WorldWideScience

Sample records for point target analysis

  1. NIF Ignition Target 3D Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Jones, O; Marinak, M; Milovich, J; Callahan, D

    2008-11-05

    We have developed an input file for running 3D NIF hohlraums that is optimized such that it can be run in 1-2 days on parallel computers. We have incorporated increasing levels of automation into the 3D input file: (1) Configuration controlled input files; (2) Common file for 2D and 3D, different types of capsules (symcap, etc.); and (3) Can obtain target dimensions, laser pulse, and diagnostics settings automatically from NIF Campaign Management Tool. Using 3D Hydra calculations to investigate different problems: (1) Intrinsic 3D asymmetry; (2) Tolerance to nonideal 3D effects (e.g. laser power balance, pointing errors); and (3) Synthetic diagnostics.

  2. Stereotactic Target point Verification in Actual Treatment Position of Radiosurgery

    International Nuclear Information System (INIS)

    Yun, Hyong Geun; Lee, Hyun Koo

    1995-01-01

    Purpose : Authors tried to enhance the safety and accuracy of radiosurgery by verifying stereotactic target point in actual treatment position prior to irradiation. Materials and Methods : Before the actual treatment, several sections of anthropomorphic head phantom were used to create a condition of unknown coordinated of the target point. A film was sand witched between the phantom sections and punctured by sharp needle tip. The tip of the needle represented the target point. The head phantom was fixed to the stereotactic ring and CT scan was done with CT localizer attached to the ring. After the CT scanning, the stereotactic coordinates of the target point were determined. The head phantom was secured to accelerator's treatment couch and the movement of laser isocenter to the stereotactic coordinates determined by CT scanning was performed using target positioner. Accelerator's anteroposterior and lateral portal films were taken using angiographic localizers. The stereotactic coordinates determined by analysis of portal films were compared with the stereotactic coordinates previously determined by CT scanning. Following the correction of discrepancy, the head phantom was irradiated using a stereotactic technique of several arcs. After the irradiation, the film which was sand witched between the phantom sections was developed and the degree of coincidence between the center of the radiation distribution with the target point represented by the hole in the film was measured. In the treatment of actual patients, the way of determining the stereotactic coordinates with CT localizers and angiographic localizers between two sets of coordinates, we proceeded to the irradiation of the actual patient. Results : In the phantom study, the agreement between the center of the radiation distribution and the localized target point was very good. By measuring optical density profiles of the sand witched film along axes that intersected the target point, authors could confirm

  3. Dim point target detection against bright background

    Science.gov (United States)

    Zhang, Yao; Zhang, Qiheng; Xu, Zhiyong; Xu, Junping

    2010-05-01

    For target detection within a large-field cluttered background from a long distance, several difficulties, involving low contrast between target and background, little occupancy, illumination ununiformity caused by vignetting of lens, and system noise, make it a challenging problem. The existing approaches to dim target detection can be roughly divided into two categories: detection before tracking (DBT) and tracking before detection (TBD). The DBT-based scheme has been widely used in practical applications due to its simplicity, but it often requires working in the situation with a higher signal-to-noise ratio (SNR). In contrast, the TBD-based methods can provide impressive detection results even in the cases of very low SNR; unfortunately, the large memory requirement and high computational load prevents these methods from real-time tasks. In this paper, we propose a new method for dim target detection. We address this problem by combining the advantages of the DBT-based scheme in computational efficiency and of the TBD-based in detection capability. Our method first predicts the local background, and then employs the energy accumulation and median filter to remove background clutter. The dim target is finally located by double window filtering together with an improved high order correlation which speeds up the convergence. The proposed method is implemented on a hardware platform and performs suitably in outside experiments.

  4. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  5. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  6. High precision target center determination from a point cloud

    Directory of Open Access Journals (Sweden)

    K. Kregar

    2013-10-01

    Full Text Available Many applications of terrestrial laser scanners (TLS require the determination of a specific point from a point cloud. In this paper procedure of high precision planar target center acquisition from point cloud is presented. The process is based on an image matching algorithm but before we can deal with raster image to fit a target on it, we need to properly determine the best fitting plane and project points on it. The main emphasis of this paper is in the precision estimation and propagation through the whole procedure which allows us to obtain precision assessment of final results (target center coordinates. Theoretic precision estimations – obtained through the procedure were rather high so we compared them with the empiric precision estimations obtained as standard deviations of results of 60 independently scanned targets. An χ2-test confirmed that theoretic precisions are overestimated. The problem most probably lies in the overestimated precisions of the plane parameters due to vast redundancy of points. However, empirical precisions also confirmed that the proposed procedure can ensure a submillimeter precision level. The algorithm can automatically detect grossly erroneous results to some extent. It can operate when the incidence angles of a laser beam are as high as 80°, which is desirable property if one is going to use planar targets as tie points in scan registration. The proposed algorithm will also contribute to improve TLS calibration procedures.

  7. Inertial fusion energy target injection, tracking, and beam pointing

    International Nuclear Information System (INIS)

    Petzoldt, R.W.

    1995-01-01

    Several cryogenic targets must be injected each second into a reaction chamber. Required target speed is about 100 m/s. Required accuracy of the driver beams on target is a few hundred micrometers. Fuel strength is calculated to allow acceleration in excess of 10,000 m/s 2 if the fuel temperature is less than 17 K. A 0.1 μm thick dual membrane will allow nearly 2,000 m/s 2 acceleration. Acceleration is gradually increased and decreased over a few membrane oscillation periods (a few ms), to avoid added stress from vibrations which could otherwise cause a factor of two decrease in allowed acceleration. Movable shielding allows multiple targets to be in flight toward the reaction chamber at once while minimizing neutron heating of subsequent targets. The use of multiple injectors is recommended for redundancy which increases availability and allows a higher pulse rate. Gas gun, rail gun, induction accelerator, and electrostatic accelerator target injection devices are studied, and compared. A gas gun is the preferred device for indirect-drive targets due to its simplicity and proven reliability. With the gas gun, the amount of gas required for each target (about 10 to 100 mg) is acceptable. A revolver loading mechanism is recommended with a cam operated poppet valve to control the gas flow. Cutting vents near the muzzle of the gas gun barrel is recommended to improve accuracy and aid gas pumping. If a railgun is used, we recommend an externally applied magnetic field to reduce required current by an order of magnitude. Optical target tracking is recommended. Up/down counters are suggested to predict target arrival time. Target steering is shown to be feasible and would avoid the need to actively point the beams. Calculations show that induced tumble from electrostatically steering the target is not excessive

  8. The registration of non-cooperative moving targets laser point cloud in different view point

    Science.gov (United States)

    Wang, Shuai; Sun, Huayan; Guo, Huichao

    2018-01-01

    Non-cooperative moving target multi-view cloud registration is the key technology of 3D reconstruction of laser threedimension imaging. The main problem is that the density changes greatly and noise exists under different acquisition conditions of point cloud. In this paper, firstly, the feature descriptor is used to find the most similar point cloud, and then based on the registration algorithm of region segmentation, the geometric structure of the point is extracted by the geometric similarity between point and point, The point cloud is divided into regions based on spectral clustering, feature descriptors are created for each region, searching to find the most similar regions in the most similar point of view cloud, and then aligning the pair of point clouds by aligning their minimum bounding boxes. Repeat the above steps again until registration of all point clouds is completed. Experiments show that this method is insensitive to the density of point clouds and performs well on the noise of laser three-dimension imaging.

  9. Effect of Antenna Pointing Errors on SAR Imaging Considering the Change of the Point Target Location

    Science.gov (United States)

    Zhang, Xin; Liu, Shijie; Yu, Haifeng; Tong, Xiaohua; Huang, Guoman

    2018-04-01

    Towards spaceborne spotlight SAR, the antenna is regulated by the SAR system with specific regularity, so the shaking of the internal mechanism is inevitable. Moreover, external environment also has an effect on the stability of SAR platform. Both of them will cause the jitter of the SAR platform attitude. The platform attitude instability will introduce antenna pointing error on both the azimuth and range directions, and influence the acquisition of SAR original data and ultimate imaging quality. In this paper, the relations between the antenna pointing errors and the three-axis attitude errors are deduced, then the relations between spaceborne spotlight SAR imaging of the point target and antenna pointing errors are analysed based on the paired echo theory, meanwhile, the change of the azimuth antenna gain is considered as the spotlight SAR platform moves ahead. The simulation experiments manifest the effects on spotlight SAR imaging caused by antenna pointing errors are related to the target location, that is, the pointing errors of the antenna beam will severely influence the area far away from the scene centre of azimuth direction in the illuminated scene.

  10. SU-E-T-310: Targeting Safety Improvements Through Analysis of Near-Miss Error Detection Points in An Incident Learning Database

    International Nuclear Information System (INIS)

    Novak, A; Nyflot, M; Sponseller, P; Howard, J; Logan, W; Holland, L; Jordan, L; Carlson, J; Ermoian, R; Kane, G; Ford, E; Zeng, J

    2014-01-01

    Purpose: Radiation treatment planning involves a complex workflow that can make safety improvement efforts challenging. This study utilizes an incident reporting system to identify detection points of near-miss errors, in order to guide our departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or their patterns. Methods: 1377 incidents were analyzed from a departmental nearmiss error reporting system from 3/2012–10/2013. All incidents were prospectively reviewed weekly by a multi-disciplinary team, and assigned a near-miss severity score ranging from 0–4 reflecting potential harm (no harm to critical). A 98-step consensus workflow was used to determine origination and detection points of near-miss errors, categorized into 7 major steps (patient assessment/orders, simulation, contouring/treatment planning, pre-treatment plan checks, therapist/on-treatment review, post-treatment checks, and equipment issues). Categories were compared using ANOVA. Results: In the 7-step workflow, 23% of near-miss errors were detected within the same step in the workflow, while an additional 37% were detected by the next step in the workflow, and 23% were detected two steps downstream. Errors detected further from origination were more severe (p<.001; Figure 1). The most common source of near-miss errors was treatment planning/contouring, with 476 near misses (35%). Of those 476, only 72(15%) were found before leaving treatment planning, 213(45%) were found at physics plan checks, and 191(40%) were caught at the therapist pre-treatment chart review or on portal imaging. Errors that passed through physics plan checks and were detected by therapists were more severe than other errors originating in contouring/treatment planning (1.81 vs 1.33, p<0.001). Conclusion: Errors caught by radiation treatment therapists tend to be more severe than errors caught earlier in the workflow, highlighting the importance of safety

  11. Measuring coseismic displacements with point-like targets offset tracking

    KAUST Repository

    Hu, Xie; Wang, Teng; Liao, Mingsheng

    2014-01-01

    Offset tracking is an important complement to measure large ground displacements in both azimuth and range dimensions where synthetic aperture radar (SAR) interferometry is unfeasible. Subpixel offsets can be obtained by searching for the cross-correlation peak calculated from the match patches uniformly distributed on two SAR images. However, it has its limitations, including redundant computation and incorrect estimations on decorrelated patches. In this letter, we propose a simple strategy that performs offset tracking on detected point-like targets (PT). We first detect image patches within bright PT by using a sinc-like template from a single SAR image and then perform offset tracking on them to obtain the pixel shifts. Compared with the standard method, the application on the 2010 M 7.2 El Mayor-Cucapah earthquake shows that the proposed PT offset tracking can significantly increase the cross-correlation and thus result in both efficiency and reliability improvements. © 2013 IEEE.

  12. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  13. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  14. Enhancing RGI lyase thermostability by targeted single point mutations

    DEFF Research Database (Denmark)

    Silva, Inês R.; Larsen, Dorte Møller; Jers, Carsten

    2013-01-01

    Rhamnogalacturonan I lyase (RGI lyase) (EC 4.2.2.-) catalyzes the cleavage of rhamnogalacturonan I in pectins by β-elimination. In this study the thermal stability of a RGI lyase (PL 11) originating from Bacillus licheniformis DSM 13/ATCC14580 was increased by a targeted protein engineering...

  15. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  16. Immunotherapy targeting immune check-point(s) in brain metastases.

    Science.gov (United States)

    Di Giacomo, Anna Maria; Valente, Monica; Covre, Alessia; Danielli, Riccardo; Maio, Michele

    2017-08-01

    Immunotherapy with monoclonal antibodies (mAb) directed to different immune check-point(s) is showing a significant clinical impact in a growing number of human tumors of different histotype, both in terms of disease response and long-term survival patients. In this rapidly changing scenario, treatment of brain metastases remains an high unmeet medical need, and the efficacy of immunotherapy in these highly dismal clinical setting remains to be largely demonstrated. Nevertheless, up-coming observations are beginning to suggest a clinical potential of cancer immunotherapy also in brain metastases, regardless the underlying tumor histotype. These observations remain to be validated in larger clinical trials eventually designed also to address the efficacy of therapeutic mAb to immune check-point(s) within multimodality therapies for brain metastases. Noteworthy, the initial proofs of efficacy on immunotherapy in central nervous system metastases are already fostering clinical trials investigating its therapeutic potential also in primary brain tumors. We here review ongoing immunotherapeutic approaches to brain metastases and primary brain tumors, and the foreseeable strategies to overcome their main biologic hurdles and clinical challenges. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. 3D reconstruction of laser projective point with projection invariant generated from five points on 2D target.

    Science.gov (United States)

    Xu, Guan; Yuan, Jing; Li, Xiaotao; Su, Jian

    2017-08-01

    Vision measurement on the basis of structured light plays a significant role in the optical inspection research. The 2D target fixed with a line laser projector is designed to realize the transformations among the world coordinate system, the camera coordinate system and the image coordinate system. The laser projective point and five non-collinear points that are randomly selected from the target are adopted to construct a projection invariant. The closed form solutions of the 3D laser points are solved by the homogeneous linear equations generated from the projection invariants. The optimization function is created by the parameterized re-projection errors of the laser points and the target points in the image coordinate system. Furthermore, the nonlinear optimization solutions of the world coordinates of the projection points, the camera parameters and the lens distortion coefficients are contributed by minimizing the optimization function. The accuracy of the 3D reconstruction is evaluated by comparing the displacements of the reconstructed laser points with the actual displacements. The effects of the image quantity, the lens distortion and the noises are investigated in the experiments, which demonstrate that the reconstruction approach is effective to contribute the accurate test in the measurement system.

  18. Target validation for FCV technology development in Japan from energy competition point of view

    International Nuclear Information System (INIS)

    ENDO Eiichi

    2006-01-01

    The objective of this work is to validate the technical targets in the governmental hydrogen energy road-map of Japan by analyzing market penetration of fuel cell vehicle(FCV)s and effects of fuel price and carbon tax on it from technology competition point of view. In this analysis, an energy system model of Japan based on MARKAL is used. The results of the analysis show that hydrogen FCVs could not have cost-competitiveness until 2030 without carbon tax, including the governmental actual plan of carbon tax. However, as the carbon tax rate increases, instead of conventional vehicles including gasoline hybrid electric vehicle, hydrogen FCVs penetrate to the market earlier and more. By assuming higher fuel price and severer carbon tax rate, market share of hydrogen FCVs approaches to the governmental goal. This suggests that cheaper vehicle cost and/or hydrogen price than those targeted in the road-map is required. At the same time, achievement of the technical targets in the road-map also allows to attain the market penetration target of hydrogen FCVs in some possible conditions. (authors)

  19. Why Targets of Economic Sanctions React Differently: Reference Point Effects on North Korea and Libya

    Directory of Open Access Journals (Sweden)

    Jiyoun Park

    2017-06-01

    Full Text Available The international community has frequently introduced economic sanctions to curb the proliferation of weapons of mass destruction, to which each target nation has reacted differently. This paper explores the reasons why each target of economic sanctions reacts differently by specif- ically building a model based on reference point effects, and by analyzing the cases of North Korea and Libya. According to the results, when the reference point level increases, as in the case of North Korea, the target resists more firmly; on the other hand, when the reference point decreases, like in the case of Libya, the target resists more subtly.

  20. Tipping point analysis of ocean acoustic noise

    Science.gov (United States)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  1. Tipping point analysis of ocean acoustic noise

    Directory of Open Access Journals (Sweden)

    V. N. Livina

    2018-02-01

    Full Text Available We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations of the time series.

  2. Segmentation of foreground apple targets by fusing visual attention mechanism and growth rules of seed points

    Energy Technology Data Exchange (ETDEWEB)

    Qu, W.; Shang, W.; Shao, Y.; Wang, D.; Yu, X.; Song, H.

    2015-07-01

    Accurate segmentation of apple targets is one of the most important problems to be solved in the vision system of apple picking robots. This work aimed to solve the difficulties that background targets often bring to foreground targets segmentation, by fusing the visual attention mechanism and the growth rule of seed points. Background targets could be eliminated by extracting the ROI (region of interest) of apple targets; the ROI was roughly segmented on the HSV color space, and then each of the pixels was used as a seed growing point. The growth rule of the seed points was adopted to obtain the whole area of apple targets from seed growing points. The proposed method was tested with 20 images captured in a natural scene, including 54 foreground apple targets and approximately 84 background apple targets. Experimental results showed that the proposed method can remove background targets and focus on foreground targets, while the k-means algorithm and the chromatic aberration algorithm cannot. Additionally, its average segmentation error rate was 13.23%, which is 2.71% higher than that of the k-means algorithm and 2.95% lower than that of the chromatic aberration algorithm. In conclusion, the proposed method contributes to the vision system of apple-picking robots to locate foreground apple targets quickly and accurately under a natural scene. (Author)

  3. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    A musical analysis represents a particular way of understanding certain aspects of the structure of a piece of music. The quality of an analysis can be evaluated to some extent by the degree to which knowledge of it improves performance on tasks such as mistake spotting, memorising a piece...... as the minimum description length principle and relates closely to certain ideas in the theory of Kolmogorov complexity. Inspired by this general principle, the hypothesis explored in this paper is that the best ways of understanding (or explanations for) a piece of music are those that are represented...... by the shortest possible descriptions of the piece. With this in mind, two compression algorithms are presented, COSIATEC and SIATECCompress. Each of these algorithms takes as input an in extenso description of a piece of music as a set of points in pitch-time space representing notes. Each algorithm...

  4. Numerical analysis on pump turbine runaway points

    International Nuclear Information System (INIS)

    Guo, L; Liu, J T; Wang, L Q; Jiao, L; Li, Z F

    2012-01-01

    To research the character of pump turbine runaway points with different guide vane opening, a hydraulic model was established based on a pumped storage power station. The RNG k-ε model and SMPLEC algorithms was used to simulate the internal flow fields. The result of the simulation was compared with the test data and good correspondence was got between experimental data and CFD result. Based on this model, internal flow analysis was carried out. The result show that when the pump turbine ran at the runway speed, lots of vortexes appeared in the flow passage of the runner. These vortexes could always be observed even if the guide vane opening changes. That is an important way of energy loss in the runaway condition. Pressure on two sides of the runner blades were almost the same. So the runner power is very low. High speed induced large centrifugal force and the small guide vane opening gave the water velocity a large tangential component, then an obvious water ring could be observed between the runner blades and guide vanes in small guide vane opening condition. That ring disappeared when the opening bigger than 20°. These conclusions can provide a theory basis for the analysis and simulation of the pump turbine runaway points.

  5. The phylogenomic analysis of the anaphase promoting complex and its targets points to complex and modern-like control of the cell cycle in the last common ancestor of eukaryotes

    Directory of Open Access Journals (Sweden)

    Brochier-Armanet Céline

    2011-09-01

    Full Text Available Abstract Background The Anaphase Promoting Complex or Cyclosome (APC/C is the largest member of the ubiquitin ligase [E3] family. It plays a crucial role in the control of the cell cycle and cell proliferation by mediating the proteolysis of key components by the proteasome. APC/C is made of a dozen subunits that assemble into a large complex of ~1.5 MDa, which interacts with various cofactors and targets. Results Using comparative genomic and phylogenetic approaches, we showed that 24 out of 37 known APC/C subunits, adaptors/co-activators and main targets, were already present in the Last Eukaryotic Common Ancestor (LECA and were well conserved to a few exceptions in all present-day eukaryotic lineages. The phylogenetic analysis of the 24 components inferred to be present in LECA showed that they contain a reliable phylogenetic signal to reconstruct the phylogeny of the domain Eucarya. Conclusions Taken together our analyses indicated that LECA had a complex and highly controlled modern-like cell cycle. Moreover, we showed that, despite what is generally assumed, proteins involved in housekeeping cellular functions may be a good complement to informational genes to study the phylogeny of eukaryotes.

  6. Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

    DEFF Research Database (Denmark)

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    -and-select tasks. We conducted two experiments comparing the performance of dwell, magnification and zoom methods in point-and-select tasks with small targets in single- and multiple-target layouts. Both magnification and zoom showed higher hit rates than dwell. Hit rates were higher when using magnification than...

  7. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  8. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  9. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  10. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  11. A proton point source produced by laser interaction with cone-top-end target

    International Nuclear Information System (INIS)

    Yu, Jinqing; Jin, Xiaolin; Zhou, Weimin; Zhao, Zongqing; Yan, Yonghong; Li, Bin; Hong, Wei; Gu, Yuqiu

    2012-01-01

    In this paper, we propose a proton point source by the interaction of laser and cone-top-end target and investigate it by two-dimensional particle-in-cell (2D-PIC) simulations as the proton point sources are well known for higher spatial resolution of proton radiography. Our results show that the relativistic electrons are guided to the rear of the cone-top-end target by the electrostatic charge-separation field and self-generated magnetic field along the profile of the target. As a result, the peak magnitude of sheath field at the rear surface of cone-top-end target is higher compared to common cone target. We test this scheme by 2D-PIC simulation and find the result has a diameter of 0.79λ 0 , an average energy of 9.1 MeV and energy spread less than 35%.

  12. Impedance analysis of acupuncture points and pathways

    International Nuclear Information System (INIS)

    Teplan, Michal; Kukucka, Marek; Ondrejkovicová, Alena

    2011-01-01

    Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.

  13. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  14. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    2015-01-01

    COSIATEC, SIATECCompress and Forth’s algorithm are point-set compression algorithms developed for discovering repeated patterns in music, such as themes and motives that would be of interest to a music analyst. To investigate their effectiveness and versatility, these algorithms were evaluated...... on three analytical tasks that depend on the discovery of repeated patterns: classifying folk song melodies into tune families, discovering themes and sections in polyphonic music, and discovering subject and countersubject entries in fugues. Each algorithm computes a compressed encoding of a point......-set representation of a musical object in the form of a list of compact patterns, each pattern being given with a set of vectors indicating its occurrences. However, the algorithms adopt different strategies in their attempts to discover encodings that maximize compression.The best-performing algorithm on the folk...

  15. Uncertainties in thick-target PIXE analysis

    International Nuclear Information System (INIS)

    Campbell, J.L.; Cookson, J.A.; Paul, H.

    1983-01-01

    Thick-target PIXE analysis insolves uncertainties arising from the calculation of thick-target X-ray production in addition to the usual PIXE uncertainties. The calculation demands knowledge of ionization cross-sections, stopping powers and photon attenuation coefficients. Information on these is reviewed critically and a computational method is used to estimate the uncertainties transmitted from this data base into results of thick-target PIXE analyses with reference to particular specimen types using beams of 2-3 MeV protons. A detailed assessment of the accuracy of thick-target PIXE is presented. (orig.)

  16. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  17. NEWTONIAN IMPERIALIST COMPETITVE APPROACH TO OPTIMIZING OBSERVATION OF MULTIPLE TARGET POINTS IN MULTISENSOR SURVEILLANCE SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. Afghan-Toloee

    2013-09-01

    Full Text Available The problem of specifying the minimum number of sensors to deploy in a certain area to face multiple targets has been generally studied in the literatures. In this paper, we are arguing the multi-sensors deployment problem (MDP. The Multi-sensor placement problem can be clarified as minimizing the cost required to cover the multi target points in the area. We propose a more feasible method for the multi-sensor placement problem. Our method makes provision the high coverage of grid based placements while minimizing the cost as discovered in perimeter placement techniques. The NICA algorithm as improved ICA (Imperialist Competitive Algorithm is used to decrease the performance time to explore an enough solution compared to other meta-heuristic schemes such as GA, PSO and ICA. A three dimensional area is used for clarify the multiple target and placement points, making provision x, y, and z computations in the observation algorithm. A structure of model for the multi-sensor placement problem is proposed: The problem is constructed as an optimization problem with the objective to minimize the cost while covering all multiple target points upon a given probability of observation tolerance.

  18. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  19. Two-point anchoring of a lanthanide-binding peptide to a target protein enhances the paramagnetic anisotropic effect

    International Nuclear Information System (INIS)

    Saio, Tomohide; Ogura, Kenji; Yokochi, Masashi; Kobashigawa, Yoshihiro; Inagaki, Fuyuhiko

    2009-01-01

    Paramagnetic lanthanide ions fixed in a protein frame induce several paramagnetic effects such as pseudo-contact shifts and residual dipolar couplings. These effects provide long-range distance and angular information for proteins and, therefore, are valuable in protein structural analysis. However, until recently this approach had been restricted to metal-binding proteins, but now it has become applicable to non-metalloproteins through the use of a lanthanide-binding tag. Here we report a lanthanide-binding peptide tag anchored via two points to the target proteins. Compared to conventional single-point attached tags, the two-point linked tag provides two to threefold stronger anisotropic effects. Though there is slight residual mobility of the lanthanide-binding tag, the present tag provides a higher anisotropic paramagnetic effect

  20. A RECOGNITION METHOD FOR AIRPLANE TARGETS USING 3D POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    M. Zhou

    2012-07-01

    Full Text Available LiDAR is capable of obtaining three dimension coordinates of the terrain and targets directly and is widely applied in digital city, emergent disaster mitigation and environment monitoring. Especially because of its ability of penetrating the low density vegetation and canopy, LiDAR technique has superior advantages in hidden and camouflaged targets detection and recognition. Based on the multi-echo data of LiDAR, and combining the invariant moment theory, this paper presents a recognition method for classic airplanes (even hidden targets mainly under the cover of canopy using KD-Tree segmented point cloud data. The proposed algorithm firstly uses KD-tree to organize and manage point cloud data, and makes use of the clustering method to segment objects, and then the prior knowledge and invariant recognition moment are utilized to recognise airplanes. The outcomes of this test verified the practicality and feasibility of the method derived in this paper. And these could be applied in target measuring and modelling of subsequent data processing.

  1. Investigating Spatial Patterns of Persistent Scatterer Interferometry Point Targets and Landslide Occurrences in the Arno River Basin

    Directory of Open Access Journals (Sweden)

    Ping Lu

    2014-07-01

    Full Text Available Persistent Scatterer Interferometry (PSI has been widely used for landslide studies in recent years. This paper investigated the spatial patterns of PSI point targets and landslide occurrences in the Arno River basin in Central Italy. The main purpose is to analyze whether spatial patterns of Persistent Scatterers (PS can be recognized as indicators of landslide occurrences throughout the whole basin. The bivariate K-function was employed to assess spatial relationships between PS and landslides. The PSI point targets were acquired from almost 4 years (from March 2003 to January 2007 of RADARSAT-1 images. The landslide inventory was collected from 15 years (from 1992–2007 of surveying and mapping data, mainly including remote sensing data, topographic maps and field investigations. The proposed approach is able to assess spatial patterns between a variety of PS and landslides, in particular, to understand if PSI point targets are spatially clustered (spatial attraction or randomly distributed (spatial independency on various types of landslides across the basin. Additionally, the degree and scale distances of PS clustering on a variety of landslides can be characterized. The results rejected the null hypothesis that PSI point targets appear to cluster similarly on four types of landslides (slides, flows, falls and creeps in the Arno River basin. Significant influence of PS velocities and acquisition orbits can be noticed on detecting landslides with different states of activities. Despite that the assessment may be influenced by the quality of landslide inventory and Synthetic Aperture Radar (SAR images, the proposed approach is expected to provide guidelines for studies trying to detect and investigate landslide occurrences at a regional scale through spatial statistical analysis of PS, for which an advanced understanding of the impact of scale distances on landslide clustering is fundamentally needed.

  2. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  3. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  4. Deployment Design of Wireless Sensor Network for Simple Multi-Point Surveillance of a Moving Target

    Science.gov (United States)

    Tsukamoto, Kazuya; Ueda, Hirofumi; Tamura, Hitomi; Kawahara, Kenji; Oie, Yuji

    2009-01-01

    In this paper, we focus on the problem of tracking a moving target in a wireless sensor network (WSN), in which the capability of each sensor is relatively limited, to construct large-scale WSNs at a reasonable cost. We first propose two simple multi-point surveillance schemes for a moving target in a WSN and demonstrate that one of the schemes can achieve high tracking probability with low power consumption. In addition, we examine the relationship between tracking probability and sensor density through simulations, and then derive an approximate expression representing the relationship. As the results, we present guidelines for sensor density, tracking probability, and the number of monitoring sensors that satisfy a variety of application demands. PMID:22412326

  5. Assessment of X-point target divertor configuration for power handling and detachment front control

    Directory of Open Access Journals (Sweden)

    M.V. Umansky

    2017-08-01

    Full Text Available A study of long-legged tokamak divertor configurations is performed with the edge transport code UEDGE (Rognlien et al., J. Nucl. Mater. 196, 347, 1992. The model parameters are based on the ADX tokamak concept design (LaBombard et al., Nucl. Fusion 55, 053020, 2015. Several long-legged divertor configurations are considered, in particular the X-point target configuration proposed for ADX, and compared with a standard divertor. For otherwise identical conditions, a scan of the input power from the core plasma is performed. It is found that as the power is reduced to a threshold value, the plasma in the outer leg transitions to a fully detached state which defines the upper limit on the power for detached divertor operation. Reducing the power further results in the detachment front shifting upstream but remaining stable. At low power the detachment front eventually moves to the primary X-point, which is usually associated with degradation of the core plasma, and this defines the lower limit on the power for the detached divertor operation. For the studied parameters, the operation window for a detached divertor in the standard divertor configuration is very small, or even non-existent; under the same conditions for long-legged divertors the detached operation window is quite large, in particular for the X-point target configuration, allowing a factor of 5–10 variation in the input power. These modeling results point to possibility of stable fully detached divertor operation for a tokamak with extended divertor legs.

  6. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... are solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  7. CFD analysis of the HYPER spallation target

    International Nuclear Information System (INIS)

    Cho, Chungho; Tak, Nam-il; Choi, Jae-Hyuk; Lee, Yong-Bum

    2008-01-01

    KAERI (Korea Atomic Energy Research Institute) is developing an accelerator driven system (ADS) named HYPER (HYbrid Power Extraction Reactor) for a transmutation of long-lived nuclear wastes. One of the challenging tasks for the HYPER system is to design a large spallation target with a beam power of 15-25 MW. The paper focuses on a thermal-hydraulic analysis of the active part of the HYPER target. Computational fluid dynamics (CFD) analysis was performed by using a commercial code CFX 5.7.1. Several advanced turbulence models with different grid structures were applied. The CFX results reveal a significant impact of the turbulence model on the window temperature. Particularly, the k-ε model predicts the lowest window temperature among the five investigated turbulence models

  8. Uncertainty Prediction in Passive Target Motion Analysis

    Science.gov (United States)

    2016-05-12

    Number 15/152,696 Filing Date 12 May 2016 Inventor John G. Baylog et al Address any questions concerning this matter to the Office of...300118 1 of 25 UNCERTAINTY PREDICTION IN PASSIVE TARGET MOTION ANALYSIS STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein...at an unknown location and following an unknown course relative to an observer 12. Observer 12 has a sensor array such as a passive sonar or radar

  9. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    Science.gov (United States)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    Nowadays, using satellite in space to observe ground is an important and major method to obtain ground information. With the development of the scientific technology in the field of space, many fields such as military and economic and other areas have more and more requirement of space technology because of the benefits of the satellite's widespread, timeliness and unlimited of area and country. And at the same time, because of the wide use of all kinds of satellites, sensors, repeater satellites and ground receiving stations, ground control system are now facing great challenge. Therefore, how to make the best value of satellite resources so as to make full use of them becomes an important problem of ground control system. Satellite scheduling is to distribute the resource to all tasks without conflict to obtain the scheduling result so as to complete as many tasks as possible to meet user's requirement under considering the condition of the requirement of satellites, sensors and ground receiving stations. Considering the size of the task, we can divide tasks into point task and area task. This paper only considers point targets. In this paper, a description of satellite scheduling problem and a chief introduction of the theory of satellite scheduling are firstly made. We also analyze the restriction of resource and task in scheduling satellites. The input and output flow of scheduling process are also chiefly described in the paper. On the basis of these analyses, we put forward a scheduling model named as multi-variable optimization model for multi-satellite and point target task on swinging mode. In the multi-variable optimization model, the scheduling problem is transformed the parametric optimization problem. The parameter we wish to optimize is the swinging angle of every time-window. In the view of the efficiency and accuracy, some important problems relating the satellite scheduling such as the angle relation between satellites and ground targets, positive

  10. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    Science.gov (United States)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  11. Impact of target point deviations on control and complication probabilities in stereotactic radiosurgery of AVMs and metastases

    International Nuclear Information System (INIS)

    Treuer, Harald; Kocher, Martin; Hoevels, Moritz; Hunsche, Stefan; Luyken, Klaus; Maarouf, Mohammad; Voges, Juergen; Mueller, Rolf-Peter; Sturm, Volker

    2006-01-01

    Objective: Determination of the impact of inaccuracies in the determination and setup of the target point in stereotactic radiosurgery (SRS) on the expectable complication and control probabilities. Methods: Two randomized samples of patients with arteriovenous malformation (AVM) (n = 20) and with brain metastases (n = 20) treated with SRS were formed, and the probability for complete obliteration (COP) or complete remission (CRP), the size of the 10 Gy-volume in the brain tissue (VOI10), and the probability for radiation necrosis (NTCP) were calculated. The dose-effect relations for COP and CRP were fitted to clinical data. Target point deviations were simulated through random vectors and the resulting probabilities and volumes were calculated and compared with the values of the treatment plan. Results: The decrease of the relative value of the control probabilities at 1 mm target point deviation was up to 4% for AVMs and up to 10% for metastases. At 2 mm the median decrease was 5% for AVMs and 9% for metastases. The value for the target point deviation, at which COP and CRP decreased about 0.05 in 90% of the cases, was 1.3 mm. The increase of NTCP was maximally 0.0025 per mm target point deviation for AVMs and 0.0035/mm for metastases. The maximal increase of VOI10 was 0.7 cm 3 /mm target point deviation in both patient groups. Conclusions: The upper limit for tolerable target point deviations is at 1.3 mm. If this value cannot be achieved during the system test, a supplementary safety margin should be applied for the definition of the target volume. A better accuracy level is desirable, in order to ensure optimal chances for the success of the treatment. The target point precision is less important for the minimization of the probability of radiation necroses

  12. Simulation of beam pointing stability on targeting plane of high power excimer laser system

    International Nuclear Information System (INIS)

    Wang Dahui; Zhao Xueqing; Zhang Yongsheng; Zheng Guoxin; Hu Yun; Zhao Jun

    2011-01-01

    Based on characteristics of image-relaying structure in high power excimer MOPA laser system, simulation and analysis software of targeting beam's barycenter stability was designed by using LABVIEW and MATLAB. Simulation was made to measured results of every optical component in laboratory environment. Simulation and validation of budget values for optical components was and optimization of error budget of system was accomplished via post-allocation for several times. It is shown that targeting beam's barycenter stability in the condition of current laboratory environment can't satisfy needs and index of high demand optical components can be allotted to 1.7 μrad when index of low demand optical components have some stability margin. These results can provide a guide to construction of system and design and machining of optical components and optimization of system. Optical components of laboratory on work can satisfy optimized distributed index, which reduce the demand of structure to some extent. (authors)

  13. Molecular Composition Analysis of Distant Targets

    Science.gov (United States)

    Hughes, Gary B.; Lubin, Philip

    2017-01-01

    This document is the Final Report for NASA Innovative Advanced Concepts (NIAC) Phase I Grant 15-NIAC16A-0145, titled Molecular Composition Analysis of Distant Targets. The research was focused on developing a system concept for probing the molecular composition of cold solar system targets, such as Asteroids, Comets, Planets and Moons from a distant vantage, for example from a spacecraft that is orbiting the target (Hughes et al., 2015). The orbiting spacecraft is equipped with a high-power laser, which is run by electricity from photovoltaic panels. The laser is directed at a spot on the target. Materials on the surface of the target are heated by the laser beam, and begin to melt and then evaporate, forming a plume of asteroid molecules in front of the heated spot. The heated spot glows, producing blackbody illumination that is visible from the spacecraft, via a path through the evaporated plume. As the blackbody radiation from the heated spot passes through the plume of evaporated material, molecules in the plume absorb radiation in a manner that is specific to the rotational and vibrational characteristics of the specific molecules. A spectrometer aboard the spacecraft is used to observe absorption lines in the blackbody signal. The pattern of absorption can be used to estimate the molecular composition of materials in the plume, which originated on the target. Focusing on a single spot produces a borehole, and shallow subsurface profiling of the targets bulk composition is possible. At the beginning of the Phase I research, the estimated Technology Readiness Level (TRL) of the system was TRL-1. During the Phase I research, an end-to-end theoretical model of the sensor system was developed from first principles. The model includes laser energy and optical propagation, target heating, melting and evaporation of target material, plume density, thermal radiation from the heated spot, molecular cross section of likely asteroid materials, and estimation of the

  14. Analysis of directly driven ICF targets

    International Nuclear Information System (INIS)

    Velarde, G.; Aragones, J.M.; Gago, J.A.

    1986-01-01

    The current capabilities at DENIM for the analysis of directly driven targets are presented. These include theoretical, computational and applied physical studies and developments of detailed simulation models for the most relevant processes in ICF. The simulation of directly driven ICF targets is carried out with the one-dimensional NORCLA code developed at DENIM. This code contains two main segments: NORMA and CLARA, able to work fully coupled and in an iterative manner. NORMA solves the hydrodynamic equations in a lagrangian mesh. It has modular programs couple to it to treat the laser or particle beam interaction with matter. Equations of state, opacities and conductivities are taken from a DENIM atomic data library, generated externally with other codes that will also be explained in this work. CLARA solves the transport equation for neutrons, as well as for charged particles, and suprathermal electrons using discrete ordinates and finite element methods in the computational procedure. Parametric calculations of multilayered single-shell targets driven by heavy ion beams are also analyzed. Finally, conclusions are focused on the ongoing developments in the areas of interest such as: radiation transport, atomic physics, particle in cell method, charged particle transport, two-dimensional calculations and instabilities. (author)

  15. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  16. An Approach for Automatic Orientation of Big Point Clouds from the Stationary Scanners Based on the Spherical Targets

    Directory of Open Access Journals (Sweden)

    YAO Jili

    2015-04-01

    Full Text Available Terrestrial laser scanning (TLS technology has high speed of data acquisition, large amount of point cloud, long distance of measuring. However, there are some disadvantages such as distance limitation in target detecting, hysteresis in point clouds processing, low automation and weaknesses of adapting long-distance topographic survey. In this case, we put forward a method on long-range targets detecting in big point clouds orientation. The method firstly searches point cloud rings that contain targets according to their engineering coordinate system. Then the detected rings are divided into sectors to detect targets in a very short time so as to obtain central coordinates of these targets. Finally, the position and orientation parameters of scanner are calculated and point clouds in scanner's own coordinate system(SOCS are converted into engineering coordinate system. The method is able to be applied in ordinary computers for long distance topographic(the distance between scanner and targets ranges from 180 to 700 m survey in mountainous areas with targets radius of 0.162m.

  17. Colocalization coefficients evaluating the distribution of molecular targets in microscopy methods based on pointed patterns

    Czech Academy of Sciences Publication Activity Database

    Pastorek, Lukáš; Sobol, Margaryta; Hozák, Pavel

    2016-01-01

    Roč. 146, č. 4 (2016), s. 391-406 ISSN 0948-6143 R&D Projects: GA TA ČR(CZ) TE01020118; GA ČR GA15-08738S; GA MŠk(CZ) ED1.1.00/02.0109; GA MŠk(CZ) LM2015062 Grant - others:Human Frontier Science Program(FR) RGP0017/2013 Institutional support: RVO:68378050 Keywords : Colocalization * Quantitative analysis * Pointed patterns * Transmission electron microscopy * Manders' coefficients * Immunohistochemistry Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.553, year: 2016

  18. Nuclear Security: Target Analysis-rev

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surinder Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gibbs, Philip W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bultz, Garl A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-03-01

    The objectives of this presentation are to understand target identification, including roll-up and protracted theft; evaluate target identification in the SNRI; recognize the target characteristics and consequence levels; and understand graded safeguards.

  19. Effect of target color and scanning geometry on terrestrial LiDAR point-cloud noise and plane fitting

    Science.gov (United States)

    Bolkas, Dimitrios; Martinez, Aaron

    2018-01-01

    Point-cloud coordinate information derived from terrestrial Light Detection And Ranging (LiDAR) is important for several applications in surveying and civil engineering. Plane fitting and segmentation of target-surfaces is an important step in several applications such as in the monitoring of structures. Reliable parametric modeling and segmentation relies on the underlying quality of the point-cloud. Therefore, understanding how point-cloud errors affect fitting of planes and segmentation is important. Point-cloud intensity, which accompanies the point-cloud data, often goes hand-in-hand with point-cloud noise. This study uses industrial particle boards painted with eight different colors (black, white, grey, red, green, blue, brown, and yellow) and two different sheens (flat and semi-gloss) to explore how noise and plane residuals vary with scanning geometry (i.e., distance and incidence angle) and target-color. Results show that darker colors, such as black and brown, can produce point clouds that are several times noisier than bright targets, such as white. In addition, semi-gloss targets manage to reduce noise in dark targets by about 2-3 times. The study of plane residuals with scanning geometry reveals that, in many of the cases tested, residuals decrease with increasing incidence angles, which can assist in understanding the distribution of plane residuals in a dataset. Finally, a scheme is developed to derive survey guidelines based on the data collected in this experiment. Three examples demonstrate that users should consider instrument specification, required precision of plane residuals, required point-spacing, target-color, and target-sheen, when selecting scanning locations. Outcomes of this study can aid users to select appropriate instrumentation and improve planning of terrestrial LiDAR data-acquisition.

  20. Stability Analysis of Periodic Systems by Truncated Point Mappings

    Science.gov (United States)

    Guttalu, R. S.; Flashner, H.

    1996-01-01

    An approach is presented deriving analytical stability and bifurcation conditions for systems with periodically varying coefficients. The method is based on a point mapping(period to period mapping) representation of the system's dynamics. An algorithm is employed to obtain an analytical expression for the point mapping and its dependence on the system's parameters. The algorithm is devised to derive the coefficients of a multinominal expansion of the point mapping up to an arbitrary order in terms of the state variables and of the parameters. Analytical stability and bifurcation condition are then formulated and expressed as functional relations between the parameters. To demonstrate the application of the method, the parametric stability of Mathieu's equation and of a two-degree of freedom system are investigated. The results obtained by the proposed approach are compared to those obtained by perturbation analysis and by direct integration which we considered to the "exact solution". It is shown that, unlike perturbation analysis, the proposed method provides very accurate solution even for large valuesof the parameters. If an expansion of the point mapping in terms of a small parameter is performed the method is equivalent to perturbation analysis. Moreover, it is demonstrated that the method can be easily applied to multiple-degree-of-freedom systems using the same framework. This feature is an important advantage since most of the existing analysis methods apply mainly to single-degree-of-freedom systems and their extension to higher dimensions is difficult and computationally cumbersome.

  1. Point of Care Testing Services Delivery: Policy Analysis using a ...

    African Journals Online (AJOL)

    Annals of Biomedical Sciences ... The service providers (hospital management) and the testing personnel are faced with the task of trying to explain these problems. Objective of the study: To critically do a policy analysis of the problems of point of care testing with the aim of identifying the causes of these problems and ...

  2. A mathematical analysis of multiple-target SELEX.

    Science.gov (United States)

    Seo, Yeon-Jung; Chen, Shiliang; Nilsen-Hamilton, Marit; Levine, Howard A

    2010-10-01

    SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is a procedure by which a mixture of nucleic acids can be fractionated with the goal of identifying those with specific biochemical activities. One combines the mixture with a specific target molecule and then separates the target-NA complex from the resulting reactions. The target-NA complex is separated from the unbound NA by mechanical means (such as by filtration), the NA is eluted from the complex, amplified by PCR (polymerase chain reaction), and the process repeated. After several rounds, one should be left with the nucleic acids that best bind to the target. The problem was first formulated mathematically in Irvine et al. (J. Mol. Biol. 222:739-761, 1991). In Levine and Nilsen-Hamilton (Comput. Biol. Chem. 31:11-25, 2007), a mathematical analysis of the process was given. In Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998), multiple target SELEX was considered. It was assumed that each target has a single nucleic acid binding site that permits occupation by no more than one nucleic acid. Here, we revisit Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998) using the same assumptions. The iteration scheme is shown to be convergent and a simplified algorithm is given. Our interest here is in the behavior of the multiple target SELEX process as a discrete "time" dynamical system. Our goal is to characterize the limiting states and their dependence on the initial distribution of nucleic acid and target fraction components. (In multiple target SELEX, we vary the target component fractions, but not their concentrations, as fixed and the initial pool of nucleic acids as a variable starting condition). Given N nucleic acids and a target consisting of M subtarget component species, there is an M × N matrix of affinities, the (i,j) entry corresponding to the affinity of the jth nucleic acid for the ith subtarget. We give a structure condition on this matrix that is equivalent to the following

  3. A systematic analysis of the Braitenberg vehicle 2b for point-like stimulus sources

    International Nuclear Information System (INIS)

    Rañó, Iñaki

    2012-01-01

    Braitenberg vehicles have been used experimentally for decades in robotics with limited empirical understanding. This paper presents the first mathematical model of the vehicle 2b, displaying so-called aggression behaviour, and analyses the possible trajectories for point-like smooth stimulus sources. This sensory-motor steering control mechanism is used to implement biologically grounded target approach, target-seeking or obstacle-avoidance behaviour. However, the analysis of the resulting model reveals that complex and unexpected trajectories can result even for point-like stimuli. We also prove how the implementation of the controller and the vehicle morphology interact to affect the behaviour of the vehicle. This work provides a better understanding of Braitenberg vehicle 2b, explains experimental results and paves the way for a formally grounded application on robotics as well as for a new way of understanding target seeking in biology. (paper)

  4. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  5. Process for structural geologic analysis of topography and point data

    Science.gov (United States)

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  6. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  7. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  8. Screw compressor analysis from a vibration point-of-view

    Science.gov (United States)

    Hübel, D.; Žitek, P.

    2017-09-01

    Vibrations are a very typical feature of all compressors and are given great attention in the industry. The reason for this interest is primarily the negative influence that it can have on both the operating staff and the entire machine's service life. The purpose of this work is to describe the methodology of screw compressor analysis from a vibration point-of-view. This analysis is an essential part of the design of vibro-diagnostics of screw compressors with regard to their service life.

  9. Point Climat no. 21 'Regional wind power plans: is there enough wind to reach the Grenelle wind power targets?'

    International Nuclear Information System (INIS)

    Bordier, Cecile; Charentenay, Jeremie de

    2012-01-01

    Among the publications of CDC Climat Research, 'Climate Briefs' presents, in a few pages, hot topics in climate change policy. This issue addresses the following points: Regional wind power plans assess the wind power development potential of every French region. The aggregate regional potential largely exceeds national targets for 2020. However, achieving these targets is still far from guaranteed: the forecasted potential is theoretical, and the issues involved in implementing wind power projects on the ground will likely reduce this potential

  10. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  11. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  12. Point Cloud Analysis for Conservation and Enhancement of Modernist Architecture

    Science.gov (United States)

    Balzani, M.; Maietti, F.; Mugayar Kühl, B.

    2017-02-01

    Documentation of cultural assets through improved acquisition processes for advanced 3D modelling is one of the main challenges to be faced in order to address, through digital representation, advanced analysis on shape, appearance and conservation condition of cultural heritage. 3D modelling can originate new avenues in the way tangible cultural heritage is studied, visualized, curated, displayed and monitored, improving key features such as analysis and visualization of material degradation and state of conservation. An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  13. Target Audience of Live Opera Transmissions to Cinema Theatres from the Marketing Point of View

    Directory of Open Access Journals (Sweden)

    Radek Tahal

    2016-03-01

    Full Text Available Opera has a famous history and even the present-day repertoire in opera houses mostly consists of classical and well-known works. Marketers are trying to find new ways that would enable opera lovers all over the world to enjoy top quality performances. One of the most successful models is real-time transmissions of operas to geographically remote cinemas. Cinemas from all around the world participate in the project. In this paper, the authors analyze the spectators´ profile and point out differences between North America and the Czech Republic, focusing on transmissions of performances by the Metropolitan Opera in New York. The authors submit a detailed analysis of the socio-demographic characteristics of the spectators and the attendance frequency. Special attention is paid to the marketing profile of Czech spectators, based on primary data gathered in the research. The paper is a combination of research report and business case study. The study reveals that female visitors prevail. Elderly people are also represented in high percentages. The spectators are characterized by refined taste in their lifestyles and familiarity with modern technology.

  14. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  15. Percolation analysis for cosmic web with discrete points

    Science.gov (United States)

    Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung

    2018-01-01

    Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.

  16. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    Science.gov (United States)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  17. Relatively Inexact Proximal Point Algorithm and Linear Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Ram U. Verma

    2009-01-01

    Full Text Available Based on a notion of relatively maximal (m-relaxed monotonicity, the approximation solvability of a general class of inclusion problems is discussed, while generalizing Rockafellar's theorem (1976 on linear convergence using the proximal point algorithm in a real Hilbert space setting. Convergence analysis, based on this new model, is simpler and compact than that of the celebrated technique of Rockafellar in which the Lipschitz continuity at 0 of the inverse of the set-valued mapping is applied. Furthermore, it can be used to generalize the Yosida approximation, which, in turn, can be applied to first-order evolution equations as well as evolution inclusions.

  18. Analysis on Single Point Vulnerabilities of Plant Control System

    International Nuclear Information System (INIS)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung

    2011-01-01

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities

  19. Analysis on Single Point Vulnerabilities of Plant Control System

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)

    2011-08-15

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities.

  20. Analysis of a simple pendulum driven at its suspension point

    International Nuclear Information System (INIS)

    Yoshida, S; Findley, T

    2005-01-01

    To familiarize undergraduate students with the dynamics of a damped driven harmonic oscillator, a simple pendulum was set up and driven at its suspension point under different damping conditions. From the time domain analysis, the decay constant was estimated and used to predict the frequency response. The simple pendulum was then driven at a series of frequencies near the resonance. By measuring the maximum amplitude at each driving frequency, the frequency response was determined. With one free parameter, which was determined under the first damping condition, the predicted frequency responses showed good agreement with the measured frequency responses under all damping conditions

  1. Neutron performance analysis for ESS target proposal

    International Nuclear Information System (INIS)

    Magán, M.; Terrón, S.; Thomsen, K.; Sordo, F.; Perlado, J.M.; Bermejo, F.J.

    2012-01-01

    In the course of discussing different target types for their suitability in the European Spallation Source (ESS) one main focus was on neutronics' performance. Diverse concepts have been assessed baselining some preliminary engineering and geometrical details and including some optimization. With the restrictions and resulting uncertainty imposed by the lack of detailed designs optimizations at the time of compiling this paper, the conclusion drawn is basically that there is a little difference in the neutronic yield of the investigated targets. Other criteria like safety, environmental compatibility, reliability and cost will thus dominate the choice of an ESS target.

  2. Molecular Composition Analysis of Distant Targets

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a system capable of probing the molecular composition of cold solar system targets such as asteroids, comets, planets and moons from a distant vantage....

  3. An introduction to nonlinear analysis and fixed point theory

    CERN Document Server

    Pathak, Hemant Kumar

    2018-01-01

    This book systematically introduces the theory of nonlinear analysis, providing an overview of topics such as geometry of Banach spaces, differential calculus in Banach spaces, monotone operators, and fixed point theorems. It also discusses degree theory, nonlinear matrix equations, control theory, differential and integral equations, and inclusions. The book presents surjectivity theorems, variational inequalities, stochastic game theory and mathematical biology, along with a large number of applications of these theories in various other disciplines. Nonlinear analysis is characterised by its applications in numerous interdisciplinary fields, ranging from engineering to space science, hydromechanics to astrophysics, chemistry to biology, theoretical mechanics to biomechanics and economics to stochastic game theory. Organised into ten chapters, the book shows the elegance of the subject and its deep-rooted concepts and techniques, which provide the tools for developing more realistic and accurate models for ...

  4. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  5. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  6. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer; Gebali, Fayez; Yang, Hong-Chuan; Alouini, Mohamed-Slim

    2017-01-01

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used

  7. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  8. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk [FNC Technology Co., Yongin (Korea, Republic of); Choi, Byung Pil [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure.

  9. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    International Nuclear Information System (INIS)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk; Choi, Byung Pil

    2016-01-01

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure

  10. NIF pointing and centering systems and target alignment using a 351 nm laser source

    International Nuclear Information System (INIS)

    Boege, S.J.; Bliss, E.S.; Chocol, C.J.; Holdener, F.R.; Miller, J.L.; Toeppen, J.S.; Vann, C.S.; Zacharias, R.A.

    1996-10-01

    The operational requirements of the National Ignition Facility (NIF) place tight constraints upon its alignment system. In general, the alignment system must establish and maintain the correct relationships between beam position, beam angle, laser component clear apertures, and the target. At the target, this includes adjustment of beam focus to obtain the correct spot size. This must be accomplished for all beamlines in a time consistent with planned shot rates and yet, in the front end and main laser, beam control functions cannot be initiated until the amplifiers have sufficiently cooled so as to minimize dynamic thermal distortions during and after alignment and wavefront optimization. The scope of the task dictates an automated system that implements parallel processes. We describe reticle choices and other alignment references, insertion of alignment beams, principles of operation of the Chamber Center Reference System 2048 and Target Alignment Sensor, and the anticipated alignment sequence that will occur between shots

  11. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  12. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  13. Accuracy of multi-point boundary crossing time analysis

    Directory of Open Access Journals (Sweden)

    J. Vogt

    2011-12-01

    Full Text Available Recent multi-spacecraft studies of solar wind discontinuity crossings using the timing (boundary plane triangulation method gave boundary parameter estimates that are significantly different from those of the well-established single-spacecraft minimum variance analysis (MVA technique. A large survey of directional discontinuities in Cluster data turned out to be particularly inconsistent in the sense that multi-point timing analyses did not identify any rotational discontinuities (RDs whereas the MVA results of the individual spacecraft suggested that RDs form the majority of events. To make multi-spacecraft studies of discontinuity crossings more conclusive, the present report addresses the accuracy of the timing approach to boundary parameter estimation. Our error analysis is based on the reciprocal vector formalism and takes into account uncertainties both in crossing times and in the spacecraft positions. A rigorous error estimation scheme is presented for the general case of correlated crossing time errors and arbitrary spacecraft configurations. Crossing time error covariances are determined through cross correlation analyses of the residuals. The principal influence of the spacecraft array geometry on the accuracy of the timing method is illustrated using error formulas for the simplified case of mutually uncorrelated and identical errors at different spacecraft. The full error analysis procedure is demonstrated for a solar wind discontinuity as observed by the Cluster FGM instrument.

  14. Material-Point-Method Analysis of Collapsing Slopes

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised-interpolation mat......To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised......, a deformed material description is introduced, based on time integration of the deformation gradient and utilising Gauss quadrature over the volume associated with each material point. The method has been implemented in a Fortran code and employed for the analysis of a landslide that took place during...

  15. Integrative analysis of RUNX1 downstream pathways and target genes

    Directory of Open Access Journals (Sweden)

    Liu Marjorie

    2008-07-01

    Full Text Available Abstract Background The RUNX1 transcription factor gene is frequently mutated in sporadic myeloid and lymphoid leukemia through translocation, point mutation or amplification. It is also responsible for a familial platelet disorder with predisposition to acute myeloid leukemia (FPD-AML. The disruption of the largely unknown biological pathways controlled by RUNX1 is likely to be responsible for the development of leukemia. We have used multiple microarray platforms and bioinformatic techniques to help identify these biological pathways to aid in the understanding of why RUNX1 mutations lead to leukemia. Results Here we report genes regulated either directly or indirectly by RUNX1 based on the study of gene expression profiles generated from 3 different human and mouse platforms. The platforms used were global gene expression profiling of: 1 cell lines with RUNX1 mutations from FPD-AML patients, 2 over-expression of RUNX1 and CBFβ, and 3 Runx1 knockout mouse embryos using either cDNA or Affymetrix microarrays. We observe that our datasets (lists of differentially expressed genes significantly correlate with published microarray data from sporadic AML patients with mutations in either RUNX1 or its cofactor, CBFβ. A number of biological processes were identified among the differentially expressed genes and functional assays suggest that heterozygous RUNX1 point mutations in patients with FPD-AML impair cell proliferation, microtubule dynamics and possibly genetic stability. In addition, analysis of the regulatory regions of the differentially expressed genes has for the first time systematically identified numerous potential novel RUNX1 target genes. Conclusion This work is the first large-scale study attempting to identify the genetic networks regulated by RUNX1, a master regulator in the development of the hematopoietic system and leukemia. The biological pathways and target genes controlled by RUNX1 will have considerable importance in disease

  16. Automatic detection of the unknown number point targets in FMICW radar signals

    Czech Academy of Sciences Publication Activity Database

    Rejfek, L.; Mošna, Zbyšek; Beran, L.; Fišer, O.; Dobrovolný, M.

    2017-01-01

    Roč. 4, č. 11 (2017), s. 116-120 ISSN 2313-626X R&D Projects: GA ČR(CZ) GA15-24688S Institutional support: RVO:68378289 Keywords : FMICW radar * 2D FFT * signal filtration * taraget detection * target parameter estimation Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences http://science-gate.com/IJAAS/Articles/2017-4-11/18%202017-4-11-pp.116-120.pdf

  17. Free-time and fixed end-point multi-target optimal control theory: Application to quantum computing

    International Nuclear Information System (INIS)

    Mishima, K.; Yamashita, K.

    2011-01-01

    Graphical abstract: The two-state Deutsch-Jozsa algortihm used to demonstrate the utility of free-time and fixed-end point multi-target optimal control theory. Research highlights: → Free-time and fixed-end point multi-target optimal control theory (FRFP-MTOCT) was constructed. → The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. → The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361 (2009) 106]. → The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. → The calculation examples show that our theory is useful for minor adjustment of the external fields. - Abstract: An extension of free-time and fixed end-point optimal control theory (FRFP-OCT) to monotonically convergent free-time and fixed end-point multi-target optimal control theory (FRFP-MTOCT) is presented. The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361, (2009), 106]. The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. The calculation examples show that our theory is useful for minor

  18. A critical analysis of the tender points in fibromyalgia.

    Science.gov (United States)

    Harden, R Norman; Revivo, Gadi; Song, Sharon; Nampiaparampil, Devi; Golden, Gary; Kirincic, Marie; Houle, Timothy T

    2007-03-01

    To pilot methodologies designed to critically assess the American College of Rheumatology's (ACR) diagnostic criteria for fibromyalgia. Prospective, psychophysical testing. An urban teaching hospital. Twenty-five patients with fibromyalgia and 31 healthy controls (convenience sample). Pressure pain threshold was determined at the 18 ACR tender points and five sham points using an algometer (dolorimeter). The patients "algometric total scores" (sums of the patients' average pain thresholds at the 18 tender points) were derived, as well as pain thresholds across sham points. The "algometric total score" could differentiate patients with fibromyalgia from normals with an accuracy of 85.7% (P pain across sham points than across ACR tender points, sham points also could be used for diagnosis (85.7%; Ps tested vs other painful conditions. The points specified by the ACR were only modestly superior to sham points in making the diagnosis. Most importantly, this pilot suggests single points, smaller groups of points, or sham points may be as effective in diagnosing fibromyalgia as the use of all 18 points, and suggests methodologies to definitively test that hypothesis.

  19. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  20. Analysis of Multicomponent Adsorption Close to a Dew Point

    DEFF Research Database (Denmark)

    Shapiro, Alexander; Stenby, Erling Halfdan

    1998-01-01

    We develop the potential theory of multicomponent adsorption close to a dew point. The approach is based on an asymptotic adsorption equation (AAE) which is valid in a vicinity of the dew point. By this equation the thickness of the liquid film is expressed through thermodynamic characteristics...... and the direct calculations, even if the mixture is not close to a dew point.Key Words: adsorption; potential theory; multicomponent; dew point....

  1. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  2. A Targeted Search for Point Sources of EeV Photons with the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Aab, A. [Institute for Mathematics, Astrophysics and Particle Physics (IMAPP), Radboud Universiteit, Nijmegen (Netherlands); Abreu, P. [Laboratório de Instrumentação e Física Experimental de Partículas—LIP and Instituto Superior Técnico—IST, Universidade de Lisboa—UL, Lisbon (Portugal); Aglietta, M. [INFN, Sezione di Torino, Torino (Italy); Samarai, I. Al [Laboratoire de Physique Nucléaire et de Hautes Energies (LPNHE), Universités Paris 6 et Paris 7, CNRS-IN2P3, Paris (France); Albuquerque, I. F. M. [Universidade de São Paulo, Inst. de Física, São Paulo (Brazil); Allekotte, I. [Centro Atómico Bariloche and Instituto Balseiro (CNEA-UNCuyo-CONICET), San Carlos de Bariloche (Argentina); Almela, A. [Instituto de Tecnologías en Detección y Astropartículas (CNEA, CONICET, UNSAM), Centro Atómico Constituyentes, Comisión Nacional de Energía Atómica, Buenos Aires (Argentina); Castillo, J. Alvarez [Universidad Nacional Autónoma de México, México, D. F., México (Mexico); Alvarez-Muñiz, J. [Universidad de Santiago de Compostela, La Coruña (Spain); Anastasi, G. A. [Gran Sasso Science Institute (INFN), L’Aquila (Italy); and others

    2017-03-10

    Simultaneous measurements of air showers with the fluorescence and surface detectors of the Pierre Auger Observatory allow a sensitive search for EeV photon point sources. Several Galactic and extragalactic candidate objects are grouped in classes to reduce the statistical penalty of many trials from that of a blind search and are analyzed for a significant excess above the background expectation. The presented search does not find any evidence for photon emission at candidate sources, and combined p -values for every class are reported. Particle and energy flux upper limits are given for selected candidate sources. These limits significantly constrain predictions of EeV proton emission models from non-transient Galactic and nearby extragalactic sources, as illustrated for the particular case of the Galactic center region.

  3. 33 CFR 334.200 - Chesapeake Bay, Point Lookout to Cedar Point; aerial and surface firing range and target area, U...

    Science.gov (United States)

    2010-07-01

    ... degrees 09 minutes 26 seconds identified as Hannibal Target. (3) The regulations. Nonexplosive projectiles and bombs will be dropped at frequent intervals in the target areas. Hooper and Hannibal target areas...

  4. Series-nonuniform rational B-spline signal feedback: From chaos to any embedded periodic orbit or target point

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Chenxi, E-mail: cxshao@ustc.edu.cn; Xue, Yong; Fang, Fang; Bai, Fangzhou [Department of Computer Science and Technology, University of Science and Technology of China, Hefei 230027 (China); Yin, Peifeng [Department of Computer Science and Engineering, Pennsylvania State University, State College, Pennsylvania 16801 (United States); Wang, Binghong [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2015-07-15

    The self-controlling feedback control method requires an external periodic oscillator with special design, which is technically challenging. This paper proposes a chaos control method based on time series non-uniform rational B-splines (SNURBS for short) signal feedback. It first builds the chaos phase diagram or chaotic attractor with the sampled chaotic time series and any target orbit can then be explicitly chosen according to the actual demand. Second, we use the discrete timing sequence selected from the specific target orbit to build the corresponding external SNURBS chaos periodic signal, whose difference from the system current output is used as the feedback control signal. Finally, by properly adjusting the feedback weight, we can quickly lead the system to an expected status. We demonstrate both the effectiveness and efficiency of our method by applying it to two classic chaotic systems, i.e., the Van der Pol oscillator and the Lorenz chaotic system. Further, our experimental results show that compared with delayed feedback control, our method takes less time to obtain the target point or periodic orbit (from the starting point) and that its parameters can be fine-tuned more easily.

  5. Series-nonuniform rational B-spline signal feedback: From chaos to any embedded periodic orbit or target point.

    Science.gov (United States)

    Shao, Chenxi; Xue, Yong; Fang, Fang; Bai, Fangzhou; Yin, Peifeng; Wang, Binghong

    2015-07-01

    The self-controlling feedback control method requires an external periodic oscillator with special design, which is technically challenging. This paper proposes a chaos control method based on time series non-uniform rational B-splines (SNURBS for short) signal feedback. It first builds the chaos phase diagram or chaotic attractor with the sampled chaotic time series and any target orbit can then be explicitly chosen according to the actual demand. Second, we use the discrete timing sequence selected from the specific target orbit to build the corresponding external SNURBS chaos periodic signal, whose difference from the system current output is used as the feedback control signal. Finally, by properly adjusting the feedback weight, we can quickly lead the system to an expected status. We demonstrate both the effectiveness and efficiency of our method by applying it to two classic chaotic systems, i.e., the Van der Pol oscillator and the Lorenz chaotic system. Further, our experimental results show that compared with delayed feedback control, our method takes less time to obtain the target point or periodic orbit (from the starting point) and that its parameters can be fine-tuned more easily.

  6. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  7. A targeted search for point sources of eev photons with the Pierre Auger Observatory

    Czech Academy of Sciences Publication Activity Database

    Aab, A.; Abreu, P.; Aglietta, M.; Blažek, Jiří; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Mandát, Dušan; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovánek, Petr; Trávníček, Petr; Vícha, Jakub

    2017-01-01

    Roč. 837, č. 2 (2017), 1-7, č. článku L25. ISSN 2041-8205 R&D Projects: GA MŠk LM2015038; GA MŠk LG15014; GA ČR(CZ) GA14-17501S Institutional support: RVO:68378271 Keywords : astroparticle physics * cosmic rays * methods * data analysis Subject RIV: BF - Elementary Particles and High Energy Physics OBOR OECD: Particles and field physics Impact factor: 5.522, year: 2016

  8. Pharmacological receptors of nematoda as target points for action of antiparasitic drugs

    Directory of Open Access Journals (Sweden)

    Trailović Saša M.

    2010-01-01

    Full Text Available Cholinergic receptors of parasitic nematodes are one of the most important possible sites of action of antiparasitic drugs. This paper presents some of our own results of electrophysiological and pharamcological examinations of nicotinic and muscarinic receptors of nematodes, as well as data from literature on a new class of anthelmintics that act precisely on cholinergic receptors. The nicotinic acetylcholine receptor (nAChR is located on somatic muscle cells of nematodes and it is responsible for the coordination of parasite movement. Cholinomimetic anthelmintics act on this receptor, as well as acetylcholine, an endogenic neurotransmitter, but they are not sensitive to enzyme acetylcholineesterase which dissolves acetylcholine. As opposed to the nicotinic receptor of vertebra, whose structure has been examined thoroughly, the stoichiometry of the nicotinic receptor of nematodes is not completely known. However, on the grounds of knowledge acquired so far, a model has been constructed recently of the potential composition of a type of nematodes nicotinic receptor, as the site of action of anthelmintics. Based on earlier investigations, it is supposed that a conventional muscarinic receptor exists in nematodes as well, so that it can also be a new pharamocological target for the development of antinematode drugs. The latest class of synthesized anthelmintics, named aminoacetonitriles (AAD, act via the nicotinic receptor. Monepantel is the first drug from the AAD group as a most significant candidate for registration in veterinary medicine. Even though several groups of cholinomimetic anthelmintics (imiodazothiazoles, tetrahydropyrimidines, organophosphat anthelmintics have been in use in veterinary practice for many years now, it is evident that cholinergic receptors of nematodes still present an attractive place in the examinations and development of new antinematode drugs. .

  9. Methane Flux Estimation from Point Sources using GOSAT Target Observation: Detection Limit and Improvements with Next Generation Instruments

    Science.gov (United States)

    Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.

    2017-12-01

    Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH

  10. A targeted search for point sources of EeV neutrons

    Czech Academy of Sciences Publication Activity Database

    Aab, A.; Abreu, P.; Aglietta, M.; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Mandát, Dušan; Nečesal, Petr; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovánek, Petr; Trávníček, Petr; Vícha, Jakub

    2014-01-01

    Roč. 789, č. 2 (2014), s. 1-7 ISSN 2041-8205 R&D Projects: GA ČR(CZ) GA14-17501S; GA MŠk(CZ) 7AMB14AR005; GA MŠk(CZ) LG13007 Institutional support: RVO:68378271 Keywords : cosmic rays * Galaxy * disk * methods * data analysis Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 5.339, year: 2014 http://iopscience.iop.org/2041-8205/789/2/L34/pdf/2041-8205_789_2_L34.pdf

  11. Material-Point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    2007-01-01

    The aim of this paper is to test different types of spatial interpolation for the material-point method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...

  12. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  13. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  14. Risk-analysis of global climate tipping points

    Energy Technology Data Exchange (ETDEWEB)

    Frieler, Katja; Meinshausen, Malte; Braun, N [Potsdam Institute for Climate Impact Research e.V., Potsdam (Germany). PRIMAP Research Group; and others

    2012-09-15

    There are many elements of the Earth system that are expected to change gradually with increasing global warming. Changes might prove to be reversible after global warming returns to lower levels. But there are others that have the potential of showing a threshold behavior. This means that these changes would imply a transition between qualitatively disparate states which can be triggered by only small shifts in background climate (2). These changes are often expected not to be reversible by returning to the current level of warming. The reason for that is, that many of them are characterized by self-amplifying processes that could lead to a new internally stable state which is qualitatively different from before. There are different elements of the climate system that are already identified as potential tipping elements. This group contains the mass losses of the Greenland and the West-Antarctic Ice Sheet, the decline of the Arctic summer sea ice, different monsoon systems, the degradation of coral reefs, the dieback of the Amazon rainforest, the thawing of the permafrost regions as well as the release of methane hydrates (3). Crucially, these tipping elements have regional to global scale effects on human society, biodiversity and/or ecosystem services. Several examples may have a discernable effect on global climate through a large-scale positive feedback. This means they would further amplify the human induced climate change. These tipping elements pose risks comparable to risks found in other fields of human activity: high-impact events that have at least a few percent chance to occur classify as high-risk events. In many of these examples adaptation options are limited and prevention of occurrence may be a more viable strategy. Therefore, a better understanding of the processes driving tipping points is essential. There might be other tipping elements even more critical but not yet identified. These may also lie within our socio-economic systems that are

  15. Analysis of hygienic critical control points in boar semen production.

    Science.gov (United States)

    Schulze, M; Ammon, C; Rüdiger, K; Jung, M; Grobbel, M

    2015-02-01

    The present study addresses the microbiological results of a quality control audit in artificial insemination (AI) boar studs in Germany and Austria. The raw and processed semen of 344 boars in 24 AI boar studs were analyzed. Bacteria were found in 26% (88 of 344) of the extended ejaculates and 66.7% (18 of 24) of the boar studs. The bacterial species found in the AI dose were not cultured from the respective raw semen in 95.5% (84 of 88) of the positive samples. These data, together with the fact that in most cases all the samples from one stud were contaminated with identical bacteria (species and resistance profile), indicate contamination during processing. Microbiological investigations of the equipment and the laboratory environment during semen processing in 21 AI boar studs revealed nine hygienic critical control points (HCCP), which were addressed after the first audit. On the basis of the analysis of the contamination rates of the ejaculate samples, improvements in the hygiene status were already present in the second audit (P = 0.0343, F-test). Significant differences were observed for heating cabinets (improvement, P = 0.0388) and manual operating elements (improvement, P = 0.0002). The odds ratio of finding contaminated ejaculates in the first and second audit was 1.68 (with the 95% confidence interval ranging from 1.04 to 2.69). Furthermore, an overall good hygienic status was shown for extenders, the inner face of dilution tank lids, dyes, and ultrapure water treatment plants. Among the nine HCCP considered, the most heavily contaminated samples, as assessed by the median scores throughout all the studs, were found in the sinks and/or drains. High numbers (>10(3) colony-forming units/cm(2)) of bacteria were found in the heating cabinets, ejaculate transfer, manual operating elements, and laboratory surfaces. In conclusion, the present study emphasizes the need for both training of the laboratory staff in monitoring HCCP in routine semen

  16. Material-Point Analysis of Large-Strain Problems

    DEFF Research Database (Denmark)

    Andersen, Søren

    The aim of this thesis is to apply and improve the material-point method for modelling of geotechnical problems. One of the geotechnical phenomena that is a subject of active research is the study of landslides. A large amount of research is focused on determining when slopes become unstable. Hence......, it is possible to predict if a certain slope is stable using commercial finite element or finite difference software such as PLAXIS, ABAQUS or FLAC. However, the dynamics during a landslide are less explored. The material-point method (MPM) is a novel numerical method aimed at analysing problems involving...... materials subjected to large strains in a dynamical time–space domain. This thesis explores the material-point method with the specific aim of improving the performance for geotechnical problems. Large-strain geotechnical problems such as landslides pose a major challenge to model numerically. Employing...

  17. Material-point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    The aim of this paper is to test different types of spatial interpolation for the materialpoint method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...

  18. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  19. Analysis of Spatial Interpolation in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2010-01-01

    are obtained using quadratic elements. It is shown that for more complex problems, the use of partially negative shape functions is inconsistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations...

  20. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    Science.gov (United States)

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  1. The Terahertz Scattering Analysis of Rough Metallic and Dielectric Targets

    Directory of Open Access Journals (Sweden)

    Mou Yuan

    2018-02-01

    Full Text Available The terahertz scattering characteristics of metallic and dielectric rough targets is important for the investigation of the terahertz radar targets properties. According to the stationary phase theory and scalar approximation, if the radius of curvature at any point of the surface is much larger than the incident wavelength, and the wavelength is also much longer than the surface height function and Root-Mean-Square (RMS surface slope, the coherent and incoherent scattering Radar Cross Section (RCS of rough metallic and dielectric targets can be obtained. Based on the stationary phase approximation, the coherent RCS of rough conductors, smooth dielectric targets and rough dielectric targets can be easily deputed. The scattering characteristics of electrically large smooth Al and painted spheres are investigated in this paper, and the calculated RCS are verified by Mie scattering theory, the error is less than 0.1 dBm2. Based on lambert theory, it is demonstrated that the incoherent RCS is analyzed with better precision if the rough surfaces are divided into much more facets. In this paper, the coherent and incoherent scattering of rough Al and painted spheres are numerically observed, and the effects of surface roughness and materials are analyzed. The conclusions provide theoretical foundation for the terahertz scattering characteristics of electrically large rough targets.

  2. Unified analysis of preconditioning methods for saddle point matrices

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe

    2015-01-01

    Roč. 22, č. 2 (2015), s. 233-253 ISSN 1070-5325 R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : saddle point problems * preconditioning * spectral properties Subject RIV: BA - General Mathematics Impact factor: 1.431, year: 2015 http://onlinelibrary.wiley.com/doi/10.1002/nla.1947/pdf

  3. Preliminary analysis of a target factory for laser fusion

    International Nuclear Information System (INIS)

    Sherohman, J.W.; Hendricks, C.D.

    1980-01-01

    An analysis of a target factory leading to the determination of production expressions has provided for the basis of a parametric study. Parameters involving the input and output rate of a process system, processing yield factors, and multiple processing steps and production lines have been used to develop an understanding of their dependence on the rate of target injection for laser fusion. Preliminary results have indicated that a parametric study of this type will be important in the selection of processing methods to be used in the final production scheme of a target factory

  4. [Segment analysis of the target market of physiotherapeutic services].

    Science.gov (United States)

    Babaskin, D V

    2010-01-01

    The objective of the present study was to demonstrate the possibilities to analyse selected segments of the target market of physiotherapeutic services provided by medical and preventive-facilities of two major types. The main features of a target segment, such as provision of therapeutic massage, are illustrated in terms of two characteristics, namely attractiveness to the users and the ability of a given medical facility to satisfy their requirements. Based on the analysis of portfolio of the available target segments the most promising ones (winner segments) were selected for further marketing studies. This choice does not exclude the possibility of involvement of other segments of medical services in marketing activities.

  5. A note on the statistical analysis of point judgment matrices

    Directory of Open Access Journals (Sweden)

    MG Kabera

    2013-06-01

    Full Text Available The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment matrix. In the present paper some statistical approaches to extracting the weights of objects from a judgment matrix are reviewed and new ideas which are rooted in the traditional method of paired comparisons are introduced.

  6. Extracting Loop Bounds for WCET Analysis Using the Instrumentation Point Graph

    Science.gov (United States)

    Betts, A.; Bernat, G.

    2009-05-01

    Every calculation engine proposed in the literature of Worst-Case Execution Time (WCET) analysis requires upper bounds on loop iterations. Existing mechanisms to procure this information are either error prone, because they are gathered from the end-user, or limited in scope, because automatic analyses target very specific loop structures. In this paper, we present a technique that obtains bounds completely automatically for arbitrary loop structures. In particular, we show how to employ the Instrumentation Point Graph (IPG) to parse traces of execution (generated by an instrumented program) in order to extract bounds relative to any loop-nesting level. With this technique, therefore, non-rectangular dependencies between loops can be captured, allowing more accurate WCET estimates to be calculated. We demonstrate the improvement in accuracy by comparing WCET estimates computed through our HMB framework against those computed with state-of-the-art techniques.

  7. Analysis of target implosion irradiated by proton beam, (1)

    International Nuclear Information System (INIS)

    Tamba, Moritake; Nagata, Norimasa; Kawata, Shigeo; Niu, Keishiro.

    1982-10-01

    Numerical simulation and analysis were performed for the implosion of a hollow shell target driven by proton beam. The target consists of three layers of Pb, Al and DT. As the Al layer is heated by proton beam, the layer expands and pushes the DT layer toward the target center. To obtain the optimal velocity of DT implosion, the optimal target size and optimal layer thickness were determined. The target size is determined by, for example, the instability of the implosion or beam focusing on the target surface. The Rayleigh-Taylor instability and the unstable implosion due to the inhomogeneity were investigated. Dissipation, nonlinear effects and density gradient at the boundary were expected to reduce the growth rate of the Rayleigh-Taylor instability during the implosion. In order that the deviation of the boundary surface during the implosion is less than the thickness of fuel, the inhomogeneity of the temperature and the density of the target should be less than ten percent. The amplitude of the boundary surface roughness is required to be less than 4 micrometer. (Kato, T.)

  8. Super-Relaxed ( -Proximal Point Algorithms, Relaxed ( -Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions

    Directory of Open Access Journals (Sweden)

    Agarwal RaviP

    2009-01-01

    Full Text Available We glance at recent advances to the general theory of maximal (set-valued monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed ( -proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion of maximal ( -monotonicity. Investigations highlighted in this communication are greatly influenced by the celebrated work of Rockafellar (1976, while others have played a significant part as well in generalizing the proximal point algorithm considered by Rockafellar (1976 to the case of the relaxed proximal point algorithm by Eckstein and Bertsekas (1992. Even for the linear convergence analysis for the overrelaxed (or super-relaxed ( -proximal point algorithm, the fundamental model for Rockafellar's case does the job. Furthermore, we attempt to explore possibilities of generalizing the Yosida regularization/approximation in light of maximal ( -monotonicity, and then applying to first-order evolution equations/inclusions.

  9. Low dose response analysis through a cytogenetic end-point

    International Nuclear Information System (INIS)

    Bojtor, I.; Koeteles, G.J.

    1998-01-01

    The effects of low doses were studied on human lymphocytes of various individuals. The frequency of micronuclei in cytokinesis-blocked cultured lymphocytes was taken as end-point. The probability distribution of radiation-induced increment was statistically proved and identified as to be asymmetric when the blood samples had been irradiated with doses of 0.01-0.05 Gy of X-rays, similarly to that in unirradiated control population. On the contrary, at or above 1 Gy the corresponding normal curve could be accepted only reflecting an approximately symmetrical scatter of the increments about their mean value. It was found that the slope as well as the closeness of correlation of the variables considerably changed when lower and lower dose ranges had been selected. Below approximately 0.2 Gy even an unrelatedness was found betwen the absorbed dose and the increment

  10. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    Science.gov (United States)

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  11. Design of thermostable rhamnogalacturonan lyase mutants from Bacillus licheniformis by combination of targeted single point mutations

    DEFF Research Database (Denmark)

    da Silva, Ines Isabel Cardoso Rodrigues; Jers, Carsten; Otten, Harm

    2014-01-01

    Rhamnogalacturonan I lyases (RGI lyases) (EC 4.2.2.-) catalyze cleavage of α-1,4 bonds between rhamnose and galacturonic acid in the backbone of pectins by β-elimination. In the present study, targeted improvement of the thermostability of a PL family 11 RGI lyase from Bacillus licheniformis (DSM......, were obtained due to additive stabilizing effects of single amino acid mutations (E434L, G55V, and G326E) compared to the wild type. The crystal structure of the B. licheniformis wild-type RGI lyase was also determined; the structural analysis corroborated that especially mutation of charged amino...

  12. Efficient moving target analysis for inverse synthetic aperture radar images via joint speeded-up robust features and regular moment

    Science.gov (United States)

    Yang, Hongxin; Su, Fulin

    2018-01-01

    We propose a moving target analysis algorithm using speeded-up robust features (SURF) and regular moment in inverse synthetic aperture radar (ISAR) image sequences. In our study, we first extract interest points from ISAR image sequences by SURF. Different from traditional feature point extraction methods, SURF-based feature points are invariant to scattering intensity, target rotation, and image size. Then, we employ a bilateral feature registering model to match these feature points. The feature registering scheme can not only search the isotropic feature points to link the image sequences but also reduce the error matching pairs. After that, the target centroid is detected by regular moment. Consequently, a cost function based on correlation coefficient is adopted to analyze the motion information. Experimental results based on simulated and real data validate the effectiveness and practicability of the proposed method.

  13. FINANCIAL ANALYSIS FROM AN ACCOUNTING POINT OF VIEW

    Directory of Open Access Journals (Sweden)

    Mihaela Ungureanu

    2013-03-01

    Full Text Available Despite the developments which tend to relax the relationship between financial analysis and accounting, property information provided by the latter irreplaceable render its use for diagnostic approaches financial foundation. An efficient information system can provide relevant indicators to users based on accurate and real information and financial analysis results are based on a diagnosis of return and risk. The aim of this article is to present primarily the origin and evolution of the relationship between financial analysis and accounting, and the fundamental role which accounting holds, through the information it produces, into analysts’ work. The used research method is the bibliographic one, being studied timely books and articles of the domain. Literature does not provide concrete answers to this problem, resolutions being expected especially from practitioners.

  14. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  15. Introduction to charm decay analysis in fixed target experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bediaga, Ignacio; Goebel, Carla

    1996-01-01

    We present an introduction to data analysis in Experimental High Energy Physics, and some concepts and useful tools are discussed. To illustrate, we use the data of E-791, a fixed target experiment recently realized at Fermilab. In particular, we analyse decay modes of D{sup +} meson with three charged particles in the final state. (author). 8 refs., 22 figs., 1 tab.

  16. Introduction to charm decay analysis in fixed target experiments

    International Nuclear Information System (INIS)

    Bediaga, Ignacio; Goebel, Carla.

    1996-01-01

    We present an introduction to data analysis in Experimental High Energy Physics, and some concepts and useful tools are discussed. To illustrate, we use the data of E-791, a fixed target experiment recently realized at Fermilab. In particular, we analyse decay modes of D + meson with three charged particles in the final state. (author). 8 refs., 22 figs., 1 tab

  17. Human detection and motion analysis at security points

    Science.gov (United States)

    Ozer, I. Burak; Lv, Tiehan; Wolf, Wayne H.

    2003-08-01

    This paper presents a real-time video surveillance system for the recognition of specific human activities. Specifically, the proposed automatic motion analysis is used as an on-line alarm system to detect abnormal situations in a campus environment. A smart multi-camera system developed at Princeton University is extended for use in smart environments in which the camera detects the presence of multiple persons as well as their gestures and their interaction in real-time.

  18. Kick-Off Point (KOP and End of Buildup (EOB Data Analysis in Trajectory Design

    Directory of Open Access Journals (Sweden)

    Novrianti Novrianti

    2017-06-01

    Full Text Available Well X is a development well which is directionally drilled. Directional drilling is choosen because the coordinate target of Well X is above the buffer zone. The directional track plan needs accurate survey calculation in order to make the righ track for directional drilling. There are many survey calculation in directional drilling such as tangential, underbalance, average angle, radius of curvature, and mercury method. Minimum curvature method is used in this directional track plan calculation. This method is used because it gives less error than other method.  Kick-Off Point (KOP and End of Buildup (EOB analysis is done at 200 ft, 400 ft, and 600 ft depth to determine the trajectory design and optimal inclination. The hole problem is also determined in this trajectory track design. Optimal trajectory design determined at 200 ft depth because the inclination below 35º and also already reach the target quite well at 1632.28 ft TVD and 408.16 AHD. The optimal inclination at 200 ft KOP depth because the maximum inclination is 18.87º which is below 35º. Hole problem will occur if the trajectory designed at 600 ft. The problems are stuck pipe and the casing or tubing will not able to bend.

  19. Comparison between dose values specified at the ICRU reference point and the mean dose to the planning target volume

    International Nuclear Information System (INIS)

    Kukoowicz, Pawel F.; Mijnheer, Bernard J.

    1997-01-01

    Background and purpose: To compare dose values specified at the reference point, as recommended by the International Commission on Radiation Units and Measurements, ICRU, and the mean dose to the planning target volume, PTV. Material and methods: CT-based dose calculations were performed with a 3-D treatment planning system for 6 series of patients treated for bladder, brain, breast, lung, oropharynx and parotid gland tumour. All patients were arbitrarily chosen from a set of previously treated patients irradiated with a two- or three-field technique using customised blocks. Appropriate wedge angles and beam weights were chosen to make the dose distribution as homogeneous as possible. Results: The dose at the ICRU reference point was generally higher than the mean dose to the PTV. The difference between the ICRU reference dose and the mean dose to the PTV for an individual patient was less than 3% in 88% of cases and less than 2% in 72% of the cases. The differences were larger in those patients where the dose distribution is significantly influenced by the presence of lungs or air gaps. For each series of patients the mean difference between the ICRU reference dose and the mean dose to the PTV was calculated. The difference between these two values never exceeded 2%. Because not all planning systems are able to calculate the mean dose to the PTV, the concept of the mean central dose, the mean of the dose values at the centre of the PTV in each CT slice, has been introduced. The mean central dose was also calculated for the same patients and was closer to the mean dose to the PTV than the ICRU reference dose. Conclusion: The mean dose to the PTV is well estimated by either the ICRU reference dose or the mean central dose for a variety of treatment techniques for common types of cancer

  20. Computer-aided target tracking in motion analysis studies

    Science.gov (United States)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  1. Penetration analysis of projectile with inclined concrete target

    Directory of Open Access Journals (Sweden)

    Kim S.B.

    2015-01-01

    Full Text Available This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction and CONCRETE_DAMAGE (K&C concrete models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.

  2. Penetration analysis of projectile with inclined concrete target

    Science.gov (United States)

    Kim, S. B.; Kim, H. W.; Yoo, Y. H.

    2015-09-01

    This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction) and CONCRETE_DAMAGE (K&C concrete) models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.

  3. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  4. Visibility Analysis in a Point Cloud Based on the Medial Axis Transform

    NARCIS (Netherlands)

    Peters, R.; Ledoux, H.; Biljecki, F.

    2015-01-01

    Visibility analysis is an important application of 3D GIS data. Current approaches require 3D city models that are often derived from detailed aerial point clouds. We present an approach to visibility analysis that does not require a city model but works directly on the point cloud. Our approach is

  5. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  6. Microbiological analysis of critical points in the chicken industry

    Directory of Open Access Journals (Sweden)

    Rogério Luis Cansian

    2005-05-01

    Full Text Available This work is focused on identifying microbial contamination in the scalding asepsis and cooling processes as well as in fresh sausages obtained. Salmonella was identified in two scald water samples but was absent in the water from chiller and in the final product, which might be explained in terms of chlorine addition and temperature reduction. The analysis revealed that MPN of Escherichia coli was in the range of O objetivo deste trabalho foi identificar a contaminação microbiana no processo de escaldagem, assepsia e resfriamento do frango (chiller, e em linguiças de frango produzidas a partir destes. As amostras foram coletadas em um frigorífico de aves, em sete datas e analisadas em triplicata. A presença de Salmonella foi detectada em duas amostras da água de escaldagem não estando mais presente na água do chiller e nem no produto final. Isto se deve à redução de temperatura da água e adição de cloro. O NMP de coliformes fecais variou entre < 1 a 11/ml na água de escaldagem e < 1 a 64/ml na água do chiller, que embora em padrões aceitáveis, mostram tendência de acréscimo no chiller, devido principalmente ao processo de evisceração. As contagens de Aeromonas variaram de 5 a 3,5x10¹UFC/ml na água de escaldagem e 9 a 3,7x10²UFC/ml na água do chiller. Este acréscimo se deve, provavelmente, por Aeromonas ser psicrófila e também devido a retirada das víceras. As análises de linguiça de frango mostraram acréscimo nas contagens de Aeromonas, apresentando até 2,5x10³UFC/g. Esta tendência de aumento de crescimento no produto final, aliado a capacidade de causar infecções de Aeromonas demonstram a necessidade de incluir a análise destas nas avaliações microbiológicas de alimentos.

  7. Single-Molecule Analysis for RISC Assembly and Target Cleavage.

    Science.gov (United States)

    Sasaki, Hiroshi M; Tadakuma, Hisashi; Tomari, Yukihide

    2018-01-01

    RNA-induced silencing complex (RISC) is a small RNA-protein complex that mediates silencing of complementary target RNAs. Biochemistry has been successfully used to characterize the molecular mechanism of RISC assembly and function for nearly two decades. However, further dissection of intermediate states during the reactions has been warranted to fill in the gaps in our understanding of RNA silencing mechanisms. Single-molecule analysis with total internal reflection fluorescence (TIRF) microscopy is a powerful imaging-based approach to interrogate complex formation and dynamics at the individual molecule level with high sensitivity. Combining this technique with our recently established in vitro reconstitution system of fly Ago2-RISC, we have developed a single-molecule observation system for RISC assembly. In this chapter, we summarize the detailed protocol for single-molecule analysis of chaperone-assisted assembly of fly Ago2-RISC as well as its target cleavage reaction.

  8. The Purification Method of Matching Points Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    DONG Yang

    2017-02-01

    Full Text Available The traditional purification method of matching points usually uses a small number of the points as initial input. Though it can meet most of the requirements of point constraints, the iterative purification solution is easy to fall into local extreme, which results in the missing of correct matching points. To solve this problem, we introduce the principal component analysis method to use the whole point set as initial input. And thorough mismatching points step eliminating and robust solving, more accurate global optimal solution, which intends to reduce the omission rate of correct matching points and thus reaches better purification effect, can be obtained. Experimental results show that this method can obtain the global optimal solution under a certain original false matching rate, and can decrease or avoid the omission of correct matching points.

  9. Extended Fitts' model of pointing time in eye-gaze input system - Incorporating effects of target shape and movement direction into modeling.

    Science.gov (United States)

    Murata, Atsuo; Fukunaga, Daichi

    2018-04-01

    This study attempted to investigate the effects of the target shape and the movement direction on the pointing time using an eye-gaze input system and extend Fitts' model so that these factors are incorporated into the model and the predictive power of Fitts' model is enhanced. The target shape, the target size, the movement distance, and the direction of target presentation were set as within-subject experimental variables. The target shape included: a circle, and rectangles with an aspect ratio of 1:1, 1:2, 1:3, and 1:4. The movement direction included eight directions: upper, lower, left, right, upper left, upper right, lower left, and lower right. On the basis of the data for identifying the effects of the target shape and the movement direction on the pointing time, an attempt was made to develop a generalized and extended Fitts' model that took into account the movement direction and the target shape. As a result, the generalized and extended model was found to fit better to the experimental data, and be more effective for predicting the pointing time for a variety of human-computer interaction (HCI) task using an eye-gaze input system. Copyright © 2017. Published by Elsevier Ltd.

  10. Bioinformatics analysis of Brucella vaccines and vaccine targets using VIOLIN.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang

    2010-09-27

    Brucella spp. are Gram-negative, facultative intracellular bacteria that cause brucellosis, one of the commonest zoonotic diseases found worldwide in humans and a variety of animal species. While several animal vaccines are available, there is no effective and safe vaccine for prevention of brucellosis in humans. VIOLIN (http://www.violinet.org) is a web-based vaccine database and analysis system that curates, stores, and analyzes published data of commercialized vaccines, and vaccines in clinical trials or in research. VIOLIN contains information for 454 vaccines or vaccine candidates for 73 pathogens. VIOLIN also contains many bioinformatics tools for vaccine data analysis, data integration, and vaccine target prediction. To demonstrate the applicability of VIOLIN for vaccine research, VIOLIN was used for bioinformatics analysis of existing Brucella vaccines and prediction of new Brucella vaccine targets. VIOLIN contains many literature mining programs (e.g., Vaxmesh) that provide in-depth analysis of Brucella vaccine literature. As a result of manual literature curation, VIOLIN contains information for 38 Brucella vaccines or vaccine candidates, 14 protective Brucella antigens, and 68 host response studies to Brucella vaccines from 97 peer-reviewed articles. These Brucella vaccines are classified in the Vaccine Ontology (VO) system and used for different ontological applications. The web-based VIOLIN vaccine target prediction program Vaxign was used to predict new Brucella vaccine targets. Vaxign identified 14 outer membrane proteins that are conserved in six virulent strains from B. abortus, B. melitensis, and B. suis that are pathogenic in humans. Of the 14 membrane proteins, two proteins (Omp2b and Omp31-1) are not present in B. ovis, a Brucella species that is not pathogenic in humans. Brucella vaccine data stored in VIOLIN were compared and analyzed using the VIOLIN query system. Bioinformatics curation and ontological representation of Brucella vaccines

  11. Influence of the distance between target surface and focal point on the expansion dynamics of a laser-induced silicon plasma with spatial confinement

    Science.gov (United States)

    Zhang, Dan; Chen, Anmin; Wang, Xiaowei; Wang, Ying; Sui, Laizhi; Ke, Da; Li, Suyu; Jiang, Yuanfei; Jin, Mingxing

    2018-05-01

    Expansion dynamics of a laser-induced plasma plume, with spatial confinement, for various distances between the target surface and focal point were studied by the fast photography technique. A silicon wafer was ablated to induce the plasma with a Nd:YAG laser in an atmospheric environment. The expansion dynamics of the plasma plume depended on the distance between the target surface and focal point. In addition, spatially confined time-resolved images showed the different structures of the plasma plumes at different distances between the target surface and focal point. By analyzing the plume images, the optimal distance for emission enhancement was found to be approximately 6 mm away from the geometrical focus using a 10 cm focal length lens. This optimized distance resulted in the strongest compression ratio of the plasma plume by the reflected shock wave. Furthermore, the duration of the interaction between the reflected shock wave and the plasma plume was also prolonged.

  12. Thermal Analysis of Fission Moly Target Solid Waste Storage

    Energy Technology Data Exchange (ETDEWEB)

    Son, Hyung Min; Park, Jonghark [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    There are various ways to produce Mo-99. Among them, nuclear transmutation of uranium target became the major one owing to its superior specific activity. After the fission molybdenum (FM) target is irradiated, it is transported to treatment facility to extract wanted isotope. During the process, various forms of wastes are produced including filter cake and other solid wastes. The filter cake is mostly consisted of decaying uranium compounds. The solid wastes are then packaged and moved to storage facility which will stay there for considerable amount of time. Being the continuous source of heat, the solid wastes are required to be cooled for the certain amount of time before transported to the storage area. In this study, temperature evaluation of the storage facility is carried out with pre-cooling time sensitivity to check its thermal integrity. In this study, thermal analysis on the FM target solid waste storage is performed. Finite volume method is utilized to numerically discretize and solve the geometry of interest. Analysis shows that the developed method can simulate temperature behavior during storage process, but needs to be checked against other code to see calculation accuracy. Highest temperature distribution is observed when every hole is filled with waste containers. Sensitivity results on pre-cooling time shows that at least 13 months of cooling is necessary to keep the structure integrity.

  13. SU-F-T-36: Dosimetric Comparison of Point Based Vs. Target Based Prescription for Intracavitary Brachytherapy in Cancer of the Cervix

    Energy Technology Data Exchange (ETDEWEB)

    Ashenafi, M; McDonald, D; Peng, J; Mart, C; Koch, N; Cooper, L; Vanek, K [Medical University of South Carolina, Charleston, SC (United States)

    2016-06-15

    Purpose: Improved patient imaging used for planning the treatment of cervical cancer with Tandem and Ovoid (T&O) Intracavitary high-dose-rate brachytherapy (HDR) now allows for 3D delineation of target volumes and organs-at-risk. However, historical data relies on the conventional point A-based planning technique. A comparative dosimetric study was performed by generating both target-based (TBP) and point-based (PBP) plans for ten clinical patients. Methods: Treatment plans created using Elekta Oncentra v. 4.3 for ten consecutive cervical cancer patients were analyzed. All patients were treated with HDR using the Utrecht T&O applicator. Both CT and MRI imaging modalities were utilized to delineate clinical target volume (CTV) and organs-at-risk (rectum, sigmoid, bladder, and small bowel). Point A (left and right), vaginal mucosa, and ICRU rectum and bladder points were defined on CT. Two plans were generated for each patient using two prescription methods (PBP and TBP). 7Gy was prescribed to each point A for each PBP plan and to the target D90% for each TBP plan. Target V90%, V100%, and V200% were evaluated. In addition, D0.1cc and D2cc were analyzed for each organ-at-risk. Differences were assessed for statistical significance (p<0.05) by use of Student’s t-test. Results: Target coverage was comparable for both planning methods, with each method providing adequate target coverage. TBP showed lower absolute dose to the target volume than PBP (D90% = 7.0Gy vs. 7.4Gy, p=0.028), (V200% = 10.9cc vs. 12.8cc, p=0.014), (ALeft = 6.4Gy vs. 7Gy, p=0.009), and (ARight = 6.4Gy vs. 7Gy, p=0.013). TBP also showed a statistically significant reduction in bladder, rectum, small bowel, and sigmoid doses compared to PBP. There was no statistically significant difference in vaginal mucosa or ICRU-defined rectum and bladder dose. Conclusion: Target based prescription resulted in substantially lower dose to delineated organs-at-risk compared to point based prescription, while

  14. Dual keel Space Station payload pointing system design and analysis feasibility study

    Science.gov (United States)

    Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.

    1988-01-01

    A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.

  15. Prediction methodologies for target scene generation in the aerothermal targets analysis program (ATAP)

    Science.gov (United States)

    Hudson, Douglas J.; Torres, Manuel; Dougherty, Catherine; Rajendran, Natesan; Thompson, Rhoe A.

    2003-09-01

    The Air Force Research Laboratory (AFRL) Aerothermal Targets Analysis Program (ATAP) is a user-friendly, engineering-level computational tool that features integrated aerodynamics, six-degree-of-freedom (6-DoF) trajectory/motion, convective and radiative heat transfer, and thermal/material response to provide an optimal blend of accuracy and speed for design and analysis applications. ATAP is sponsored by the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility at Eglin AFB, where it is used with the CHAMP (Composite Hardbody and Missile Plume) technique for rapid infrared (IR) signature and imagery predictions. ATAP capabilities include an integrated 1-D conduction model for up to 5 in-depth material layers (with options for gaps/voids with radiative heat transfer), fin modeling, several surface ablation modeling options, a materials library with over 250 materials, options for user-defined materials, selectable/definable atmosphere and earth models, multiple trajectory options, and an array of aerodynamic prediction methods. All major code modeling features have been validated with ground-test data from wind tunnels, shock tubes, and ballistics ranges, and flight-test data for both U.S. and foreign strategic and theater systems. Numerous applications include the design and analysis of interceptors, booster and shroud configurations, window environments, tactical missiles, and reentry vehicles.

  16. Homotopy analysis solutions of point kinetics equations with one delayed precursor group

    International Nuclear Information System (INIS)

    Zhu Qian; Luo Lei; Chen Zhiyun; Li Haofeng

    2010-01-01

    Homotopy analysis method is proposed to obtain series solutions of nonlinear differential equations. Homotopy analysis method was applied for the point kinetics equations with one delayed precursor group. Analytic solutions were obtained using homotopy analysis method, and the algorithm was analysed. The results show that the algorithm computation time and precision agree with the engineering requirements. (authors)

  17. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  18. Ergodic Capacity Analysis of Free-Space Optical Links with Nonzero Boresight Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique; Alouini, Mohamed-Slim; Cheng, Julian

    2015-01-01

    A unified capacity analysis of a free-space optical (FSO) link that accounts for nonzero boresight pointing errors and both types of detection techniques (i.e. intensity modulation/ direct detection as well as heterodyne detection) is addressed

  19. Stakeholder analysis and mapping as targeted communication strategy.

    Science.gov (United States)

    Shirey, Maria R

    2012-09-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author highlights the importance of stakeholder theory and discusses how to apply the theory to conduct a stakeholder analysis. This article also provides an explanation of how to use related stakeholder mapping techniques with targeted communication strategies.

  20. Molecular analysis of point mutations in a barley genome exposed to MNU and gamma rays

    Energy Technology Data Exchange (ETDEWEB)

    Kurowska, Marzena, E-mail: mkurowsk@us.edu.pl [Department of Genetics, Faculty of Biology and Environmental Protection, University of Silesia, Jagiellonska 28, 40-032 Katowice (Poland); Labocha-Pawlowska, Anna; Gnizda, Dominika; Maluszynski, Miroslaw; Szarejko, Iwona [Department of Genetics, Faculty of Biology and Environmental Protection, University of Silesia, Jagiellonska 28, 40-032 Katowice (Poland)

    2012-10-15

    We present studies aimed at determining the types and frequencies of mutations induced in the barley genome after treatment with chemical (N-methyl-N-nitrosourea, MNU) and physical (gamma rays) mutagens. We created M{sub 2} populations of a doubled haploid line and used them for the analysis of mutations in targeted DNA sequences and over an entire barley genome using TILLING (Targeting Induced Local Lesions in Genomes) and AFLP (Amplified Fragment Length Polymorphism) technique, respectively. Based on the TILLING analysis of the total DNA sequence of 4,537,117 bp in the MNU population, the average mutation density was estimated as 1/504 kb. Only one nucleotide change was found after an analysis of 3,207,444 bp derived from the highest dose of gamma rays applied. MNU was clearly a more efficient mutagen than gamma rays in inducing point mutations in barley. The majority (63.6%) of the MNU-induced nucleotide changes were transitions, with a similar number of G > A and C > T substitutions. The similar share of G > A and C > T transitions indicates a lack of bias in the repair of O{sup 6}-methylguanine lesions between DNA strands. There was, however, a strong specificity of the nucleotide surrounding the O{sup 6}-meG at the -1 position. Purines formed 81% of nucleotides observed at the -1 site. Scanning the barley genome with AFLP markers revealed ca. a three times higher level of AFLP polymorphism in MNU-treated as compared to the gamma-irradiated population. In order to check whether AFLP markers can really scan the whole barley genome for mutagen-induced polymorphism, 114 different AFLP products, were cloned and sequenced. 94% of bands were heterogenic, with some bands containing up to 8 different amplicons. The polymorphic AFLP products were characterised in terms of their similarity to the records deposited in a GenBank database. The types of sequences present in the polymorphic bands reflected the organisation of the barley genome.

  1. Ideal MHD stability analysis of KSTAR target AT mode

    International Nuclear Information System (INIS)

    Yi, S.M.; Kim, J.H.; You, K.I.; Kim, J.Y.

    2009-01-01

    Full text: A main research objective of KSTAR (Korea Superconducting Tokamak Advanced Research) device is to demonstrate the steady-state operation capability of high-performance AT (Advanced Tokamak) mode. To meet this goal, it is critical for KSTAR to have a good MHD stability boundary, particularly against the high-beta ideal instabilities such as the external kink and the ballooning modes. To support this MHD stability KSTAR has been designed to have a strong plasma shape and a close interval between plasma and passive- plate wall. During the conceptual design phase of KSTAR, a preliminary study was performed to estimate the high beta MHD stability limit of KSTAR target AT mode using PEST and VACUUM codes and it was shown that the target AT mode can be stable up to β N ∼ 5 with a well-defined plasma pressure and current profiles. Recently, a new calculation has been performed to estimate the ideal stability limit in various KSTAR operating conditions using DCON code, and it has been observed that there is some difference between the new and old calculation results, particularly in the dependence of the maximum β N value on the toroidal mode number. Here, we thus present a more detailed analysis of the ideal MHD stability limit of KSTAR target AT mode using various codes, which include GATO as well as PEST and DCON, in the comparison of calculation results among the three codes. (author)

  2. Analysis of relationship between registration performance of point cloud statistical model and generation method of corresponding points

    International Nuclear Information System (INIS)

    Yamaoka, Naoto; Watanabe, Wataru; Hontani, Hidekata

    2010-01-01

    Most of the time when we construct statistical point cloud model, we need to calculate the corresponding points. Constructed statistical model will not be the same if we use different types of method to calculate the corresponding points. This article proposes the effect to statistical model of human organ made by different types of method to calculate the corresponding points. We validated the performance of statistical model by registering a surface of an organ in a 3D medical image. We compare two methods to calculate corresponding points. The first, the 'Generalized Multi-Dimensional Scaling (GMDS)', determines the corresponding points by the shapes of two curved surfaces. The second approach, the 'Entropy-based Particle system', chooses corresponding points by calculating a number of curved surfaces statistically. By these methods we construct the statistical models and using these models we conducted registration with the medical image. For the estimation, we use non-parametric belief propagation and this method estimates not only the position of the organ but also the probability density of the organ position. We evaluate how the two different types of method that calculates corresponding points affects the statistical model by change in probability density of each points. (author)

  3. Global analysis of small molecule binding to related protein targets.

    Directory of Open Access Journals (Sweden)

    Felix A Kruger

    2012-01-01

    Full Text Available We report on the integration of pharmacological data and homology information for a large scale analysis of small molecule binding to related targets. Differences in small molecule binding have been assessed for curated pairs of human to rat orthologs and also for recently diverged human paralogs. Our analysis shows that in general, small molecule binding is conserved for pairs of human to rat orthologs. Using statistical tests, we identified a small number of cases where small molecule binding is different between human and rat, some of which had previously been reported in the literature. Knowledge of species specific pharmacology can be advantageous for drug discovery, where rats are frequently used as a model system. For human paralogs, we demonstrate a global correlation between sequence identity and the binding of small molecules with equivalent affinity. Our findings provide an initial general model relating small molecule binding and sequence divergence, containing the foundations for a general model to anticipate and predict within-target-family selectivity.

  4. Simultaneous colour visualizations of multiple ALS point cloud attributes for land cover and vegetation analysis

    Science.gov (United States)

    Zlinszky, András; Schroiff, Anke; Otepka, Johannes; Mandlburger, Gottfried; Pfeifer, Norbert

    2014-05-01

    LIDAR point clouds hold valuable information for land cover and vegetation analysis, not only in the spatial distribution of the points but also in their various attributes. However, LIDAR point clouds are rarely used for visual interpretation, since for most users, the point cloud is difficult to interpret compared to passive optical imagery. Meanwhile, point cloud viewing software is available allowing interactive 3D interpretation, but typically only one attribute at a time. This results in a large number of points with the same colour, crowding the scene and often obscuring detail. We developed a scheme for mapping information from multiple LIDAR point attributes to the Red, Green, and Blue channels of a widely used LIDAR data format, which are otherwise mostly used to add information from imagery to create "photorealistic" point clouds. The possible combinations of parameters are therefore represented in a wide range of colours, but relative differences in individual parameter values of points can be well understood. The visualization was implemented in OPALS software, using a simple and robust batch script, and is viewer independent since the information is stored in the point cloud data file itself. In our case, the following colour channel assignment delivered best results: Echo amplitude in the Red, echo width in the Green and normalized height above a Digital Terrain Model in the Blue channel. With correct parameter scaling (but completely without point classification), points belonging to asphalt and bare soil are dark red, low grassland and crop vegetation are bright red to yellow, shrubs and low trees are green and high trees are blue. Depending on roof material and DTM quality, buildings are shown from red through purple to dark blue. Erroneously high or low points, or points with incorrect amplitude or echo width usually have colours contrasting from terrain or vegetation. This allows efficient visual interpretation of the point cloud in planar

  5. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  6. Protein targeting in the analysis of learning and memory: a potential alternative to gene targeting.

    Science.gov (United States)

    Gerlai, R; Williams, S P; Cairns, B; Van Bruggen, N; Moran, P; Shih, A; Caras, I; Sauer, H; Phillips, H S; Winslow, J W

    1998-11-01

    Gene targeting using homologous recombination in embryonic stem (ES) cells offers unprecedented precision with which one may manipulate single genes and investigate the in vivo effects of defined mutations in the mouse. Geneticists argue that this technique abrogates the lack of highly specific pharmacological tools in the study of brain function and behavior. However, by now it has become clear that gene targeting has some limitations too. One problem is spatial and temporal specificity of the generated mutation, which may appear in multiple brain regions or even in other organs and may also be present throughout development, giving rise to complex, secondary phenotypical alterations. This may be a disadvantage in the functional analysis of a number of genes associated with learning and memory processes. For example, several proteins, including neurotrophins--cell-adhesion molecules--and protein kinases, that play a significant developmental role have recently been suggested to be also involved in neural and behavioral plasticity. Knocking out genes of such proteins may lead to developmental alterations or even embryonic lethality in the mouse, making it difficult to study their function in neural plasticity, learning, and memory. Therefore, alternative strategies to gene targeting may be needed. Here, we suggest a potentially useful in vivo strategy based on systemic application of immunoadhesins, genetically engineered fusion proteins possessing the Fc portion of the human IgG molecule and, for example, a binding domain of a receptor of interest. These proteins are stable in vivo and exhibit high binding specificity and affinity for the endogenous ligand of the receptor, but lack the ability to signal. Thus, if delivered to the brain, immunoadhesins may specifically block signalling of the receptor of interest. Using osmotic minipumps, the protein can be infused in a localized region of the brain for a specified period of time (days or weeks). Thus, the location

  7. Analysis of kinematically redundant reaching movements using the equilibrium-point hypothesis.

    Science.gov (United States)

    Cesari, P; Shiratori, T; Olivato, P; Duarte, M

    2001-03-01

    Six subjects performed a planar reaching arm movement to a target while unpredictable perturbations were applied to the endpoint; the perturbations consisted of pulling springs having different stiffness. Two conditions were applied; in the first, subjects had to reach for the target despite the perturbation, in the second condition, the subjects were asked to not correct the motion as a perturbation was applied. We analyzed the kinematics profiles of the three arm segments and, by means of inverse dynamics, calculated the joint torques. The framework of the equilibrium-point (EP) hypothesis, the lambda model, allowed the reconstruction of the control variables, the "equilibrium trajectories", in the "do not correct" condition for the wrist and the elbow joints as well as for the end point final position, while for the other condition, the reconstruction was less reliable. The findings support and extend to a multiple-joint planar movement, the paradigm of the EP hypothesis along with the "do not correct" instruction.

  8. Environmental protection standards - from the point of view of systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, K

    1978-11-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors.

  9. Environmental protection standards - from the point of view of systems analysis

    International Nuclear Information System (INIS)

    Becker, K.

    1978-01-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors. (orig.) [de

  10. Challenges in thermal and hydraulic analysis of ADS target systems

    International Nuclear Information System (INIS)

    Groetzbach, G.; Batta, A.; Lefhalm, C.-H.; Otic, I.

    2004-01-01

    The liquid metal cooled spallation targets of Accelerator Driven nuclear reactor Systems obey high thermal loads; in addition some flow and cooling conditions are of a prototypical character; in contrast the operating conditions for the engaged materials are narrow; thus, the target development requires a very careful analysis by experimental and numerical means. Especially the cooling of the steel window, which is heated by the proton beam, needs special care. Some of the main goals of the experimental and numerical analyses of the thermal dynamics of those systems are discusses. The prediction of locally detached flows and of flows with larger recirculation areas suffers from insufficient turbulence modeling; this has to be compensated by using prototypical model experiments, e.g. with water, to select the adequate models and numerical schemes. The well known problems with the Reynolds analogy in predicting the heat transfer in liquid metals requires always prototypic liquid metal experiments to select and adapt the turbulent heat flux models. The uncertainties in liquid metal experiments cannot be neglected; so it is necessary to perform CFD calculations and experiments always hand in hand and to develop improve turbulent heat flux models. One contribution to an improved 3 or 4-equation model is deduced from recent Direct Numerical Simulation (DNS) data. (author)

  11. SeedVicious: Analysis of microRNA target and near-target sites.

    Science.gov (United States)

    Marco, Antonio

    2018-01-01

    Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.

  12. Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud

    Science.gov (United States)

    Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.

    2018-04-01

    In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.

  13. Analysis of point source size on measurement accuracy of lateral point-spread function of confocal Raman microscopy

    Science.gov (United States)

    Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang

    2018-01-01

    Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.

  14. In-depth resistome analysis by targeted metagenomics.

    Science.gov (United States)

    Lanza, Val F; Baquero, Fernando; Martínez, José Luís; Ramos-Ruíz, Ricardo; González-Zorn, Bruno; Andremont, Antoine; Sánchez-Valenzuela, Antonio; Ehrlich, Stanislav Dusko; Kennedy, Sean; Ruppé, Etienne; van Schaik, Willem; Willems, Rob J; de la Cruz, Fernando; Coque, Teresa M

    2018-01-15

    Antimicrobial resistance is a major global health challenge. Metagenomics allows analyzing the presence and dynamics of "resistomes" (the ensemble of genes encoding antimicrobial resistance in a given microbiome) in disparate microbial ecosystems. However, the low sensitivity and specificity of available metagenomic methods preclude the detection of minority populations (often present below their detection threshold) and/or the identification of allelic variants that differ in the resulting phenotype. Here, we describe a novel strategy that combines targeted metagenomics using last generation in-solution capture platforms, with novel bioinformatics tools to establish a standardized framework that allows both quantitative and qualitative analyses of resistomes. We developed ResCap, a targeted sequence capture platform based on SeqCapEZ (NimbleGene) technology, which includes probes for 8667 canonical resistance genes (7963 antibiotic resistance genes and 704 genes conferring resistance to metals or biocides), and 2517 relaxase genes (plasmid markers) and 78,600 genes homologous to the previous identified targets (47,806 for antibiotics and 30,794 for biocides or metals). Its performance was compared with metagenomic shotgun sequencing (MSS) for 17 fecal samples (9 humans, 8 swine). ResCap significantly improves MSS to detect "gene abundance" (from 2.0 to 83.2%) and "gene diversity" (26 versus 14.9 genes unequivocally detected per sample per million of reads; the number of reads unequivocally mapped increasing up to 300-fold by using ResCap), which were calculated using novel bioinformatic tools. ResCap also facilitated the analysis of novel genes potentially involved in the resistance to antibiotics, metals, biocides, or any combination thereof. ResCap, the first targeted sequence capture, specifically developed to analyze resistomes, greatly enhances the sensitivity and specificity of available metagenomic methods and offers the possibility to analyze genes

  15. Nonlinear bending and collapse analysis of a poked cylinder and other point-loaded cylinders

    International Nuclear Information System (INIS)

    Sobel, L.H.

    1983-06-01

    This paper analyzes the geometrically nonlinear bending and collapse behavior of an elastic, simply supported cylindrical shell subjected to an inward-directed point load applied at midlength. The large displacement analysis results for this thin (R/t = 638) poked cylinder were obtained from the STAGSC-1 finite element computer program. STAGSC-1 results are also presented for two other point-loaded shell problems: a pinched cylinder (R/t = 100), and a venetian blind (R/t = 250)

  16. Numerical analysis of sandwich beam with corrugated core under three-point bending

    Energy Technology Data Exchange (ETDEWEB)

    Wittenbeck, Leszek [Poznan University of Technology, Institute of Mathematics Piotrowo Street No. 5, 60-965 Poznan (Poland); Grygorowicz, Magdalena; Paczos, Piotr [Poznan University of Technology, Institute of Applied Mechanics Jana Pawla IIStreet No. 24, 60-965 Poznan (Poland)

    2015-03-10

    The strength problem of sandwich beam with corrugated core under three-point bending is presented.The beam are made of steel and formed by three mutually orthogonal corrugated layers. The finite element analysis (FEA) of the sandwich beam is performed with the use of the FEM system - ABAQUS. The relationship between the applied load and deflection in three-point bending is considered.

  17. Watershed-based point sources permitting strategy and dynamic permit-trading analysis.

    Science.gov (United States)

    Ning, Shu-Kuang; Chang, Ni-Bin

    2007-09-01

    Permit-trading policy in a total maximum daily load (TMDL) program may provide an additional avenue to produce environmental benefit, which closely approximates what would be achieved through a command and control approach, with relatively lower costs. One of the important considerations that might affect the effective trading mechanism is to determine the dynamic transaction prices and trading ratios in response to seasonal changes of assimilative capacity in the river. Advanced studies associated with multi-temporal spatially varied trading ratios among point sources to manage water pollution hold considerable potential for industries and policy makers alike. This paper aims to present an integrated simulation and optimization analysis for generating spatially varied trading ratios and evaluating seasonal transaction prices accordingly. It is designed to configure a permit-trading structure basin-wide and provide decision makers with a wealth of cost-effective, technology-oriented, risk-informed, and community-based management strategies. The case study, seamlessly integrating a QUAL2E simulation model with an optimal waste load allocation (WLA) scheme in a designated TMDL study area, helps understand the complexity of varying environmental resources values over space and time. The pollutants of concern in this region, which are eligible for trading, mainly include both biochemical oxygen demand (BOD) and ammonia-nitrogen (NH3-N). The problem solution, as a consequence, suggests an array of waste load reduction targets in a well-defined WLA scheme and exhibits a dynamic permit-trading framework among different sub-watersheds in the study area. Research findings gained in this paper may extend to any transferable dynamic-discharge permit (TDDP) program worldwide.

  18. Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing

    Science.gov (United States)

    Pitone, D. S.; Klein, J. R.; Twambly, B. J.

    1990-01-01

    Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.

  19. Space-based infrared sensors of space target imaging effect analysis

    Science.gov (United States)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  20. STRUCTURE LINE DETECTION FROM LIDAR POINT CLOUDS USING TOPOLOGICAL ELEVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    C. Y. Lo

    2012-07-01

    Full Text Available Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  1. Error Analysis of Fast Moving Target Geo-location in Wide Area Surveillance Ground Moving Target Indication Mode

    Directory of Open Access Journals (Sweden)

    Zheng Shi-chao

    2013-12-01

    Full Text Available As an important mode in airborne radar systems, Wide Area Surveillance Ground Moving Target Indication (WAS-GMTI mode has the ability of monitoring a large area in a short time, and then the detected moving targets can be located quickly. However, in real environment, many factors introduce considerable errors into the location of moving targets. In this paper, a fast location method based on the characteristics of the moving targets in WAS-GMTI mode is utilized. And in order to improve the location performance, those factors that introduce location errors are analyzed and moving targets are relocated. Finally, the analysis of those factors is proved to be reasonable by simulation and real data experiments.

  2. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  3. The study and analysis of point-to-point vibration isolation and its utility to seismic base isolator

    International Nuclear Information System (INIS)

    Mehboob, M.; Qureshi, A.S.

    2001-01-01

    This paper presents systematic approach to regarding the piece wise vibration isolation generally termed as point-to-point vibration isolation system, and its broad spectrum-utilities to an economic seismic base isolation. Transfer of curves for coulomb damped i.e. softening damper flexible mountings are presented and the utility has been proved equally good for both rigidly and elastically coupled damping. It is clearly shown that the very closest solutions are easily obtainable for both slipping and sticking nature of phases of the motion. This eliminates the conventional and conceptual approximations based on the linearization of the damping. This new concept will not endanger-super-structure if mounted on such isolation systems. (author)

  4. Study on characteristic points of boiling curve by using wavelet analysis and genetic algorithm

    International Nuclear Information System (INIS)

    Wei Huiming; Su Guanghui; Qiu Suizheng; Yang Xingbo

    2009-01-01

    Based on the wavelet analysis theory of signal singularity detection,the critical heat flux (CHF) and minimum film boiling starting point (q min ) of boiling curves can be detected and analyzed by using the wavelet multi-resolution analysis. To predict the CHF in engineering, empirical relations were obtained based on genetic algorithm. The results of wavelet detection and genetic algorithm prediction are consistent with experimental data very well. (authors)

  5. One-point fluctuation analysis of the high-energy neutrino sky

    DEFF Research Database (Denmark)

    Feyereisen, Michael R.; Tamborra, Irene; Ando, Shin'ichiro

    2017-01-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even...

  6. Multiscale change-point analysis of inhomogeneous Poisson processes using unbalanced wavelet decompositions

    NARCIS (Netherlands)

    Jansen, M.H.; Di Bucchianico, A.; Mattheij, R.M.M.; Peletier, M.A.

    2006-01-01

    We present a continuous wavelet analysis of count data with timevarying intensities. The objective is to extract intervals with significant intensities from background intervals. This includes the precise starting point of the significant interval, its exact duration and the (average) level of

  7. Chopped or long roughage: what do calves prefer? Using cross point analysis of double demand functions

    NARCIS (Netherlands)

    Webb, L.E.; Bak Jensen, M.; Engel, B.; Reenen, van C.G.; Gerrits, W.J.J.; Boer, de I.J.M.; Bokkers, E.A.M.

    2014-01-01

    The present study aimed to quantify calves'(Bos taurus) preference for long versus chopped hay and straw, and hay versus straw, using cross point analysis of double demand functions, in a context where energy intake was not a limiting factor. Nine calves, fed milk replacer and concentrate, were

  8. An Exploratory Study: A Kinesic Analysis of Academic Library Public Service Points

    Science.gov (United States)

    Kazlauskas, Edward

    1976-01-01

    An analysis of body movements of individuals at reference and circulation public service points in four academic libraries indicated that both receptive and nonreceptive nonverbal behaviors were used by all levels of library employees, and these behaviors influenced patron interaction. (Author/LS)

  9. Microchip capillary electrophoresis for point-of-care analysis of lithium

    NARCIS (Netherlands)

    Vrouwe, E.X.; Luttge, R.; Vermes, I.; Berg, van den A.

    2007-01-01

    Background: Microchip capillary electrophoresis (CE) is a promising method for chemical analysis of complex samples such as whole blood. We evaluated the method for point-of-care testing of lithium. Methods: Chemical separation was performed on standard glass microchip CE devices with a conductivity

  10. Qweak Data Analysis for Target Modeling Using Computational Fluid Dynamics

    Science.gov (United States)

    Moore, Michael; Covrig, Silviu

    2015-04-01

    The 2.5 kW liquid hydrogen (LH2) target used in the Qweak parity violation experiment is the highest power LH2 target in the world and the first to be designed with Computational Fluid Dynamics (CFD) at Jefferson Lab. The Qweak experiment determined the weak charge of the proton by measuring the parity-violating elastic scattering asymmetry of longitudinally polarized electrons from unpolarized liquid hydrogen at small momentum transfer (Q2 = 0 . 025 GeV2). This target met the design goals of bench-marked with the Qweak target data. This work is an essential ingredient in future designs of very high power low noise targets like MOLLER (5 kW, target noise asymmetry contribution < 25 ppm) and MESA (4.5 kW).

  11. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  12. Transmutation technology development; thermal hydraulic power analysis and structure analysis of the HYPER target beam window

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J. H.; Ju, E. S.; Song, M. K.; Jeon, Y. Z. [Gyeongsang National University, Jinju (Korea)

    2002-03-01

    A thermal hydraulic power analysis, a structure analysis and optimization computation for some design factor for the design of spallation target suitable for HYPER with 1000 MW thermal power in this study was performed. Heat generation formula was used which was evaluated recently based on the LAHET code, mainly to find the maximum beam current under given computation conditions. Thermal hydraulic power of HYPER target system was calculated using FLUENT code, structure conducted by inputting the data into ANSYS. On the temp of beam windows and the pressure distribution calculated using FLUENT. Data transformation program was composed apply the data calculated using FLUENT being commercial CFD code and ANSYS being FEM code for CFX structure analysis. A basic study was conducted on various singular target to obtain fundamental data on the shape for optimum target design. A thermal hydraulic power analysis and structure analysis were conducted on the shapes of parabolic, uniform, scanning beams to choose the optimum shape of beam current analysis was done according to some turbulent model to simulate the real flow. To evaluate the reliability of numerical analysis result, benchmarking of FLUENT code reformed at SNU and Korea Advanced Institute of Science and Technology and it was compared to CFX in the possession of Korea Atomic Energy Research Institute and evaluated. Reliable deviation was observed in the results calculated using FLUENT code, but temperature deviation of about 200 .deg. C was observed in the result from CFX analysis at optimum design condition. Several benchmarking were performed on the basis of numerical analysis concerning conventional HYPER. It was possible to allow a beam arrests of 17.3 mA in the case of the {phi} 350 mm parabolic beam suggested to the optimum in nuclear transmutation when stress equivalent to VON-MISES was calculated to be 140 MPa. 29 refs., 109 figs. (Author)

  13. Nonlinear consider covariance analysis using a sigma-point filter formulation

    Science.gov (United States)

    Lisano, Michael E.

    2006-01-01

    The research reported here extends the mathematical formulation of nonlinear, sigma-point estimators to enable consider covariance analysis for dynamical systems. This paper presents a novel sigma-point consider filter algorithm, for consider-parameterized nonlinear estimation, following the unscented Kalman filter (UKF) variation on the sigma-point filter formulation, which requires no partial derivatives of dynamics models or measurement models with respect to the parameter list. It is shown that, consistent with the attributes of sigma-point estimators, a consider-parameterized sigma-point estimator can be developed entirely without requiring the derivation of any partial-derivative matrices related to the dynamical system, the measurements, or the considered parameters, which appears to be an advantage over the formulation of a linear-theory sequential consider estimator. It is also demonstrated that a consider covariance analysis performed with this 'partial-derivative-free' formulation yields equivalent results to the linear-theory consider filter, for purely linear problems.

  14. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  15. [Analysis and research on cleaning points of HVAC systems in public places].

    Science.gov (United States)

    Yang, Jiaolan; Han, Xu; Chen, Dongqing; Jin, Xin; Dai, Zizhu

    2010-03-01

    To analyze cleaning points of HVAC systems, and to provides scientific base for regulating the cleaning of HVAC systems. Based on the survey results on the cleaning situation of HVAC systems around China for the past three years, we analyzes the cleaning points of HVAC systems from various aspects, such as the major health risk factors of HVAC systems, the formulation strategy of the cleaning of HVAC systems, cleaning methods and acceptance points of the air ducts and the parts of HVAC systems, the onsite protection and individual protection, the waste treatment and the cleaning of the removed equipment, inspection of the cleaning results, video record, and the final acceptance of the cleaning. The analysis of the major health risk factors of HVAC systems and the formulation strategy of the cleaning of HVAC systems is given. The specific methods for cleaning the air ducts, machine units, air ports, coil pipes and the water cooling towers of HVAC systems, the acceptance points of HVAC systems and the requirements of the report on the final acceptance of the cleaning of HVAC systems are proposed. By the analysis of the points of the cleaning of HVAC systems and proposal of corresponding measures, this study provides the base for the scientific and regular launch of the cleaning of HVAC systems, a novel technology service, and lays a foundation for the revision of the existing cleaning regulations, which may generate technical and social benefits to some extent.

  16. Dosimetric analysis at ICRU reference points in HDR-brachytherapy of cervical carcinoma.

    Science.gov (United States)

    Eich, H T; Haverkamp, U; Micke, O; Prott, F J; Müller, R P

    2000-01-01

    In vivo dosimetry in bladder and rectum as well as determining doses on suggested reference points following the ICRU report 38 contribute to quality assurance in HDR-brachytherapy of cervical carcinoma, especially to minimize side effects. In order to gain information regarding the radiation exposure at ICRU reference points in rectum, bladder, ureter and regional lymph nodes those were calculated (digitalisation) by means of orthogonal radiographs of 11 applications in patients with cervical carcinoma, who received primary radiotherapy. In addition, the doses at the ICRU rectum reference point was compared to the results of in vivo measurements in the rectum. The in vivo measurements were by factor 1.5 below the doses determined for the ICRU rectum reference point (4.05 +/- 0.68 Gy versus 6.11 +/- 1.63 Gy). Reasons for this were: calibration errors, non-orthogonal radiographs, movement of applicator and probe in the time span between X-ray and application, missing connection of probe and anterior rectal wall. The standard deviation of calculations at ICRU reference points was on average +/- 30%. Possible reasons for the relatively large standard deviation were difficulties in defining the points, identifying them on radiographs and the different locations of the applicators. Although 3 D CT, US or MR based treatment planning using dose volume histogram analysis is more and more established, this simple procedure of marking and digitising the ICRU reference points lengthened treatment planning only by 5 to 10 minutes. The advantages of in vivo dosimetry are easy practicability and the possibility to determine rectum doses during radiation. The advantages of computer-aided planning at ICRU reference points are that calculations are available before radiation and that they can still be taken into account for treatment planning. Both methods should be applied in HDR-brachytherapy of cervical carcinoma.

  17. IMAGE-PLANE ANALYSIS OF n-POINT-MASS LENS CRITICAL CURVES AND CAUSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Danek, Kamil; Heyrovský, David, E-mail: kamil.danek@utf.mff.cuni.cz, E-mail: heyrovsky@utf.mff.cuni.cz [Institute of Theoretical Physics, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)

    2015-06-10

    The interpretation of gravitational microlensing events caused by planetary systems or multiple stars is based on the n-point-mass lens model. The first planets detected by microlensing were well described by the two-point-mass model of a star with one planet. By the end of 2014, four events involving three-point-mass lenses had been announced. Two of the lenses were stars with two planetary companions each; two were binary stars with a planet orbiting one component. While the two-point-mass model is well understood, the same cannot be said for lenses with three or more components. Even the range of possible critical-curve topologies and caustic geometries of the three-point-mass lens remains unknown. In this paper we provide new tools for mapping the critical-curve topology and caustic cusp number in the parameter space of n-point-mass lenses. We perform our analysis in the image plane of the lens. We show that all contours of the Jacobian are critical curves of re-scaled versions of the lens configuration. Utilizing this property further, we introduce the cusp curve to identify cusp-image positions on all contours simultaneously. In order to track cusp-number changes in caustic metamorphoses, we define the morph curve, which pinpoints the positions of metamorphosis-point images along the cusp curve. We demonstrate the usage of both curves on simple two- and three-point-mass lens examples. For the three simplest caustic metamorphoses we illustrate the local structure of the image and source planes.

  18. CRISPRTarget: bioinformatic prediction and analysis of crRNA targets

    NARCIS (Netherlands)

    Biswas, A.; Gagnon, J.N.; Brouns, S.J.J.; Fineran, P.C.; Brown, C.M.

    2013-01-01

    The bacterial and archaeal CRISPR/Cas adaptive immune system targets specific protospacer nucleotide sequences in invading organisms. This requires base pairing between processed CRISPR RNA and the target protospacer. For type I and II CRISPR/Cas systems, protospacer adjacent motifs (PAM) are

  19. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  20. Comparative analysis among several methods used to solve the point kinetic equations

    International Nuclear Information System (INIS)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da

    2007-01-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  1. Comparative analysis among several methods used to solve the point kinetic equations

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; E-mails: alupo@if.ufrj.br; agoncalves@con.ufrj.br; aquilino@lmp.ufrj.br; fernando@con.ufrj.br

    2007-07-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  2. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  3. Antimalarial drug targets in Plasmodium falciparum predicted by stage-specific metabolic network analysis

    Directory of Open Access Journals (Sweden)

    Huthmacher Carola

    2010-08-01

    Full Text Available Abstract Background Despite enormous efforts to combat malaria the disease still afflicts up to half a billion people each year of which more than one million die. Currently no approved vaccine is available and resistances to antimalarials are widely spread. Hence, new antimalarial drugs are urgently needed. Results Here, we present a computational analysis of the metabolism of Plasmodium falciparum, the deadliest malaria pathogen. We assembled a compartmentalized metabolic model and predicted life cycle stage specific metabolism with the help of a flux balance approach that integrates gene expression data. Predicted metabolite exchanges between parasite and host were found to be in good accordance with experimental findings when the parasite's metabolic network was embedded into that of its host (erythrocyte. Knock-out simulations identified 307 indispensable metabolic reactions within the parasite. 35 out of 57 experimentally demonstrated essential enzymes were recovered and another 16 enzymes, if additionally the assumption was made that nutrient uptake from the host cell is limited and all reactions catalyzed by the inhibited enzyme are blocked. This predicted set of putative drug targets, shown to be enriched with true targets by a factor of at least 2.75, was further analyzed with respect to homology to human enzymes, functional similarity to therapeutic targets in other organisms and their predicted potency for prophylaxis and disease treatment. Conclusions The results suggest that the set of essential enzymes predicted by our flux balance approach represents a promising starting point for further drug development.

  4. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  5. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.

  6. Analysis of payload bay magnetic fields due to dc power multipoint and single point ground configurations

    Science.gov (United States)

    Lawton, R. M.

    1976-01-01

    An analysis of magnetic fields in the Orbiter Payload Bay resulting from the present grounding configuration (structure return) was presented and the amount of improvement that would result from installing wire returns for the three dc power buses was determined. Ac and dc magnetic fields at five points in a cross-section of the bay are calculated for both grounding configurations. Y and Z components of the field at each point are derived in terms of a constant coefficient and the current amplitude of each bus. The dc loads assumed are 100 Amperes for each bus. The ac noise current used is a spectrum 6 db higher than the Orbiter equipment limit for narrowband conducted emissions. It was concluded that installing return wiring to provide a single point ground for the dc Buses in the Payload Bay would reduce the ac and dc magnetic field intensity by approximately 30 db.

  7. Targeting gender: A content analysis of alcohol advertising in magazines.

    Science.gov (United States)

    Jung, A-Reum; Hovland, Roxanne

    2016-01-01

    Creating target specific advertising is fundamental to maximizing advertising effectiveness. When crafting an advertisement, message and creative strategies are considered important because they affect target audiences' attitudes toward advertised products. This study endeavored to find advertising strategies that are likely to have special appeal for men or women by examining alcohol advertising in magazines. The results show that the substance of the messages is the same for men and women, but they only differ in terms of presentation. However, regardless of gender group, the most commonly used strategies in alcohol advertising are appeals to the target audience's emotions.

  8. Field evaluation of a rapid point-of-care assay for targeting antibiotic treatment for trachoma control: a comparative study.

    Science.gov (United States)

    Michel, Claude-Edouard C; Solomon, Anthony W; Magbanua, Jose P V; Massae, Patrick A; Huang, Ling; Mosha, Jonaice; West, Sheila K; Nadala, Elpidio C B; Bailey, Robin; Wisniewski, Craig; Mabey, David C W; Lee, Helen H

    2006-05-13

    Trachoma results from repeated episodes of conjunctival infection with Chlamydia trachomatis and is the leading infectious cause of blindness. To eliminate trachoma, control programmes use the SAFE strategy (Surgery, Antibiotics, Face cleanliness, and Environmental improvement). The A component is designed to treat C trachomatis infection, and is initiated on the basis of the prevalence of the clinical sign trachomatous inflammation-follicular (TF). Unfortunately, TF correlates poorly with C trachomatis infection. We sought to assess a newly developed point-of-care (POC) assay compared with presence of TF for guiding the use of antibiotics for trachoma control. We compared performance outcomes of the POC assay and presence of TF using commercial PCR as a comparator in 664 children aged 1-9 years in remote, trachoma-endemic villages in Tanzania. Signs of trachoma were graded according to the WHO simplified trachoma grading system. Of 664 participants, 128 (19%) were positive for ocular C trachomatis infection by PCR. Presence of TF had a sensitivity of 64.1% (95% CI 55.8-72.4), specificity of 80.2% (76.8-83.6), and positive predictive value of 43.6% (36.5-50.7). By contrast, the POC assay had a sensitivity of 83.6% (77.2-90.0), specificity of 99.4% (98.8-100.0), and positive predictive value of 97.3% (94.2-100.3). Interagreements and intra-agreements between four novice operators were 0.988 (0.973-1.000) and 0.950 (0.894-1.000), respectively. The POC assay is substantially more accurate than TF prevalence in identifying the presence or absence of infection. Additional studies should assess the use of the assay in the planning and monitoring of trachoma control activities.

  9. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  10. System implementation of hazard analysis and critical control points (HACCP) in a nitrogen production plant

    International Nuclear Information System (INIS)

    Barrantes Salazar, Alexandra

    2014-01-01

    System of hazard analysis and critical control points are deployed in a production plant of liquid nitrogen. The fact that the nitrogen has become a complement to food packaging to increase shelf life, or provide a surface that protect it from manipulation, has been the main objective. Analysis of critical control points for the nitrogen production plant has been the adapted methodology. The knowledge of both the standard and the production process, as well as the on site verification process, have been necessary. In addition, all materials and/or processing units that are found in contact with the raw material or the product under study were evaluated. Such a way that the intrinsic risks of each were detected, from the physical, chemical and biological points of view according to the origin or pollution source. For each found risk was evaluated the probability of occurrence according to the frequency and gravity of it, with these variables determined was achieved the definition of the type of risk detected. In the cases that was presented a greater risk or critical, these were subjected decision tree; with which is concluded the non determination of critical control points. However, for each one of them were established the maximum permitted limits. To generate each of the results it has literature or scientific reference of reliable provenance, where is indicated properly the support of the evaluated matter. In a general way, the material matrix and the process matrix are found without critical control points; so that the project is concluded in the analysis, and it has to generate without the monitoring system and verification. To increase this project is suggested in order to cover the packaging system of gaseous nitrogen, due to it was delimited to liquid nitrogen. Furthermore, the liquid nitrogen is a 100% automated and closed process so the introduction of contaminants is very reduced, unlike the gaseous nitrogen process. (author) [es

  11. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Science.gov (United States)

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  12. Monetary targeting and financial system characteristics : An empirical analysis

    NARCIS (Netherlands)

    Samarina, A..

    2012-01-01

    This paper investigates how reforms and characteristics of the financial system affect the likelihood of countries to abandon their strategy of monetary targeting. Apart from financial system characteristics, we include macroeconomic, fiscal, and institutional factors potentially associated with

  13. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  14. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oesterling, Patrick [Univ. of Leipzig (Germany). Computer Science Dept.; Heine, Christian [Univ. of Leipzig (Germany). Computer Science Dept.; Federal Inst. of Technology (ETH), Zurich (Switzerland). Dept. of Computer Science; Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Scheuermann, Gerik [Univ. of Leipzig (Germany). Computer Science Dept.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phase utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.

  15. Fuzzy Risk Analysis for a Production System Based on the Nagel Point of a Triangle

    Directory of Open Access Journals (Sweden)

    Handan Akyar

    2016-01-01

    Full Text Available Ordering and ranking fuzzy numbers and their comparisons play a significant role in decision-making problems such as social and economic systems, forecasting, optimization, and risk analysis problems. In this paper, a new method for ordering triangular fuzzy numbers using the Nagel point of a triangle is presented. With the aid of the proposed method, reasonable properties of ordering fuzzy numbers are verified. Certain comparative examples are given to illustrate the advantages of the new method. Many papers have been devoted to studies on fuzzy ranking methods, but some of these studies have certain shortcomings. The proposed method overcomes the drawbacks of the existing methods in the literature. The suggested method can order triangular fuzzy numbers as well as crisp numbers and fuzzy numbers with the same centroid point. An application to the fuzzy risk analysis problem is given, based on the suggested ordering approach.

  16. A Deep Learning Prediction Model Based on Extreme-Point Symmetric Mode Decomposition and Cluster Analysis

    OpenAIRE

    Li, Guohui; Zhang, Songling; Yang, Hong

    2017-01-01

    Aiming at the irregularity of nonlinear signal and its predicting difficulty, a deep learning prediction model based on extreme-point symmetric mode decomposition (ESMD) and clustering analysis is proposed. Firstly, the original data is decomposed by ESMD to obtain the finite number of intrinsic mode functions (IMFs) and residuals. Secondly, the fuzzy c-means is used to cluster the decomposed components, and then the deep belief network (DBN) is used to predict it. Finally, the reconstructed ...

  17. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  18. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  19. Identification of estrogen target genes during zebrafish embryonic development through transcriptomic analysis.

    Directory of Open Access Journals (Sweden)

    Ruixin Hao

    Full Text Available Estrogen signaling is important for vertebrate embryonic development. Here we have used zebrafish (Danio rerio as a vertebrate model to analyze estrogen signaling during development. Zebrafish embryos were exposed to 1 µM 17β-estradiol (E2 or vehicle from 3 hours to 4 days post fertilization (dpf, harvested at 1, 2, 3 and 4 dpf, and subjected to RNA extraction for transcriptome analysis using microarrays. Differentially expressed genes by E2-treatment were analyzed with hierarchical clustering followed by biological process and tissue enrichment analysis. Markedly distinct sets of genes were up and down-regulated by E2 at the four different time points. Among these genes, only the well-known estrogenic marker vtg1 was co-regulated at all time points. Despite this, the biological functional categories targeted by E2 were relatively similar throughout zebrafish development. According to knowledge-based tissue enrichment, estrogen responsive genes were clustered mainly in the liver, pancreas and brain. This was in line with the developmental dynamics of estrogen-target tissues that were visualized using transgenic zebrafish containing estrogen responsive elements driving the expression of GFP (Tg(5xERE:GFP. Finally, the identified embryonic estrogen-responsive genes were compared to already published estrogen-responsive genes identified in male adult zebrafish (Gene Expression Omnibus database. The expressions of a few genes were co-regulated by E2 in both embryonic and adult zebrafish. These could potentially be used as estrogenic biomarkers for exposure to estrogens or estrogenic endocrine disruptors in zebrafish. In conclusion, our data suggests that estrogen effects on early embryonic zebrafish development are stage- and tissue- specific.

  20. Analysis method of beam pointing stability based on optical transmission matrix

    Science.gov (United States)

    Wang, Chuanchuan; Huang, PingXian; Li, Xiaotong; Cen, Zhaofen

    2016-10-01

    Quite a lot of factors will make effects on beam pointing stability of an optical system, Among them, the element tolerance is one of the most important and common factors. In some large laser systems, it will make final micro beams spot on the image plane deviate obviously. So it is essential for us to achieve effective and accurate analysis theoretically on element tolerance. In order to make the analysis of beam pointing stability convenient and theoretical, we consider transmission of a single chief ray rather than beams approximately to stand for the whole spot deviation. According to optical matrix, we also simplify this complex process of light transmission to multiplication of many matrices. So that we can set up element tolerance model, namely having mathematical expression to illustrate spot deviation in an optical system with element tolerance. In this way, we can realize quantitative analysis of beam pointing stability theoretically. In second half of the paper, we design an experiment to get the spot deviation in a multipass optical system caused by element tolerance, then we adjust the tolerance step by step and compare the results with the datum got from tolerance model, finally prove the correction of tolerance model successfully.

  1. Genome-wide analysis of Polycomb targets in Drosophila

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, Yuri B.; Kahn, Tatyana G.; Nix, David A.; Li,Xiao-Yong; Bourgon, Richard; Biggin, Mark; Pirrotta, Vincenzo

    2006-04-01

    Polycomb Group (PcG) complexes are multiprotein assemblages that bind to chromatin and establish chromatin states leading to epigenetic silencing. PcG proteins regulate homeotic genes in flies and vertebrates but little is known about other PcG targets and the role of the PcG in development, differentiation and disease. We have determined the distribution of the PcG proteins PC, E(Z) and PSC and of histone H3K27 trimethylation in the Drosophila genome. At more than 200 PcG target genes, binding sites for the three PcG proteins colocalize to presumptive Polycomb Response Elements (PREs). In contrast, H3 me3K27 forms broad domains including the entire transcription unit and regulatory regions. PcG targets are highly enriched in genes encoding transcription factors but receptors, signaling proteins, morphogens and regulators representing all major developmental pathways are also included.

  2. Targets for bulk hydrogen analysis using thermal neutrons

    CERN Document Server

    Csikai, J; Buczko, C M

    2002-01-01

    The reflection property of substances can be characterized by the reflection cross-section of thermal neutrons, sigma subbeta. A combination of the targets with thin polyethylene foils allowed an estimation of the flux depression of thermal neutrons caused by a bulk sample containing highly absorbing elements or compounds. Some new and more accurate sigma subbeta values were determined by using the combined target arrangement. For the ratio, R of the reflection and the elastic scattering cross-sections of thermal neutrons, R=sigma subbeta/sigma sub E sub L a value of 0.60+-0.02 was found on the basis of the data obtained for a number of elements from H to Pb. Using this correlation factor, and the sigma sub E sub L values, the unknown sigma subbeta data can be deduced. The equivalent thicknesses, to polyethylene or hydrogen, of the different target materials were determined from the sigma subbeta values.

  3. Portable Dew Point Mass Spectrometry System for Real-Time Gas and Moisture Analysis

    Science.gov (United States)

    Arkin, C.; Gillespie, Stacey; Ratzel, Christopher

    2010-01-01

    A portable instrument incorporates both mass spectrometry and dew point measurement to provide real-time, quantitative gas measurements of helium, nitrogen, oxygen, argon, and carbon dioxide, along with real-time, quantitative moisture analysis. The Portable Dew Point Mass Spectrometry (PDP-MS) system comprises a single quadrupole mass spectrometer and a high vacuum system consisting of a turbopump and a diaphragm-backing pump. A capacitive membrane dew point sensor was placed upstream of the MS, but still within the pressure-flow control pneumatic region. Pressure-flow control was achieved with an upstream precision metering valve, a capacitance diaphragm gauge, and a downstream mass flow controller. User configurable LabVIEW software was developed to provide real-time concentration data for the MS, dew point monitor, and sample delivery system pressure control, pressure and flow monitoring, and recording. The system has been designed to include in situ, NIST-traceable calibration. Certain sample tubing retains sufficient water that even if the sample is dry, the sample tube will desorb water to an amount resulting in moisture concentration errors up to 500 ppm for as long as 10 minutes. It was determined that Bev-A-Line IV was the best sample line to use. As a result of this issue, it is prudent to add a high-level humidity sensor to PDP-MS so such events can be prevented in the future.

  4. [Evaluation of a new blood gas analysis system: RapidPoint 500(®)].

    Science.gov (United States)

    Nicolas, Thierry; Cabrolier, Nadège; Bardonnet, Karine; Davani, Siamak

    2013-01-01

    We present here evaluation of a new blood gas analysis system, RapidPoint 500(®) (Siemens Healthcare Diagnostics). The aim of this research was to compare the ergonomics and analytical performances of this analyser with those of the RapidLab 1265 for the following parameters: pH, partial oxygen pressure, partial carbon dioxide pressure, sodium, potassium, ionized calcium, lactate and the CO-oximetry parameters: hemoglobin, oxyhemoglobin, carboxyhemoglobin, methemoglobin, reduced hemoglobin, neonatal bilirubin; as well as with the Dimension Vista 500 results for chloride and glucose. The Valtec protocol, recommended by the French Society of Clinical Biology (SFBC), was used to analyze the study results. The experiment was carried out over a period of one month in the Department of medical biochemistry. One hundred sixty five samples from adult patients admitted to the ER or hospitalized in intensive care were tested. The RapidPoint 500(®) was highly satisfactory from an ergonomic point of view. Intra-and inter- assay coefficients of variation (CV) with the three control levels were below those recommended by the SFBC for all parameters, and the comparative study gave coefficients of determination higher than 0.91. Taken together, the RapidPoint 500(®) appears fully satisfactory in terms of ergonomics and analytical performance.

  5. Inflation targeting and inflation performance : a comparative analysis

    NARCIS (Netherlands)

    Samarina, Anna; De Haan, Jakob; Terpstra, M.

    2014-01-01

    This article examines how the impact of inflation targeting on inflation performance depends on the choice of country samples, adoption dates, time periods and methodological approaches. We apply two different estimation methods - difference-in-differences and propensity score matching - for our

  6. Proposition for Improvement of Economics Situation with Use of Analysis of Break Even Point

    OpenAIRE

    Starečková, Alena

    2015-01-01

    Bakalářská práce se zabývá realizací Break-even-point analýzy v podniku, analýzou nákladů a návrhem na zlepšení finanční situace podniku zejména z pohledu nákladů. V první části práce jsou vymezeny pojmy a vzorce týkajících se Break-even-point analýzy a problematiky nákladů. Ve druhé části pak na konkrétním podniku bude provedena analýza bodu zvratu a následné návrhy na zlepšení stávajícího stavu. Bachelor work is dealing with realization of break even point analysis of company, analysis o...

  7. Analysis of Myc-induced histone modifications on target chromatin.

    Directory of Open Access Journals (Sweden)

    Francesca Martinato

    Full Text Available The c-myc proto-oncogene is induced by mitogens and is a central regulator of cell growth and differentiation. The c-myc product, Myc, is a transcription factor that binds a multitude of genomic sites, estimated to be over 10-15% of all promoter regions. Target promoters generally pre-exist in an active or poised chromatin state that is further modified by Myc, contributing to fine transcriptional regulation (activation or repression of the afferent gene. Among other mechanisms, Myc recruits histone acetyl-transferases to target chromatin and locally promotes hyper-acetylation of multiple lysines on histones H3 and H4, although the identity and combination of the modified lysines is unknown. Whether Myc dynamically regulates other histone modifications (or marks at its binding sites also remains to be addressed. Here, we used quantitative chromatin immunoprecipitation (qChIP to profile a total of 24 lysine-acetylation and -methylation marks modulated by Myc at target promoters in a human B-cell line with a regulatable c-myc transgene. Myc binding promoted acetylation of multiple lysines, primarily of H3K9, H3K14, H3K18, H4K5 and H4K12, but significantly also of H4K8, H4K91 and H2AK5. Dimethylation of H3K79 was also selectively induced at target promoters. A majority of target promoters showed co-induction of multiple marks - in various combinations - correlating with recruitment of the two HATs tested (Tip60 and HBO1, incorporation of the histone variant H2A.Z and transcriptional activation. Based on this and previous findings, we surmise that Myc recruits the Tip60/p400 complex to achieve a coordinated histone acetylation/exchange reaction at activated promoters. Our data are also consistent with the additive and redundant role of multiple acetylation events in transcriptional activation.

  8. A protein-targeting strategy used to develop a selective inhibitor of the E17K point mutation in the PH domain of Akt1

    Science.gov (United States)

    Deyle, Kaycie M.; Farrow, Blake; Qiao Hee, Ying; Work, Jeremy; Wong, Michelle; Lai, Bert; Umeda, Aiko; Millward, Steven W.; Nag, Arundhati; Das, Samir; Heath, James R.

    2015-05-01

    Ligands that can bind selectively to proteins with single amino-acid point mutations offer the potential to detect or treat an abnormal protein in the presence of the wild type (WT). However, it is difficult to develop a selective ligand if the point mutation is not associated with an addressable location, such as a binding pocket. Here we report an all-chemical synthetic epitope-targeting strategy that we used to discover a 5-mer peptide with selectivity for the E17K-transforming point mutation in the pleckstrin homology domain of the Akt1 oncoprotein. A fragment of Akt1 that contained the E17K mutation and an I19[propargylglycine] substitution was synthesized to form an addressable synthetic epitope. Azide-presenting peptides that clicked covalently onto this alkyne-presenting epitope were selected from a library using in situ screening. One peptide exhibits a 10:1 in vitro selectivity for the oncoprotein relative to the WT, with a similar selectivity in cells. This 5-mer peptide was expanded into a larger ligand that selectively blocks the E17K Akt1 interaction with its PIP3 (phosphatidylinositol (3,4,5)-trisphosphate) substrate.

  9. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  10. Using thermal analysis techniques for identifying the flash point temperatures of some lubricant and base oils

    Directory of Open Access Journals (Sweden)

    Aksam Abdelkhalik

    2018-03-01

    Full Text Available The flash point (FP temperatures of some lubricant and base oils were measured according to ASTM D92 and ASTM D93. In addition, the thermal stability of the oils was studied using differential scanning calorimeter (DSC and thermogravimetric analysis (TGA under nitrogen atmosphere. The DSC results showed that the FP temperatures, for each oil, were found during the first decomposition step and the temperature at the peak of the first decomposition step was usually higher than FP temperatures. The TGA results indicated that the temperature at which 17.5% weigh loss take placed (T17.5% was nearly identical with the FP temperature (±10 °C that was measured according to ASTM D92. The deviation percentage between FP and T17.5% was in the range from −0.8% to 3.6%. Keywords: Flash point, TGA, DSC

  11. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  12. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming...... spatio-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates e.g. to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  13. Analysis of Point Based Image Registration Errors With Applications in Single Molecule Microscopy.

    Science.gov (United States)

    Cohen, E A K; Ober, R J

    2013-12-15

    We present an asymptotic treatment of errors involved in point-based image registration where control point (CP) localization is subject to heteroscedastic noise; a suitable model for image registration in fluorescence microscopy. Assuming an affine transform, CPs are used to solve a multivariate regression problem. With measurement errors existing for both sets of CPs this is an errors-in-variable problem and linear least squares is inappropriate; the correct method being generalized least squares. To allow for point dependent errors the equivalence of a generalized maximum likelihood and heteroscedastic generalized least squares model is achieved allowing previously published asymptotic results to be extended to image registration. For a particularly useful model of heteroscedastic noise where covariance matrices are scalar multiples of a known matrix (including the case where covariance matrices are multiples of the identity) we provide closed form solutions to estimators and derive their distribution. We consider the target registration error (TRE) and define a new measure called the localization registration error (LRE) believed to be useful, especially in microscopy registration experiments. Assuming Gaussianity of the CP localization errors, it is shown that the asymptotic distribution for the TRE and LRE are themselves Gaussian and the parameterized distributions are derived. Results are successfully applied to registration in single molecule microscopy to derive the key dependence of the TRE and LRE variance on the number of CPs and their associated photon counts. Simulations show asymptotic results are robust for low CP numbers and non-Gaussianity. The method presented here is shown to outperform GLS on real imaging data.

  14. Point defect characterization in HAADF-STEM images using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Sarahan, Michael C.; Chi, Miaofang; Masiel, Daniel J.; Browning, Nigel D.

    2011-01-01

    Quantitative analysis of point defects is demonstrated through the use of multivariate statistical analysis. This analysis consists of principal component analysis for dimensional estimation and reduction, followed by independent component analysis to obtain physically meaningful, statistically independent factor images. Results from these analyses are presented in the form of factor images and scores. Factor images show characteristic intensity variations corresponding to physical structure changes, while scores relate how much those variations are present in the original data. The application of this technique is demonstrated on a set of experimental images of dislocation cores along a low-angle tilt grain boundary in strontium titanate. A relationship between chemical composition and lattice strain is highlighted in the analysis results, with picometer-scale shifts in several columns measurable from compositional changes in a separate column. -- Research Highlights: → Multivariate analysis of HAADF-STEM images. → Distinct structural variations among SrTiO 3 dislocation cores. → Picometer atomic column shifts correlated with atomic column population changes.

  15. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system

    Directory of Open Access Journals (Sweden)

    Jing Cao

    2017-12-01

    Full Text Available Background: Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc. Methods: Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Results: Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Conclusion: Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. Impact of the study: This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing. Keywords: Epoc, Capillary, Transport, Blood gas, Point of care

  16. Thermal shock analysis of liquid-mercury spallation target

    CERN Document Server

    Ishikura, S; Futakawa, M; Hino, R; Date, H

    2002-01-01

    The developments of the neutron scattering facilities are carried out under the high-intensity proton accelerator project promoted by JAERI and KEK. To estimate the structural integrity of the heavy liquid-metal (Hg) target used as a spallation neutron source in a MW-class neutron scattering facility, dynamic stress behavior due to the incident of a 1 MW-pulsed proton beam was analyzed by using FEM code. Two-type target containers with semi-cylindrical type and flat-plate type window were used as models for analyses. As a result, it is confirmed that the stress (pressure wave) generated by dynamic thermal shock becomes the largest at the center of window, and the flat-plate type window is more advantageous from the structural viewpoint than the semi-cylindrical type window. It has been understood that the stress generated in the window by the pressure wave can be treated as the secondary stress. (author)

  17. Analysis of an XADS Target with the System Code TRACE

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Sanchez Espinoza, Victor H.; Feng, Bo

    2008-01-01

    Accelerator-driven systems (ADS) present an option to reduce the radioactive waste of the nuclear industry. The experimental Accelerator-Driven System (XADS) has been designed to investigate the feasibility of using ADS on an industrial scale to burn minor actinides. The target section lies in the middle of the subcritical core and is bombarded by a proton beam to produce spallation neutrons. The thermal energy produced from this reaction requires a heat removal system for the target section. The target is cooled by liquid lead-bismuth-eutectics (LBE) in the primary system which in turn transfers the heat via a heat exchanger (HX) to the secondary coolant, Diphyl THT (DTHT), a synthetic diathermic fluid. Since this design is still in development, a detailed investigation of the system is necessary to evaluate the behavior during normal and transient operations. Due to the lack of experimental facilities and data for ADS, the analyses are mostly done using thermal hydraulic codes. In addition to evaluating the thermal hydraulics of the XADS, this paper also benchmarks a new code developed by the NRC, TRACE, against other established codes. The events used in this study are beam power switch-on/off transients and a loss of heat sink accident. The obtained results from TRACE were in good agreement with the results of various other codes. (authors)

  18. Analysis of Mo99 production irradiating 20% U targets

    International Nuclear Information System (INIS)

    Calabrese, C. Ruben; Grant, Carlos R.; Marajofsky, Andres; Parkansky, David G.

    1999-01-01

    At present time, the National Atomic Energy Commission is producing about 800 Ci of Mo99 per week irradiating 90% enriched uranium-aluminum alloy plate targets in the RA-3 reactor, a 5 MW. Mtr type one. In order to change to 20% enriched uranium, and to increase the production to about 3000 Ci per week some configurations were studied with rod and plate geometry with uranium (20% enriched) -aluminum targets. The first case was the irradiation of a plate target element in the normal reactor configuration. Results showed a good efficiency, but both reactivity value and power density were too high. An element with rods was also analyzed, but results showed a poor efficiency, too much aluminum involved in the process, although a low reactivity and an acceptable rod power density. Finally, a solution consisting of plate elements with a Zircaloy cladding was adopted, which has shown not only a good efficiency, but it is also acceptable from the viewpoint of safety, heat transference criteria and feasibility

  19. CT-guided intracavitary radiotherapy for cervical cancer: Comparison of conventional point A plan with clinical target volume-based three-dimensional plan using dose-volume parameters

    International Nuclear Information System (INIS)

    Shin, Kyung Hwan; Kim, Tae Hyun; Cho, Jung Keun; Kim, Joo-Young; Park, Sung Yong; Park, Sang-Yoon; Kim, Dae Yong; Chie, Eui Kyu; Pyo, Hong Ryull; Cho, Kwan Ho

    2006-01-01

    Purpose: To perform an intracavitary radiotherapy (ICR) plan comparison between the conventional point A plan (conventional plan) and computed tomography (CT)-guided clinical target volume-based plan (CTV plan) by analysis of the quantitative dose-volume parameters and irradiated volumes of organs at risk in patients with cervical cancer. Methods and Materials: Thirty plans for 192 Ir high-dose-rate ICR after 30-40-Gy external beam radiotherapy were investigated. CT images were acquired at the first ICR session with artifact-free applicators in place. The gross tumor volume, clinical target volume (CTV), point A, and International Commission on Radiation Units and Measurements Report 38 rectal and bladder points were defined on reconstructed CT images. A fractional 100% dose was prescribed to point A in the conventional plan and to the outermost point to cover all CTVs in the CTV plan. The reference volume receiving 100% of the prescribed dose (V ref ), and the dose-volume parameters of the coverage index, conformal index, and external volume index were calculated from the dose-volume histogram. The bladder, rectal point doses, and percentage of volumes receiving 50%, 80%, and 100% of the prescribed dose were also analyzed. Results: Conventional plans were performed, and patients were categorized on the basis of whether the 100% isodose line of point A prescription dose fully encompassed the CTV (Group 1, n = 20) or not (Group 2, n = 10). The mean gross tumor volume (11.6 cm 3 ) and CTV (24.9 cm 3 ) of Group 1 were smaller than the corresponding values (23.7 and 44.7 cm 3 , respectively) for Group 2 (p = 0.003). The mean V ref for all patients was 129.6 cm 3 for the conventional plan and 97.0 cm 3 for the CTV plan (p = 0.003). The mean V ref in Group 1 decreased markedly with the CTV plan (p < 0.001). For the conventional and CTV plans in all patients, the mean coverage index, conformal index, and external volume index were 0.98 and 1.0, 0.23 and 0.34, and 3.86 and

  20. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  1. Breed differences in dogs sensitivity to human points: a meta-analysis.

    Science.gov (United States)

    Dorey, Nicole R; Udell, Monique A R; Wynne, Clive D L

    2009-07-01

    The last decade has seen a substantial increase in research on the behavioral and cognitive abilities of pet dogs, Canis familiaris. The most commonly used experimental paradigm is the object-choice task in which a dog is given a choice of two containers and guided to the reinforced object by human pointing gestures. We review here studies of this type and attempt a meta-analysis of the available data. In the meta-analysis breeds of dogs were grouped into the eight categories of the American Kennel Club, and into four clusters identified by Parker and Ostrander [Parker, H.G., Ostrander, E.A., 2005. Canine genomics and genetics: running with the pack. PLoS Genet. 1, 507-513] on the basis of a genetic analysis. No differences in performance between breeds categorized in either fashion were identified. Rather, all dog breeds appear to be similarly and highly successful in following human points to locate desired food. We suggest this result could be due to the paucity of data available in published studies, and the restricted range of breeds tested.

  2. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  3. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    Science.gov (United States)

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  4. The Melting Point of Palladium Using Miniature Fixed Points of Different Ceramic Materials: Part II—Analysis of Melting Curves and Long-Term Investigation

    Science.gov (United States)

    Edler, F.; Huang, K.

    2016-12-01

    Fifteen miniature fixed-point cells made of three different ceramic crucible materials (Al2O3, ZrO2, and Al2O3(86 %)+ZrO2(14 %)) were filled with pure palladium and used to calibrate type B thermocouples (Pt30 %Rh/Pt6 %Rh). A critical point by using miniature fixed points with small amounts of fixed-point material is the analysis of the melting curves, which are characterized by significant slopes during the melting process compared to flat melting plateaus obtainable using conventional fixed-point cells. The method of the extrapolated starting point temperature using straight line approximation of the melting plateau was applied to analyze the melting curves. This method allowed an unambiguous determination of an electromotive force (emf) assignable as melting temperature. The strict consideration of two constraints resulted in a unique, repeatable and objective method to determine the emf at the melting temperature within an uncertainty of about 0.1 μ V. The lifetime and long-term stability of the miniature fixed points was investigated by performing more than 100 melt/freeze cycles for each crucible of the different ceramic materials. No failure of the crucibles occurred indicating an excellent mechanical stability of the investigated miniature cells. The consequent limitation of heating rates to values below {± }3.5 K min^{-1} above 1100° C and the carefully and completely filled crucibles (the liquid palladium occupies the whole volume of the crucible) are the reasons for successfully preventing the crucibles from breaking. The thermal stability of the melting temperature of palladium was excellent when using the crucibles made of Al2O3(86 %)+ZrO2(14 %) and ZrO2. Emf drifts over the total duration of the long-term investigation were below a temperature equivalent of about 0.1 K-0.2 K.

  5. Dual time point 18FDG-PET/CT versus single time point 18FDG-PET/CT for the differential diagnosis of pulmonary nodules - A meta-analysis

    International Nuclear Information System (INIS)

    Zhang, Li; Wang, Yinzhong; Lei, Junqiang; Tian, Jinhui; Zhai, Yanan

    2013-01-01

    Background: Lung cancer is one of the most common cancer types in the world. An accurate diagnosis of lung cancer is crucial for early treatment and management. Purpose: To perform a comprehensive meta-analysis to evaluate the diagnostic performance of dual time point 18F-fluorodexyglucose position emission tomography/computed tomography (FDG-PET/CT) and single time point 18FDG-PET/CT in the diagnosis of pulmonary nodules. Material and Methods: PubMed (1966-2011.11), EMBASE (1974-2011.11), Web of Science (1972-2011.11), Cochrane Library (-2011.11), and four Chinese databases; CBM (1978-2011.11), CNKI (1994-2011.11), VIP (1989-2011.11), and Wanfang Database (1994-2011.11) were searched. Summary sensitivity, summary specificity, summary diagnostic odds ratios (DOR), and summary positive likelihood ratios (LR+) and negative likelihood ratios (LR-) were obtained using Meta-Disc software. Summary receiver-operating characteristic (SROC) curves were used to evaluate the diagnostic performance of dual time point 18FDG-PET/CT and single time point 18FDG-PET/CT. Results: The inclusion criteria were fulfilled by eight articles, with a total of 415 patients and 430 pulmonary nodules. Compared with the gold standard (pathology or clinical follow-up), the summary sensitivity of dual time point 18FDG-PET/CT was 79% (95%CI, 74.0 - 84.0%), and its summary specificity was 73% (95%CI, 65.0-79.0%); the summary LR+ was 2.61 (95%CI, 1.96-3.47), and the summary LR- was 0.29 (95%CI, 0.21 - 0.41); the summary DOR was 10.25 (95%CI, 5.79 - 18.14), and the area under the SROC curve (AUC) was 0.8244. The summary sensitivity for single time point 18FDG-PET/CT was 77% (95%CI, 71.9 - 82.3%), and its summary specificity was 59% (95%CI, 50.6 - 66.2%); the summary LR+ was 1.97 (95%CI, 1.32 - 2.93), and the summary LR- was 0.37 (95%CI, 0.29 - 0.49); the summary DOR was 6.39 (95%CI, 3.39 - 12.05), and the AUC was 0.8220. Conclusion: The results indicate that dual time point 18FDG-PET/CT and single

  6. An Investigation of Three-point Shooting through an Analysis of NBA Player Tracking Data

    OpenAIRE

    Sliz, Bradley A.

    2017-01-01

    I address the difficult challenge of measuring the relative influence of competing basketball game strategies, and I apply my analysis to plays resulting in three-point shots. I use a glut of SportVU player tracking data from over 600 NBA games to derive custom position-based features that capture tangible game strategies from game-play data, such as teamwork, player matchups, and on-ball defender distances. Then, I demonstrate statistical methods for measuring the relative importance of any ...

  7. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    Science.gov (United States)

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.

  8. Farmer cooperatives in the food economy of Western Europe: an analysis from the Marketing point of view

    NARCIS (Netherlands)

    Meulenberg, M.T.G.

    1979-01-01

    This paper is concerned with an analysis of farmer cooperatives in Western Europe from the marketing point of view. The analysis is restricted to marketing and processing cooperatives. First some basic characteristics of farmer cooperatives are discussed from a systems point of view. Afterwards

  9. Analysis of tree stand horizontal structure using random point field methods

    Directory of Open Access Journals (Sweden)

    O. P. Sekretenko

    2015-06-01

    Full Text Available This paper uses the model approach to analyze the horizontal structure of forest stands. The main types of models of random point fields and statistical procedures that can be used to analyze spatial patterns of trees of uneven and even-aged stands are described. We show how modern methods of spatial statistics can be used to address one of the objectives of forestry – to clarify the laws of natural thinning of forest stand and the corresponding changes in its spatial structure over time. Studying natural forest thinning, we describe the consecutive stages of modeling: selection of the appropriate parametric model, parameter estimation and generation of point patterns in accordance with the selected model, the selection of statistical functions to describe the horizontal structure of forest stands and testing of statistical hypotheses. We show the possibilities of a specialized software package, spatstat, which is designed to meet the challenges of spatial statistics and provides software support for modern methods of analysis of spatial data. We show that a model of stand thinning that does not consider inter-tree interaction can project the size distribution of the trees properly, but the spatial pattern of the modeled stand is not quite consistent with observed data. Using data of three even-aged pine forest stands of 25, 55, and 90-years old, we demonstrate that the spatial point process models are useful for combining measurements in the forest stands of different ages to study the forest stand natural thinning.

  10. Miniaturization for Point-of-Care Analysis: Platform Technology for Almost Every Biomedical Assay.

    Science.gov (United States)

    Schumacher, Soeren; Sartorius, Dorian; Ehrentreich-Förster, Eva; Bier, Frank F

    2012-10-01

    Platform technologies for the changing need of diagnostics are one of the main challenges in medical device technology. From one point-of-view the demand for new and more versatile diagnostic is increasing due to a deeper knowledge of biomarkers and their combination with diseases. From another point-of-view a decentralization of diagnostics will occur since decisions can be made faster resulting in higher success of therapy. Hence, new types of technologies have to be established which enables a multiparameter analysis at the point-of-care. Within this review-like article a system called Fraunhofer ivD-platform is introduced. It consists of a credit-card sized cartridge with integrated reagents, sensors and pumps and a read-out/processing unit. Within the cartridge the assay runs fully automated within 15-20 minutes. Due to the open design of the platform different analyses such as antibody, serological or DNA-assays can be performed. Specific examples of these three different assay types are given to show the broad applicability of the system.

  11. Change-Point and Trend Analysis on Annual Maximum Discharge in Continental United States

    Science.gov (United States)

    Serinaldi, F.; Villarini, G.; Smith, J. A.; Krajewski, W. F.

    2008-12-01

    Annual maximum discharge records from 36 stations representing different hydro-climatic regimes in the continental United States with at least 100 years of records are used to investigate the presence of temporal trends and abrupt changes in mean and variance. Change point analysis is performed by means of two non- parametric (Pettitt and CUSUM), one semi-parametric (Guan), and two parametric (Rodionov and Bayesian Change Point) tests. Two non-parametric (Mann-Kendall and Spearman) and one parametric (Pearson) tests are applied to detect the presence of temporal trends. Generalized Additive Model for Location Scale and Shape (GAMLSS) models are also used to parametrically model the streamflow data exploiting their flexibility to account for changes and temporal trends in the parameters of distribution functions. Additionally, serial correlation is assessed in advance by computing the autocorrelation function (ACF), and the Hurst parameter is estimated using two estimators (aggregated variance and differenced variance methods) to investigate the presence of long range dependence. The results of this study indicate lack of long range dependence in the maximum streamflow series. At some stations the authors found a statistically significant change point in the mean and/or variance, while in general they detected no statistically significant temporal trends.

  12. One-point fluctuation analysis of the high-energy neutrino sky

    Energy Technology Data Exchange (ETDEWEB)

    Feyereisen, Michael R.; Ando, Shin' ichiro [GRAPPA Institute, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Tamborra, Irene, E-mail: m.r.feyereisen@uva.nl, E-mail: tamborra@nbi.ku.dk, E-mail: s.ando@uva.nl [Niels Bohr International Academy, Niels Bohr Institute, Blegdamsvej 17, 2100 Copenhagen (Denmark)

    2017-03-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even with low statistics and in the absence of point source detection. Besides the veto-passing atmospheric foregrounds, we adopt a simple model of the high-energy neutrino background by assuming two main extra-galactic components: star-forming galaxies and blazars. By leveraging multi-wavelength data from Herschel and Fermi , we predict the spectral and anisotropic probability distributions for their expected neutrino counts in IceCube. We find that star-forming galaxies are likely to remain a diffuse background due to the poor angular resolution of IceCube, and we determine an upper limit on the number of shower events that can reasonably be associated to blazars. We also find that upper limits on the contribution of blazars to the measured flux are unfavourably affected by the skewness of the blazar flux distribution. One-point event clustering and likelihood analyses of the IceCube HESE data suggest that this method has the potential to dramatically improve over more conventional model-based analyses, especially for the next generation of neutrino telescopes.

  13. Electron-density critical points analysis and catastrophe theory to forecast structure instability in periodic solids.

    Science.gov (United States)

    Merli, Marcello; Pavese, Alessandro

    2018-03-01

    The critical points analysis of electron density, i.e. ρ(x), from ab initio calculations is used in combination with the catastrophe theory to show a correlation between ρ(x) topology and the appearance of instability that may lead to transformations of crystal structures, as a function of pressure/temperature. In particular, this study focuses on the evolution of coalescing non-degenerate critical points, i.e. such that ∇ρ(x c ) = 0 and λ 1 , λ 2 , λ 3 ≠ 0 [λ being the eigenvalues of the Hessian of ρ(x) at x c ], towards degenerate critical points, i.e. ∇ρ(x c ) = 0 and at least one λ equal to zero. The catastrophe theory formalism provides a mathematical tool to model ρ(x) in the neighbourhood of x c and allows one to rationalize the occurrence of instability in terms of electron-density topology and Gibbs energy. The phase/state transitions that TiO 2 (rutile structure), MgO (periclase structure) and Al 2 O 3 (corundum structure) undergo because of pressure and/or temperature are here discussed. An agreement of 3-5% is observed between the theoretical model and experimental pressure/temperature of transformation.

  14. An analysis of Chinas CO2 emission peaking target and pathways

    OpenAIRE

    He, Jian-Kun

    2017-01-01

    China has set the goal for its CO2 emissions to peak around 2030, which is not only a strategic decision coordinating domestic sustainable development and global climate change mitigation but also an overarching target and a key point of action for Chinas resource conservation, environmental protection, shift in economic development patterns, and CO2 emission reduction to avoid climate change. The development stage where China maps out the CO2 emission peak target is earlier than that of the ...

  15. Proteomics analysis of antimalarial targets of Garcinia mangostana Linn.

    Institute of Scientific and Technical Information of China (English)

    Wanna; Chaijaroenkul; Artitiya; Thiengsusuk; Kanchana; Rungsihirunrat; Stephen; Andrew; Ward; Kesara; Na-Bangchang

    2014-01-01

    Objective:To investigate possible protein targets for antimalarial activity of Garcina mangostana Linn.(G.mangostana)(pericarp)in 3D7 Plasmodium falciparum clone using 2-dimensional electrophoresis and liquid chromatography mass-spectrometry(LC/MS/MS).Methods:3D7 Plasmodium falciparum was exposed to the crude ethanolic extract of G.mangostana Linn.(pericarp)at the concentrations of 12μg/mL(1C50level:concentration that inhibits parasite growth by 50%)and 30μg/mL(1C90level:concentration that inhibits parasite growth by 90%)for 12 h.Parasite proteins were separated by 2-dimensional electrophoresis and identified by LC/MS/MS.Results:At the IC50concentration,about 82%of the expressed parasite proteins were matched with the control(non-exposed),while at the IC90concentration,only 15%matched proteins were found.The selected protein spots from parasite exposed to the plant extract at the concentration of 12μg/mL were identified as eneymes that play role in glycolysis pathway,i.e.,phosphoglyeerate mutase putative,L-lactate dehydrogenase/glyceraldehyde-3-phosphate dehydrogenase,and fruetose-bisphosphate aldolase/phosphoglyeerate kinase.The proteosome was found in parasite exposed to 30μg/mL of the extract.Conclusions:Results suggest that proteins involved in the glycolysis pathway may be the targets for antimalarial activity of G.mangostana Linn.(pericarp).

  16. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  17. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  18. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  19. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system.

    Science.gov (United States)

    Cao, Jing; Edwards, Rachel; Chairez, Janette; Devaraj, Sridevi

    2017-12-01

    Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc). Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory) was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics) was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer) for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing.

  20. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  1. AUTOMATED VOXEL MODEL FROM POINT CLOUDS FOR STRUCTURAL ANALYSIS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    G. Bitelli

    2016-06-01

    Full Text Available In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy that was hit by an earthquake in 2012.

  2. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    Science.gov (United States)

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Spectral analysis of point-vortex dynamics: first application to vortex polygons in a circular domain

    International Nuclear Information System (INIS)

    Speetjens, M F M; Meleshko, V V; Van Heijst, G J F

    2014-01-01

    The present study addresses the classical problem of the dynamics and stability of a cluster of N-point vortices of equal strength arranged in a polygonal configuration (‘N-vortex polygons’). In unbounded domains, such N-vortex polygons are unconditionally stable for N⩽7. Confinement in a circular domain tightens the stability conditions to N⩽6 and a maximum polygon size relative to the domain radius. This work expands on existing studies on stability and integrability by a first giving an exploratory spectral analysis of the dynamics of N vortex polygons in circular domains. Key to this is that the spectral signature of the time evolution of vortex positions reflects their qualitative behaviour. Expressing vortex motion by a generic evolution operator (the so-called Koopman operator) provides a rigorous framework for such spectral analyses. This paves the way to further differentiation and classification of point-vortex behaviour beyond stability and integrability. The concept of Koopman-based spectral analysis is demonstrated for N-vortex polygons. This reveals that conditional stability can be seen as a local form of integrability and confirms an important generic link between spectrum and dynamics: discrete spectra imply regular (quasi-periodic) motion; continuous (sub-)spectra imply chaotic motion. Moreover, this exposes rich nonlinear dynamics as intermittency between regular and chaotic motion and quasi-coherent structures formed by chaotic vortices. (ss 1)

  4. Area, and Power Performance Analysis of a Floating-Point Based Application on FPGAs

    National Research Council Canada - National Science Library

    Govindu, Gokul

    2003-01-01

    .... However the inevitable quantization effects and the complexity of converting the floating-point algorithm into a fixed point one, limit the use of fixed-point arithmetic for high precision embedded computing...

  5. Thermal analysis of titanium drive-in target for D-D neutron generation.

    Science.gov (United States)

    Jung, N S; Kim, I J; Kim, S J; Choi, H D

    2010-01-01

    Thermal analysis was performed for a titanium drive-in target of a D-D neutron generator. Computational fluid dynamics code CFX-5 was used in this study. To define the heat flux term for the thermal analysis, beam current profile was measured. Temperature of the target was calculated at some of the operating conditions. The cooling performance of the target was evaluated by means of the comparison of the calculated maximum target temperature and the critical temperature of titanium. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Multivariate analysis for the estimation of target localization errors in fiducial marker-based radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Takamiya, Masanori [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501, Japan and Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Nakamura, Mitsuhiro, E-mail: m-nkmr@kuhp.kyoto-u.ac.jp; Akimoto, Mami; Ueki, Nami; Yamada, Masahiro; Matsuo, Yukinori; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Tanabe, Hiroaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047 (Japan); Kokubo, Masaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047, Japan and Department of Radiation Oncology, Kobe City Medical Center General Hospital, Kobe 650-0047 (Japan); Itoh, Akio [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501 (Japan)

    2016-04-15

    Purpose: To assess the target localization error (TLE) in terms of the distance between the target and the localization point estimated from the surrogates (|TMD|), the average of respiratory motion for the surrogates and the target (|aRM|), and the number of fiducial markers used for estimating the target (n). Methods: This study enrolled 17 lung cancer patients who subsequently underwent four fractions of real-time tumor tracking irradiation. Four or five fiducial markers were implanted around the lung tumor. The three-dimensional (3D) distance between the tumor and markers was at maximum 58.7 mm. One of the markers was used as the target (P{sub t}), and those markers with a 3D |TMD{sub n}| ≤ 58.7 mm at end-exhalation were then selected. The estimated target position (P{sub e}) was calculated from a localization point consisting of one to three markers except P{sub t}. Respiratory motion for P{sub t} and P{sub e} was defined as the root mean square of each displacement, and |aRM| was calculated from the mean value. TLE was defined as the root mean square of each difference between P{sub t} and P{sub e} during the monitoring of each fraction. These procedures were performed repeatedly using the remaining markers. To provide the best guidance on the answer with n and |TMD|, fiducial markers with a 3D |aRM ≥ 10 mm were selected. Finally, a total of 205, 282, and 76 TLEs that fulfilled the 3D |TMD| and 3D |aRM| criteria were obtained for n = 1, 2, and 3, respectively. Multiple regression analysis (MRA) was used to evaluate TLE as a function of |TMD| and |aRM| in each n. Results: |TMD| for n = 1 was larger than that for n = 3. Moreover, |aRM| was almost constant for all n, indicating a similar scale for the marker’s motion near the lung tumor. MRA showed that |aRM| in the left–right direction was the major cause of TLE; however, the contribution made little difference to the 3D TLE because of the small amount of motion in the left–right direction. The TLE

  7. Numerical analysis for multi-group neutron-diffusion equation using Radial Point Interpolation Method (RPIM)

    International Nuclear Information System (INIS)

    Kim, Kyung-O; Jeong, Hae Sun; Jo, Daeseong

    2017-01-01

    Highlights: • Employing the Radial Point Interpolation Method (RPIM) in numerical analysis of multi-group neutron-diffusion equation. • Establishing mathematical formation of modified multi-group neutron-diffusion equation by RPIM. • Performing the numerical analysis for 2D critical problem. - Abstract: A mesh-free method is introduced to overcome the drawbacks (e.g., mesh generation and connectivity definition between the meshes) of mesh-based (nodal) methods such as the finite-element method and finite-difference method. In particular, the Point Interpolation Method (PIM) using a radial basis function is employed in the numerical analysis for the multi-group neutron-diffusion equation. The benchmark calculations are performed for the 2D homogeneous and heterogeneous problems, and the Multiquadrics (MQ) and Gaussian (EXP) functions are employed to analyze the effect of the radial basis function on the numerical solution. Additionally, the effect of the dimensionless shape parameter in those functions on the calculation accuracy is evaluated. According to the results, the radial PIM (RPIM) can provide a highly accurate solution for the multiplication eigenvalue and the neutron flux distribution, and the numerical solution with the MQ radial basis function exhibits the stable accuracy with respect to the reference solutions compared with the other solution. The dimensionless shape parameter directly affects the calculation accuracy and computing time. Values between 1.87 and 3.0 for the benchmark problems considered in this study lead to the most accurate solution. The difference between the analytical and numerical results for the neutron flux is significantly increased in the edge of the problem geometry, even though the maximum difference is lower than 4%. This phenomenon seems to arise from the derivative boundary condition at (x,0) and (0,y) positions, and it may be necessary to introduce additional strategy (e.g., the method using fictitious points and

  8. Targeted drugs for pulmonary arterial hypertension: a network meta-analysis of 32 randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Gao XF

    2017-05-01

    Full Text Available Xiao-Fei Gao,1 Jun-Jie Zhang,1,2 Xiao-Min Jiang,1 Zhen Ge,1,2 Zhi-Mei Wang,1 Bing Li,1 Wen-Xing Mao,1 Shao-Liang Chen1,2 1Department of Cardiology, Nanjing First Hospital, Nanjing Medical University, Nanjing, 2Department of Cardiology, Nanjing Heart Center, Nanjing, People’s Republic of China Background: Pulmonary arterial hypertension (PAH is a devastating disease and ultimately leads to right heart failure and premature death. A total of four classical targeted drugs, prostanoids, endothelin receptor antagonists (ERAs, phosphodiesterase 5 inhibitors (PDE-5Is, and soluble guanylate cyclase stimulator (sGCS, have been proved to improve exercise capacity and hemodynamics compared to placebo; however, direct head-to-head comparisons of these drugs are lacking. This network meta-analysis was conducted to comprehensively compare the efficacy of these targeted drugs for PAH.Methods: Medline, the Cochrane Library, and other Internet sources were searched for randomized clinical trials exploring the efficacy of targeted drugs for patients with PAH. The primary effective end point of this network meta-analysis was a 6-minute walk distance (6MWD.Results: Thirty-two eligible trials including 6,758 patients were identified. There was a statistically significant improvement in 6MWD, mean pulmonary arterial pressure, pulmonary vascular resistance, and clinical worsening events associated with each of the four targeted drugs compared with placebo. Combination therapy improved 6MWD by 20.94 m (95% confidence interval [CI]: 6.94, 34.94; P=0.003 vs prostanoids, and 16.94 m (95% CI: 4.41, 29.47; P=0.008 vs ERAs. PDE-5Is improved 6MWD by 17.28 m (95% CI: 1.91, 32.65; P=0.028 vs prostanoids, with a similar result with combination therapy. In addition, combination therapy reduced mean pulmonary artery pressure by 3.97 mmHg (95% CI: -6.06, -1.88; P<0.001 vs prostanoids, 8.24 mmHg (95% CI: -10.71, -5.76; P<0.001 vs ERAs, 3.38 mmHg (95% CI: -6.30, -0.47; P=0.023 vs

  9. Singular point analysis during rail deployment into vacuum vessel for ITER blanket maintenance

    International Nuclear Information System (INIS)

    Kakudate, Satoshi; Shibanuma, Kiyoshi

    2007-05-01

    Remote maintenance of the ITER blanket composed of about 400 modules in the vessel is required by a maintenance robot due to high gamma radiation of ∼500Gy/h in the vessel. A concept of rail-mounted vehicle manipulator system has been developed to apply to the maintenance of the ITER blanket. The most critical issue of the vehicle manipulator system is the feasibility of the deployment of the articulated rail composed of eight rail links into the donut-shaped vessel without any driving mechanism in the rail. To solve this issue, a new driving mechanism and procedure for the rail deployment has been proposed, taking account of a repeated operation of the multi-rail links deployed in the same kinematical manner. The new driving mechanism, which is deferent from those of a usual 'articulated arm' equipped with actuator in the every joint for movement, is composed of three mechanisms. To assess the feasibility of the kinematics of the articulated rail for rail deployment, a kinematical model composed of three rail links related to a cycle of the repeated operation for rail deployment was considered. The determinant det J' of the Jacobian matrix J' was solved so as to estimate the existence of a singular point of the transformation during rail deployment. As a result, it is found that there is a singular point due to det J'=0. To avoid the singular point of the rail links, a new location of the second driving mechanism and the related rail deployment procedure are proposed. As a result of the rail deployment test based on the new proposal using a full-scale vehicle manipulator system, the respective rail links have been successfully deployed within 6 h less than the target of 8 h in the same manner of the repeated operation under a synchronized cooperation among the three driving mechanisms. It is therefore concluded that the feasibility of the rail deployment of the articulated rail composed of simple structures without any driving mechanism has been demonstrated

  10. [Experience of a Break-Even Point Analysis for Make-or-Buy Decision.].

    Science.gov (United States)

    Kim, Yunhee

    2006-12-01

    Cost containment through continuous quality improvement of medical service is required in an age of a keen competition of the medical market. Laboratory managers should examine the matters on make-or-buy decision periodically. On this occasion, a break-even point analysis can be useful as an analyzing tool. In this study, cost accounting and break-even point (BEP) analysis were performed in case that the immunoassay items showing a recent increase in order volume were to be in-house made. Fixed and variable costs were calculated in case that alpha fetoprotein (AFP), carcinoembryonic antigen (CEA), prostate-specific antigen (PSA), ferritin, free thyroxine (fT4), triiodothyronine (T3), thyroid-stimulating hormone (TSH), CA 125, CA 19-9, and hepatitis B envelope antibody (HBeAb) were to be tested with Abbott AxSYM instrument. Break-even volume was calculated as fixed cost per year divided by purchasing cost per test minus variable cost per test and BEP ratio as total purchasing costs at break-even volume divided by total purchasing costs at actual annual volume. The average fixed cost per year of AFP, CEA, PSA, ferritin, fT4, T3, TSH, CA 125, CA 19-9, and HBeAb was 8,279,187 won and average variable cost per test, 3,786 won. Average break-even volume was 1,599 and average BEP ratio was 852%. Average BEP ratio without including quality costs such as calibration and quality control was 74%. Because the quality assurance of clinical tests cannot be waived, outsourcing all of 10 items was more adequate than in-house make at the present volume in financial aspect. BEP analysis was useful as a financial tool for make-or-buy decision, the common matter which laboratory managers meet with.

  11. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  12. Kinetic analysis of the effects of target structure on siRNA efficiency

    Science.gov (United States)

    Chen, Jiawen; Zhang, Wenbing

    2012-12-01

    RNAi efficiency for target cleavage and protein expression is related to the target structure. Considering the RNA-induced silencing complex (RISC) as a multiple turnover enzyme, we investigated the effect of target mRNA structure on siRNA efficiency with kinetic analysis. The 4-step model was used to study the target cleavage kinetic process: hybridization nucleation at an accessible target site, RISC-mRNA hybrid elongation along with mRNA target structure melting, target cleavage, and enzyme reactivation. At this model, the terms accounting for the target accessibility, stability, and the seed and the nucleation site effects are all included. The results are in good agreement with that of experiments which show different arguments about the structure effects on siRNA efficiency. It shows that the siRNA efficiency is influenced by the integrated factors of target's accessibility, stability, and the seed effects. To study the off-target effects, a simple model of one siRNA binding to two mRNA targets was designed. By using this model, the possibility for diminishing the off-target effects by the concentration of siRNA was discussed.

  13. Evaluating Google, Twitter, and Wikipedia as Tools for Influenza Surveillance Using Bayesian Change Point Analysis: A Comparative Analysis.

    Science.gov (United States)

    Sharpe, J Danielle; Hopkins, Richard S; Cook, Robert L; Striley, Catherine W

    2016-10-20

    Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC's change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package "bcp" version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for

  14. Bioinformatic analysis to discover putative drug targets against ...

    African Journals Online (AJOL)

    /

    2012-01-26

    Jan 26, 2012 ... JVIRTUAL GEL. GELBANK was available from the NCBI FTP server. This website incorporates only completed genomes and information pertinent to 2-DE. Link is available at www.gelbank.anl.gov. JVirGel is a software for the simulation and analysis of proteomics data (http://www.jvirgel.de/). The Java TM.

  15. SU-E-T-72: A Retrospective Correlation Analysis On Dose-Volume Control Points and Treatment Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Roy, A; Nohadani, O [Northwestern University, Evanston, IL (United States); Refaat, T; Bacchus, I; Cutright, D; Sathiaseelan, V; Mittal, B [Northwestern University, Chicago, IL (United States)

    2015-06-15

    Purpose: To quantify correlation between dose-volume control points and treatment outcomes. Specifically, two outcomes are analyzed: occurrence of radiation induced dysphagia and target complications. The results inform the treatment planning process when competing dose-volume criteria requires relaxations. Methods: 32 patients, treated with whole-field sequential intensity modulated radiation therapy during 2009–2010 period, are considered for this study. Acute dysphagia that is categorized into 3 grades is observed on all patients. 3 patients are observed in grade 1, 17 patients in grade 2, and 12 patients in grade 3. Ordinal logistic regression is employed to establish correlations between grades of dysphagia and dose to cervico-thoracic esophagus. Particularly, minimum (Dmin), mean (Dmean), and maximum (Dmax) dose control points are analyzed. Additionally, target complication, which includes local-regional recurrence and/or distant metastasis, is observed on 4 patients. Binary logistic regression is used to quantify correlation between target complication and four dose control points. Namely, ICRU recommended dose control points, D2, D50, D95, and D98 are analyzed. Results: For correlation with dysphagia, Dmin on cervico-thoracic esophagus is statistically significant (p-value = 0.005). Additionally, Dmean on cervico-thoracic esophagus is also significant in association with dysphagia (p-value = 0.012). However, no correlation was observed between Dmax and dysphagia (p-value = 0.263). For target complications, D50 on the target is a statistically significant dose control point (p-value = 0.032). No correlations were observed between treatment complications and D2 (p-value = 0.866), D95 (p-value = 0.750), and D98 (p-value = 0.710) on the target. Conclusion: Significant correlations are observed between radiation induced dysphagia and Dmean (and Dmin) to cervico-thoracic esophagus. Additionally, correlation between target complications and median dose to target

  16. SU-E-T-72: A Retrospective Correlation Analysis On Dose-Volume Control Points and Treatment Outcomes

    International Nuclear Information System (INIS)

    Roy, A; Nohadani, O; Refaat, T; Bacchus, I; Cutright, D; Sathiaseelan, V; Mittal, B

    2015-01-01

    Purpose: To quantify correlation between dose-volume control points and treatment outcomes. Specifically, two outcomes are analyzed: occurrence of radiation induced dysphagia and target complications. The results inform the treatment planning process when competing dose-volume criteria requires relaxations. Methods: 32 patients, treated with whole-field sequential intensity modulated radiation therapy during 2009–2010 period, are considered for this study. Acute dysphagia that is categorized into 3 grades is observed on all patients. 3 patients are observed in grade 1, 17 patients in grade 2, and 12 patients in grade 3. Ordinal logistic regression is employed to establish correlations between grades of dysphagia and dose to cervico-thoracic esophagus. Particularly, minimum (Dmin), mean (Dmean), and maximum (Dmax) dose control points are analyzed. Additionally, target complication, which includes local-regional recurrence and/or distant metastasis, is observed on 4 patients. Binary logistic regression is used to quantify correlation between target complication and four dose control points. Namely, ICRU recommended dose control points, D2, D50, D95, and D98 are analyzed. Results: For correlation with dysphagia, Dmin on cervico-thoracic esophagus is statistically significant (p-value = 0.005). Additionally, Dmean on cervico-thoracic esophagus is also significant in association with dysphagia (p-value = 0.012). However, no correlation was observed between Dmax and dysphagia (p-value = 0.263). For target complications, D50 on the target is a statistically significant dose control point (p-value = 0.032). No correlations were observed between treatment complications and D2 (p-value = 0.866), D95 (p-value = 0.750), and D98 (p-value = 0.710) on the target. Conclusion: Significant correlations are observed between radiation induced dysphagia and Dmean (and Dmin) to cervico-thoracic esophagus. Additionally, correlation between target complications and median dose to target

  17. An econometric analysis of the effects of the penalty points system driver's license in Spain.

    Science.gov (United States)

    Castillo-Manzano, José I; Castro-Nuño, Mercedes; Pedregal, Diego J

    2010-07-01

    This article seeks to quantify the effects of the penalty points system driver's license during the 18-month period following its coming into force. This is achieved by means of univariate and multivariate unobserved component models set up in a state space framework estimated using maximum likelihood. A detailed intervention analysis is carried out in order to test for the effects and their duration of the introduction of the penalty points system driver's license in Spain. Other variables, mainly indicators of the level of economic activity in Spain, are also considered. Among the main effects, we can mention an average reduction of almost 12.6% in the number of deaths in highway accidents. It would take at least 2 years for that effect to disappear. For the rest of the safety indicator variables (vehicle occupants injured in highway accidents and vehicle occupants injured in accidents built-up areas) the effects disappeared 1 year after the law coming into force. Copyright 2010 Elsevier Ltd. All rights reserved.

  18. At what age group blood pressure discontinue to increase? An assessment using change-point analysis

    Directory of Open Access Journals (Sweden)

    Khalib A. Latiff

    2010-05-01

    Full Text Available Aim To study at what age group blood pressure ceases to increase for women and men.Methods Applying change-point technique, we used our existing database - mega base-line cross-sectional Hulu Langat Health Study that was initiated in 2000 - to locate the most appropriate age limit in planning promotive, preventive and controlling strategies against systolic hypertension.Results Systolic hypertension was found to be constantly increasing for both gender right from the early age until the middle age group. However, women achieved the systolic peak 15 years earlier (at 41-45 years old than men (at 56-60 years old. Systolic blood pressure was steadily declined after the peak.Conclusions Hypertension intervention, we recommend age before 40 (women and 55 (men be the most appropriate period to apply various public health intervention, after that, the action must be exclusively curative. (Med J Indones 2010; 19:136-41Keywords: change-point analysis, public health intervention, systolic hypertension

  19. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  20. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  1. The chaotic points and XRD analysis of Hg-based superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Aslan, Oe [Anatuerkler Educational Consultancy and Trading Company, Orhan Veli Kanik Cad., 6/1, Kavacik 34810 Beykoz, Istanbul (Turkey); Oezdemir, Z Gueven [Physics Department, Yildiz Technical University, Davutpasa Campus, Esenler 34210, Istanbul (Turkey); Keskin, S S [Department of Environmental Eng., University of Marmara, Ziverbey, 34722, Istanbul (Turkey); Onbasli, Ue, E-mail: ozdenaslan@yahoo.co [Physics Department, University of Marmara, Ridvan Pasa Cad. 3. Sok. 85/12 Goztepe, Istanbul (Turkey)

    2009-03-01

    In this article, high T{sub c} mercury based cuprate superconductors with different oxygen doping rates have been examined by means of magnetic susceptibility (magnetization) versus temperature data and X-ray diffraction pattern analysis. The under, optimally and over oxygen doping procedures have been defined from the magnetic susceptibility versus temperature data of the superconducting sample by extracting the Meissner critical transition temperature, T{sub c} and the paramagnetic Meissner temperature, T{sub PME}, so called as the critical quantum chaos points. Moreover, the optimally oxygen doped samples have been investigated under both a.c. and d.c. magnetic fields. The related a.c. data for virgin(uncut) and cut samples with optimal doping have been obtained under a.c. magnetic field of 1 Gauss. For the cut sample with the rectangular shape, the chaotic points have been found to occur at 122 and 140 K, respectively. The Meissner critical temperature of 140 K is the new world record for the high temperature oxide superconductors under normal atmospheric pressure. Moreover, the crystallographic lattice parameters of superconducting samples have a crucial importance in calculating Josephson penetration depth determined by the XRD patterns. From the XRD data obtained for under and optimally doped samples, the crystal symmetries have been found in tetragonal structure.

  2. Linear stability analysis of laminar flow near a stagnation point in the slip flow regime

    Science.gov (United States)

    Essaghir, E.; Oubarra, A.; Lahjomri, J.

    2017-12-01

    The aim of the present contribution is to analyze the effect of slip parameter on the stability of a laminar incompressible flow near a stagnation point in the slip flow regime. The analysis is based on the traditional normal mode approach and assumes parallel flow approximation. The Orr-Sommerfeld equation that governs the infinitesimal disturbance of stream function imposed to the steady main flow, which is an exact solution of the Navier-Stokes equation satisfying slip boundary conditions, is obtained by using the powerful spectral Chebyshev collocation method. The results of the effect of slip parameter K on the hydrodynamic characteristics of the base flow, namely the velocity profile, the shear stress profile, the boundary layer, displacement and momentum thicknesses are illustrated and discussed. The numerical data for these characteristics, as well as those of the eigenvalues and the corresponding wave numbers recover the results of the special case of no-slip boundary conditions. They are found to be in good agreement with previous numerical calculations. The effects of slip parameter on the neutral curves of stability, for two-dimensional disturbances in the Reynolds-wave number plane, are then obtained for the first time in the slip flow regime for stagnation point flow. Furthermore, the evolution of the critical Reynolds number against the slip parameter is established. The results show that the critical Reynolds number for instability is significantly increased with the slip parameter and the flow turn out to be more stable when the effect of rarefaction becomes important.

  3. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    Science.gov (United States)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  4. Performance Analysis of a Maximum Power Point Tracking Technique using Silver Mean Method

    Directory of Open Access Journals (Sweden)

    Shobha Rani Depuru

    2018-01-01

    Full Text Available The proposed paper presents a simple and particularly efficacious Maximum Power Point Tracking (MPPT algorithm based on Silver Mean Method (SMM. This method operates by choosing a search interval from the P-V characteristics of the given solar array and converges to MPP of the Solar Photo-Voltaic (SPV system by shrinking its interval. After achieving the maximum power, the algorithm stops shrinking and maintains constant voltage until the next interval is decided. The tracking capability efficiency and performance analysis of the proposed algorithm are validated by the simulation and experimental results with a 100W solar panel for variable temperature and irradiance conditions. The results obtained confirm that even without any perturbation and observation process, the proposed method still outperforms the traditional perturb and observe (P&O method by demonstrating far better steady state output, more accuracy and higher efficiency.

  5. Measurement and analysis of pressure tube elongation in the Douglas Point reactor

    International Nuclear Information System (INIS)

    Causey, A.R.; MacEwan, S.R.; Jamieson, H.C.; Mitchell, A.B.

    1980-02-01

    Elongations of zirconium alloy pressure tubes in CANDU reactors, which occur as a result of neutron-irradiation-induced creep and growth, have been measured over the past 6 years, and the consequences of thses elongations have recently been analysed. Elongation rates, previously deduced from extensive measurements of elongations of cold-worked Zircaloy-2 pressure tubes in the Pickering reactors, have been modified to apply to the pressure tubes in the Douglas Point (DP) reactor by taking into account measured diffences in texture and dislocation density. Using these elongation rates, and structural data unique to the DP reactor, the analysis predicts elongation behaviour which is in good agreement with pressure tube elongations measured during the ten years of reactor operation. (Auth)

  6. Ergodic Capacity Analysis of Free-Space Optical Links with Nonzero Boresight Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique

    2015-04-01

    A unified capacity analysis of a free-space optical (FSO) link that accounts for nonzero boresight pointing errors and both types of detection techniques (i.e. intensity modulation/ direct detection as well as heterodyne detection) is addressed in this work. More specifically, an exact closed-form expression for the moments of the end-to-end signal-to-noise ratio (SNR) of a single link FSO transmission system is presented in terms of well-known elementary functions. Capitalizing on these new moments expressions, we present approximate and simple closedform results for the ergodic capacity at high and low SNR regimes. All the presented results are verified via computer-based Monte-Carlo simulations.

  7. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    International Nuclear Information System (INIS)

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations

  8. Simplified Probabilistic Analysis of Settlement of Cyclically Loaded Soil Stratum by Point Estimate Method

    Science.gov (United States)

    Przewłócki, Jarosław; Górski, Jarosław; Świdziński, Waldemar

    2016-12-01

    The paper deals with the probabilistic analysis of the settlement of a non-cohesive soil layer subjected to cyclic loading. Originally, the settlement assessment is based on a deterministic compaction model, which requires integration of a set of differential equations. However, with the use of the Bessel functions, the settlement of a soil stratum can be calculated by a simplified algorithm. The compaction model parameters were determined for soil samples taken from subsoil near the Izmit Bay, Turkey. The computations were performed for various sets of random variables. The point estimate method was applied, and the results were verified by the Monte Carlo method. The outcome leads to a conclusion that can be useful in the prediction of soil settlement under seismic loading.

  9. 'Fixed point' QCD analysis of the CCFR data on deep inelastic neutrino-nucleon scattering

    International Nuclear Information System (INIS)

    Sidorov, A.V.; Stamenov, D.B.

    1995-01-01

    The results of LO Fixed point QCD (FP-QCD) analysis of the CCFR data for the nucleon structure function xF 3 (x,Q 2 ) are presented. The predictions of FP-QCD, in which α S (Q 2 ) tends to a nonzero coupling constant α 0 as Q 2 → ∞, are in good agreement with the data. The description of the data is even better than that in the case of LO QCD. The FP-QCD parameter α 0 is determined with a good accuracy: α 0 0.198 ± 0.009. Having in mind the recent QCD fits to the same data we conclude that unlike the high precision and large (x,Q 2 ) kinematic range of the CCFR data they cannot discriminate between QCD and FP-QCD predictions for xF 3 (x,Q 2 ). 11 refs., 1 tab

  10. Comparative analysis of different configurations of PLC-based safety systems from reliability point of view

    Science.gov (United States)

    Tapia, Moiez A.

    1993-01-01

    The study of a comparative analysis of distinct multiplex and fault-tolerant configurations for a PLC-based safety system from a reliability point of view is presented. It considers simplex, duplex and fault-tolerant triple redundancy configurations. The standby unit in case of a duplex configuration has a failure rate which is k times the failure rate of the standby unit, the value of k varying from 0 to 1. For distinct values of MTTR and MTTF of the main unit, MTBF and availability for these configurations are calculated. The effect of duplexing only the PLC module or only the sensors and the actuators module, on the MTBF of the configuration, is also presented. The results are summarized and merits and demerits of various configurations under distinct environments are discussed.

  11. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    Science.gov (United States)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  12. Mode analysis of heuristic behavior of searching for multimodal optimum point

    Energy Technology Data Exchange (ETDEWEB)

    Kamei, K; Araki, Y; Inoue, K

    1982-01-01

    Describes an experimental study of a heuristic behavior of searching for the global optimum (maximum) point of a two-dimensional, multimodal, nonlinear and unknown function. First, the authors define three modes dealing with the trial purposes, called the purpose modes and show the heuristic search behaviors expressed by the purpose modes which the human subjects select in the search experiments. Second, the authors classify the heuristic search behaviors into three types according to the mode transitions and extracts eight states of searches which cause the mode transitions. Third, a model of the heuristic search behavior is composed of the eight mode transitions. The analysis of the heuristic search behaviors by use of the purpose modes plays an important role in the heuristic search techniques. 6 references.

  13. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  14. Emergency medical technician-performed point-of-care blood analysis using the capillary blood obtained from skin puncture.

    Science.gov (United States)

    Kim, Changsun; Kim, Hansol

    2017-12-09

    Comparing a point-of-care (POC) test using the capillary blood obtained from skin puncture with conventional laboratory tests. In this study, which was conducted at the emergency department of a tertiary care hospital in April-July 2017, 232 patients were enrolled, and three types of blood samples (capillary blood from skin puncture, arterial and venous blood from blood vessel puncture) were simultaneously collected. Each blood sample was analyzed using a POC analyzer (epoc® system, USA), an arterial blood gas analyzer (pHOx®Ultra, Nova biomedical, USA) and venous blood analyzers (AU5800, DxH2401, Beckman Coulter, USA). Twelve parameters were compared between the epoc and reference analyzers, with an equivalence test, Bland-Altman plot analysis and linear regression employed to show the agreement or correlation between the two methods. The pH, HCO 3 , Ca 2+ , Na + , K + , Cl - , glucose, Hb and Hct measured by the epoc were equivalent to the reference values (95% confidence interval of mean difference within the range of the agreement target) with clinically inconsequential mean differences and narrow limits of agreement. All of them, except pH, had clinically acceptable agreements between the two methods (results within target value ≥80%). Of the remaining three parameters (pCO 2, pO 2 and lactate), the epoc pCO 2 and lactate values were highly correlated with the reference device values, whereas pO 2 was not. (pCO 2 : R 2 =0.824, y=-1.411+0.877·x; lactate: R 2 =0.902, y=-0.544+0.966·x; pO 2 : R 2 =0.037, y=61.6+0.431·x). Most parameters, except only pO 2 , measured by the epoc were equivalent to or correlated with those from the reference method. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Integrative Analysis of CRISPR/Cas9 Target Sites in the Human HBB Gene

    Directory of Open Access Journals (Sweden)

    Yumei Luo

    2015-01-01

    Full Text Available Recently, the clustered regularly interspaced short palindromic repeats (CRISPR system has emerged as a powerful customizable artificial nuclease to facilitate precise genetic correction for tissue regeneration and isogenic disease modeling. However, previous studies reported substantial off-target activities of CRISPR system in human cells, and the enormous putative off-target sites are labor-intensive to be validated experimentally, thus motivating bioinformatics methods for rational design of CRISPR system and prediction of its potential off-target effects. Here, we describe an integrative analytical process to identify specific CRISPR target sites in the human β-globin gene (HBB and predict their off-target effects. Our method includes off-target analysis in both coding and noncoding regions, which was neglected by previous studies. It was found that the CRISPR target sites in the introns have fewer off-target sites in the coding regions than those in the exons. Remarkably, target sites containing certain transcriptional factor motif have enriched binding sites of relevant transcriptional factor in their off-target sets. We also found that the intron sites have fewer SNPs, which leads to less variation of CRISPR efficiency in different individuals during clinical applications. Our studies provide a standard analytical procedure to select specific CRISPR targets for genetic correction.

  16. Targeted next-generation sequencing analysis identifies novel mutations in families with severe familial exudative vitreoretinopathy

    Science.gov (United States)

    Huang, Xiao-Yan; Zhuang, Hong; Wu, Ji-Hong; Li, Jian-Kang; Hu, Fang-Yuan; Zheng, Yu; Tellier, Laurent Christian Asker M.; Zhang, Sheng-Hai; Gao, Feng-Juan; Zhang, Jian-Guo

    2017-01-01

    Purpose Familial exudative vitreoretinopathy (FEVR) is a genetically and clinically heterogeneous disease, characterized by failure of vascular development of the peripheral retina. The symptoms of FEVR vary widely among patients in the same family, and even between the two eyes of a given patient. This study was designed to identify the genetic defect in a patient cohort of ten Chinese families with a definitive diagnosis of FEVR. Methods To identify the causative gene, next-generation sequencing (NGS)-based target capture sequencing was performed. Segregation analysis of the candidate variant was performed in additional family members by using Sanger sequencing and quantitative real-time PCR (QPCR). Results Of the cohort of ten FEVR families, six pathogenic variants were identified, including four novel and two known heterozygous mutations. Of the variants identified, four were missense variants, and two were novel heterozygous deletion mutations [LRP5, c.4053 DelC (p.Ile1351IlefsX88); TSPAN12, EX8Del]. The two novel heterozygous deletion mutations were not observed in the control subjects and could give rise to a relatively severe FEVR phenotype, which could be explained by the protein function prediction. Conclusions We identified two novel heterozygous deletion mutations [LRP5, c.4053 DelC (p.Ile1351IlefsX88); TSPAN12, EX8Del] using targeted NGS as a causative mutation for FEVR. These genetic deletion variations exhibit a severe form of FEVR, with tractional retinal detachments compared with other known point mutations. The data further enrich the mutation spectrum of FEVR and enhance our understanding of genotype–phenotype correlations to provide useful information for disease diagnosis, prognosis, and effective genetic counseling. PMID:28867931

  17. Error analysis of dimensionless scaling experiments with multiple points using linear regression

    International Nuclear Information System (INIS)

    Guercan, Oe.D.; Vermare, L.; Hennequin, P.; Bourdelle, C.

    2010-01-01

    A general method of error estimation in the case of multiple point dimensionless scaling experiments, using linear regression and standard error propagation, is proposed. The method reduces to the previous result of Cordey (2009 Nucl. Fusion 49 052001) in the case of a two-point scan. On the other hand, if the points follow a linear trend, it explains how the estimated error decreases as more points are added to the scan. Based on the analytical expression that is derived, it is argued that for a low number of points, adding points to the ends of the scanned range, rather than the middle, results in a smaller error estimate. (letter)

  18. Cost Minimization Analysis of Hypnotic Drug: Target Controlled Inhalation Anesthesia (TCIA Sevoflurane and Target Controlled Infusion (TCI Propofol

    Directory of Open Access Journals (Sweden)

    Made Wiryana

    2016-09-01

    Full Text Available Background: Cost minimization analysis is a pharmaco-economic study used to compare two or more health interventions that have been shown to have the same effect, similar or equivalent. With limited health insurance budget from the Indonesian National Social Security System implementation in 2015, the quality control and the drug cost are two important things that need to be focused. The application of pharmaco-economic study results in the selection and use of drugs more effectively and efficiently. Objective: To determine cost minimization analysis of hypnotic drug between a target controlled inhalation anesthesia (TCIA sevoflurane and a target controlled infusion (TCI propofol in patients underwent a major oncologic surgery in Sanglah General Hospital. Methods: Sixty ASA physical status I-II patients underwent major oncologic surgery were divided into two groups. Group A was using TCIA sevoflurane and group B using TCI propofol. Bispectral index monitor (BIS index was used to evaluate the depth of anesthesia. The statistical tests used are the Shapiro-Wilk test, Lavene test, Mann-Whitney U test and unpaired t-test (α = 0.05. The data analysis used the Statistical Package for Social Sciences (SPSS for Windows. Results: In this study, the rate of drug used per unit time in group A was 0.12 ml sevoflurane per minute (± 0.03 and the group B was 7.25 mg propofol per minute (±0.98. Total cost of hypnotic drug in group A was IDR598.43 (IQR 112.47 per minute, in group B was IDR703.27 (IQR 156.73 per minute (p>0.05. Conclusions: There was no statistically significant difference from the analysis of the drug cost minimization hypnotic drug in a major oncologic surgery using TCIA sevoflurane and TCI propofol.

  19. Single cell analysis of G1 check points-the relationship between the restriction point and phosphorylation of pRb

    International Nuclear Information System (INIS)

    Martinsson, Hanna-Stina; Starborg, Maria; Erlandsson, Fredrik; Zetterberg, Anders

    2005-01-01

    Single cell analysis allows high resolution investigation of temporal relationships between transition events in G 1 . It has been suggested that phosphorylation of the retinoblastoma tumor suppressor protein (pRb) is the molecular mechanism behind passage through the restriction point (R). We performed a detailed single cell study of the temporal relationship between R and pRb phosphorylation in human fibroblasts using time lapse video-microscopy combined with immunocytochemistry. Four principally different criteria for pRb phosphorylation were used, namely (i) phosphorylation of residues Ser 795 and Ser 780 (ii) degree of pRb-association with the nuclear structure, a property that is closely related with pRb phosphorylation status, (iii) release of the transcription factor E2F-1 from pRb, and (iv) accumulation of cyclin E, which is dependent on phosphorylation of pRb. The analyses of individual cells revealed that passage through R preceded phosphorylation of pRb, which occurs in a gradually increasing proportion of cells in late G 1 . Our data clearly suggest that pRb phosphorylation is not the molecular mechanism behind the passage through R. The restriction point and phosphorylation of pRb thus seem to represent two separate check point in G 1

  20. Can Detectability Analysis Improve the Utility of Point Counts for Temperate Forest Raptors?

    Science.gov (United States)

    Temperate forest breeding raptors are poorly represented in typical point count surveys because these birds are cryptic and typically breed at low densities. In recent years, many new methods for estimating detectability during point counts have been developed, including distanc...

  1. Gas analysis within remote porous targets using LIDAR multi-scatter techniques

    Science.gov (United States)

    Guan, Z. G.; Lewander, M.; Grönlund, R.; Lundberg, H.; Svanberg, S.

    2008-11-01

    Light detection and ranging (LIDAR) experiments are normally pursued for range resolved atmospheric gas measurements or for analysis of solid target surfaces using fluorescence of laser-induced breakdown spectroscopy. In contrast, we now demonstrate the monitoring of free gas enclosed in pores of materials, subject to impinging laser radiation, employing the photons emerging back to the surface laterally of the injection point after penetrating the medium in heavy multiple scattering processes. The directly reflected light is blocked by a beam stop. The technique presented is a remote version of the newly introduced gas in scattering media absorption spectroscopy (GASMAS) technique, which so far was pursued with the injection optics and the detector in close contact with the sample. Feasibility measurements of LIDAR-GASMAS on oxygen in polystyrene foam were performed at a distance of 6 m. Multiple-scattering induced delays of the order of 50 ns, which corresponds to 15 m optical path length, were observed. First extensions to a range of 60 m are discussed. Remote observation of gas composition anomalies in snow using differential absorption LIDAR (DIAL) may find application in avalanche victim localization or for leak detection in snow-covered natural gas pipelines. Further, the techniques may be even more useful for short-range, non-intrusive GASMAS measurements, e.g., on packed food products.

  2. Are your hands clean enough for point-of-care electrolyte analysis?

    Science.gov (United States)

    Lam, Hugh S; Chan, Michael H M; Ng, Pak C; Wong, William; Cheung, Robert C K; So, Alan K W; Fok, Tai F; Lam, Christopher W K

    2005-08-01

    To investigate clinically significant analytical interference in point-of-care electrolyte analysis caused by contamination of blood specimens with hand disinfectant. Six different hand hygiene products were added separately to heparinised blood samples in varying amounts as contaminant. The contaminated samples were analysed by three different blood gas and electrolyte analysers for assessing interference on measured whole blood sodium and potassium concentrations. There were significant analytical interferences caused by hand hygiene product contamination that varied depending on the combination of disinfectant and analyser. Small amounts of Microshield Antibacterial Hand Gel contamination caused large increases in measured sodium concentration. Such effect was much greater compared with the other five products tested, and started to occur at much lower levels of contamination. There was a trend towards lower sodium results in blood samples contaminated with Hexol Antiseptic Lotion (Hexol), the hand hygiene product that we used initially. Apart from AiE Hand Sanitizer, all the other hand disinfectants, especially Hexol, significantly elevated the measured potassium concentration, particularly when a direct ion-selective electrode method was used for measurement. Hand disinfectant products can significantly interfere with blood electrolyte analysis. Proper precautions must be taken against contamination since the resultant errors can adversely affect the clinical management of patients.

  3. Biospectral analysis of the bladder channel point in chronic low back pain patients

    Science.gov (United States)

    Vidal, Alberto Espinosa; Nava, Juan José Godina; Segura, Miguel Ángel Rodriguez; Bastida, Albino Villegas

    2012-10-01

    Chronic pain is the main cause of disability in the productive age people and is a public health problem that affects both the patient and society. On the other hand, there isn't any instrument to measure it; this is only estimated using subjective variables. The healthy cells generate a known membrane potential which is part of a network of biologically closed electric circuits still unstudied. It is proposed a biospectral analysis of a bladder channel point as a diagnosis method for chronic low back pain patients. Materials and methods: We employed a study group with chronic low back pain patients and a control group without low back pain patients. The visual analog scale (VAS) to determine the level of pain was applied. Bioelectric variables were measured for 10 seconds and the respective biostatistical analyses were made. Results: Biospectral analysis on frequency domain shows a depression in the 60-300 Hz frequency range proportional to the chronicity of low back pain compared against healthy patients.

  4. Challenges of teacher-based clinical evaluation from nursing students' point of view: Qualitative content analysis.

    Science.gov (United States)

    Sadeghi, Tabandeh; Seyed Bagheri, Seyed Hamid

    2017-01-01

    Clinical evaluation is very important in the educational system of nursing. One of the most common methods of clinical evaluation is evaluation by the teacher, but the challenges that students would face in this evaluation method, have not been mentioned. Thus, this study aimed to explore the experiences and views of nursing students about the challenges of teacher-based clinical evaluation. This study was a descriptive qualitative study with a qualitative content analysis approach. Data were gathered through semi-structured focused group sessions with undergraduate nursing students who were passing their 8 th semester at Rafsanjan University of Medical Sciences. Date were analyzed using Graneheim and Lundman's proposed method. Data collection and analysis were concurrent. According to the findings, "factitious evaluation" was the main theme of study that consisted of three categories: "Personal preferences," "unfairness" and "shirking responsibility." These categories are explained using quotes derived from the data. According to the results of this study, teacher-based clinical evaluation would lead to factitious evaluation. Thus, changing this approach of evaluation toward modern methods of evaluation is suggested. The finding can help nursing instructors to get a better understanding of the nursing students' point of view toward this evaluation approach and as a result could be planning for changing of this approach.

  5. Temperature calibration procedure for thin film substrates for thermo-ellipsometric analysis using melting point standards

    International Nuclear Information System (INIS)

    Kappert, Emiel J.; Raaijmakers, Michiel J.T.; Ogieglo, Wojciech; Nijmeijer, Arian; Huiskes, Cindy; Benes, Nieck E.

    2015-01-01

    Highlights: • Facile temperature calibration method for thermo-ellipsometric analysis. • The melting point of thin films of indium, lead, zinc, and water can be detected by ellipsometry. • In-situ calibration of ellipsometry hot stage, without using any external equipment. • High-accuracy temperature calibration (±1.3 °C). - Abstract: Precise and accurate temperature control is pertinent to studying thermally activated processes in thin films. Here, we present a calibration method for the substrate–film interface temperature using spectroscopic ellipsometry. The method is adapted from temperature calibration methods that are well developed for thermogravimetric analysis and differential scanning calorimetry instruments, and is based on probing a transition temperature. Indium, lead, and zinc could be spread on a substrate, and the phase transition of these metals could be detected by a change in the Ψ signal of the ellipsometer. For water, the phase transition could be detected by a loss of signal intensity as a result of light scattering by the ice crystals. The combined approach allowed for construction of a linear calibration curve with an accuracy of 1.3 °C or lower over the full temperature range

  6. Cost analysis of premixed multichamber bags versus compounded parenteral nutrition: breakeven point.

    Science.gov (United States)

    Bozat, Erkut; Korubuk, Gamze; Onar, Pelin; Abbasoglu, Osman

    2014-02-01

    Industrially premixed multichamber bags or hospital-manufactured compounded products can be used for parenteral nutrition. The aim of this study was to compare the cost of these 2 approaches. Costs of compounded parenteral nutrition bags in an university hospital were calculated. A total of 600 bags that were administered during 34 days between December 10, 2009 and February 17, 2010 were included in the analysis. For quality control, specific gravity evaluation of the filled bags was performed. It was calculated that the variable cost of a hospital compounded bag was $26.15. If we take the annual fixed costs into consideration, the production cost reaches $36.09 for each unit. It was estimated that the cost for the corresponding multichamber bag was $37.79. Taking the fixed and the variable costs into account, the breakeven point of the hospital compounded and the premixed multichamber bags was seen at 5,404 units per year. In specific gravity evaluation, it was observed that the mean and interval values were inside the upper and lower control margins. In this analysis, usage of hospital-compounded parenteral nutrition bags showed a cost advantage in hospitals that treat more than 15 patients per day. In small volume hospitals, premixed multichamber bags may be more beneficial.

  7. Intraosseous blood samples for point-of-care analysis: agreement between intraosseous and arterial analyses.

    Science.gov (United States)

    Jousi, Milla; Saikko, Simo; Nurmi, Jouni

    2017-09-11

    Point-of-care (POC) testing is highly useful when treating critically ill patients. In case of difficult vascular access, the intraosseous (IO) route is commonly used, and blood is aspirated to confirm the correct position of the IO-needle. Thus, IO blood samples could be easily accessed for POC analyses in emergency situations. The aim of this study was to determine whether IO values agree sufficiently with arterial values to be used for clinical decision making. Two samples of IO blood were drawn from 31 healthy volunteers and compared with arterial samples. The samples were analysed for sodium, potassium, ionized calcium, glucose, haemoglobin, haematocrit, pH, blood gases, base excess, bicarbonate, and lactate using the i-STAT® POC device. Agreement and reliability were estimated by using the Bland-Altman method and intraclass correlation coefficient calculations. Good agreement was evident between the IO and arterial samples for pH, glucose, and lactate. Potassium levels were clearly higher in the IO samples than those from arterial blood. Base excess and bicarbonate were slightly higher, and sodium and ionised calcium values were slightly lower, in the IO samples compared with the arterial values. The blood gases in the IO samples were between arterial and venous values. Haemoglobin and haematocrit showed remarkable variation in agreement. POC diagnostics of IO blood can be a useful tool to guide treatment in critical emergency care. Seeking out the reversible causes of cardiac arrest or assessing the severity of shock are examples of situations in which obtaining vascular access and blood samples can be difficult, though information about the electrolytes, acid-base balance, and lactate could guide clinical decision making. The analysis of IO samples should though be limited to situations in which no other option is available, and the results should be interpreted with caution, because there is not yet enough scientific evidence regarding the agreement of IO

  8. Coupling analysis of the target temperature and thermal stress due to pulsed ion beam

    International Nuclear Information System (INIS)

    Yan Jie; Liu Meng; Lin Jufang; An Li; Long Xinggui

    2013-01-01

    Background: Target temperature has an important effect on the target life for the sealed neutron generator without cooling system. Purpose: To carry out the thermal-mechanical coupling analysis of the film-substrate target bombarded by the pulsed ion beam. Methods: The indirect coupling Finite Element Method (FEM) with a 2-dimensional time-space Gaussian axisymmetric power density as heat source was used to simulate the target temperature and thermal stress fields. Results: The effects of the target temperature and thermal stress fields under difference pulse widths and beam sizes were analyzed in terms of the FEM results. Conclusions: Combining with the temperature requirement and the thermal stress inducing film thermal mechanical destruction effect of the sealed neutron generator film-substrate targets, an optimized pulsed ion beam work status was proposed. (authors)

  9. Beyond typing and grading: target analysis in individualized therapy as a new challenge for tumour pathology.

    Science.gov (United States)

    Kreipe, Hans H; von Wasielewski, Reinhard

    2007-01-01

    In order to bring about its beneficial effects in oncology, targeted therapy depends on accurate target analysis. Whether cells of a tumour will be sensitive to a specific treatment is predicted by the detection of appropriate targets in cancer tissue by immunohistochemistry or molecular methods. In most instances this is performed by histopathologists. Reliability and reproducibility of tissue-based target analysis in histopathology require novel measures of quality assurance by internal and external controls. As a model for external quality assurance in targeted therapy an annual inter-laboratory trial has been set up in Germany applying tissue arrays with up to 60 mammary cancer samples which are tested by participants for expression of HER2/neu and steroid hormone receptors.

  10. Thermal hydraulics analysis of LIBRA-SP target chamber

    International Nuclear Information System (INIS)

    Mogahed, E.A.

    1996-01-01

    LIBRA-SP is a conceptual design study of an inertially confined 1000 MWe fusion power reactor utilizing self-pinched light ion beams. There are 24 ion beams which are arranged around the reactor cavity. The reaction chamber is an upright cylinder with an inverted conical roof resembling a mushroom, and a pool floor. The vertical sides of the cylinder are occupied by a blanket zone consisting of many perforated rigid HT-9 ferritic steel tubes called PERITs (PEr-forated RIgid Tube). The breeding/cooling material, liquid lead-lithium, flows through the PERITs, providing protection to the reflector/vacuum chamber so as to make it a lifetime component. The neutronics analysis and cavity hydrodynamics calculations are performed to account for the neutron heating and also to determine the effects of vaporization/condensation processes on the surface heat flux. The steady state nuclear heating distribution at the midplane is used for thermal hydraulics calculations. The maximum surface temperature of the HT-9 is chosen to not exceed 625 degree C to avoid drastic deterioration of the metal's mechanical properties. This choice restricts the thermal hydraulics performance of the reaction cavity. The inlet first surface coolant bulk temperature is 370 degree C, and the heat exchanger inlet coolant bulk temperature is 502 degree C. 4 refs., 6 figs., 2 tabs

  11. Analysis of drug adversiting targeted to health professionals

    Directory of Open Access Journals (Sweden)

    Marcela Campos Esqueff Abdalla

    2017-08-01

    Full Text Available The advertising of medicines is the dissemination of the product by the pharmaceutical industry, with emphasis on brand, aiming to promote their prescription and/or purchase. This practice must comply with the legal provisions in effect determined by Brazilian National Surveillance Agency. The present work aimed to analyze advertisements of medicines offered by the industry to health professionals. The capture of advertisements covered physician offices of various specialties, public and private hospitals and magazines directed at health professionals. The analysis of the collected parts involved the verification of legibility and viewing of information required, as well as the compliance with the health legislation that regulates the promotion and advertising of medicines in Brazil – agency’s resolution n. 96/2008. The results showed that no piece meets the health legislation in full. Most industries employs strategies that hinder access to restricted information of use of the medicine, as contra-indications, for example, constituting an obstacle to rational use. It was also observed the presence of indications other than those approved by the agency and use indication for different age groups in the specified product registration. It is obvious the need for a new model controller and more rigid regulator that prioritize above all particular interests, a major importance, that is the society. This must be protected from false advertising and abusive, promoting the rational use of medicines.

  12. Drug target mining and analysis of the Chinese tree shrew for pharmacological testing.

    Directory of Open Access Journals (Sweden)

    Feng Zhao

    Full Text Available The discovery of new drugs requires the development of improved animal models for drug testing. The Chinese tree shrew is considered to be a realistic candidate model. To assess the potential of the Chinese tree shrew for pharmacological testing, we performed drug target prediction and analysis on genomic and transcriptomic scales. Using our pipeline, 3,482 proteins were predicted to be drug targets. Of these predicted targets, 446 and 1,049 proteins with the highest rank and total scores, respectively, included homologs of targets for cancer chemotherapy, depression, age-related decline and cardiovascular disease. Based on comparative analyses, more than half of drug target proteins identified from the tree shrew genome were shown to be higher similarity to human targets than in the mouse. Target validation also demonstrated that the constitutive expression of the proteinase-activated receptors of tree shrew platelets is similar to that of human platelets but differs from that of mouse platelets. We developed an effective pipeline and search strategy for drug target prediction and the evaluation of model-based target identification for drug testing. This work provides useful information for future studies of the Chinese tree shrew as a source of novel targets for drug discovery research.

  13. Chopped or long roughage: what do calves prefer? Using cross point analysis of double demand functions.

    Directory of Open Access Journals (Sweden)

    Laura E Webb

    Full Text Available The present study aimed to quantify calves' (Bos taurus preference for long versus chopped hay and straw, and hay versus straw, using cross point analysis of double demand functions, in a context where energy intake was not a limiting factor. Nine calves, fed milk replacer and concentrate, were trained to work for roughage rewards from two simultaneously available panels. The cost (number of muzzle presses required on the panels varied in each session (left panel/right panel: 7/35, 14/28, 21/21, 28/14, 35/7. Demand functions were estimated from the proportion of rewards achieved on one panel relative to the total number of rewards achieved in one session. Cross points (cp were calculated as the cost at which an equal number of rewards was achieved from both panels. The deviation of the cp from the midpoint (here 21 indicates the strength of the preference. Calves showed a preference for long versus chopped hay (cp = 14.5; P = 0.004, and for hay versus straw (cp = 38.9; P = 0.004, both of which improve rumen function. Long hay may stimulate chewing more than chopped hay, and the preference for hay versus straw could be related to hedonic characteristics. No preference was found for chopped versus long straw (cp = 20.8; P = 0.910. These results could be used to improve the welfare of calves in production systems; for example, in systems where calves are fed hay along with high energy concentrate, providing long hay instead of chopped could promote roughage intake, rumen development, and rumination.

  14. Chopped or Long Roughage: What Do Calves Prefer? Using Cross Point Analysis of Double Demand Functions

    Science.gov (United States)

    Webb, Laura E.; Bak Jensen, Margit; Engel, Bas; van Reenen, Cornelis G.; Gerrits, Walter J. J.; de Boer, Imke J. M.; Bokkers, Eddie A. M.

    2014-01-01

    The present study aimed to quantify calves'(Bos taurus) preference for long versus chopped hay and straw, and hay versus straw, using cross point analysis of double demand functions, in a context where energy intake was not a limiting factor. Nine calves, fed milk replacer and concentrate, were trained to work for roughage rewards from two simultaneously available panels. The cost (number of muzzle presses) required on the panels varied in each session (left panel/right panel): 7/35, 14/28, 21/21, 28/14, 35/7. Demand functions were estimated from the proportion of rewards achieved on one panel relative to the total number of rewards achieved in one session. Cross points (cp) were calculated as the cost at which an equal number of rewards was achieved from both panels. The deviation of the cp from the midpoint (here 21) indicates the strength of the preference. Calves showed a preference for long versus chopped hay (cp  = 14.5; P  = 0.004), and for hay versus straw (cp  = 38.9; P = 0.004), both of which improve rumen function. Long hay may stimulate chewing more than chopped hay, and the preference for hay versus straw could be related to hedonic characteristics. No preference was found for chopped versus long straw (cp  = 20.8; P = 0.910). These results could be used to improve the welfare of calves in production systems; for example, in systems where calves are fed hay along with high energy concentrate, providing long hay instead of chopped could promote roughage intake, rumen development, and rumination. PMID:24558426

  15. Curie point depth from spectral analysis of aeromagnetic data for geothermal reconnaissance in Afghanistan

    Science.gov (United States)

    Saibi, H.; Aboud, E.; Gottsmann, J.

    2015-11-01

    The geologic setting of Afghanistan has the potential to contain significant mineral, petroleum and geothermal resources. However, much of the country's potential remains unknown due to limited exploration surveys. Here, we present countrywide aeromagnetic data to estimate the Curie point depth (CPD) and to evaluate the geothermal exploration potential. CPD is an isothermal surface at which magnetic minerals lose their magnetization and as such outlines an isotherm of about 580 °C. We use spectral analysis on the aeromagnetic data to estimate the CPD spatial distribution and compare our findings with known geothermal fields in the western part of Afghanistan. The results outline four regions with geothermal potential: 1) regions of shallow Curie point depths (∼16-21 km) are located in the Helmand basin. 2) regions of intermediate depths (∼21-27 km) are located in the southern Helmand basin and the Baluchistan area. 3) Regions of great depths (∼25-35 km) are located in the Farad block. 4) Regions of greatest depths (∼35-40 km) are located in the western part of the northern Afghanistan platform. The deduced thermal structure in western Afghanistan relates to the collision of the Eurasian and Indian plates, while the shallow CPDs are related to crustal thinning. This study also shows that the geothermal systems are associated with complex magmatic and tectonic association of major intrusions and fault systems. Our results imply geothermal gradients ranging from 14 °C/km to 36 °C/km and heat-flow values ranging from 36 to 90 mW/m2 for the study area.

  16. Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot

    Science.gov (United States)

    Liddy, Joshua J.; Haddad, Jeffrey M.

    2018-02-01

    Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.

  17. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    Science.gov (United States)

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  18. Analysis of step-up transformer tap change on the quantities at the point of connection to transmission grid

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2017-01-01

    Full Text Available The analysis of a step-up transformer tap change on the quantities at the point of connection to the transmission grid is presented in this paper. The point of connection of generator TENT A6 has been analyzed, and a detailed model of this generator is available in software package DIgSILENT Power Factory. The comparison between the effect of a step-up transformer tap change on the quantities at the point of connection during automatic and manual operation of voltage regulator has been conducted. In order to conduct the analysis of the manual operation of the voltage regulator, the comparison between the different methods of modeling of these modes has been performed. Several generator operating points, selected in order to represent the need for tap change, have been analyzed. Also, previously mentioned analyses have been performed taking into account the voltage-reactive stiffness at the point of connection.

  19. SCAP-82, Single Scattering, Albedo Scattering, Point-Kernel Analysis in Complex Geometry

    International Nuclear Information System (INIS)

    Disney, R.K.; Vogtman, S.E.

    1987-01-01

    1 - Description of problem or function: SCAP solves for radiation transport in complex geometries using the single or albedo scatter point kernel method. The program is designed to calculate the neutron or gamma ray radiation level at detector points located within or outside a complex radiation scatter source geometry or a user specified discrete scattering volume. Geometry is describable by zones bounded by intersecting quadratic surfaces within an arbitrary maximum number of boundary surfaces per zone. Anisotropic point sources are describable as pointwise energy dependent distributions of polar angles on a meridian; isotropic point sources may also be specified. The attenuation function for gamma rays is an exponential function on the primary source leg and the scatter leg with a build- up factor approximation to account for multiple scatter on the scat- ter leg. The neutron attenuation function is an exponential function using neutron removal cross sections on the primary source leg and scatter leg. Line or volumetric sources can be represented as a distribution of isotropic point sources, with un-collided line-of-sight attenuation and buildup calculated between each source point and the detector point. 2 - Method of solution: A point kernel method using an anisotropic or isotropic point source representation is used, line-of-sight material attenuation and inverse square spatial attenuation between the source point and scatter points and the scatter points and detector point is employed. A direct summation of individual point source results is obtained. 3 - Restrictions on the complexity of the problem: - The SCAP program is written in complete flexible dimensioning so that no restrictions are imposed on the number of energy groups or geometric zones. The geometric zone description is restricted to zones defined by boundary surfaces defined by the general quadratic equation or one of its degenerate forms. The only restriction in the program is that the total

  20. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  1. Hazard analysis and critical control point to irradiated food in Brazil

    International Nuclear Information System (INIS)

    Boaratti, Maria de Fatima Guerra

    2004-01-01

    Food borne diseases, in particular gastro-intestinal infections, represent a very large group of pathologies with a strong negative impact on the health of the population because of their widespread nature. Little consideration is given to such conditions due to the fact that their symptoms are often moderate and self-limiting. This has led to a general underestimation of their importance, and consequently to incorrect practices during the preparation and preservation of food, resulting in the frequent occurrence of outbreaks involving groups of varying numbers of consumers. Despite substantial efforts in the avoidance of contamination, an upward trend in the number of outbreaks of food borne illnesses caused by non-spore forming pathogenic bacteria are reported in many countries. Good hygienic practices can reduce the level of contamination but the most important pathogens cannot presently be eliminated from most farms, nor is it possible to eliminate them by primary processing, particularly from those foods which are sold raw. Several decontamination methods exist but the most versatile treatment among them is the ionizing radiation procedure. HACCP (Hazard Analysis and Critical Control Point) is a management system in which food safety is addressed through the analysis and control of biological, chemical, and physical hazards from raw material production, procurement and handling, to manufacturing, distribution and consumption of the finished product. For successful implementation of a HACCP plan, management must be strongly committed to the HACCP concept. A firm commitment to HACCP by top management provides company employees with a sense of the importance of producing safe food. At the same time, it has to be always emphasized that, like other intervention strategies, irradiation must be applied as part of a total sanitation program. The benefits of irradiation should never be considered as an excuse for poor quality or for poor handling and storage conditions

  2. Genetic interaction analysis of point mutations enables interrogation of gene function at a residue-level resolution

    Science.gov (United States)

    Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.

    2014-01-01

    We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270

  3. Joint Clustering and Component Analysis of Correspondenceless Point Sets: Application to Cardiac Statistical Modeling.

    Science.gov (United States)

    Gooya, Ali; Lekadir, Karim; Alba, Xenia; Swift, Andrew J; Wild, Jim M; Frangi, Alejandro F

    2015-01-01

    Construction of Statistical Shape Models (SSMs) from arbitrary point sets is a challenging problem due to significant shape variation and lack of explicit point correspondence across the training data set. In medical imaging, point sets can generally represent different shape classes that span healthy and pathological exemplars. In such cases, the constructed SSM may not generalize well, largely because the probability density function (pdf) of the point sets deviates from the underlying assumption of Gaussian statistics. To this end, we propose a generative model for unsupervised learning of the pdf of point sets as a mixture of distinctive classes. A Variational Bayesian (VB) method is proposed for making joint inferences on the labels of point sets, and the principal modes of variations in each cluster. The method provides a flexible framework to handle point sets with no explicit point-to-point correspondences. We also show that by maximizing the marginalized likelihood of the model, the optimal number of clusters of point sets can be determined. We illustrate this work in the context of understanding the anatomical phenotype of the left and right ventricles in heart. To this end, we use a database containing hearts of healthy subjects, patients with Pulmonary Hypertension (PH), and patients with Hypertrophic Cardiomyopathy (HCM). We demonstrate that our method can outperform traditional PCA in both generalization and specificity measures.

  4. Whole-genome analysis of herbicide-tolerant mutant rice generated by Agrobacterium-mediated gene targeting.

    Science.gov (United States)

    Endo, Masaki; Kumagai, Masahiko; Motoyama, Ritsuko; Sasaki-Yamagata, Harumi; Mori-Hosokawa, Satomi; Hamada, Masao; Kanamori, Hiroyuki; Nagamura, Yoshiaki; Katayose, Yuichi; Itoh, Takeshi; Toki, Seiichi

    2015-01-01

    Gene targeting (GT) is a technique used to modify endogenous genes in target genomes precisely via homologous recombination (HR). Although GT plants are produced using genetic transformation techniques, if the difference between the endogenous and the modified gene is limited to point mutations, GT crops can be considered equivalent to non-genetically modified mutant crops generated by conventional mutagenesis techniques. However, it is difficult to guarantee the non-incorporation of DNA fragments from Agrobacterium in GT plants created by Agrobacterium-mediated GT despite screening with conventional Southern blot and/or PCR techniques. Here, we report a comprehensive analysis of herbicide-tolerant rice plants generated by inducing point mutations in the rice ALS gene via Agrobacterium-mediated GT. We performed genome comparative genomic hybridization (CGH) array analysis and whole-genome sequencing to evaluate the molecular composition of GT rice plants. Thus far, no integration of Agrobacterium-derived DNA fragments has been detected in GT rice plants. However, >1,000 single nucleotide polymorphisms (SNPs) and insertion/deletion (InDels) were found in GT plants. Among these mutations, 20-100 variants might have some effect on expression levels and/or protein function. Information about additive mutations should be useful in clearing out unwanted mutations by backcrossing. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  5. [Effect of ear point embedding on plasma and effect site concentrations of propofol-remifentanil in elderly patients after target-controlled induction].

    Science.gov (United States)

    Zheng, Xiaochun; Wan, Liling; Gao, Fei; Chen, Jianghu; Tu, Wenshao

    2017-08-12

    To observe the clinical effect of ear point embedding on plasma and effect site concentrations of propofol-remifentanil in elderly patients who underwent abdominal external hernia surgery at the time of consciousness and pain disappearing by target-controlled infusion (TCI) and bispectral index (BIS). Fifty patients who underwent elective abdominal hernia surgery were randomly assigned into an observation group and a control group, 25 cases in each one. In the observation group, 30 minutes before anesthesia induction, Fugugou (Extra), Gan (CO 12 ), Pizhixia (AT 4 ), and Shenmen (TF 4 ) were embedded by auricular needles until the end of surgery, 10 times of counter press each point. In the control group, the same amount of auricular tape was applied until the end of surgery at the same points without stimulation 30 minutes before anesthesia induction. Patients in the two groups were given total intravenous anesthesia, and BIS was monitored by BIS anesthesia depth monitor. Propofol was infused by TCI at a beginning concentration of 1.5μg/L and increased by 0.3μg/L every 30s until the patients lost their consciousness. After that, remifentanil was infused by TCI at a beginning concentration of 2.0μg/L and increased by 0.3μg/L every 30s until the patients had no body reaction to pain stimulation (orbital reflex). Indices were recorded, including mean arterial pressure (MAP), heart rate (HR) and the BIS values, at the time of T 0 (entering into the operation room), T 1 (losing consciousness) and T 2 (pain relief), the plasma and effect site concentrations of propofol at T 1 , the plasma and effect site concentrations of remifentanil at T 2 . After surgery we recorded the total amounts of propofol and remifentanil, surgery time and anesthesia time. At T 1 and T 2 , MAP and HR of the observation group were higher than those of the control group ( P effect site concentrations of propofol in the observation group were significantly lower than those in the control group

  6. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis.

    Science.gov (United States)

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan Mhai; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim-sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein-protein interaction. In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to develop and discover more effective and specific

  7. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    Science.gov (United States)

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan MHAI; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Purpose Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Conclusion Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to

  8. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    Directory of Open Access Journals (Sweden)

    Chowdhury MRH

    2014-11-01

    Full Text Available Md Rabiul Hossain Chowdhury,1 Md IqbalKaiser Bhuiyan,2 Ayan Saha,2 Ivan MHAI Mosleh,2 Sobuj Mondol,2 C M Sabbir Ahmed3 1Department of Pharmacy, University of Science and Technology Chittagong, Chittagong, Bangladesh; 2Department of Genetic Engineering and Biotechnology, University of Chittagong, Chittagong, Bangladesh; 3Biotechnology and Genetic Engineering Discipline, Khulna University, Khulna, Bangladesh Purpose: Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods: In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results: In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic

  9. Stress analysis of neutral beam pivot point bellows for tokamak fusion test reactor

    International Nuclear Information System (INIS)

    Johnson, J.J.; Benda, B.J.; Tiong, L.W.

    1983-01-01

    The neutral beam pivot point bellows serves as an airtight flexible linkage between the torus duct and the neutral beam transition duct in Princeton University's Tokamak Fusion Test Reactor. The bellows considered here is basically rectangular in cross section with rounded corners; a unique shape. Its overall external dimensions are about 28 in. (about 711 mm) X about 35 in. (about 889 mm). The bellows is formed from 18 convolutions and is of the nested ripple type. It is about 11 in. (about 43.3 mm) in length, composed of Inconel 718, and each leaf has a thickness of 0.034 in. (.86 mm). The bellows is subjected to a series of design loading conditions -- vacuum, vacuum + 2 psi (.12 MPa), vacuum + stroke (10,000 cycles), vacuum + temperature increase + extension, extension to a stress of 120 ksi (838 MPa), and a series of rotational loading conditions induced in the bellows by alignment of the neutral beam injector. A stress analysis of the bellows was performed by the finite element method -- locations and magnitude of maximum stresses were calculated for all of the design loading conditions to check with allowable values and help guide placement of strain gauges during proof testing. A typical center convolution and end convolution were analyzed. Loading conditions were separated into symmetric and antisymmetric cases about the planes of symmetry of the cross-section. Iterative linear analyses were performed, i.e. compressive loading conditions led to predicted overlap of the leaves from linear analysis and restraints were added to prevent such overlap. This effect was found to be substantial in stress predicition and necessary to be taken into account. A total of eleven loading conditions and seven models were analyzed. The results showed peak stresses to be within allowable limits and the number of allowable cycles to be greater than the design condition

  10. Strain Analysis of Stretched Tourmaline Crystals Using ImageJ, Microsoft Excel and PowerPoint

    Science.gov (United States)

    Bosbyshell, H.

    2012-12-01

    This poster describes an undergraduate structural geology lab exercise utilizing the Mohr's circle diagram for finite strain, constructed using measurements obtained from stretched tourmaline crystals. A small building housing HVAC equipment at the south end of West Chester University's Recitation Hall (itself made of serpentinite) is constructed of early-Cambrian Chickies Quartzite. Stretched tourmaline crystals, with segments joined by fibrous quartz, are visible on many surfaces (presumably originally bedding). While the original orientation of any stone is unknown, these rocks provide an opportunity for a short field exercise during a two-hour lab period and a great base for conducting strain analysis. It is always fun to ask how many in the class have ever noticed the tourmaline (few have). Students take photos using their cell phones or cameras. Since strain is a ratio the absolute size of the tourmaline crystals is immaterial. Nonetheless, this is a good opportunity to remind students of the importance of including a scale in their photographs. The photos are opened in ImageJ and the line tool is used to determine the original and final lengths of selected crystals. Students calculate strain parameters using Microsoft Excel. Then, we use Adobe Illustrator or the drafting capabilities of Microsoft PowerPoint 2010 to follow Ramsay and Huber's techniques using a Mohr's circle construction to determine the finite strain ellipse. If a stretching direction can be estimated, elongation of two crystals is all that is required to determine the strain ratio. If no stretching direction is apparent, three crystals are required for a more complicated analysis that allows for determination of the stretching direction, as well as the strain ratio.

  11. Analysis of ultrasonically rotating droplet using moving particle semi-implicit and distributed point source methods

    Science.gov (United States)

    Wada, Yuji; Yuge, Kohei; Tanaka, Hiroki; Nakamura, Kentaro

    2016-07-01

    Numerical analysis of the rotation of an ultrasonically levitated droplet with a free surface boundary is discussed. The ultrasonically levitated droplet is often reported to rotate owing to the surface tangential component of acoustic radiation force. To observe the torque from an acoustic wave and clarify the mechanism underlying the phenomena, it is effective to take advantage of numerical simulation using the distributed point source method (DPSM) and moving particle semi-implicit (MPS) method, both of which do not require a calculation grid or mesh. In this paper, the numerical treatment of the viscoacoustic torque, which emerges from the viscous boundary layer and governs the acoustical droplet rotation, is discussed. The Reynolds stress traction force is calculated from the DPSM result using the idea of effective normal particle velocity through the boundary layer and input to the MPS surface particles. A droplet levitated in an acoustic chamber is simulated using the proposed calculation method. The droplet is vertically supported by a plane standing wave from an ultrasonic driver and subjected to a rotating sound field excited by two acoustic sources on the side wall with different phases. The rotation of the droplet is successfully reproduced numerically and its acceleration is discussed and compared with those in the literature.

  12. Thermomagnetic instabilities in a vertical layer of ferrofluid: nonlinear analysis away from a critical point

    Energy Technology Data Exchange (ETDEWEB)

    Dey, Pinkee; Suslov, Sergey A, E-mail: ssuslov@swin.edu.au [Department of Mathematics H38, Swinburne University of Technology, Hawthorn, Victoria 3122 (Australia)

    2016-12-15

    A finite amplitude instability has been analysed to discover the exact mechanism leading to the appearance of stationary magnetoconvection patterns in a vertical layer of a non-conducting ferrofluid heated from the side and placed in an external magnetic field perpendicular to the walls. The physical results have been obtained using a version of a weakly nonlinear analysis that is based on the disturbance amplitude expansion. It enables a low-dimensional reduction of a full nonlinear problem in supercritical regimes away from a bifurcation point. The details of the reduction are given in comparison with traditional small-parameter expansions. It is also demonstrated that Squire’s transformation can be introduced for higher-order nonlinear terms thus reducing the full three-dimensional problem to its equivalent two-dimensional counterpart and enabling significant computational savings. The full three-dimensional instability patterns are subsequently recovered using the inverse transforms The analysed stationary thermomagnetic instability is shown to occur as a result of a supercritical pitchfork bifurcation. (paper)

  13. An automated smartphone-based diagnostic assay for point-of-care semen analysis

    Science.gov (United States)

    Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L.; Draz, Mohamed Shehata; Petrozza, John C.; Shafiee, Hadi

    2017-01-01

    Male infertility affects up to 12% of the world’s male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. PMID:28330865

  14. What motivates individuals with ADHD? A qualitative analysis from the adolescent's point of view.

    Science.gov (United States)

    Morsink, Sarah; Sonuga-Barke, Edmund; Mies, Gabry; Glorie, Nathalie; Lemiere, Jurgen; Van der Oord, Saskia; Danckaerts, Marina

    2017-08-01

    Individuals with ADHD appear to respond differently to incentives than their peers. This could be due to a general altered sensitivity to reinforcers. However, apart from differences in the degree of motivation, individuals with ADHD might also be motivated by qualitatively different factors. This study aimed to harvest a range of motivational factors and identify ADHD-related qualitative differences in motivation, from the adolescent's point of view. Semi-structured interviews allowing participants to describe what motivates them in daily life were conducted with young adolescents (9-16 years) with and without ADHD. Thematic analysis was undertaken using NVivo software. Major themes relating to motivation were identified from the interview data. These were: (1) achieving a sense of togetherness; (2) feeling competent; (3) fulfilling a need for variation; (4) gaining pleasure from applying effort to achieve a goal; (5) valuing social reinforcement; (6) desiring to be absorbed/forget problems; (7) feeling free and independent, (8) attaining material reinforcement; and (9) an enjoyment of bodily stimulation. The theme structure was very similar for both groups. However, individuals with ADHD differed in some specifics: their focus on the passing of time, the absence of preference for predictable and familiar tasks, and their less elaborate description of the togetherness theme. A broad range of motivational themes was identified, stretching beyond the current focus of ADHD research and motivational theories. Similarities and differences in motivational values of individuals with and without ADHD should be taken into account in reward sensitivity research, and in psychological treatment.

  15. Analysis of three geopressured geothermal aquifer-natural gas fields; Duson Hollywood and Church Point, Louisiana

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, L.A.; Boardman, C.R.

    1981-05-01

    The available well logs, production records and geological structure maps were analyzed for the Hollywood, Duson, and Church Point, Louisiana oil and gas field to determine the areal extent of the sealed geopressured blocks and to identify which aquifer sands within the blocks are connected to commercial production of hydrocarbons. The analysis showed that over the depth intervals of the geopressured zones shown on the logs essentially all of the sands of any substantial thickness had gas production from them somewhere or other in the fault block. It is therefore expected that the sands which are fully brine saturated in many of the wells are the water drive portion of the producing gas/oil somewhere else within the fault block. In this study only one deep sand was identified, in the Hollywood field, which was not connected to a producing horizon somewhere else in the field. Estimates of the reservoir parameters were made and a hypothetical production calculation showed the probable production to be less than 10,000 b/d. The required gas price to profitably produce this gas is well above the current market price.

  16. HACCP (Hazard Analysis Critical Control Points): is it coming to the dairy?

    Science.gov (United States)

    Cullor, J S

    1997-12-01

    The risks and consequences of foodborne and waterborne pathogens are coming to the forefront of public health concerns, and strong pressure is being applied on agriculture for immediate implementation of on-farm controls. The FDA is considering HACCP (Hazard Analysis Critical Control Points) as the new foundation for revision of the US Food Safety Assurance Program because HACCP is considered to be a science-based, systematic approach to the prevention of food safety problems. In addition, the implementation of HACCP principles permits more government oversight through requirements for standard operating procedures and additional systems for keeping records, places primary responsibility for ensuring food safety on the food manufacturer or distributor, and may assist US food companies in competing more effectively in the world market. With the HACCP-based program in place, a government investigator should be able to determine and evaluate both current and past conditions that are critical to ensuring the safety of the food produced by the facility. When this policy is brought to the production unit, the impact for producers and veterinarians will be substantial.

  17. Thermomagnetic instabilities in a vertical layer of ferrofluid: nonlinear analysis away from a critical point

    International Nuclear Information System (INIS)

    Dey, Pinkee; Suslov, Sergey A

    2016-01-01

    A finite amplitude instability has been analysed to discover the exact mechanism leading to the appearance of stationary magnetoconvection patterns in a vertical layer of a non-conducting ferrofluid heated from the side and placed in an external magnetic field perpendicular to the walls. The physical results have been obtained using a version of a weakly nonlinear analysis that is based on the disturbance amplitude expansion. It enables a low-dimensional reduction of a full nonlinear problem in supercritical regimes away from a bifurcation point. The details of the reduction are given in comparison with traditional small-parameter expansions. It is also demonstrated that Squire’s transformation can be introduced for higher-order nonlinear terms thus reducing the full three-dimensional problem to its equivalent two-dimensional counterpart and enabling significant computational savings. The full three-dimensional instability patterns are subsequently recovered using the inverse transforms The analysed stationary thermomagnetic instability is shown to occur as a result of a supercritical pitchfork bifurcation. (paper)

  18. An automated smartphone-based diagnostic assay for point-of-care semen analysis.

    Science.gov (United States)

    Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L; Draz, Mohamed Shehata; Petrozza, John C; Shafiee, Hadi

    2017-03-22

    Male infertility affects up to 12% of the world's male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with time and provide the user a semen quality evaluation based on the World Health Organization (WHO) guidelines with ~98% accuracy. The work suggests that the integration of microfluidics, optical sensing accessories, and advances in consumer electronics, particularly smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. Copyright © 2017, American Association for the Advancement of Science.

  19. Instantaneous normal mode analysis for intermolecular and intramolecular vibrations of water from atomic point of view.

    Science.gov (United States)

    Chen, Yu-Chun; Tang, Ping-Han; Wu, Ten-Ming

    2013-11-28

    By exploiting the instantaneous normal mode (INM) analysis for models of flexible molecules, we investigate intermolecular and intramolecular vibrations of water from the atomic point of view. With two flexible SPC/E models, our investigations include three aspects about their INM spectra, which are separated into the unstable, intermolecular, bending, and stretching bands. First, the O- and H-atom contributions in the four INM bands are calculated and their stable INM spectra are compared with the power spectra of the atomic velocity autocorrelation functions. The unstable and intermolecular bands of the flexible models are also compared with those of the SPC/E model of rigid molecules. Second, we formulate the inverse participation ratio (IPR) of the INMs, respectively, for the O- and H-atom and molecule. With the IPRs, the numbers of the three species participated in the INMs are estimated so that the localization characters of the INMs in each band are studied. Further, by the ratio of the IPR of the H atom to that of the O atom, we explore the number of involved OH bond per molecule participated in the INMs. Third, by classifying simulated molecules into subensembles according to the geometry of their local environments or their H-bond configurations, we examine the local-structure effects on the bending and stretching INM bands. All of our results are verified to be insensible to the definition of H-bond. Our conclusions about the intermolecular and intramolecular vibrations in water are given.

  20. Design of the solid target structure and the study on the coolant flow distribution in the solid target using the 2-dimensional flow analysis

    International Nuclear Information System (INIS)

    Haga, Katsuhiro; Terada, Atsuhiko; Ishikura, Shuichi; Teshigawara, Makoto; Kinoshita, Hidetaka; Kobayashi, Kaoru; Kaminaga, Masaki; Hino, Ryutaro; Susuki, Akira

    1999-11-01

    A solid target cooled by heavy water is presently under development under the Neutron Science Research Project of the Japan Atomic Energy Research Institute (JAERI). Target plates of several millimeters thickness made of heavy metal are used as the spallation target material and they are put face to face in a row with one to two millimeters gaps in between though which heavy water flows, as the coolant. Based on the design criteria regarding the target plate cooling, the volume percentage of the coolant, and the thermal stress produced in the target plates, we conducted thermal and hydraulic analysis with a one dimensional target plate model. We choosed tungsten as the target material, and decided on various target plate thicknesses. We then calculated the temperature and the thermal stress in the target plates using a two dimensional model, and confirmed the validity of the target plate thicknesses. Based on these analytical results, we proposed a target structure in which forty target plates are divided into six groups and each group is cooled using a single pass of coolant. In order to investigate the relationship between the distribution of the coolant flow, the pressure drop, and the coolant velocity, we conducted a hydraulic analysis using the general purpose hydraulic analysis code. As a result, we realized that an uniform coolant flow distribution can be achieved under a wide range of flow velocity conditions in the target plate cooling channels from 1 m/s to 10 m/s. The pressure drop along the coolant path was 0.09 MPa and 0.17 MPa when the coolant flow velocity was 5 m/s and 7 m/s respectively, which is required to cool the 1.5 MW and 2.5 MW solid targets. (author)

  1. An assessment of independent component analysis for detection of military targets from hyperspectral images

    Science.gov (United States)

    Tiwari, K. C.; Arora, M. K.; Singh, D.

    2011-10-01

    Hyperspectral data acquired over hundreds of narrow contiguous wavelength bands are extremely suitable for target detection due to their high spectral resolution. Though spectral response of every material is expected to be unique, but in practice, it exhibits variations, which is known as spectral variability. Most target detection algorithms depend on spectral modelling using a priori available target spectra In practice, target spectra is, however, seldom available a priori. Independent component analysis (ICA) is a new evolving technique that aims at finding out components which are statistically independent or as independent as possible. The technique therefore has the potential of being used for target detection applications. A assessment of target detection from hyperspectral images using ICA and other algorithms based on spectral modelling may be of immense interest, since ICA does not require a priori target information. The aim of this paper is, thus, to assess the potential of ICA based algorithm vis a vis other prevailing algorithms for military target detection. Four spectral matching algorithms namely Orthogonal Subspace Projection (OSP), Constrained Energy Minimisation (CEM), Spectral Angle Mapper (SAM) and Spectral Correlation Mapper (SCM), and four anomaly detection algorithms namely OSP anomaly detector (OSPAD), Reed-Xiaoli anomaly detector (RXD), Uniform Target Detector (UTD) and a combination of Reed-Xiaoli anomaly detector and Uniform Target Detector (RXD-UTD) were considered. The experiments were conducted using a set of synthetic and AVIRIS hyperspectral images containing aircrafts as military targets. A comparison of true positive and false positive rates of target detections obtained from ICA and other algorithms plotted on a receiver operating curves (ROC) space indicates the superior performance of the ICA over other algorithms.

  2. CETF Space Station payload pointing system design and analysis feasibility study. [Critical Evaluation Task Force

    Science.gov (United States)

    Smagala, Tom; Mcglew, Dave

    1988-01-01

    The expected pointing performance of an attached payload coupled to the Critical Evaluation Task Force Space Station via a payload pointing system (PPS) is determined. The PPS is a 3-axis gimbal which provides the capability for maintaining inertial pointing of a payload in the presence of disturbances associated with the Space Station environment. A system where the axes of rotation were offset from the payload center of mass (CM) by 10 in. in the Z axis was studied as well as a system having the payload CM offset by only 1 inch. There is a significant improvement in pointing performance when going from the 10 in. to the 1 in. gimbal offset.

  3. Dynamic analysis of the Nova Target Chamber to assess alignment errors due to ambient noise

    International Nuclear Information System (INIS)

    McCallen, D.B.; Murray, R.C.

    1984-01-01

    We performed a study to determine the dynamic behavior of the Nova Target Chamber. We conducted a free vibration analysis to determine the natural frequencies of vibration and the corresponding modeshapes of the target chamber. Utilizing the free vibration results, we performed forced vibration analysis to predict the displacements of the chamber due to ambient vibration. The input support motion for the forced vibration analysis was defined by a white noise acceleration spectrum which was based on previous measurements of ground noise near the Nova site. A special purpose computer program was prepared to process the results of the forced vibration analysis. The program yields distances by which the lines of sight of the various laser beams miss the target as a result of ambient vibrations. We also performed additional estimates of miss distance to provide bounds on the results. A description of the finite element model of the chamber, the input spectrum, and the results of the analyses are included

  4. Sentiment analysis enhancement with target variable in Kumar’s Algorithm

    Science.gov (United States)

    Arman, A. A.; Kawi, A. B.; Hurriyati, R.

    2016-04-01

    Sentiment analysis (also known as opinion mining) refers to the use of text analysis and computational linguistics to identify and extract subjective information in source materials. Sentiment analysis is widely applied to reviews discussion that is being talked in social media for many purposes, ranging from marketing, customer service, or public opinion of public policy. One of the popular algorithm for Sentiment Analysis implementation is Kumar algorithm that developed by Kumar and Sebastian. Kumar algorithm can identify the sentiment score of the statement, sentence or tweet, but cannot determine the relationship of the object or target related to the sentiment being analysed. This research proposed solution for that challenge by adding additional component that represent object or target to the existing algorithm (Kumar algorithm). The result of this research is a modified algorithm that can give sentiment score based on a given object or target.

  5. Design study and heat transfer analysis of a neutron converter target for medical radioisotope production

    International Nuclear Information System (INIS)

    Masoud Behzad; Sang-In Bak; Seung-Woo Hong; Jong-Seo Chai; Yacine Kadi; Claudio Tenreiro; University of Talca, Talca

    2014-01-01

    A worldwide challenge in the near future will be to find a way of producing radioisotopes in sufficient quantity without relying on research reactors. The motivation for this innovative work on targets lies in the accelerator-based production of radioisotopes using a neutron converter target as in the transmutation by adiabatic resonance crossing concept. Thermal analysis of a multi-channel helium cooled device is performed with the computational fluid dynamics code CFX. Different boundary conditions are taken into account in the simulation process and many important parameters such as maximum allowable solid target temperature as well as uniform inlet velocity and outlet pressure changes in the channels are investigated. The results confirm that the cooling configuration works well; hence such a solid target could be operated safely and may be considered for a prototype target. (author)

  6. Throat swabs in children with respiratory tract infection: associations with clinical presentation and potential targets for point-of-care testing.

    Science.gov (United States)

    Thornton, Hannah V; Hay, Alastair D; Redmond, Niamh M; Turnbull, Sophie L; Christensen, Hannah; Peters, Tim J; Leeming, John P; Lovering, Andrew; Vipond, Barry; Muir, Peter; Blair, Peter S

    2017-08-01

    Diagnostic uncertainty over respiratory tract infections (RTIs) in primary care contributes to over-prescribing of antibiotics and drives antibiotic resistance. If symptoms and signs predict respiratory tract microbiology, they could help clinicians target antibiotics to bacterial infection. This study aimed to determine relationships between symptoms and signs in children presenting to primary care and microbes from throat swabs. Cross-sectional study of children ≥3 months to presenting with acute cough and RTI, with subset follow-up. Associations and area under receiver operating curve (AUROC) statistics sought between clinical presentation and baseline microbe detection. Microbe prevalence compared between baseline (symptomatic) and follow-up (asymptomatic) visits. At baseline, ≥1 bacteria was detected in 1257/2113 (59.5%) children and ≥1 virus in 894/2127 (42%) children. Clinical presentation was not associated with detection of ≥1 bacteria [AUROC 0.54 (95% CI 0.52-0.56)] or ≥1 virus [0.64 (95% CI 0.61-0.66)]. Individually, only respiratory syncytial virus (RSV) was associated with clinical presentation [AUROC 0.80 (0.77-0.84)]. Prevalence fell between baseline and follow-up; more so in viruses (68% versus 26%, P clinical presentation cannot distinguish the presence of bacteria or viruses in the upper respiratory tract. However, individual and overall microbe prevalence was greater when children were unwell than when well, providing some evidence that upper respiratory tract microbes may be the cause or consequence of the illness. If causal, selective microbial point-of-care testing could be beneficial. © The Author 2017. Published by Oxford University Press.

  7. APPLICABILITY ANALYSIS OF CLOTH SIMULATION FILTERING ALGORITHM FOR MOBILE LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Cai

    2018-04-01

    Full Text Available Classifying the original point clouds into ground and non-ground points is a key step in LiDAR (light detection and ranging data post-processing. Cloth simulation filtering (CSF algorithm, which based on a physical process, has been validated to be an accurate, automatic and easy-to-use algorithm for airborne LiDAR point cloud. As a new technique of three-dimensional data collection, the mobile laser scanning (MLS has been gradually applied in various fields, such as reconstruction of digital terrain models (DTM, 3D building modeling and forest inventory and management. Compared with airborne LiDAR point cloud, there are some different features (such as point density feature, distribution feature and complexity feature for mobile LiDAR point cloud. Some filtering algorithms for airborne LiDAR data were directly used in mobile LiDAR point cloud, but it did not give satisfactory results. In this paper, we explore the ability of the CSF algorithm for mobile LiDAR point cloud. Three samples with different shape of the terrain are selected to test the performance of this algorithm, which respectively yields total errors of 0.44 %, 0.77 % and1.20 %. Additionally, large area dataset is also tested to further validate the effectiveness of this algorithm, and results show that it can quickly and accurately separate point clouds into ground and non-ground points. In summary, this algorithm is efficient and reliable for mobile LiDAR point cloud.

  8. A quantitative analysis of statistical power identifies obesity end points for improved in vivo preclinical study design.

    Science.gov (United States)

    Selimkhanov, J; Thompson, W C; Guo, J; Hall, K D; Musante, C J

    2017-08-01

    The design of well-powered in vivo preclinical studies is a key element in building the knowledge of disease physiology for the purpose of identifying and effectively testing potential antiobesity drug targets. However, as a result of the complexity of the obese phenotype, there is limited understanding of the variability within and between study animals of macroscopic end points such as food intake and body composition. This, combined with limitations inherent in the measurement of certain end points, presents challenges to study design that can have significant consequences for an antiobesity program. Here, we analyze a large, longitudinal study of mouse food intake and body composition during diet perturbation to quantify the variability and interaction of the key metabolic end points. To demonstrate how conclusions can change as a function of study size, we show that a simulated preclinical study properly powered for one end point may lead to false conclusions based on secondary end points. We then propose the guidelines for end point selection and study size estimation under different conditions to facilitate proper power calculation for a more successful in vivo study design.

  9. Analysis of the Neutron Generator and Target for the LSDTS System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Lee, Yong Deok; Song, Jae Hoon; Song, Kee Chan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-11-15

    A preliminary analysis was performed based on the literatures and the patents for the neutron generators and targets for the lead slowing down time spectrometer (LSDTS) system. It was found that local neutron generator did not exhibit enough neutron intensity such as 1E+12 n/s, which is a minimum requirement for the LSDTS system to overcome curium backgrounds. However, a neutron generator implemented with an electron accelerator may provide a higher intensity around 1E+13 n/s and it is required to investigate further including a detail analysis. In addition to the neutron generator, a study on target was performed with the Monte Carlo simulation. In the study, an optimal design of target was suggested to provide a high neutron yield and a better thermal resistance. The suggested target consists several cylindrical plates with a certain cooling gap, which have increasing thickness and increasing radius.

  10. Hierarchical spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine B.; Møller, Jesper; Waagepetersen, Rasmus

    2009-01-01

    A complex multivariate spatial point pattern of a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially a maxim...

  11. Design of glass-ceramic complex microstructure with using onset point of crystallization in differential thermal analysis

    International Nuclear Information System (INIS)

    Hwang, Seongjin; Kim, Jinho; Shin, Hyo-Soon; Kim, Jong-Hee; Kim, Hyungsun

    2008-01-01

    Two types of frits with different compositions were used to develop a high strength substrate in electronic packaging using a low temperature co-fired ceramic process. In order to reveal the crystallization stage during heating to approximately 900 deg. C, a glass-ceramic consisting of the two types of frits, which had been crystallized to diopside and anorthite after firing, was tested at different mixing ratios of the frits. The exothermal peaks deconvoluted by a Gauss function in the differential thermal analysis curves were used to determine the onset point of crystallization of diopside or anorthite. The onset points of crystallization were affected by the mixing ratio of the frits, and the microstructure of the glass-ceramic depended on the onset point of crystallization. It was found that when multicrystalline phases appear in the microstructure, the resulting complex microstructure could be predicted from the onset point of crystallization obtained by differential thermal analysis

  12. Target preparation and neutron activation analysis a successful story at IRMM

    CERN Document Server

    Robouch, P; Eguskiza, M; Maguregui, M I; Pommé, S; Ingelbrecht, C

    2002-01-01

    The main task of a target producer is to make well characterized and homogeneous deposits on specific supports. Alpha and/or gamma spectrometry are traditionally used to monitor the quality of actinide deposits. With the increasing demand for enriched stable isotope targets, other analytical techniques, such as ICP-MS and NAA, are needed. This paper presents the application of neutron activation analysis to quality control of 'thin' targets, 'thicker' neutron dosimeters and 'thick' bronze disks prepared by the Reference Materials Unit at the Institute of Reference Materials and Measurements.

  13. Target preparation and neutron activation analysis: a successful story at IRMM

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Maguregui, M.I.; Pomme, S.; Ingelbrecht, C.

    2002-01-01

    The main task of a target producer is to make well characterized and homogeneous deposits on specific supports. Alpha and/or gamma spectrometry are traditionally used to monitor the quality of actinide deposits. With the increasing demand for enriched stable isotope targets, other analytical techniques, such as ICP-MS and NAA, are needed. This paper presents the application of neutron activation analysis to quality control of 'thin' targets, 'thicker' neutron dosimeters and 'thick' bronze disks prepared by the Reference Materials Unit at the Institute of Reference Materials and Measurements

  14. Performance Analysis of Free-Space Optical Links Over Malaga (M) Turbulence Channels with Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique; Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2015-01-01

    In this work, we present a unified performance analysis of a free-space optical (FSO) link that accounts for pointing errors and both types of detection techniques (i.e. intensity modulation/direct detection (IM/DD) as well as heterodyne detection). More specifically, we present unified exact closedform expressions for the cumulative distribution function, the probability density function, the moment generating function, and the moments of the end-to-end signal-to-noise ratio (SNR) of a single link FSO transmission system, all in terms of the Meijer’s G function except for the moments that is in terms of simple elementary functions. We then capitalize on these unified results to offer unified exact closed-form expressions for various performance metrics of FSO link transmission systems, such as, the outage probability, the scintillation index (SI), the average error rate for binary and M-ary modulation schemes, and the ergodic capacity (except for IM/DD technique, where we present closed-form lower bound results), all in terms of Meijer’s G functions except for the SI that is in terms of simple elementary functions. Additionally, we derive the asymptotic results for all the expressions derived earlier in terms of Meijer’s G function in the high SNR regime in terms of simple elementary functions via an asymptotic expansion of the Meijer’s G function. We also derive new asymptotic expressions for the ergodic capacity in the low as well as high SNR regimes in terms of simple elementary functions via utilizing moments. All the presented results are verified via computer-based Monte-Carlo simulations.

  15. New approach to accuracy verification of 3D surface models: An analysis of point cloud coordinates.

    Science.gov (United States)

    Lee, Wan-Sun; Park, Jong-Kyoung; Kim, Ji-Hwan; Kim, Hae-Young; Kim, Woong-Chul; Yu, Chin-Ho

    2016-04-01

    The precision of two types of surface digitization devices, i.e., a contact probe scanner and an optical scanner, and the trueness of two types of stone replicas, i.e., one without an imaging powder (SR/NP) and one with an imaging powder (SR/P), were evaluated using a computer-aided analysis. A master die was fabricated from stainless steel. Ten impressions were taken, and ten stone replicas were prepared from Type IV stone (Fujirock EP, GC, Leuven, Belgium). The precision of two types of scanners was analyzed using the root mean square (RMS), measurement error (ME), and limits of agreement (LoA) at each coordinate. The trueness of the stone replicas was evaluated using the total deviation. A Student's t-test was applied to compare the discrepancies between the CAD-reference-models of the master die (m-CRM) and point clouds for the two types of stone replicas (α=.05). The RMS values for the precision were 1.58, 1.28, and 0.98μm along the x-, y-, and z-axes in the contact probe scanner and 1.97, 1.32, and 1.33μm along the x-, y-, and z-axes in the optical scanner, respectively. A comparison with m-CRM revealed a trueness of 7.10μm for SR/NP and 8.65μm for SR/P. The precision at each coordinate (x-, y-, and z-axes) was revealed to be higher than the one assessed in the previous method (overall offset differences). A comparison between the m-CRM and 3D surface models of the stone replicas revealed a greater dimensional change in SR/P than in SR/NP. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  16. Flux balance analysis of ammonia assimilation network in E. coli predicts preferred regulation point.

    Science.gov (United States)

    Wang, Lu; Lai, Luhua; Ouyang, Qi; Tang, Chao

    2011-01-25

    Nitrogen assimilation is a critical biological process for the synthesis of biomolecules in Escherichia coli. The central ammonium assimilation network in E. coli converts carbon skeleton α-ketoglutarate and ammonium into glutamate and glutamine, which further serve as nitrogen donors for nitrogen metabolism in the cell. This reaction network involves three enzymes: glutamate dehydrogenase (GDH), glutamine synthetase (GS) and glutamate synthase (GOGAT). In minimal media, E. coli tries to maintain an optimal growth rate by regulating the activity of the enzymes to match the availability of the external ammonia. The molecular mechanism and the strategy of the regulation in this network have been the research topics for many investigators. In this paper, we develop a flux balance model for the nitrogen metabolism, taking into account of the cellular composition and biosynthetic requirements for nitrogen. The model agrees well with known experimental results. Specifically, it reproduces all the (15)N isotope labeling experiments in the wild type and the two mutant (ΔGDH and ΔGOGAT) strains of E. coli. Furthermore, the predicted catalytic activities of GDH, GS and GOGAT in different ammonium concentrations and growth rates for the wild type, ΔGDH and ΔGOGAT strains agree well with the enzyme concentrations obtained from western blots. Based on this flux balance model, we show that GS is the preferred regulation point among the three enzymes in the nitrogen assimilation network. Our analysis reveals the pattern of regulation in this central and highly regulated network, thus providing insights into the regulation strategy adopted by the bacteria. Our model and methods may also be useful in future investigations in this and other networks.

  17. Screening of point mutations by multiple SSCP analysis in the dystrophin gene

    Energy Technology Data Exchange (ETDEWEB)

    Lasa, A.; Baiget, M.; Gallano, P. [Hospital Sant Pau, Barcelona (Spain)

    1994-09-01

    Duchenne muscular dystrophy (DMD) is a lethal, X-linked neuromuscular disorder. The population frequency of DMD is one in approximately 3500 boys, of which one third is thought to be a new mutant. The DMD gene is the largest known to date, spanning over 2,3 Mb in band Xp21.2; 79 exons are transcribed into a 14 Kb mRNA coding for a protein of 427 kD which has been named dystrophin. It has been shown that about 65% of affected boys have a gene deletion with a wide variation in localization and size. The remaining affected individuals who have no detectable deletions or duplications would probably carry more subtle mutations that are difficult to detect. These mutations occur in several different exons and seem to be unique to single patients. Their identification represents a formidable goal because of the large size and complexity of the dystrophin gene. SSCP is a very efficient method for the detection of point mutations if the parameters that affect the separation of the strands are optimized for a particular DNA fragment. The multiple SSCP allows the simultaneous study of several exons, and implies the use of different conditions because no single set of conditions will be optimal for all fragments. Seventy-eight DMD patients with no deletion or duplication in the dystrophin gene were selected for the multiple SSCP analysis. Genomic DNA from these patients was amplified using the primers described for the diagnosis procedure (muscle promoter and exons 3, 8, 12, 16, 17, 19, 32, 45, 48 and 51). We have observed different mobility shifts in bands corresponding to exons 8, 12, 43 and 51. In exons 17 and 45, altered electrophoretic patterns were found in different samples identifying polymorphisms already described.

  18. Performance Analysis of Free-Space Optical Links Over Malaga (M) Turbulence Channels with Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique

    2015-08-12

    In this work, we present a unified performance analysis of a free-space optical (FSO) link that accounts for pointing errors and both types of detection techniques (i.e. intensity modulation/direct detection (IM/DD) as well as heterodyne detection). More specifically, we present unified exact closedform expressions for the cumulative distribution function, the probability density function, the moment generating function, and the moments of the end-to-end signal-to-noise ratio (SNR) of a single link FSO transmission system, all in terms of the Meijer’s G function except for the moments that is in terms of simple elementary functions. We then capitalize on these unified results to offer unified exact closed-form expressions for various performance metrics of FSO link transmission systems, such as, the outage probability, the scintillation index (SI), the average error rate for binary and M-ary modulation schemes, and the ergodic capacity (except for IM/DD technique, where we present closed-form lower bound results), all in terms of Meijer’s G functions except for the SI that is in terms of simple elementary functions. Additionally, we derive the asymptotic results for all the expressions derived earlier in terms of Meijer’s G function in the high SNR regime in terms of simple elementary functions via an asymptotic expansion of the Meijer’s G function. We also derive new asymptotic expressions for the ergodic capacity in the low as well as high SNR regimes in terms of simple elementary functions via utilizing moments. All the presented results are verified via computer-based Monte-Carlo simulations.

  19. Point-of-sale tobacco promotion and youth smoking: a meta-analysis.

    Science.gov (United States)

    Robertson, Lindsay; Cameron, Claire; McGee, Rob; Marsh, Louise; Hoek, Janet

    2016-12-01

    Previous systematic reviews have found consistent evidence of a positive association between exposure to point-of-sale (POS) tobacco promotion and increased smoking and smoking susceptibility among children and adolescents. No meta-analysis has been conducted on these studies to date. Systematic literature searches were carried out to identify all quantitative observational studies that examined the relationship between POS tobacco promotion and individual-level smoking and smoking-related cognitions among children and adolescents, published between January 1990 and June 2014. Random-effects meta-analyses were used. Subgroup analyses were conducted according to extent of tobacco POS advertising environment in the study environment. Sensitivity analyses were performed according to study size and quality. 13 studies met the inclusion criteria; 11 reported data for behavioural outcomes, 6 for cognitive outcomes (each of these assessed smoking susceptibility). The studies were cross-sectional, with the exception of 2 cohort studies. For the behavioural outcomes, the pooled OR was 1.61 (95% CI 1.33 to 1.96) and for smoking susceptibility the pooled OR was 1.32 (95% CI 1.09 to 1.61). Children and adolescents more frequently exposed to POS tobacco promotion have around 1.6 times higher odds of having tried smoking and around 1.3 times higher odds of being susceptible to future smoking, compared with those less frequently exposed. Together with the available evaluations of POS display bans, the results strongly indicate that legislation banning tobacco POS promotion will effectively reduce smoking among young people. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Flux balance analysis of ammonia assimilation network in E. coli predicts preferred regulation point.

    Directory of Open Access Journals (Sweden)

    Lu Wang

    Full Text Available Nitrogen assimilation is a critical biological process for the synthesis of biomolecules in Escherichia coli. The central ammonium assimilation network in E. coli converts carbon skeleton α-ketoglutarate and ammonium into glutamate and glutamine, which further serve as nitrogen donors for nitrogen metabolism in the cell. This reaction network involves three enzymes: glutamate dehydrogenase (GDH, glutamine synthetase (GS and glutamate synthase (GOGAT. In minimal media, E. coli tries to maintain an optimal growth rate by regulating the activity of the enzymes to match the availability of the external ammonia. The molecular mechanism and the strategy of the regulation in this network have been the research topics for many investigators. In this paper, we develop a flux balance model for the nitrogen metabolism, taking into account of the cellular composition and biosynthetic requirements for nitrogen. The model agrees well with known experimental results. Specifically, it reproduces all the (15N isotope labeling experiments in the wild type and the two mutant (ΔGDH and ΔGOGAT strains of E. coli. Furthermore, the predicted catalytic activities of GDH, GS and GOGAT in different ammonium concentrations and growth rates for the wild type, ΔGDH and ΔGOGAT strains agree well with the enzyme concentrations obtained from western blots. Based on this flux balance model, we show that GS is the preferred regulation point among the three enzymes in the nitrogen assimilation network. Our analysis reveals the pattern of regulation in this central and highly regulated network, thus providing insights into the regulation strategy adopted by the bacteria. Our model and methods may also be useful in future investigations in this and other networks.

  1. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  2. Optimization of an Optical Inspection System Based on the Taguchi Method for Quantitative Analysis of Point-of-Care Testing

    Directory of Open Access Journals (Sweden)

    Chia-Hsien Yeh

    2014-09-01

    Full Text Available This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG strips (cut-off value of the hCG commercial product is 25 mIU/mL were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N, the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP.

  3. Meta-Analysis of PECS with Individuals with ASD: Investigation of Targeted versus Non-Targeted Outcomes, Participant Characteristics, and Implementation Phase

    Science.gov (United States)

    Ganz, Jennifer B.; Davis, John L.; Lund, Emily M.; Goodwyn, Fara D.; Simpson, Richard L.

    2012-01-01

    The Picture Exchange Communication System (PECS) is a widely used picture/icon aided augmentative communication system designed for learners with autism and other developmental disorders. This meta-analysis analyzes the extant empirical literature for PECS relative to targeted (functional communication) and non-targeted concomitant outcomes…

  4. Acupuncture-Point Stimulation for Postoperative Pain Control: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Xian-Liang Liu

    2015-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of Acupuncture-point stimulation (APS in postoperative pain control compared with sham/placebo acupuncture or standard treatments (usual care or no treatment. Only randomized controlled trials (RCTs were included. Meta-analysis results indicated that APS interventions improved VAS scores significantly and also reduced total morphine consumption. No serious APS-related adverse effects (AEs were reported. There is Level I evidence for the effectiveness of body points plaster therapy and Level II evidence for body points electroacupuncture (EA, body points acupressure, body points APS for abdominal surgery patients, auricular points seed embedding, manual auricular acupuncture, and auricular EA. We obtained Level III evidence for body points APS in patients who underwent cardiac surgery and cesarean section and for auricular-point stimulation in patients who underwent abdominal surgery. There is insufficient evidence to conclude that APS is an effective postoperative pain therapy in surgical patients, although the evidence does support the conclusion that APS can reduce analgesic requirements without AEs. The best level of evidence was not adequate in most subgroups. Some limitations of this study may have affected the results, possibly leading to an overestimation of APS effects.

  5. A SWOT Analysis of the Nabucco Pipeline from Romania’s Point of View

    Directory of Open Access Journals (Sweden)

    Mariana Papatulica

    2009-07-01

    Full Text Available European Union energy sources are supposed to be sufficient to cover expected growth of natural gas demand for the coming decades, but there are not enough opportunities/infrastructure to transport these volumes of gas to European markets. Arbitrary interruptions of Russia gas deliveries towards Europe, the delays in the rehabilitation of its obsolete pipeline network, the interdiction of direct Asian gas exports transit through Russian transport infrastructure, made stringently necessary for European countries to diversify gas suppliers’ portfolio, by avoiding Russian territory. Nabucco pipeline was conceived as an alternative to European Union countries’ high dependence on Russian gas (about 40% of their consumption is provided by Russia, by connecting European Union countries directly to the huge natural gas resources of Central Asia, on the route Turkey – Bulgaria – Romania – Hungary – Austria. The purpose of this paper is to make a SWOT analysis of this project, highlighting its strengths and weakness from Romania’s point of view, as well as the opportunities and threats as external factors. The main idea resulting from the analysis is that strengths are prevailing for Romania. The turning to account of this project will ensure the diversification of gas sources and the development of competitive markets which can entail price reduction. It is supposed to be a fair and advantageous option, economically reliable, that will reduce dependence on deliveries of gas from a single source – Russia, ensuring two undeniable prerequisites: accessibility (to new supply sources and availability (which refers to guarantees of long term sustainability of gas deliveries. The project implementation will allow energy to help to establish new structural links between the EU, Turkey and the Caspian Sea states and will ensure transfrontier cooperation possibilities inside some euro-regions already constituted, by accessing regional development

  6. Quantitative functional analysis of Late Glacial projectile points from northern Europe

    DEFF Research Database (Denmark)

    Dev, Satya; Riede, Felix

    2012-01-01

    This paper discusses the function of Late Glacial arch-backed and tanged projectile points from northern Europe in general and southern Scandinavia in particular. Ballistic requirements place clear and fairly well understood constraints on the design of projectile points. We outline the argument...... surely fully serviceable, diverged considerably from the functional optimum predicated by ballistic theory. These observations relate directly to southern Scandinavian Late Glacial culture-history which is marked by a sequence of co-occurrence of arch-backed and large tanged points in the earlier part...

  7. Genome-Wide Analysis of miRNA targets in Brachypodium and Biomass Energy Crops

    Energy Technology Data Exchange (ETDEWEB)

    Green, Pamela J. [Univ. of Delaware, Newark, DE (United States)

    2015-08-11

    MicroRNAs (miRNAs) contribute to the control of numerous biological processes through the regulation of specific target mRNAs. Although the identities of these targets are essential to elucidate miRNA function, the targets are much more difficult to identify than the small RNAs themselves. Before this work, we pioneered the genome-wide identification of the targets of Arabidopsis miRNAs using an approach called PARE (German et al., Nature Biotech. 2008; Nature Protocols, 2009). Under this project, we applied PARE to Brachypodium distachyon (Brachypodium), a model plant in the Poaceae family, which includes the major food grain and bioenergy crops. Through in-depth global analysis and examination of specific examples, this research greatly expanded our knowledge of miRNAs and target RNAs of Brachypodium. New regulation in response to environmental stress or tissue type was found, and many new miRNAs were discovered. More than 260 targets of new and known miRNAs with PARE sequences at the precise sites of miRNA-guided cleavage were identified and characterized. Combining PARE data with the small RNA data also identified the miRNAs responsible for initiating approximately 500 phased loci, including one of the novel miRNAs. PARE analysis also revealed that differentially expressed miRNAs in the same family guide specific target RNA cleavage in a correspondingly tissue-preferential manner. The project included generation of small RNA and PARE resources for bioenergy crops, to facilitate ongoing discovery of conserved miRNA-target RNA regulation. By associating specific miRNA-target RNA pairs with known physiological functions, the research provides insights about gene regulation in different tissues and in response to environmental stress. This, and release of new PARE and small RNA data sets should contribute basic knowledge to enhance breeding and may suggest new strategies for improvement of biomass energy crops.

  8. Bioinformatic analysis of xenobiotic reactive metabolite target proteins and their interacting partners

    Directory of Open Access Journals (Sweden)

    Hanzlik Robert P

    2009-06-01

    Full Text Available Abstract Background Protein covalent binding by reactive metabolites of drugs, chemicals and natural products can lead to acute cytotoxicity. Recent rapid progress in reactive metabolite target protein identification has shown that adduction is surprisingly selective and inspired the hope that analysis of target proteins might reveal protein factors that differentiate target- vs. non-target proteins and illuminate mechanisms connecting covalent binding to cytotoxicity. Results Sorting 171 known reactive metabolite target proteins revealed a number of GO categories and KEGG pathways to be significantly enriched in targets, but in most cases the classes were too large, and the "percent coverage" too small, to allow meaningful conclusions about mechanisms of toxicity. However, a similar analysis of the directlyinteracting partners of 28 common targets of multiple reactive metabolites revealed highly significant enrichments in terms likely to be highly relevant to cytotoxicity (e.g., MAP kinase pathways, apoptosis, response to unfolded protein. Machine learning was used to rank the contribution of 211 computed protein features to determining protein susceptibility to adduction. Protein lysine (but not cysteine content and protein instability index (i.e., rate of turnover in vivo were among the features most important to determining susceptibility. Conclusion As yet there is no good explanation for why some low-abundance proteins become heavily adducted while some abundant proteins become only lightly adducted in vivo. Analyzing the directly interacting partners of target proteins appears to yield greater insight into mechanisms of toxicity than analyzing target proteins per se. The insights provided can readily be formulated as hypotheses to test in future experimental studies.

  9. Numerical analysis of free surface instabilities in the IFMIF lithium target

    International Nuclear Information System (INIS)

    Gordeev, S.; Heinzel, V.; Moeslang, A.

    2007-01-01

    The International Fusion Materials Facility (IFMIF) facility uses a high speed (10-20 m/s) Lithium (Li) jet flow as a target for two 40 MeV/125 mA deuteron beams. The major function of the Li target is to provide a stable Li jet for the production of an intense neutron flux. For the understanding the lithium jet behaviour and elimination of the free-surface flow instabilities a detailed analysis of the Li jet flow is necessary. Different kinds of instability mechanisms in the liquid jet flow have been evaluated and classified based on analytical and experimental data. Numerical investigations of the target free surface flow have been performed. Previous numerical investigations have shown in principle the suitability of CFD code Star- CD for the simulation of the Li-target flow. The main objective of this study is detailed numerical analysis of instabilities in the Li-jet flow caused by boundary layer relaxation near the nozzle exit, transition to the turbulence flow and back wall curvature. A number of CFD models are developed to investigate the formation of instabilities on the target surface. Turbulence models are validated on the experimental data. Experimental observations have shown that the change of the nozzle geometry at the outlet such as a slight divergence of the nozzle surfaces or nozzle edge defects causes the flow separation and occurrence of longitudinal periodic structures on the free surface with an amplitude up to 5 mm. Target surface fluctuations of this magnitude can lead to the penetration of the deuteron beam in the target structure and cause the local overheating of the back plat. Analysis of large instabilities in the Li-target flow combined with the heat distribution in lithium depending on the free surface shape is performed in this study. (orig.)

  10. Numerical analysis of slowest heating or cooling point in a canned food in oil

    Energy Technology Data Exchange (ETDEWEB)

    Hanzawa, T.; Wang, Q.; Suzuki, M.; Sakai, N. [Tokyo Univ. of Fisheries (Japan)

    1998-06-01

    In the sterilizing process of canned food in oil for a fish meat such as tunny, the slowest heating or cooling point is very important for the thermal process determination of the can. To obtain the slowest point, the temperature profiles in solid food are estimated by numerical calculation from the fundamental equations at unsteady state in consideration of a free convection in the space occupied by the oil. The positions of the slowest heating or cooling point in the canned food in oil are obtained accurately, and a correlative equation for the position is obtained numerically under various operating conditions. The calculated temperature profiles and the position of both slowest points are in sufficiently good approximation to the experimental ones. 4 refs., 9 figs.

  11. Do points, levels and leaderboards harm intrinsic motivation? An empirical analysis of common gamification elements

    DEFF Research Database (Denmark)

    Mekler, Elisa D.; Brühlmann, Florian; Opwis, Klaus

    2013-01-01

    It is heavily debated within the gamification community whether specific game elements may actually undermine users' intrinsic motivation. This online experiment examined the effects of three commonly employed game design elements - points, leaderboard, levels - on users' performance, intrinsic...

  12. Thermal Analysis of a Cracked Half-plane under Moving Point Heat Source

    Directory of Open Access Journals (Sweden)

    He Kuanfang

    2017-09-01

    Full Text Available The heat conduction in half-plane with an insulated crack subjected to moving point heat source is investigated. The analytical solution and the numerical means are combined to analyze the transient temperature distribution of a cracked half-plane under moving point heat source. The transient temperature distribution of the half plane structure under moving point heat source is obtained by the moving coordinate method firstly, then the heat conduction equation with thermal boundary of an insulated crack face is changed to singular integral equation by applying Fourier transforms and solved by the numerical method. The numerical examples of the temperature distribution on the cracked half-plane structure under moving point heat source are presented and discussed in detail.

  13. COMPARATIVE ANALYSIS OF 3D POINT CLOUDS GENERATED FROM A FREEWARE AND TERRESTRIAL LASER SCANNER

    Directory of Open Access Journals (Sweden)

    K. R. Dayal

    2017-07-01

    Full Text Available In the recent past, several heritage structures have faced destruction due to both human-made incidents and natural calamities that have caused a great loss to the human race regarding its cultural achievements. In this context, the importance of documenting such structures to create a substantial database cannot be emphasised enough. The Clock Tower of Dehradun, India is one such structure. There is a lack of sufficient information in the digital domain, which justified the need to carry out this study. Thus, an attempt has been made to gauge the possibilities of using open source 3D tools such as VSfM to quickly and easily obtain point clouds of an object and assess its quality. The photographs were collected using consumer grade cameras with reasonable effort to ensure overlap. The sparse reconstruction and dense reconstruction were carried out to generate a 3D point cloud model of the tower. A terrestrial laser scanner (TLS was also used to obtain a point cloud of the tower. The point clouds obtained from the two methods were analyzed to understand the quality of the information present; TLS acquired point cloud being a benchmark to assess the VSfM point cloud. They were compared to analyze the point density and subjected to a plane-fitting test for sample flat portions on the structure. The plane-fitting test revealed the planarity of the point clouds. A Gauss distribution fit yielded a standard deviation of 0.002 and 0.01 for TLS and VSfM, respectively. For more insight, comparisons with Agisoft Photoscan results were also made.

  14. Inference of miRNA targets using evolutionary conservation and pathway analysis

    Directory of Open Access Journals (Sweden)

    van Nimwegen Erik

    2007-03-01

    assigns a posterior probability to each putative target site. The results presented here indicate that our general method achieves very good performance in predicting miRNA target sites, providing at the same time insights into the evolution of target sites for individual miRNAs. Moreover, by combining our predictions with pathway analysis, we propose functions of specific miRNAs in nervous system development, inter-cellular communication and cell growth. The complete target site predictions as well as the miRNA/pathway associations are accessible on the ElMMo web server.

  15. Thermal analysis of Ti drive-in target for D-D neutron generation

    International Nuclear Information System (INIS)

    Jung, N.S.; Kim, I.J.; Kim, S.J.; Choi, H.D.

    2008-01-01

    Full text: Thermal analysis was performed for a Ti drive-in target of a D-D neutron generator. Numerical calculation was the only feasible way to obtain the information of the target temperature, since it was very difficult to measure the target temperature during neutron generation due to high voltage being applied to the target. Computational fluid dynamics code CFX-5 was used in this study. In order to define the heat flux term for the thermal analysis, the current profile of the ion beam was measured. The one-dimensional, integrated current profile was measured by using a single slit and a Faraday cup. The measured current profile was transformed into the axially symmetric two-dimensional distribution function by using the Abel inversion, which had the two-dimensional Gaussian function shape. Temperature distribution in the target was calculated at the operating condition. The influence of operational parameters like the ion beam energy, current, coolant mass flow rate and coolant inlet temperature on the target temperature was investigated

  16. Positioning accuracy analysis of adjusting target mechanism of three-dimensional attitude

    International Nuclear Information System (INIS)

    Ma Li; Wang Kun; Sun Linzhi; Zhou Shasha

    2012-01-01

    A novel adjusting target mechanism of three-dimensional attitude is presented according to the characteristics of the target transport subsystem in inertial confinement fusion (ICF). The mechanism consists of a tangent mechanism adjusting rotation angle and a set of orthogonal tangent mechanism adjusting two-dimensional deflection angles. The structural parameters of the adjusting target mechanism are analyzed according to principle errors, structure errors and motion errors of following. The analysis results indicate that the system error of the adjusting target mechanism is influenced by the displacement of the linear actuators, the actuator ball radius, the working radius of the tangent mechanism, the angle error of the inclined installation hole, the centralization error of the actuators, the orthogonal error of the two tangent mechanism, and the angle errors of the inclined target rod inclined rotation shaft. The errors of the inclined target rod and inclined rotation shaft are the two greatest impact factors, the spherical contact error is the next. By means of precise assembly and control system compensation, the accuracy of the adjusting target mechanism can be less than 0.1 mrad. (authors)

  17. Comprehensive analysis of Curie-point depths and lithospheric effective elastic thickness at Arctic Region

    Science.gov (United States)

    Lu, Y.; Li, C. F.

    2017-12-01

    Arctic Ocean remains at the forefront of geological exploration. Here we investigate its deep geological structures and geodynamics on the basis of gravity, magnetic and bathymetric data. We estimate Curie-point depth and lithospheric effective elastic thickness to understand deep geothermal structures and Arctic lithospheric evolution. A fractal exponent of 3.0 for the 3D magnetization model is used in the Curie-point depth inversion. The result shows that Curie-point depths are between 5 and 50 km. Curie depths are mostly small near the active mid-ocean ridges, corresponding well to high heat flow and active shallow volcanism. Large curie depths are distributed mainly at continental marginal seas around the Arctic Ocean. We present a map of effective elastic thickness (Te) of the lithosphere using a multitaper coherence technique, and Te are between 5 and 110 km. Te primarily depends on geothermal gradient and composition, as well as structures in the lithosphere. We find that Te and Curie-point depths are often correlated. Large Te are distributed mainly at continental region and small Te are distributed at oceanic region. The Alpha-Mendeleyev Ridge (AMR) and The Svalbard Archipelago (SA) are symmetrical with the mid-ocean ridge. AMR and SA were formed before an early stage of Eurasian basin spreading, and they are considered as conjugate large igneous provinces, which show small Te and Curie-point depths. Novaya Zemlya region has large Curie-point depths and small Te. We consider that fault and fracture near the Novaya Zemlya orogenic belt cause small Te. A series of transform faults connect Arctic mid-ocean ridge with North Atlantic mid-ocean ridge. We can see large Te near transform faults, but small Curie-point depths. We consider that although temperature near transform faults is high, but mechanically the lithosphere near transform faults are strengthened.

  18. Scrub typhus point-of-care testing: A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Kartika Saraswati

    2018-03-01

    Full Text Available Diagnosing scrub typhus clinically is difficult, hence laboratory tests play a very important role in diagnosis. As performing sophisticated laboratory tests in resource-limited settings is not feasible, accurate point-of-care testing (POCT for scrub typhus diagnosis would be invaluable for patient diagnosis and management. Here we summarise the existing evidence on the accuracy of scrub typhus POCTs to inform clinical practitioners in resource-limited settings of their diagnostic value.Studies on POCTs which can be feasibly deployed in primary health care or outpatient settings were included. Thirty-one studies were identified through PubMed and manual searches of reference lists. The quality of the studies was assessed with the Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2. About half (n = 14/31 of the included studies were of moderate quality. Meta-analysis showed the pooled sensitivity and specificity of commercially available immunochromatographic tests (ICTs were 66.0% (95% CI 0.37-0.86 and 92.0% (95% CI 0.83-0.97, respectively. There was a significant and high degree of heterogeneity between the studies (I2 value = 97.48%, 95% CI 96.71-98.24 for sensitivity and I2 value = 98.17%, 95% CI 97.67-98.67 for specificity. Significant heterogeneity was observed for total number of samples between studies (p = 0.01, study design (whether using case-control design or not, p = 0.01, blinding during index test interpretation (p = 0.02, and QUADAS-2 score (p = 0.01.There was significant heterogeneity between the scrub typhus POCT diagnostic accuracy studies examined. Overall, the commercially available scrub typhus ICTs demonstrated better performance when 'ruling in' the diagnosis. There is a need for standardised methods and reporting of diagnostic accuracy to decrease between-study heterogeneity and increase comparability among study results, as well as development of an affordable and accurate antigen-based POCT to tackle the

  19. Retrospective analysis of the financial break-even point for intrathecal morphine pump use in Korea.

    Science.gov (United States)

    Kim, Eun Kyoung; Shin, Ji Yeon; Castañeda, Anyela Marcela; Lee, Seung Jae; Yoon, Hyun Kyu; Kim, Yong Chul; Moon, Jee Youn

    2017-10-01

    The high cost of intrathecal morphine pump (ITMP) implantation may be the main obstacle to its use. Since July 2014, the Korean national health insurance (NHI) program began paying 50% of the ITMP implantation cost in select refractory chronic pain patients. The aims of this study were to investigate the financial break-even point and patients' satisfaction in patients with ITMP treatment after the initiation of the NHI reimbursement. We collected data retrospectively or via direct phone calls to patients who underwent ITMP implantation at a single university-based tertiary hospital between July 2014 and May 2016. Pain severity, changes in the morphine equivalent daily dosage (MEDD), any adverse events, and patients' satisfaction were determined. We calculated the financial break-even point of ITMP implantation via investigating the patient's actual medical costs and insurance information. During the studied period, 23 patients received ITMP implantation, and 20 patients were included in our study. Scores on an 11-point numeric rating scale (NRS) for pain were significantly reduced compared to the baseline value ( P break-even point was 28 months for ITMP treatment after the NHI reimbursement policy. ITMP provided effective chronic pain management with improved satisfaction and reasonable financial break-even point of 28 months with 50% financial coverage by NHI program.

  20. TRAC analysis of design basis events for the accelerator production of tritium target/blanket

    International Nuclear Information System (INIS)

    Lin, J.C.; Elson, J.

    1997-01-01

    A two-loop primary cooling system with a residual heat removal system was designed to mitigate the heat generated in the tungsten neutron source rods inside the rungs of the ladders and the shell of the rungs. The Transient Reactor Analysis Code (TRAC) was used to analyze the thermal-hydraulic behavior of the primary cooling system during a pump coastdown transient; a cold-leg, large-break loss-of-coolant accident (LBLOCA); a hot-leg LBLOCA; and a target downcomer LBLOCA. The TRAC analysis results showed that the heat generated in the tungsten neutron source rods can be mitigated by the primary cooling system for the pump coastdown transient and all the LBLOCAs except the target downcomer LBLOCA. For the target downcomer LBLOCA, a cavity flood system is required to fill the cavity with water at a level above the large fixed headers

  1. Multiview 3D sensing and analysis for high quality point cloud reconstruction

    Science.gov (United States)

    Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard

    2018-04-01

    Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.

  2. Data analysis strategies for the characterization of normal: superconductor point contacts by barrier strength parameter

    Science.gov (United States)

    Smith, Charles W.; Reinertson, Randal C.; Dolan, P. J., Jr.

    1993-05-01

    The theoretical description by Blonder, Tinkham, and Klapwijk [Phys. Rev. B 25, 4515 (1982)] of the I-V curves of normal: superconductor point contacts encompasses a broad range of experimental behavior, from the tunnel junction case, on the one hand, to the clean metallic microconstriction limit on the other. The theory characterizes point contacts in terms of a single parameter, the barrier strength. The differential conductance of a point contact, at zero bias, as a function of temperature, offers a direct experimental method by which the barrier strength parameter can be evaluated. In view of the full range of phenomena incorporated by this theory, we suggest several different strategies for the evaluation of the barrier strength parameter from data in the low and intermediate barrier strength regimes and for measurements in the low temperature (near T=0 K) and high temperature (near T=Tc) limits.

  3. Plasma triglycerides and cardiovascular events in the Treating to New Targets and Incremental Decrease in End-Points through Aggressive Lipid Lowering trials of statins in patients with coronary artery disease

    DEFF Research Database (Denmark)

    Faergeman, Ole; Holme, Ingar; Fayyad, Rana

    2009-01-01

    We determined the ability of in-trial measurements of triglycerides (TGs) to predict new cardiovascular events (CVEs) using data from the Incremental Decrease in End Points through Aggressive Lipid Lowering (IDEAL) and Treating to New Targets (TNT) trials. The trials compared atorvastatin 80 mg...

  4. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  5. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  6. Spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine; Møller, Jesper; Waagepetersen, Rasmus Plenge

    A complex multivariate spatial point pattern for a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially...... a maximum likelihood approach to inference where problems arise due to unknown interaction radii for the plants. We next demonstrate that a Bayesian approach provides a flexible framework for incorporating prior information concerning the interaction radii. From an ecological perspective, we are able both...

  7. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  8. Development, validation and application of multi-point kinetics model in RELAP5 for analysis of asymmetric nuclear transients

    Energy Technology Data Exchange (ETDEWEB)

    Pradhan, Santosh K., E-mail: santosh@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Obaidurrahman, K. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Iyer, Kannan N. [Department of Mechanical Engineering, IIT Bombay, Mumbai 400076 (India); Gaikwad, Avinash J. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India)

    2016-04-15

    Highlights: • A multi-point kinetics model is developed for RELAP5 system thermal hydraulics code. • Model is validated against extensive 3D kinetics code. • RELAP5 multi-point kinetics formulation is used to investigate critical break for LOCA in PHWR. - Abstract: Point kinetics approach in system code RELAP5 limits its use for many of the reactivity induced transients, which involve asymmetric core behaviour. Development of fully coupled 3D core kinetics code with system thermal-hydraulics is the ultimate requirement in this regard; however coupling and validation of 3D kinetics module with system code is cumbersome and it also requires access to source code. An intermediate approach with multi-point kinetics is appropriate and relatively easy to implement for analysis of several asymmetric transients for large cores. Multi-point kinetics formulation is based on dividing the entire core into several regions and solving ODEs describing kinetics in each region. These regions are interconnected by spatial coupling coefficients which are estimated from diffusion theory approximation. This model offers an advantage that associated ordinary differential equations (ODEs) governing multi-point kinetics formulation can be solved using numerical methods to the desired level of accuracy and thus allows formulation based on user defined control variables, i.e., without disturbing the source code and hence also avoiding associated coupling issues. Euler's method has been used in the present formulation to solve several coupled ODEs internally at each time step. The results have been verified against inbuilt point-kinetics models of RELAP5 and validated against 3D kinetics code TRIKIN. The model was used to identify the critical break in RIH of a typical large PHWR core. The neutronic asymmetry produced in the core due to the system induced transient was effectively handled by the multi-point kinetics model overcoming the limitation of in-built point kinetics model

  9. School programs targeting stress management in children and adolescence: a meta-analysis

    NARCIS (Netherlands)

    Kraag, G.C; Zeegers, M.P.; Kok, G.J.; Hosman, C.M.H.; Huijer Abu-Saad, H.

    2006-01-01

    Introduction This meta-analysis evaluates the effect of school programs targeting stress management or coping skills in school children. Methods Articles were selected through a systematic literature search. Only randomized controlled trials or quasi-experimental studies were included. The

  10. Simulating Serial-Target Antibacterial Drug Synergies Using Flux Balance Analysis

    DEFF Research Database (Denmark)

    Krueger, Andrew S.; Munck, Christian; Dantas, Gautam

    2016-01-01

    Flux balance analysis (FBA) is an increasingly useful approach for modeling the behavior of metabolic systems. However, standard FBA modeling of genetic knockouts cannot predict drug combination synergies observed between serial metabolic targets, even though such synergies give rise to some of t...

  11. Analysis of shots on target and goals scored in soccer matches ...

    African Journals Online (AJOL)

    The aim of this study was to analyse the characteristics and patterns of shots on target and goals scored during the 2012-European Championship. The broadcasted matches were recorded and converted into electronic video files for a computerbased analysis. This quantitative study examined 31 matches of the ...

  12. Doing Televised Rhetorical Analysis as a Means of Promoting College Awareness in a Target Market.

    Science.gov (United States)

    Schnell, Jim

    This paper describes aspects of doing televised rhetorical analysis as they relate to the promotion of college awareness in a particular target market. Considerations in the paper include variables that most professors encounter in their efforts to address the "service" expectations of their employment and how these variables can be…

  13. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Science.gov (United States)

    2010-02-24

    ... (HACCP); Approval of Information Collection Request AGENCY: Food and Nutrition Service, USDA. ACTION... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... must be based on the (HACCP) system established by the Secretary of Agriculture. The food safety...

  14. Point-of-care lactate and creatinine analysis for sick obstetric ...

    African Journals Online (AJOL)

    2016-03-15

    Mar 15, 2016 ... may take up to three days via the main hospital laboratory. The aim of this ... for handling point-of-care clinical chemistry devices and the most suitable .... Creatinine test strips and quality control solutions were stored in the ...

  15. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address t...... of the data. The selected number of common latent fields provides an index of complexity of the multivariate covariance structure. Hierarchical clustering is used to identify groups of species with similar patterns of dependence on the common latent fields.......Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...... the problems of identifying parsimonious models and of extracting biologically relevant information from the fitted models. The latent multivariate Gaussian field is decomposed into components given in terms of random fields common to all species and components which are species specific. This allows...

  16. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  17. Computational Analysis of Distance Operators for the Iterative Closest Point Algorithm.

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    Full Text Available The Iterative Closest Point (ICP algorithm is currently one of the most popular methods for rigid registration so that it has become the standard in the Robotics and Computer Vision communities. Many applications take advantage of it to align 2D/3D surfaces due to its popularity and simplicity. Nevertheless, some of its phases present a high computational cost thus rendering impossible some of its applications. In this work, it is proposed an efficient approach for the matching phase of the Iterative Closest Point algorithm. This stage is the main bottleneck of that method so that any efficiency improvement has a great positive impact on the performance of the algorithm. The proposal consists in using low computational cost point-to-point distance metrics instead of classic Euclidean one. The candidates analysed are the Chebyshev and Manhattan distance metrics due to their simpler formulation. The experiments carried out have validated the performance, robustness and quality of the proposal. Different experimental cases and configurations have been set up including a heterogeneous set of 3D figures, several scenarios with partial data and random noise. The results prove that an average speed up of 14% can be obtained while preserving the convergence properties of the algorithm and the quality of the final results.

  18. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard, E-mail: milena.wollmann@ufrgs.br, E-mail: vilhena@mat.ufrgs.br, E-mail: bardobodmann@ufrgs.br, E-mail: richard.vasques@fulbrightmail.org [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica

    2015-07-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  19. Stability analysis of the Gyroscopic Power Take-Off wave energy point absorber

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Zhang, Zili; Kramer, Morten Mejlhede

    2015-01-01

    The Gyroscopic Power Take-Off (GyroPTO) wave energy point absorber consists of a float rigidly connected to a lever. The operational principle is somewhat similar to that of the so-called gyroscopic hand wrist exercisers, where the rotation of the float is brought forward by the rotational particle...

  20. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    Science.gov (United States)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  1. Spectral analysis of point-vortex dynamics : first application to vortex polygons in a circular domain

    NARCIS (Netherlands)

    Speetjens, M.F.M.; Meleshko, V.V.; Heijst, van G.J.F.

    2014-01-01

    The present study addresses the classical problem of the dynamics and stability of a cluster of N point vortices of equal strength arranged in a polygonal configuration ("N-vortex polygons"). In unbounded domains, such N-vortex polygons are unconditionally stable for N

  2. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    International Nuclear Information System (INIS)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard

    2015-01-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  3. Analysis of Deregulated microRNAs and Their Target Genes in Gastric Cancer.

    Directory of Open Access Journals (Sweden)

    Simonas Juzėnas

    Full Text Available MicroRNAs (miRNAs are widely studied non-coding RNAs that modulate gene expression. MiRNAs are deregulated in different tumors including gastric cancer (GC and have potential diagnostic and prognostic implications. The aim of our study was to determine miRNA profile in GC tissues, followed by evaluation of deregulated miRNAs in plasma of GC patients. Using available databases and bioinformatics methods we also aimed to evaluate potential target genes of confirmed differentially expressed miRNA and validate these findings in GC tissues.The study included 51 GC patients and 51 controls. Initially, we screened miRNA expression profile in 13 tissue samples of GC and 12 normal gastric tissues with TaqMan low density array (TLDA. In the second stage, differentially expressed miRNAs were validated in a replication cohort using qRT-PCR in tissue and plasma samples. Subsequently, we analyzed potential target genes of deregulated miRNAs using bioinformatics approach, determined their expression in GC tissues and performed correlation analysis with targeting miRNAs.Profiling with TLDA revealed 15 deregulated miRNAs in GC tissues compared to normal gastric mucosa. Replication analysis confirmed that miR-148a-3p, miR-204-5p, miR-223-3p and miR-375 were consistently deregulated in GC tissues. Analysis of GC patients' plasma samples showed significant down-regulation of miR-148a-3p, miR-375 and up-regulation of miR-223-3p compared to healthy subjects. Further, using bioinformatic tools we identified targets of replicated miRNAs and performed disease-associated gene enrichment analysis. Ultimately, we evaluated potential target gene BCL2 and DNMT3B expression by qRT-PCR in GC tissue, which correlated with targeting miRNA expression.Our study revealed miRNA profile in GC tissues and showed that miR-148a-3p, miR-223-3p and miR-375 are deregulated in GC plasma samples, but these circulating miRNAs showed relatively weak diagnostic performance as sole biomarkers

  4. Improved target detection and bearing estimation utilizing fast orthogonal search for real-time spectral analysis

    International Nuclear Information System (INIS)

    Osman, Abdalla; El-Sheimy, Naser; Nourledin, Aboelamgd; Theriault, Jim; Campbell, Scott

    2009-01-01

    The problem of target detection and tracking in the ocean environment has attracted considerable attention due to its importance in military and civilian applications. Sonobuoys are one of the capable passive sonar systems used in underwater target detection. Target detection and bearing estimation are mainly obtained through spectral analysis of received signals. The frequency resolution introduced by current techniques is limited which affects the accuracy of target detection and bearing estimation at a relatively low signal-to-noise ratio (SNR). This research investigates the development of a bearing estimation method using fast orthogonal search (FOS) for enhanced spectral estimation. FOS is employed in this research in order to improve both target detection and bearing estimation in the case of low SNR inputs. The proposed methods were tested using simulated data developed for two different scenarios under different underwater environmental conditions. The results show that the proposed method is capable of enhancing the accuracy for target detection as well as bearing estimation especially in cases of a very low SNR

  5. Interacting with target tracking algorithms in a gaze-enhanced motion video analysis system

    Science.gov (United States)

    Hild, Jutta; Krüger, Wolfgang; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2016-05-01

    Motion video analysis is a challenging task, particularly if real-time analysis is required. It is therefore an important issue how to provide suitable assistance for the human operator. Given that the use of customized video analysis systems is more and more established, one supporting measure is to provide system functions which perform subtasks of the analysis. Recent progress in the development of automated image exploitation algorithms allow, e.g., real-time moving target tracking. Another supporting measure is to provide a user interface which strives to reduce the perceptual, cognitive and motor load of the human operator for example by incorporating the operator's visual focus of attention. A gaze-enhanced user interface is able to help here. This work extends prior work on automated target recognition, segmentation, and tracking algorithms as well as about the benefits of a gaze-enhanced user interface for interaction with moving targets. We also propose a prototypical system design aiming to combine both the qualities of the human observer's perception and the automated algorithms in order to improve the overall performance of a real-time video analysis system. In this contribution, we address two novel issues analyzing gaze-based interaction with target tracking algorithms. The first issue extends the gaze-based triggering of a target tracking process, e.g., investigating how to best relaunch in the case of track loss. The second issue addresses the initialization of tracking algorithms without motion segmentation where the operator has to provide the system with the object's image region in order to start the tracking algorithm.

  6. Uncertainty analysis of point-by-point sampling complex surfaces using touch probe CMMs DOE for complex surfaces verification with CMM

    DEFF Research Database (Denmark)

    Barini, Emanuele Modesto; Tosello, Guido; De Chiffre, Leonardo

    2010-01-01

    The paper describes a study concerning point-by-point sampling of complex surfaces using tactile CMMs. A four factor, two level completely randomized factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, co...

  7. Virtual diplomacy: an analysis of the structure of the target audiences

    Directory of Open Access Journals (Sweden)

    V. V. Verbytska

    2016-03-01

    Full Text Available In the context of the global information society the communication processes, especially at the international level, become more important.  The effectiveness of communication depends primarily on its focus, i.e. on defining clearly the target audience which it should focus on. Virtual diplomacy, as a kind of political communication at the international level, is no exception.  The novelty, rapid development and dissemination of this phenomenon require profound analysis and elaboration of effective utilization strategies, including studying its recipients and target audiences. Purpose: identification, structuring and analysis of the recipients of virtual diplomacy as the audiences of international political communication. The study uses such research methods, as system analysis, structural functionalism, dialectics and synergy, comparison, critical analysis. Main results of the research: 1. The study examined the specifics of political communication in the context of the development of the global information society at the international level. 2. It also analyzed the recipients of virtual diplomacy as a kind of political communication at the international level. 3. The study highlighted the key target groups in the global Internet network based on the tasks performed by virtual diplomacy. 4. It proved the effectiveness of cooperation with each target group in the framework of virtual diplomacy. 5. It described the specifics of the work with each target group in the context of virtual diplomacy. Practical implications: The article may be useful for writing scientific theoretical studies, tests, essays and term papers, for designing special courses in universities in the sphere of international relations and international information. It can also be a guide for the authorities carrying out diplomatic activities and international information cooperation. Findings: In the context of the establishment of the global information society political

  8. System for automatic x-ray-image analysis, measurement, and sorting of laser fusion targets

    International Nuclear Information System (INIS)

    Singleton, R.M.; Perkins, D.E.; Willenborg, D.L.

    1980-01-01

    This paper describes the Automatic X-Ray Image Analysis and Sorting (AXIAS) system which is designed to analyze and measure x-ray images of opaque hollow microspheres used as laser fusion targets. The x-ray images are first recorded on a high resolution film plate. The AXIAS system then digitizes and processes the images to accurately measure the target parameters and defects. The primary goals of the AXIAS system are: to provide extremely accurate and rapid measurements, to engineer a practical system for a routine production environment and to furnish the capability of automatically measuring an array of images for sorting and selection

  9. An analysis of health promotion materials for Dutch truck drivers: Off target and too complex?

    Science.gov (United States)

    Boeijinga, Anniek; Hoeken, Hans; Sanders, José

    2017-01-01

    Despite various health promotion initiatives, unfavorable figures regarding Dutch truck drivers' eating behaviors, exercise behaviors, and absenteeism have not improved. The aim was to obtain a better understanding of the low level of effectiveness of current health interventions for Dutch truck drivers by examining to what extent these are tailored to the target group's particular mindset (focus of content) and health literacy skills (presentation of content). The article analyzes 21 health promotion materials for Dutch truck drivers using a two-step approach: (a) an analysis of the materials' focus, guided by the Health Action Process Approach; and (b) an argumentation analysis, guided by pragma-dialectics. The corpus analysis revealed: (a) a predominant focus on the motivation phase; and (b) in line with the aim of motivating the target group, a consistent use of pragmatic arguments, which were typically presented in an implicit way. The results indicate that existing health promotion materials for Dutch truck drivers are not sufficiently tailored to the target group's mindset and health literacy skills. Recommendations are offered to develop more tailored/effective health interventions targeting this high-risk, underserved occupational group.

  10. TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.

    Science.gov (United States)

    Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung

    2016-01-01

    Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.

  11. Hot-spot analysis for drug discovery targeting protein-protein interactions.

    Science.gov (United States)

    Rosell, Mireia; Fernández-Recio, Juan

    2018-04-01

    Protein-protein interactions are important for biological processes and pathological situations, and are attractive targets for drug discovery. However, rational drug design targeting protein-protein interactions is still highly challenging. Hot-spot residues are seen as the best option to target such interactions, but their identification requires detailed structural and energetic characterization, which is only available for a tiny fraction of protein interactions. Areas covered: In this review, the authors cover a variety of computational methods that have been reported for the energetic analysis of protein-protein interfaces in search of hot-spots, and the structural modeling of protein-protein complexes by docking. This can help to rationalize the discovery of small-molecule inhibitors of protein-protein interfaces of therapeutic interest. Computational analysis and docking can help to locate the interface, molecular dynamics can be used to find suitable cavities, and hot-spot predictions can focus the search for inhibitors of protein-protein interactions. Expert opinion: A major difficulty for applying rational drug design methods to protein-protein interactions is that in the majority of cases the complex structure is not available. Fortunately, computational docking can complement experimental data. An interesting aspect to explore in the future is the integration of these strategies for targeting PPIs with large-scale mutational analysis.

  12. Comparative Analysis of Maximum Power Point Tracking Controllers under Partial Shaded Conditions in a Photovoltaic System

    Directory of Open Access Journals (Sweden)

    R. Ramaprabha

    2015-06-01

    Full Text Available Mismatching effects due to partial shaded conditions are the major drawbacks existing in today’s photovoltaic (PV systems. These mismatch effects are greatly reduced in distributed PV system architecture where each panel is effectively decoupled from its neighboring panel. To obtain the optimal operation of the PV panels, maximum power point tracking (MPPT techniques are used. In partial shaded conditions, detecting the maximum operating point is difficult as the characteristic curves are complex with multiple peaks. In this paper, a neural network control technique is employed for MPPT. Detailed analyses were carried out on MPPT controllers in centralized and distributed architecture under partial shaded environments. The efficiency of the MPPT controllers and the effectiveness of the proposed control technique under partial shaded environments was examined using MATLAB software. The results were validated through experimentation.

  13. Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis

    Science.gov (United States)

    Lee, Jonguk; Choi, Heesu; Park, Daihee; Chung, Yongwha; Kim, Hee-Young; Yoon, Sukhan

    2016-01-01

    Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs) from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs) for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods. PMID:27092509

  14. Identifying critical constraints for the maximum loadability of electric power systems - analysis via interior point method

    Energy Technology Data Exchange (ETDEWEB)

    Barboza, Luciano Vitoria [Sul-riograndense Federal Institute for Education, Science and Technology (IFSul), Pelotas, RS (Brazil)], E-mail: luciano@pelotas.ifsul.edu.br

    2009-07-01

    This paper presents an overview about the maximum load ability problem and aims to study the main factors that limit this load ability. Specifically this study focuses its attention on determining which electric system buses influence directly on the power demand supply. The proposed approach uses the conventional maximum load ability method modelled by an optimization problem. The solution of this model is performed using the Interior Point methodology. As consequence of this solution method, the Lagrange multipliers are used as parameters that identify the probable 'bottlenecks' in the electric power system. The study also shows the relationship between the Lagrange multipliers and the cost function in the Interior Point optimization interpreted like sensitivity parameters. In order to illustrate the proposed methodology, the approach was applied to an IEEE test system and to assess its performance, a real equivalent electric system from the South- Southeast region of Brazil was simulated. (author)

  15. Quality control for electron beam processing of polymeric materials by end-point analysis

    International Nuclear Information System (INIS)

    DeGraff, E.; McLaughlin, W.L.

    1981-01-01

    Properties of certain plastics, e.g. polytetrafluoroethylene, polyethylene, ethylene vinyl acetate copolymer, can be modified selectively by ionizing radiation. One of the advantages of this treatment over chemical methods is better control of the process and the end-product properties. The most convenient method of dosimetry for monitoring quality control is post-irradiation evaluation of the plastic itself, e.g., melt index and melt point determination. It is shown that by proper calibration in terms of total dose and sufficiently reproducible radiation effects, such product test methods provide convenient and meaningful analyses. Other appropriate standardized analytical methods include stress-crack resistance, stress-strain-to-fracture testing and solubility determination. Standard routine dosimetry over the dose and dose rate ranges of interest confirm that measured product end points can be correlated with calibrated values of absorbed dose in the product within uncertainty limits of the measurements. (author)

  16. Analysis and research on Maximum Power Point Tracking of Photovoltaic Array with Fuzzy Logic Control and Three-point Weight Comparison Method

    Institute of Scientific and Technical Information of China (English)

    LIN; Kuang-Jang; LIN; Chii-Ruey

    2010-01-01

    The Photovoltaic Array has a best optimal operating point where the array operating can obtain the maximum power.However, the optimal operating point can be compromised by the strength of solar radiation,angle,and by the change of environment and load.Due to the constant changes in these conditions,it has become very difficult to locate the optimal operating point by following a mathematical model.Therefore,this study will focus mostly on the application of Fuzzy Logic Control theory and Three-point Weight Comparison Method in effort to locate the optimal operating point of solar panel and achieve maximum efficiency in power generation. The Three-point Weight Comparison Method is the comparison between the characteristic curves of the voltage of photovoltaic array and output power;it is a rather simple way to track the maximum power.The Fuzzy Logic Control,on the other hand,can be used to solve problems that cannot be effectively dealt with by calculation rules,such as concepts,contemplation, deductive reasoning,and identification.Therefore,this paper uses these two kinds of methods to make simulation successively. The simulation results show that,the Three-point Comparison Method is more effective under the environment with more frequent change of solar radiation;however,the Fuzzy Logic Control has better tacking efficiency under the environment with violent change of solar radiation.

  17. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach.

    Science.gov (United States)

    Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark; Hunter, Paul R

    2016-06-01

    To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease.

  18. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  19. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  20. Structure analysis in algorithms and programs. Generator of coordinates of equivalent points. (Collected programs)

    International Nuclear Information System (INIS)

    Matyushenko, N.N.; Titov, Yu.G.

    1982-01-01

    Programs of atom coordinate generation and space symmetry groups in a form of equivalent point systems are presented. Programs of generation and coordinate output from an on-line storage are written in the FORTRAN language for the ES computer. They may be used in laboratories specialized in studying atomic structure and material properties, in colleges and by specialists in other fields of physics and chemistry

  1. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  2. Second-order analysis of inhomogeneous spatial point processes with proportional intensity functions

    DEFF Research Database (Denmark)

    Guan, Yongtao; Waagepetersen, Rasmus; Beale, Colin M.

    2008-01-01

    of the intensity functions. The first approach is based on nonparametric kernel-smoothing, whereas the second approach uses a conditional likelihood estimation approach to fit a parametric model for the pair correlation function. A great advantage of the proposed methods is that they do not require the often...... to two spatial point patterns regarding the spatial distributions of birds in the U.K.'s Peak District in 1990 and 2004....

  3. In-Plane free Vibration Analysis of an Annular Disk with Point Elastic Support

    OpenAIRE

    Bashmal, S.; Bhat, R.; Rakheja, S.

    2011-01-01

    In-plane free vibrations of an elastic and isotropic annular disk with elastic constraints at the inner and outer boundaries, which are applied either along the entire periphery of the disk or at a point are investigated. The boundary characteristic orthogonal polynomials are employed in the Rayleigh-Ritz method to obtain the frequency parameters and the associated mode shapes. Boundary characteristic orthogonal polynomials are generated for the free boundary conditions of the disk while arti...

  4. Point-splitting analysis of commutator anomalies in non-abelian chiral gauge theories

    International Nuclear Information System (INIS)

    Ghosh, S.; Banerjee, R.

    1988-01-01

    A gauge covariant point-splitting regularisation is employed to calculate different anomalous commutators in four dimensional chiral gauge theories. For an external gauge field the fixed time anomalous commutator of the gauge group generators is seen to violate the Jacobi identity. The cohomological prediction can be confirmed provided the electric fields do not commute. Other commutators like the current-current and current-electric field are consistent with the Bjorken-Johnson-Low (BJL) derivation. (orig.)

  5. Sorting points into neighborhoods (SPIN): data analysis and visualization by ordering distance matrices.

    Science.gov (United States)

    Tsafrir, D; Tsafrir, I; Ein-Dor, L; Zuk, O; Notterman, D A; Domany, E

    2005-05-15

    We introduce a novel unsupervised approach for the organization and visualization of multidimensional data. At the heart of the method is a presentation of the full pairwise distance matrix of the data points, viewed in pseudocolor. The ordering of points is iteratively permuted in search of a linear ordering, which can be used to study embedded shapes. Several examples indicate how the shapes of certain structures in the data (elongated, circular and compact) manifest themselves visually in our permuted distance matrix. It is important to identify the elongated objects since they are often associated with a set of hidden variables, underlying continuous variation in the data. The problem of determining an optimal linear ordering is shown to be NP-Complete, and therefore an iterative search algorithm with O(n3) step-complexity is suggested. By using sorting points into neighborhoods, i.e. SPIN to analyze colon cancer expression data we were able to address the serious problem of sample heterogeneity, which hinders identification of metastasis related genes in our data. Our methodology brings to light the continuous variation of heterogeneity--starting with homogeneous tumor samples and gradually increasing the amount of another tissue. Ordering the samples according to their degree of contamination by unrelated tissue allows the separation of genes associated with irrelevant contamination from those related to cancer progression. Software package will be available for academic users upon request.

  6. The fractography analysis of IN718 alloy after three-point flexure fatigue test

    Directory of Open Access Journals (Sweden)

    Belan Juraj

    2018-01-01

    Full Text Available In this study, the high cycle fatigue (HCF properties of IN718 superalloy with given chemical composition were investigated at three-point flexure fatigue test at room temperature. INCONEL alloy 718 is nickel-chromium-iron hardenable alloy and due to its unique combination of mechanical properties (high-strength; corrosion-resistant and so on used for production of heat resistant parts of aero jet engine mostly. Mechanical properties of this alloy are strongly dependent on microstructure and on presence of structural features such are principal strengthening phase gamma double prime, gamma prime and due to its morphology less desired delta phases. The mentioned phases precipitate at various temperature ranges and Nb content as well. The three-point flexure fatigue test was performed on ZWICK/ROELL Amsler 150 HFP 5100 test equipment with approximate loading frequency f=150 Hz. The S – N (Stress – Number of cycles curve was obtained after testing. With the help of scanning electron microscope (SEM, fractography analyses were performed to disclose the fracture features of specimens in different life ranges. The brief comparison of three-point flexure and push-pull fatigue loading modes and its influence on fatigue life is discussed as well.

  7. In-Plane free Vibration Analysis of an Annular Disk with Point Elastic Support

    Directory of Open Access Journals (Sweden)

    S. Bashmal

    2011-01-01

    Full Text Available In-plane free vibrations of an elastic and isotropic annular disk with elastic constraints at the inner and outer boundaries, which are applied either along the entire periphery of the disk or at a point are investigated. The boundary characteristic orthogonal polynomials are employed in the Rayleigh-Ritz method to obtain the frequency parameters and the associated mode shapes. Boundary characteristic orthogonal polynomials are generated for the free boundary conditions of the disk while artificial springs are used to account for different boundary conditions. The frequency parameters for different boundary conditions of the outer edge are evaluated and compared with those available in the published studies and computed from a finite element model. The computed mode shapes are presented for a disk clamped at the inner edge and point supported at the outer edge to illustrate the free in-plane vibration behavior of the disk. Results show that addition of point clamped support causes some of the higher modes to split into two different frequencies with different mode shapes.

  8. CFD Analysis of the Active Part of the HYPER Spallation Target

    International Nuclear Information System (INIS)

    Nam-il Tak; Chungho Cho; Tae-Yung Song

    2006-01-01

    KAERI (Korea Atomic Energy Research Institute) is developing an accelerator driven system (ADS) named HYPER (HYbrid Power Extraction Reactor) for a transmutation of long-lived nuclear wastes. One of the challenging tasks for the HYPER system is to design a large spallation target having a beam power of 15∼25 MW. The present paper focuses on the thermal-hydraulic performance of the active part of the HYPER target. Computational fluid dynamics (CFD) analysis was performed using a commercial code CFX 5.7.1. Several advanced turbulence models with different grid structures were applied. The CFX results show the significant impact of the turbulence model on the window temperature. It is concluded that experimental verifications are very important for the design of the HYPER target. (authors)

  9. Accurate Analysis of Target Characteristic in Bistatic SAR Images: A Dihedral Corner Reflectors Case.

    Science.gov (United States)

    Ao, Dongyang; Li, Yuanhao; Hu, Cheng; Tian, Weiming

    2017-12-22

    The dihedral corner reflectors are the basic geometric structure of many targets and are the main contributions of radar cross section (RCS) in the synthetic aperture radar (SAR) images. In stealth technologies, the elaborate design of the dihedral corners with different opening angles is a useful approach to reduce the high RCS generated by multiple reflections. As bistatic synthetic aperture sensors have flexible geometric configurations and are sensitive to the dihedral corners with different opening angles, they specially fit for the stealth target detections. In this paper, the scattering characteristic of dihedral corner reflectors is accurately analyzed in bistatic synthetic aperture images. The variation of RCS with the changing opening angle is formulated and the method to design a proper bistatic radar for maximizing the detection capability is provided. Both the results of the theoretical analysis and the experiments show the bistatic SAR could detect the dihedral corners, under a certain bistatic angle which is related to the geometry of target structures.

  10. Detection of Moving Targets Based on Doppler Spectrum Analysis Technique for Passive Coherent Radar

    Directory of Open Access Journals (Sweden)

    Zhao Yao-dong

    2013-06-01

    Full Text Available A novel method of moving targets detection taking Doppler spectrum analysis technique for Passive Coherent Radar (PCR is provided. After dividing the receiving signals into segments as pulse series, it utilizes the technique of pulse compress and Doppler processing to detect and locate the targets. Based on the algorithm for Pulse-Doppler (PD radar, the equipollence between continuous and pulsed wave in match filtering is proved and details of this method are introduced. To compare it with the traditional method of Cross-Ambiguity Function (CAF calculation, the relationship and mathematical modes of them are analyzed, with some suggestions on parameters choosing. With little influence to the gain of targets, the method can greatly promote the processing efficiency. The validity of the proposed method is demonstrated by offline processing real collected data sets and simulation results.

  11. Deterministic and stochastic analysis of alternative climate targets under differentiated cooperation regimes

    International Nuclear Information System (INIS)

    Loulou, Richard; Labriet, Maryse; Kanudia, Amit

    2009-01-01

    This article analyzes the feasibility of attaining a variety of climate targets during the 21st century, under alternative cooperation regimes by groups of countries. Five climate targets of increasing severity are analyzed, following the EMF-22 experiment. Each target is attempted under two cooperation regimes, a First Best scenario where all countries fully cooperate from 2012 on, and a Second Best scenario where the World is partitioned into three groups, and each group of countries enters the cooperation at a different date, and implement emission abatement actions in a progressive manner, once in the coalition. The resulting ten combinations are simulated via the ETSAP-TIAM technology based, integrated assessment model. In addition to the 10 separate case analyses, the article proposes a probabilistic treatment of three targets under the First Best scenario, and shows that the three forcing targets may in fact be interpreted as a single target on global temperature change, while assuming that the climate sensitivity C s is uncertain. It is shown that such an interpretation is possible only if the probability distribution of C s is carefully chosen. The analysis of the results shows that the lowest forcing level is unattainable unless immediate coordinated action is undertaken by all countries, and even so only at a high global cost. The middle and the high forcing levels are feasible at affordable global costs, even under the Second Best scenario. Another original contribution of this article is to explain why certain combinations of technological choices are made by the model, and in particular why the climate target clearly supersedes the usually accepted objective of improving energy efficiency. The analysis shows that under some climate targets, it is not optimal to improve energy efficiency, but rather to take advantage of certain technologies that help to reach the climate objective, but that happen to be less energy efficient than even the technologies

  12. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    Science.gov (United States)

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  13. Transcriptome profiling to identify ATRA-responsive genes in human iPSC-derived endoderm for high-throughput point of departure analysis (SOT Annual Meeting)

    Science.gov (United States)

    Toxicological tipping points occur at chemical concentrations that overwhelm a cell’s adaptive response leading to permanent effects. We focused on retinoid signaling in differentiating endoderm to identify developmental pathways for tipping point analysis. Human induced pluripot...

  14. A psychological and Islamic analysis of corporal punishment from children’s point of view

    Directory of Open Access Journals (Sweden)

    Mahbobeh Alborzi

    2012-04-01

    Full Text Available There are various methods and ways for the sociability of children. To this end, this study examines children’s point of view about corporal punishment from an Islamic and psychological perspective analytically. To do so, 40 male and female preschoolers selected on the basis of availability were tested using drawing and interview tests. The results from statistical analyses showed that most of participating children had experienced a mild corporal punishment. They didn’t have a positive view toward corporal punishment but they approved of other disciplining methods such as deprivation, penalization and not speaking with them. Children who experienced severe corporal punishment used dark colors, light lines and the least space in their paintings. Also, results from regression analysis showed that among the demographic variables of parents, the age and education of mothers were negative and significant predictors of the use of corporal punishment. The results were analyzed from both Islamic and psychological perspectives. روش‌ها و شیوه‌های متعددی برای جامعه‌پذیری کودکان وجود دارد. در این راستا پژوهش حاضر به بررسی دیدگاه کودکان از تنبیه بدنی با تحلیلی از رویکرد اسلامی و روانشناسی پرداخت. در این راستا تعداد چهل دختر و پسر مقطع پیش دبستانی بر اساس نمونه در دسترس با استفاده از آزمون نقاشی و مصاحبه مورد سنجش قرارگرفتند. نتایج تحلیل‌های آماری نشان داد بیشتر کودکان مورد تنبیه بدنی خفیف قرار گرفته بودند. کودکان دیدگاه مثبتی نسبت به تنبیه بدنی نداشتند، لیکن در قبال خطاها و اشتباهات خود روش‌های تربیتی دیگر همچون قهر، محرومیت و جریمه شدن

  15. Seismic analysis of fuel and target assemblies at a production reactor

    International Nuclear Information System (INIS)

    Braverman, J.I.; Wang, Y.K.

    1991-01-01

    This paper describes the unique modeling and analysis considerations used to assess the seismic adequacy of the fuel and target assemblies in a production reactor at Savannah River Site. This confirmatory analysis was necessary to provide assurance that the reactor can operate safely during a seismic event and be brought to a safe shutdown condition. The plant which was originally designed in the 1950's required to be assessed to more current seismic criteria. The design of the reactor internals and the magnitude of the structural responses enabled the use of a linear elastic dynamic analysis. A seismic analysis was performed using a finite element model consisting of the fuel and target assemblies, reactor tank, and a portion of the concrete structure supporting the reactor tank. The effects of submergence of the fuel and target assemblies in the water contained within the reactor tank can have a significant effect on their seismic response. Thus, the model included hydrodynamic fluid coupling effects between the assemblies and the reactor tank. Fluid coupling mass terms were based on formulations for solid bodies immersed in incompressible and frictionless fluids. The potential effects of gap conditions were also assessed in this evaluation. 5 refs., 6 figs., 1 tab

  16. Mapping the order and pattern of brain structural MRI changes using change-point analysis in premanifest Huntington's disease.

    Science.gov (United States)

    Wu, Dan; Faria, Andreia V; Younes, Laurent; Mori, Susumu; Brown, Timothy; Johnson, Hans; Paulsen, Jane S; Ross, Christopher A; Miller, Michael I

    2017-10-01

    Huntington's disease (HD) is an autosomal dominant neurodegenerative disorder that progressively affects motor, cognitive, and emotional functions. Structural MRI studies have demonstrated brain atrophy beginning many years prior to clinical onset ("premanifest" period), but the order and pattern of brain structural changes have not been fully characterized. In this study, we investigated brain regional volumes and diffusion tensor imaging (DTI) measurements in premanifest HD, and we aim to determine (1) the extent of MRI changes in a large number of structures across the brain by atlas-based analysis, and (2) the initiation points of structural MRI changes in these brain regions. We adopted a novel multivariate linear regression model to detect the inflection points at which the MRI changes begin (namely, "change-points"), with respect to the CAG-age product (CAP, an indicator of extent of exposure to the effects of CAG repeat expansion). We used approximately 300 T1-weighted and DTI data from premanifest HD and control subjects in the PREDICT-HD study, with atlas-based whole brain segmentation and change-point analysis. The results indicated a distinct topology of structural MRI changes: the change-points of the volumetric measurements suggested a central-to-peripheral pattern of atrophy from the striatum to the deep white matter; and the change points of DTI measurements indicated the earliest changes in mean diffusivity in the deep white matter and posterior white matter. While interpretation needs to be cautious given the cross-sectional nature of the data, these findings suggest a spatial and temporal pattern of spread of structural changes within the HD brain. Hum Brain Mapp 38:5035-5050, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. HYGIENE PRACTICES IN URBAN RESTAURANTS AND CHALLENGES TO IMPLEMENTING FOOD SAFETY AND HAZARD ANALYSIS CRITICAL CONTROL POINTS (HACCP) PROGRAMMES IN THIKA TOWN, KENYA.

    Science.gov (United States)

    Muinde, R K; Kiinyukia, C; Rombo, G O; Muoki, M A

    2012-12-01

    To determine the microbial load in food, examination of safety measures and possibility of implementing an Hazard Analysis Critical Control Points (HACCP) system. The target population for this study consisted of restaurants owners in Thika. Municipality (n = 30). Simple randomsamples of restaurantswere selected on a systematic sampling method of microbial analysis in cooked, non-cooked, raw food and water sanitation in the selected restaurants. Two hundred and ninety eight restaurants within Thika Municipality were selected. Of these, 30 were sampled for microbiological testing. From the study, 221 (74%) of the restaurants were ready to eat establishments where food was prepared early enough to hold and only 77(26%) of the total restaurants, customers made an order of food they wanted. 118(63%) of the restaurant operators/staff had knowledge on quality control on food safety measures, 24 (8%) of the restaurants applied these knowledge while 256 (86%) of the restaurants staff showed that food contains ingredients that were hazard if poorly handled. 238 (80%) of the resultants used weighing and sorting of food materials, 45 (15%) used preservation methods and the rest used dry foods as critical control points on food safety measures. The study showed that there was need for implementation of Hazard Analysis Critical Control Points (HACCP) system to enhance food safety. Knowledge of HACCP was very low with 89 (30%) of the restaurants applying some of quality measures to the food production process systems. There was contamination with Coliforms, Escherichia coli and Staphylococcus aureus microbial though at very low level. The means of Coliforms, Escherichia coli and Staphylococcus aureas microbial in sampled food were 9.7 x 103CFU/gm, 8.2 x 103 CFU/gm and 5.4 x 103 CFU/gm respectively with Coliforms taking the highest mean.

  18. Analysis of divertor asymmetry using a simple five-point model

    International Nuclear Information System (INIS)

    Hayashi, Nobuhiko; Takizuka, Tomonori; Hatayama, Akiyoshi; Ogasawara, Masatada.

    1997-03-01

    A simple five-point model of the scrape-off layer (SOL) plasma outside the separatrix of a diverted tokamak has been developed to study the inside/outside divertor asymmetry. The SOL current, gas pumping/puffing in the divertor region, and divertor plate biasing are included in this model. Gas pumping/puffing and biasing are shown to control divertor asymmetry. In addition, the SOL current is found to form asymmetric solutions without external controls of gas pumping/puffing and biasing. (author)

  19. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    Science.gov (United States)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  20. The analysis of thermal network of district heating system from investor point of view

    Science.gov (United States)

    Takács, Ján; Rácz, Lukáš

    2016-06-01

    The hydraulics of a thermal network of a district heating system is a very important issue, to which not enough attention is often paid. In this paper the authors want to point out some of the important aspects of the design and operation of thermal networks in district heating systems. The design boundary conditions of a heat distribution network and the requirements on active pressure - circulation pump - influencing the operation costs of the centralized district heating system as a whole, are analyzed in detail. The heat generators and the heat exchange stations are designed according to the design heat loads after thermal insulation, and modern boiler units are installed in the heating plant.

  1. A BRDF-BPDF database for the analysis of Earth target reflectances

    Science.gov (United States)

    Breon, Francois-Marie; Maignan, Fabienne

    2017-01-01

    Land surface reflectance is not isotropic. It varies with the observation geometry that is defined by the sun, view zenith angles, and the relative azimuth. In addition, the reflectance is linearly polarized. The reflectance anisotropy is quantified by the bidirectional reflectance distribution function (BRDF), while its polarization properties are defined by the bidirectional polarization distribution function (BPDF). The POLDER radiometer that flew onboard the PARASOL microsatellite remains the only space instrument that measured numerous samples of the BRDF and BPDF of Earth targets. Here, we describe a database of representative BRDFs and BPDFs derived from the POLDER measurements. From the huge number of data acquired by the spaceborne instrument over a period of 7 years, we selected a set of targets with high-quality observations. The selection aimed for a large number of observations, free of significant cloud or aerosol contamination, acquired in diverse observation geometries with a focus on the backscatter direction that shows the specific hot spot signature. The targets are sorted according to the 16-class International Geosphere-Biosphere Programme (IGBP) land cover classification system, and the target selection aims at a spatial representativeness within the class. The database thus provides a set of high-quality BRDF and BPDF samples that can be used to assess the typical variability of natural surface reflectances or to evaluate models. It is available freely from the PANGAEA website (PANGAEA.864090" target="_blank">doi:10.1594/PANGAEA.864090). In addition to the database, we provide a visualization and analysis tool based on the Interactive Data Language (IDL). It allows an interactive analysis of the measurements and a comparison against various BRDF and BPDF analytical models. The present paper describes the input data, the selection principles, the database format, and the analysis tool

  2. Towards understanding the lifespan extension by reduced insulin signaling: bioinformatics analysis of DAF-16/FOXO direct targets in Caenorhabditis elegans.

    Science.gov (United States)

    Li, Yan-Hui; Zhang, Gai-Gai

    2016-04-12

    DAF-16, the C. elegans FOXO transcription factor, is an important determinant in aging and longevity. In this work, we manually curated FOXODB http://lyh.pkmu.cn/foxodb/, a database of FOXO direct targets. It now covers 208 genes. Bioinformatics analysis on 109 DAF-16 direct targets in C. elegans found interesting results. (i) DAF-16 and transcription factor PQM-1 co-regulate some targets. (ii) Seventeen targets directly regulate lifespan. (iii) Four targets are involved in lifespan extension induced by dietary restriction. And (iv) DAF-16 direct targets might play global roles in lifespan regulation.

  3. Analysis of the thermomechanical behavior of the IFMIF bayonet target assembly under design loading scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Bernardi, D., E-mail: davide.bernardi@enea.it [ENEA Brasimone, Camugnano, BO (Italy); Arena, P.; Bongiovì, G.; Di Maio, P.A. [Dipartimento di Energia, Ingegneria dell’Informazione e Modelli Matematici, Università di Palermo, Viale delle Scienze, Palermo (Italy); Frisoni, M. [ENEA Bologna, Via Martiri di Monte Sole 4, Bologna (Italy); Miccichè, G.; Serra, M. [ENEA Brasimone, Camugnano, BO (Italy)

    2015-10-15

    In the framework of the IFMIF Engineering Validation and Engineering Design Activities (IFMIF/EVEDA) phase, ENEA is responsible for the design of the European concept of the IFMIF lithium target system which foresees the possibility to periodically replace only the most irradiated and thus critical component (i.e., the backplate) while continuing to operate the rest of the target for a longer period (the so-called bayonet backplate concept). In this work, the results of the steady state thermomechanical analysis of the IFMIF bayonet target assembly under two different design loading scenarios (a “hot” scenario and a “cold” scenario) are briefly reported highlighting the relevant indications obtained with respect to the fulfillment of the design requirements. In particular, the analyses have shown that in the hot scenario the temperatures reached in the target assembly are within the material acceptable limits while in the cold scenario transition below the ductile to brittle transition temperature (DBTT) cannot be excluded. Moreover, results indicate that the contact between backplate and high flux test module is avoided and that the overall structural integrity of the system is assured in both scenarios. However, stress linearization analysis reveals that ITER Structural Design Criteria for In-vessel Components (SDC-IC) design rules are not always met along the selected paths at backplate middle plane section in the hot scenario, thus suggesting the need of a revision of the backplate design or a change of the operating conditions.

  4. Thermal analysis of LEU modified Cintichem target irradiated in TRIGA reactor

    International Nuclear Information System (INIS)

    Catana, A; Toma, C.

    2009-01-01

    Actions conceived during last years at international level for conversion of Molybdenum fabrication process from HEU to LEU targets utilization created opportunities for INR to get access to information and participating to international discussions under IAEA auspices. Concrete steps for developing fission Molybdenum technology were facilitated. Institute of Nuclear Research bringing together a number of conditions like suitable irradiation possibilities, direct communication between reactor and hot cell facility, handling capacity of high radioactive sources, and simultaneously the existence of an expanding internal market, decided to undertake the necessary steps in order to produce fission molybdenum. Over the course of last years of efforts in this direction we developed the steps for fission Molybdenum technology development based on modified Cintichem process in accordance with the Argonne National Laboratory proved methodology. Progress made by INR to heat transfer computations of annular target using is presented. An advanced thermal-hydraulic analysis was performed to estimate the heat removal capability for an enriched uranium (LEU) foil annular target irradiated in TRIGA reactor core. As a result, the present analysis provides an upper limit estimate of the LEU-foil and external target surface temperatures during irradiation in TRIGA 14 MW reactor. (authors)

  5. Radiation inactivation analysis of enzymes. Effect of free radical scavengers on apparent target sizes

    International Nuclear Information System (INIS)

    Eichler, D.C.; Solomonson, L.P.; Barber, M.J.; McCreery, M.J.; Ness, G.C.

    1987-01-01

    In most cases the apparent target size obtained by radiation inactivation analysis corresponds to the subunit size or to the size of a multimeric complex. In this report, we examined whether the larger than expected target sizes of some enzymes could be due to secondary effects of free radicals. To test this proposal we carried out radiation inactivation analysis on Escherichia coli DNA polymerase I, Torula yeast glucose-6-phosphate dehydrogenase, Chlorella vulgaris nitrate reductase, and chicken liver sulfite oxidase in the presence and absence of free radical scavengers (benzoic acid and mannitol). In the presence of free radical scavengers, inactivation curves are shifted toward higher radiation doses. Plots of scavenger concentration versus enzyme activity showed that the protective effect of benzoic acid reached a maximum at 25 mM then declined. Mannitol alone had little effect, but appeared to broaden the maximum protective range of benzoic acid relative to concentration. The apparent target size of the polymerase activity of DNA polymerase I in the presence of free radical scavengers was about 40% of that observed in the absence of these agents. This is considerably less than the minimum polypeptide size and may reflect the actual size of the polymerase functional domain. Similar effects, but of lesser magnitude, were observed for glucose-6-phosphate dehydrogenase, nitrate reductase, and sulfite oxidase. These results suggest that secondary damage due to free radicals generated in the local environment as a result of ionizing radiation can influence the apparent target size obtained by this method

  6. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  7. Delay analysis of a point-to-multipoint spectrum sharing network with CSI based power allocation

    KAUST Repository

    Khan, Fahd Ahmed

    2012-10-01

    In this paper, we analyse the delay performance of a point-to-multipoint cognitive radio network which is sharing the spectrum with a point-to-multipoint primary network. The channel is assumed to be independent but not identically distributed and has Nakagami-m fading. A constraint on the peak transmit power of the secondary user transmitter (SU-Tx) is also considered in addition to the peak interference power constraint. Based on the constraints, a power allocation scheme which requires knowledge of the instantaneous channel state information (CSI) of the interference links is derived. The SU-Tx is assumed to be equipped with a buffer and is modelled using the M/G/1 queueing model. Closed form expressions for the probability distribution function (PDF) and cumulative distribution function (CDF) of the packet transmission time is derived. Using the PDF, the expressions for the moments of transmission time are obtained. In addition, using the moments, the expressions for the performance measures such as the total average waiting time of packets and the average number of packets waiting in the buffer of the SU-Tx are also obtained. Numerical simulations corroborate the theoretical results. © 2012 IEEE.

  8. Theoretical analysis of leaky surface acoustic waves of point-focused acoustic lens and some experiments

    International Nuclear Information System (INIS)

    Ishikawa, Isao; Suzuki, Yoshiaki; Ogura, Yukio; Katakura, Kageyoshi

    1997-01-01

    When a point-focused acoustic lens in the scanning acoustic microscope (SAM) is faced to test specimen and defocused to some extent, two effective echoes can be obtained. One is the echo of longitudinal wave, which is normally incident upon the specimen of an on-axis beam in the central region of the lens and is reflected normal to the lens surface, hence detected by the transducer. The other is of leaky surface acoustic waves(LSAW), which are mode converted front a narrow beam of off-axis longitudinal wave, then propagate across the surface of the specimen and reradiate at angles normal to the lens surface, thus detected by the transducer. These two echoes are either interfered or separated with each other depending ell the defocused distance. It turned out theoretically that the LSAW have a narrow focal spot in the central region of the point-focused acoustic lens, whose size is approximately 40% of the LSAW wavelength. On top of that, a wavelength of LSAW is about 50% short as that of longitudinal wave. So, It is expected that high resolution images can be obtained provided LSAW are used in the scanning acoustic microscope.

  9. Unsteady-state analysis of a counter-flow dew point evaporative cooling system

    KAUST Repository

    Lin, J.

    2016-07-19

    Understanding the dynamic behavior of the dew point evaporative cooler is crucial in achieving efficient cooling for real applications. This paper details the development of a transient model for a counter-flow dew point evaporative cooling system. The transient model approaching steady conditions agreed well with the steady state model. Additionally, it is able to accurately predict the experimental data within 4.3% discrepancy. The transient responses of the cooling system were investigated under different inlet air conditions. Temporal temperature and humidity profiles were analyzed for different transient and step responses. The key findings from this study include: (1) the response trend and settling time is markedly dependent on the inlet air temperature, humidity and velocity; (2) the settling time of the transient response ranges from 50 s to 300 s when the system operates under different inlet conditions; and (3) the average transient wet bulb effectiveness (1.00–1.06) of the system is observed to be higher than the steady state wet bulb effectiveness (1.01) for our range of study. © 2016 Elsevier Ltd

  10. Data for Suspect Screening and Non-Targeted Analysis of Drinking Water Using Point-Of-Use Filters

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset contains information about all the features extracted from the raw data files, the formulas that were assigned to some of these features, and the...

  11. A single point in protein trafficking by Plasmodium falciparum determines the expression of major antigens on the surface of infected erythrocytes targeted by human antibodies.

    Science.gov (United States)

    Chan, Jo-Anne; Howell, Katherine B; Langer, Christine; Maier, Alexander G; Hasang, Wina; Rogerson, Stephen J; Petter, Michaela; Chesson, Joanne; Stanisic, Danielle I; Duffy, Michael F; Cooke, Brian M; Siba, Peter M; Mueller, Ivo; Bull, Peter C; Marsh, Kevin; Fowkes, Freya J I; Beeson, James G

    2016-11-01

    Antibodies to blood-stage antigens of Plasmodium falciparum play a pivotal role in human immunity to malaria. During parasite development, multiple proteins are trafficked from the intracellular parasite to the surface of P. falciparum-infected erythrocytes (IEs). However, the relative importance of different proteins as targets of acquired antibodies, and key pathways involved in trafficking major antigens remain to be clearly defined. We quantified antibodies to surface antigens among children, adults, and pregnant women from different malaria-exposed regions. We quantified the importance of antigens as antibody targets using genetically engineered P. falciparum with modified surface antigen expression. Genetic deletion of the trafficking protein skeleton-binding protein-1 (SBP1), which is involved in trafficking the surface antigen PfEMP1, led to a dramatic reduction in antibody recognition of IEs and the ability of human antibodies to promote opsonic phagocytosis of IEs, a key mechanism of parasite clearance. The great majority of antibody epitopes on the IE surface were SBP1-dependent. This was demonstrated using parasite isolates with different genetic or phenotypic backgrounds, and among antibodies from children, adults, and pregnant women in different populations. Comparisons of antibody reactivity to parasite isolates with SBP1 deletion or inhibited PfEMP1 expression suggest that PfEMP1 is the dominant target of acquired human antibodies, and that other P. falciparum IE surface proteins are minor targets. These results establish SBP1 as part of a critical pathway for the trafficking of major surface antigens targeted by human immunity, and have key implications for vaccine development, and quantifying immunity in populations.

  12. Allocating the Fixed Resources and Setting Targets in Integer Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Kobra Gholami

    2013-11-01

    Full Text Available Data envelopment analysis (DEA is a non-parametric approach to evaluate a set of decision making units (DMUs consuming multiple inputs to produce multiple outputs. Formally, DEA use to estimate the efficiency score into the empirical efficient frontier. Also, DEA can be used to allocate resources and set targets for future forecast. The data are continuous in the standard DEA model whereas there are many problems in the real life that data must be integer such as number of employee, machinery, expert and so on. Thus in this paper we propose an approach to allocate fixed resources and set fixed targets with selective integer assumption that is based on an integer data envelopment analysis (IDEA approach for the first time. The major aim in this approach is preserving the efficiency score of DMUs. We use the concept of benchmarking to reach this aim. The numerical example gets to illustrate the applicability of the proposed method.

  13. Avian Analysis of LCTA Core Plot Data: West Point Military Academy

    National Research Council Canada - National Science Library

    Schreiber, Eric

    1998-01-01

    The Land Condition Trend Analysis (LCTA) program is the Army's standard for land inventory and monitoring, using standard methods for natural resources data collection, analyses, and reporting designed to meet multiple goals and objectives...

  14. Avian Analysis of LCTA Core Plot Data: West Point Military Academy (Revised)

    National Research Council Canada - National Science Library

    Schreiber, Eric

    1998-01-01

    The Land Condition Trend Analysis (LCTA) program is the Army's standard for land inventory and monitoring, using standard methods for natural resources data collection, analyses, and reporting designed to meet multiple goals and objectives...

  15. Behavior analysis and social constructionism: some points of contact and departure.

    Science.gov (United States)

    Roche, Bryan; Barnes-Holmes, Dermot

    2003-01-01

    Social constructionists occasionally single out behavior analysis as the field of psychology that most closely resembles the natural sciences in its commitment to empiricism, and accuses it of suffering from many of the limitations to science identified by the postmodernist movement (e.g., K. J. Gergen, 1985a; Soyland, 1994). Indeed, behavior analysis is a natural science in many respects. However, it also shares with social constructionism important epistemological features such as a rejection of mentalism, a functional-analytic approach to language, the use of interpretive methodologies, and a reflexive stance on analysis. The current paper outlines briefly the key tenets of the behavior-analytic and social constructionist perspectives before examining a number of commonalties between these approaches. The paper aims to show that far from being a nemesis to social constructionism, behavior analysis may in fact be its close ally.

  16. [Power, interdependence and complementarity in hospital work: an analysis from the nursing point of view].

    Science.gov (United States)

    Lopes, M J

    1997-01-01

    This essay intends to discuss recent transformation both to hospital work and nursing work specifically. Analysis privilege inter and intra relations with multidisciplinary teams which is constituted of practices on the therapeutic process present in hospital space-time.

  17. Life-Cycle Cost-Benefit (LCCB) Analysis of Bridges from a User and Social Point of View

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2009-01-01

    is to present and discuss some of these problems from a user and social point of view. A brief presentation of a preliminary study of the importance of including benefits in life-cycle cost-benefit analysis in management systems for bridges is shown. Benefits may be positive as well as negative from the user...... point of view. In the paper, negative benefits (user costs) are discussed in relation to the maintenance of concrete bridges. A limited number of excerpts from published reports that are related to the importance of estimating user costs when repairs of bridges are planned, and when optimized strategies......During the last two decades, important progress has been made in the life-cycle cost-benefit (LCCB) analysis of structures, especially offshore platforms, bridges and nuclear installations. Due to the large uncertainties related to the deterioration, maintenance, and benefits of such structures...

  18. Introduction of the system of hazard analysis critical control point to ensure the safety of irradiated food

    International Nuclear Information System (INIS)

    Sajet, A.S.

    2014-01-01

    Hazard Analysis Critical Control Point (HACCP) is a preventive system for food safety. It identifies safety risks faced by food. Identified points are controlled ensuring product safety. Because of presence of many of the pathogenic microorganisms and parasites in food which caused cases of food poisoning and many diseases transmitted through food, the current methods of food production could not prevent food contamination or prevent the growth of these pathogens completely because of being a part of the normal flora in the environment. Irradiation technology helped to control diseases transmitted through food, caused by pathological microorganisms and parasites present in food. The application of a system based on risk analysis as a means of risk management in food chain, demonstrated the importance of food irradiation. (author)

  19. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  20. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    Science.gov (United States)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in

  1. Analysis and Simulation of Multi-target Echo Signals from a Phased Array Radar

    OpenAIRE

    Jia Zhen; Zhou Rui

    2017-01-01

    The construction of digital radar simulation systems has been a research hotspot of the radar field. This paper focuses on theoretical analysis and simulation of multi-target echo signals produced in a phased array radar system, and constructs an array antenna element and a signal generation environment. The antenna element is able to simulate planar arrays and optimizes these arrays by adding window functions. And the signal environment can model and simulate radar transmission signals, rada...

  2. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  3. Targeting khat or targeting Somalis? A discourse analysis of project evaluations on khat abuse among Somali immigrants in Scandinavia

    Directory of Open Access Journals (Sweden)

    Nordgren Johan

    2015-09-01

    Full Text Available BACKGROUND – In Denmark, Norway and Sweden, the use of the psychoactive plant khat is widely seen as a social and health problem exclusively affecting the Somali immigrant population. Several projects by governmental and municipal bodies and agencies have been initiated to reduce khat use and abuse within this target population.

  4. Point Analysis in Java applied to histological images of the perforant pathway: A user’s account

    OpenAIRE

    Scorcioni, Ruggero; Wright, Susan N.; Card, J. Patrick; Ascoli, Giorgio A.; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool PAJ, created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (2× objective) comprised the entire perforant pathway, while the high magnification set (100× objective) allowed the identification of individual fibers. A preliminary stereologi...

  5. An analysis of market development strategy of a point-of-sale solutions provider's market research database

    OpenAIRE

    Medina, Ahmed

    2007-01-01

    This paper is a strategic analysis of Vivonet Inc. and its restaurant performance-benchmarking tool ZATA. Vivonet is a Point of Sales (POS) systems provider for the hospitality and the retail industry. Its ZATA product captures POS and other related information from restaurants and allows the restaurants to compare their performance with restaurants in their market segment. With ZATA, Vivonet has the opportunity to extend beyond the POS systems segment and compete in the market research i...

  6. Analysis of the dynamics of a nutating body. [numerical analysis of displacement, velocity, and acceleration of point on mechanical drives

    Science.gov (United States)

    Anderson, W. J.

    1974-01-01

    The equations for the displacement, velocity, and acceleration of a point in a nutating body are developed. These are used to derive equations for the inertial moment developed by a nutating body of arbitrary shape. Calculations made for a previously designed nutating plate transmission indicate that that device is severely speed limited because of the very high magnitude inertial moment.

  7. Analysis of residual stress state in sheet metal parts processed by single point incremental forming

    Science.gov (United States)

    Maaß, F.; Gies, S.; Dobecki, M.; Brömmelhoff, K.; Tekkaya, A. E.; Reimers, W.

    2018-05-01

    The mechanical properties of formed metal components are highly affected by the prevailing residual stress state. A selective induction of residual compressive stresses in the component, can improve the product properties such as the fatigue strength. By means of single point incremental forming (SPIF), the residual stress state can be influenced by adjusting the process parameters during the manufacturing process. To achieve a fundamental understanding of the residual stress formation caused by the SPIF process, a valid numerical process model is essential. Within the scope of this paper the significance of kinematic hardening effects on the determined residual stress state is presented based on numerical simulations. The effect of the unclamping step after the manufacturing process is also analyzed. An average deviation of the residual stress amplitudes in the clamped and unclamped condition of 18 % reveals, that the unclamping step needs to be considered to reach a high numerical prediction quality.

  8. An improved local radial point interpolation method for transient heat conduction analysis

    Science.gov (United States)

    Wang, Feng; Lin, Gao; Zheng, Bao-Jing; Hu, Zhi-Qiang

    2013-06-01

    The smoothing thin plate spline (STPS) interpolation using the penalty function method according to the optimization theory is presented to deal with transient heat conduction problems. The smooth conditions of the shape functions and derivatives can be satisfied so that the distortions hardly occur. Local weak forms are developed using the weighted residual method locally from the partial differential equations of the transient heat conduction. Here the Heaviside step function is used as the test function in each sub-domain to avoid the need for a domain integral. Essential boundary conditions can be implemented like the finite element method (FEM) as the shape functions possess the Kronecker delta property. The traditional two-point difference method is selected for the time discretization scheme. Three selected numerical examples are presented in this paper to demonstrate the availability and accuracy of the present approach comparing with the traditional thin plate spline (TPS) radial basis functions.

  9. An improved local radial point interpolation method for transient heat conduction analysis

    International Nuclear Information System (INIS)

    Wang Feng; Lin Gao; Hu Zhi-Qiang; Zheng Bao-Jing

    2013-01-01

    The smoothing thin plate spline (STPS) interpolation using the penalty function method according to the optimization theory is presented to deal with transient heat conduction problems. The smooth conditions of the shape functions and derivatives can be satisfied so that the distortions hardly occur. Local weak forms are developed using the weighted residual method locally from the partial differential equations of the transient heat conduction. Here the Heaviside step function is used as the test function in each sub-domain to avoid the need for a domain integral. Essential boundary conditions can be implemented like the finite element method (FEM) as the shape functions possess the Kronecker delta property. The traditional two-point difference method is selected for the time discretization scheme. Three selected numerical examples are presented in this paper to demonstrate the availability and accuracy of the present approach comparing with the traditional thin plate spline (TPS) radial basis functions

  10. RETRAN applications in pressurized thermal shock analysis of turkey point units 3 and 4

    International Nuclear Information System (INIS)

    Arpa, J.; Fatemi, A.S.; Mathavan, S.K.

    1985-01-01

    A methodology to assess the impact of overcooling transients on vessel wall integrity with respect to pressurized thermal shock conditions has been developed at Florida Power and Light Company for the Turkey Point Nuclear Units. Small break loss-of-coolant and small steamline break events have been simulated with the RETRAN code. Highly conservative assumptions, such as engineered safeguards with minimum temperature and maximum flow, have been made to maximize cooldown and thermal stress in the vessel wall. Temperatures, pressures, and flows obtained with RETRAN provide input for stress and fracture mechanics analyses that evaluate reactor vessel integrity. The results of the RETRAN analyses compare well with generic calculations performed by the Westinghouse Owners Group for a similar type of plant

  11. Analysis of stagnation point flow of an upper-convected Maxwell fluid

    Directory of Open Access Journals (Sweden)

    Joseph E. Paullet

    2017-12-01

    Full Text Available Several recent papers have investigated the two-dimensional stagnation point flow of an upper-convected Maxwell fluid by employing a similarity change of variable to reduce the governing PDEs to a nonlinear third order ODE boundary value problem (BVP. In these previous works, the BVP was studied numerically and several conjectures regarding the existence and behavior of the solutions were made. The purpose of this article is to mathematically verify these conjectures. We prove the existence of a solution to the BVP for all relevant values of the elasticity parameter. We also prove that this solution has monotonically increasing first derivative, thus verifying the conjecture that no ``overshoot'' of the boundary condition occurs. Uniqueness results are presented for a large range of parameter space and bounds on the skin friction coefficient are calculated.

  12. Analysis on signal properties due to concurrent leaks at two points in water supply pipelines

    International Nuclear Information System (INIS)

    Lee, Young Sup

    2015-01-01

    Intelligent leak detection is an essential component of a underground water supply pipeline network such as a smart water grid system. In this network, numerous leak detection sensors are needed to cover all of the pipelines in a specific area installed at specific regular distances. It is also necessary to determine the existence of any leaks and estimate its location within a short time after it occurs. In this study, the leak signal properties and feasibility of leak location detection were investigated when concurrent leaks occurred at two points in a pipeline. The straight distance between the two leak sensors in the 100A sized cast-iron pipeline was 315.6 m, and their signals were measured with one leak and two concurrent leaks. Each leak location was described after analyzing the frequency properties and cross-correlation of the measured signals.

  13. Analysis on signal properties due to concurrent leaks at two points in water supply pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Sup [Dept. of Embedded Systems Engineering, Incheon National University, Incheon (Korea, Republic of)

    2015-02-15

    Intelligent leak detection is an essential component of a underground water supply pipeline network such as a smart water grid system. In this network, numerous leak detection sensors are needed to cover all of the pipelines in a specific area installed at specific regular distances. It is also necessary to determine the existence of any leaks and estimate its location within a short time after it occurs. In this study, the leak signal properties and feasibility of leak location detection were investigated when concurrent leaks occurred at two points in a pipeline. The straight distance between the two leak sensors in the 100A sized cast-iron pipeline was 315.6 m, and their signals were measured with one leak and two concurrent leaks. Each leak location was described after analyzing the frequency properties and cross-correlation of the measured signals.

  14. Feasibility study of point cloud data from test deposition holes for deformation analysis

    International Nuclear Information System (INIS)

    Carrea, D.; Jaboyedoff, M.; Derron, M.-H.

    2014-02-01

    The present document reports the observations and analyses made at the University of Lausanne (UNIL) on the point cloud datasets from the test deposition holes of the ONKALO facility (Olkiluoto, Finland). This study has revealed that an artificial distortion due to the acquisition procedure affects part of the data (up to 6 mm shift). This distortion occurs when the incidence angle gets too high and recommendations are proposed to avoid it during future acquisitions. Another issue is the influence of the surface condition on range measurement, i.e. wet versus dry, or dark versus light colored. No obvious ground deformation was observed on the data provided for this study. But, because of the distortion mentioned previously, a quite important amplitude deformation would be required to be detected in some parts of the holes on the present data. We think that changing slightly the scanning strategy in the field for future acquisitions should make possible to detect sub-mm deformations. (orig.)

  15. Summary - COG: A new point-wise Monte Carlo code for burnup credit analysis

    International Nuclear Information System (INIS)

    Alesso, H.P.

    1989-01-01

    COG, a new point-wise Monte Carlo code being developed and tested at Lawrence Livermore National Laboratory (LLNL) for the Cray-1, solves the Boltzmann equation for the transport of neutrons, photons, and (in future versions) other particles. Techniques included in the code for modifying the random walk of particles make COG most suitable for solving deep-penetration (shielding) problems and a wide variety of criticality problems. COG is similar to a number of other computer codes used in the shielding community. Each code is a little different in its geometry input and its random-walk modification options. COG is a Monte Carlo code specifically designed for the CRAY (in 1986) to be as precise as the current state of physics knowledge. It has been extensively benchmarked and used as a shielding code at LLNL since 1986, and has recently been extended to accomplish criticality calculations. It will make an excellent tool for future shipping cask studies

  16. ANALYSIS OF FORECASTING METHODS FROM THE POINT OF VIEW OF EARLY WARNING CONCEPT IN PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Florin POPESCU

    2017-12-01

    Full Text Available Early warning system (EWS based on a reliable forecasting process has become a critical component of the management of large complex industrial projects in the globalized transnational environment. The purpose of this research is to critically analyze the forecasting methods from the point of view of early warning, choosing those useful for the construction of EWS. This research addresses complementary techniques, using Bayesian Networks, which addresses both uncertainties and causality in project planning and execution, with the goal of generating early warning signals for project managers. Even though Bayesian networks have been widely used in a range of decision-support applications, their application as early warning systems for project management is still new.

  17. Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis

    Science.gov (United States)

    Anton-Sanchez, Laura; Bielza, Concha; Merchán-Pérez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Larrañaga, Pedro

    2014-01-01

    The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority) and so-called gap junctions (a substrate of one class of electrical synapse). Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA) point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM), we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500μm3 and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers. PMID:25206325

  18. Nitrate transport and supply limitations quantified using high-frequency stream monitoring and turning point analysis

    Science.gov (United States)

    Jones, Christopher S.; Wang, Bo; Schilling, Keith E.; Chan, Kung-sik

    2017-06-01

    Agricultural landscapes often leak inorganic nitrogen to the stream network, usually in the form of nitrate-nitrite (NOx-N), degrading downstream water quality on both the local and regional scales. While the spatial distribution of nitrate sources has been delineated in many watersheds, less is known about the complicated temporal dynamics that drive stream NOx-N because traditional methods of stream grab sampling are often conducted at a low frequency. Deployment of accurate real-time, continuous measurement devices that have been developed in recent years enables high-frequency sampling that provides detailed information on the concentration-discharge relation and the timing of NOx-N delivery to streams. We aggregated 15-min interval NOx-N and discharge data over a nine-year period into daily averages and then used robust statistical methods to identify how the discharge regime within an artificially-drained agricultural watershed reflected catchment hydrology and NOx-N delivery pathways. We then quantified how transport and supply limitations varied from year-to-year and how dependence of these limitations varied with climate, especially drought. Our results show NOx-N concentrations increased linearly with discharge up to an average "turning point" of 1.42 mm of area-normalized discharge, after which concentrations decline with increasing discharge. We estimate transport and supply limitations to govern 57 and 43 percent, respectively, of the NOx-N flux over the nine-year period. Drought effects on the NOx-N flux linger for multiple years and this is reflected in a greater tendency toward supply limitations in the three years following drought. How the turning point varies with climate may aid in prediction of NOx-N loading in future climate regimes.

  19. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  20. FEA Analysis of AP-0 Target Hall Collection Lens (Current Design)

    International Nuclear Information System (INIS)

    Hurh, P.G.; Tang, Z.

    2001-01-01

    The AP-0 Target Hall Collection Lens is a pulsed device which focuses anti-protons just downstream of the Target. Since the angles at which the anti-protons depart the Target can be quite large, a very high focusing strength is required to maximize anti-proton capture into the downstream Debuncher Ring. The current design of the Collection Lens was designed to operate with a focusing gradient of 1,000 T/m. However, multiple failures of early devices resulted in lowering the normal operating gradient to about 750 T/m. At this gradient, the Lens design fares much better, lasting several million pulses, but ultimately still fails. A Finite Element Analysis (FEA) has been performed on this Collection Lens design to help determine the cause and/or nature of the failures. The Collection Lens magnetic field is created by passing high current through a central conductor cylinder. A uniform current distribution through the cylinder will create a tangential or azimuthal magnetic field that varies linearly from zero at the center of the cylinder to a maximum at the outer surface of the cylinder. Anti-proton particles passing through this cylinder (along the longitudinal direction) will see an inward focusing kick back toward the center of the cylinder proportional to the magnetic field strength. For the current Lens design a gradient of 1,000 T/m requires a current of about 580,000 amps. Since the DC power and cooling requirements would be prohibitive, the Lens is operated in a pulsed mode. Each pulse is half sine wave in shape with a pulse duration of about 350 microseconds. Because of the skin effect, the most uniform current density actually occurs about two-thirds of the way through the pulse. This means that the maximum current of the pulse is actually higher than that required in the DC case (about 670,000 amps). Since the beam must pass through the central conductor cylinder it must be made of a conducting material that is also very 'transparent' to the beam. For the

  1. Ensemble Sensitivity Analysis of a Severe Downslope Windstorm in Complex Terrain: Implications for Forecast Predictability Scales and Targeted Observing Networks

    Science.gov (United States)

    2013-09-01

    observations, linear regression finds the straight line that explains the linear relationship of the sample. This line is given by the equation y = mx + b...SENSITIVITY ANALYSIS OF A SEVERE DOWNSLOPE WINDSTORM IN COMPLEX TERRAIN: IMPLICATIONS FOR FORECAST PREDICTABILITY SCALES AND TARGETED OBSERVING...SENSITIVITY ANALYSIS OF A SEVERE DOWNSLOPE WINDSTORM IN COMPLEX TERRAIN: IMPLICATIONS FOR FORECAST PREDICTABILITY SCALES AND TARGETED OBSERVING NETWORKS

  2. An analysis of possible off target effects following CAS9/CRISPR targeted deletions of neuropeptide gene enhancers from the mouse genome.

    Science.gov (United States)

    Hay, Elizabeth Anne; Khalaf, Abdulla Razak; Marini, Pietro; Brown, Andrew; Heath, Karyn; Sheppard, Darrin; MacKenzie, Alasdair

    2017-08-01

    We have successfully used comparative genomics to identify putative regulatory elements within the human genome that contribute to the tissue specific expression of neuropeptides such as galanin and receptors such as CB1. However, a previous inability to rapidly delete these elements from the mouse genome has prevented optimal assessment of their function in-vivo. This has been solved using CAS9/CRISPR genome editing technology which uses a bacterial endonuclease called CAS9 that, in combination with specifically designed guide RNA (gRNA) molecules, cuts specific regions of the mouse genome. However, reports of "off target" effects, whereby the CAS9 endonuclease is able to cut sites other than those targeted, limits the appeal of this technology. We used cytoplasmic microinjection of gRNA and CAS9 mRNA into 1-cell mouse embryos to rapidly generate enhancer knockout mouse lines. The current study describes our analysis of the genomes of these enhancer knockout lines to detect possible off-target effects. Bioinformatic analysis was used to identify the most likely putative off-target sites and to design PCR primers that would amplify these sequences from genomic DNA of founder enhancer deletion mouse lines. Amplified DNA was then sequenced and blasted against the mouse genome sequence to detect off-target effects. Using this approach we were unable to detect any evidence of off-target effects in the genomes of three founder lines using any of the four gRNAs used in the analysis. This study suggests that the problem of off-target effects in transgenic mice have been exaggerated and that CAS9/CRISPR represents a highly effective and accurate method of deleting putative neuropeptide gene enhancer sequences from the mouse genome. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Sedentary behaviour profiling of office workers: a sensitivity analysis of sedentary cut-points

    NARCIS (Netherlands)

    Boerema, Simone Theresa; Essink, Gerard B.; Tönis, Thijs; van Velsen, Lex Stefan; Hermens, Hermanus J.

    2016-01-01

    Measuring sedentary behaviour and physical activity with wearable sensors provides detailed information on activity patterns and can serve health interventions. At the basis of activity analysis stands the ability to distinguish sedentary from active time. As there is no consensus regarding the

  4. Points of convergence between functional and formal approaches to syntactic analysis

    DEFF Research Database (Denmark)

    Bjerre, Tavs; Engels, Eva; Jørgensen, Henrik

    2008-01-01

    respectively: The functional approach is represented by Paul Diderichsen's (1936, 1941, 1946, 1964) sætningsskema, ‘sentence model', and the formal approach is represented by analysis whose main features are common to the principles and parameters framework (Chomsky 1986) and the minimalist programme (Chomsky...

  5. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Analysis and Segmentation of Face Images using Point Annotations and Linear Subspace Techniques

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille

    2002-01-01

    This report provides an analysis of 37 annotated frontal face images. All results presented have been obtained using our freely available Active Appearance Model (AAM) implementation. To ensure the reproducibility of the presented experiments, the data set has also been made available. As such...

  7. A new integrated dual time-point amyloid PET/MRI data analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Cecchin, Diego; Zucchetta, Pietro; Turco, Paolo; Bui, Franco [University Hospital of Padua, Nuclear Medicine Unit, Department of Medicine - DIMED, Padua (Italy); Barthel, Henryk; Tiepolt, Solveig; Sabri, Osama [Leipzig University, Department of Nuclear Medicine, Leipzig (Germany); Poggiali, Davide; Cagnin, Annachiara; Gallo, Paolo [University Hospital of Padua, Neurology, Department of Neurosciences (DNS), Padua (Italy); Frigo, Anna Chiara [University Hospital of Padua, Biostatistics, Epidemiology and Public Health Unit, Department of Cardiac, Thoracic and Vascular Sciences, Padua (Italy)

    2017-11-15

    In the initial evaluation of patients with suspected dementia and Alzheimer's disease, there is no consensus on how to perform semiquantification of amyloid in such a way that it: (1) facilitates visual qualitative interpretation, (2) takes the kinetic behaviour of the tracer into consideration particularly with regard to at least partially correcting for blood flow dependence, (3) analyses the amyloid load based on accurate parcellation of cortical and subcortical areas, (4) includes partial volume effect correction (PVEC), (5) includes MRI-derived topographical indexes, (6) enables application to PET/MRI images and PET/CT images with separately acquired MR images, and (7) allows automation. A method with all of these characteristics was retrospectively tested in 86 subjects who underwent amyloid ({sup 18}F-florbetaben) PET/MRI in a clinical setting (using images acquired 90-110 min after injection, 53 were classified visually as amyloid-negative and 33 as amyloid-positive). Early images after tracer administration were acquired between 0 and 10 min after injection, and later images were acquired between 90 and 110 min after injection. PVEC of the PET data was carried out using the geometric transfer matrix method. Parametric images and some regional output parameters, including two innovative ''dual time-point'' indexes, were obtained. Subjects classified visually as amyloid-positive showed a sparse tracer uptake in the primary sensory, motor and visual areas in accordance with the isocortical stage of the topographic distribution of the amyloid plaque (Braak stages V/VI). In patients classified visually as amyloid-negative, the method revealed detectable levels of tracer uptake in the basal portions of the frontal and temporal lobes, areas that are known to be sites of early deposition of amyloid plaques that probably represented early accumulation (Braak stage A) that is typical of normal ageing. There was a strong correlation between

  8. A new integrated dual time-point amyloid PET/MRI data analysis method

    International Nuclear Information System (INIS)

    Cecchin, Diego; Zucchetta, Pietro; Turco, Paolo; Bui, Franco; Barthel, Henryk; Tiepolt, Solveig; Sabri, Osama; Poggiali, Davide; Cagnin, Annachiara; Gallo, Paolo; Frigo, Anna Chiara

    2017-01-01

    In the initial evaluation of patients with suspected dementia and Alzheimer's disease, there is no consensus on how to perform semiquantification of amyloid in such a way that it: (1) facilitates visual qualitative interpretation, (2) takes the kinetic behaviour of the tracer into consideration particularly with regard to at least partially correcting for blood flow dependence, (3) analyses the amyloid load based on accurate parcellation of cortical and subcortical areas, (4) includes partial volume effect correction (PVEC), (5) includes MRI-derived topographical indexes, (6) enables application to PET/MRI images and PET/CT images with separately acquired MR images, and (7) allows automation. A method with all of these characteristics was retrospectively tested in 86 subjects who underwent amyloid ( 18 F-florbetaben) PET/MRI in a clinical setting (using images acquired 90-110 min after injection, 53 were classified visually as amyloid-negative and 33 as amyloid-positive). Early images after tracer administration were acquired between 0 and 10 min after injection, and later images were acquired between 90 and 110 min after injection. PVEC of the PET data was carried out using the geometric transfer matrix method. Parametric images and some regional output parameters, including two innovative ''dual time-point'' indexes, were obtained. Subjects classified visually as amyloid-positive showed a sparse tracer uptake in the primary sensory, motor and visual areas in accordance with the isocortical stage of the topographic distribution of the amyloid plaque (Braak stages V/VI). In patients classified visually as amyloid-negative, the method revealed detectable levels of tracer uptake in the basal portions of the frontal and temporal lobes, areas that are known to be sites of early deposition of amyloid plaques that probably represented early accumulation (Braak stage A) that is typical of normal ageing. There was a strong correlation between age

  9. Immune checkpoint inhibitors and targeted therapies for metastatic melanoma: A network meta-analysis.

    Science.gov (United States)

    Pasquali, Sandro; Chiarion-Sileni, Vanna; Rossi, Carlo Riccardo; Mocellin, Simone

    2017-03-01

    Immune checkpoint inhibitors and targeted therapies, two new class of drugs for treatment of metastatic melanoma, have not been compared in randomized controlled trials (RCT). We quantitatively summarized the evidence and compared immune and targeted therapies in terms of both efficacy and toxicity. A comprehensive search for RCTs of immune checkpoint inhibitors and targeted therapies was conducted to August 2016. Using a network meta-analysis approach, treatments were compared with each other and ranked based on their effectiveness (as measured by the impact on progression-free survival [PFS]) and acceptability (the inverse of high grade toxicity). Twelve RCTs enrolling 6207 patients were included. Network meta-analysis generated 15 comparisons. Combined BRAF and MEK inhibitors were associated with longer PFS as compared to anti-CTLA4 (HR: 0.22; 95% confidence interval [CI]: 0.12-0.41) and anti-PD1 antibodies alone (HR: 0.38; CI: 0.20-0.72). However, anti-PD1 monoclonal antibodies were less toxic than anti-CTLA4 monoclonal antibodies (RR: 0.65; CI: 0.40-0.78) and their combination significantly increased toxicity compared to either single agent anti-CTLA4 (RR: 2.06; CI: 1.45-2.93) or anti-PD1 monoclonal antibodies (RR: 3.67; CI: 2.27-5.96). Consistently, ranking analysis suggested that the combination of targeted therapies is the most effective strategy, whereas single agent anti-PD1 antibodies have the best acceptability. The GRADE level of evidence quality for these findings was moderate to low. The simultaneous inhibition of BRAF and MEK appears the most effective treatment for melanomas harboring BRAF V600 mutation, although anti-PD1 antibodies appear to be less toxic. Further research is needed to increase the quality of evidence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Critical analysis of the potential for therapeutic targeting of mammalian target of rapamycin (mTOR in gastric cancer

    Directory of Open Access Journals (Sweden)

    Inokuchi M

    2014-04-01

    Full Text Available Mikito Inokuchi,1 Keiji Kato,1 Kazuyuki Kojima,2 Kenichi Sugihara1 1Department of Surgical Oncology, 2Department of Minimally Invasive Surgery, Tokyo Medical and Dental University, Tokyo, Japan Abstract: Multidisciplinary treatment including chemotherapy has become the global standard of care for patients with metastatic gastric cancer (mGC; nonetheless, survival remains poor. Although many molecular-targeted therapies have been developed for various cancers, only anti-HER2 treatment has produced promising results in patients with mGC. Mammalian target of rapamycin (mTOR plays a key role in cell proliferation, antiapoptosis, and metastasis in signaling pathways from the tyrosine kinase receptor, and its activation has been demonstrated in gastric cancer (GC cells. This review discusses the clinical relevance of mTOR in GC and examines its potential as a therapeutic target in patients with mGC. Preclinical studies in animal models suggest that suppression of the mTOR pathway inhibits the proliferation of GC cells and delays tumor progression. The mTOR inhibitor everolimus has been evaluated as second- or third-line treatment in clinical trials. Adverse events were well tolerated although the effectiveness of everolimus alone was limited. Everolimus is now being evaluated in combination with chemotherapy in Phase III clinical studies in this subgroup of patients. Two Phase III studies include exploratory biomarker research designed to evaluate the predictive value of the expression or mutation of molecules related to the Akt/mTOR signaling pathway. These biomarker studies may lead to the realization of targeted therapy for selected patients with mGC in the future. Keywords: gastric cancer, mTOR, everolimus

  11. Selective analysis of power plant operation on the Hudson River with emphasis on the Bowline Point Generating Station. Volume 1

    International Nuclear Information System (INIS)

    Barnthouse, L.W.; Cannon, J.B.; Christensen, S.G.

    1977-07-01

    A comprehensive study of the effects of power plant operation on the Hudson River was conducted. The study included thermal, biological, and air quality effects of existing and planned electrical generating stations. This section on thermal impacts presents a comprehensive mathematical modeling and computer simulation study of the effects of heat rejection from the plants. The overall study consisted of three major parts: near-field analysis; far-field analysis; and zone-matched near-field/far-field analysis. Near-field analyses were completed for Roseton, Danskammer, and Bowline Point Generating Stations, and near-field dilution ratios range from a low of about 2 for Bowline Point and 3 for Roseton to a maximum of 6 for both plants. The far-field analysis included a critical review of existing studies and a parametric review of operating plants. The maximum thermal load case, based on hypothetical 1974 river conditions, gives the daily maximum cross-section-averaged and 2-mile-segment-averaged water temperatures as 83.80 0 F in the vicinity of the Indian Point Station and 83.25 0 F in the vicinity of the Bowline Station. This maximum case will be significantly modified if cooling towers are used at certain units. A full analysis and discussion of these cases is presented. A study of the Hudson River striped bass population is divided into the following eight subsections: distribution of striped bass eggs, larvae, and juveniles in the Hudson River; entrainment mortality factor; intake factor; impingement; effects of discharges; compensation; model estimates of percent reduction; and Hudson River striped bass stock

  12. In-silico Metabolome Target Analysis Towards PanC-based Antimycobacterial Agent Discovery.

    Science.gov (United States)

    Khoshkholgh-Sima, Baharak; Sardari, Soroush; Izadi Mobarakeh, Jalal; Khavari-Nejad, Ramezan Ali

    2015-01-01

    Mycobacterium tuberculosis, the main cause of tuberculosis (TB), has still remained a global health crisis especially in developing countries. Tuberculosis treatment is a laborious and lengthy process with high risk of noncompliance, cytotoxicity adverse events and drug resistance in patient. Recently, there has been an alarming rise of drug resistant in TB. In this regard, it is an unmet need to develop novel antitubercular medicines that target new or more effective biochemical pathways to prevent drug resistant Mycobacterium. Integrated study of metabolic pathways through in-silico approach played a key role in antimycobacterial design process in this study. Our results suggest that pantothenate synthetase (PanC), anthranilate phosphoribosyl transferase (TrpD) and 3-isopropylmalate dehydratase (LeuD) might be appropriate drug targets. In the next step, in-silico ligand analysis was used for more detailed study of chemical tractability of targets. This was helpful to identify pantothenate synthetase (PanC, Rv3602c) as the best target for antimycobacterial design procedure. Virtual library screening on the best ligand of PanC was then performed for inhibitory ligand design. At the end, five chemical intermediates showed significant inhibition of Mycobacterium bovis with good selectivity indices (SI) ≥10 according to Tuberculosis Antimicrobial Acquisition & Coordinating Facility of US criteria for antimycobacterial screening programs.

  13. Identification of Cell Surface Targets through Meta-analysis of Microarray Data

    Directory of Open Access Journals (Sweden)

    Henry Haeberle

    2012-07-01

    Full Text Available High-resolution image guidance for resection of residual tumor cells would enable more precise and complete excision for more effective treatment of cancers, such as medulloblastoma, the most common pediatric brain cancer. Numerous studies have shown that brain tumor patient outcomes correlate with the precision of resection. To enable guided resection with molecular specificity and cellular resolution, molecular probes that effectively delineate brain tumor boundaries are essential. Therefore, we developed a bioinformatics approach to analyze micro-array datasets for the identification of transcripts that encode candidate cell surface biomarkers that are highly enriched in medulloblastoma. The results identified 380 genes with greater than a two-fold increase in the expression in the medulloblastoma compared with that in the normal cerebellum. To enrich for targets with accessibility for extracellular molecular probes, we further refined this list by filtering it with gene ontology to identify genes with protein localization on, or within, the plasma membrane. To validate this meta-analysis, the top 10 candidates were evaluated with immunohistochemistry. We identified two targets, fibrillin 2 and EphA3, which specifically stain medulloblastoma. These results demonstrate a novel bioinformatics approach that successfully identified cell surface and extracellular candidate markers enriched in medulloblastoma versus adjacent cerebellum. These two proteins are high-value targets for the development of tumor-specific probes in medulloblastoma. This bioinformatics method has broad utility for the identification of accessible molecular targets in a variety of cancers and will enable probe development for guided resection.

  14. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar

    Directory of Open Access Journals (Sweden)

    Raja Syamsul Azmir Raja Abdullah

    2016-09-01

    Full Text Available The passive bistatic radar (PBR system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR. The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system’s capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  15. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar.

    Science.gov (United States)

    Raja Abdullah, Raja Syamsul Azmir; Abdul Aziz, Noor Hafizah; Abdul Rashid, Nur Emileen; Ahmad Salah, Asem; Hashim, Fazirulhisyam

    2016-09-29

    The passive bistatic radar (PBR) system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR). The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS) for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE) base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system's capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  16. Expression analysis of miRNA and target mRNAs in esophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Meng, X.R. [Oncology Department, The First Affiliated Hospital of Zhengzhou University, Zhengzhou (China); Lu, P. [Gastrointestinal Surgery Department, People' s Hospital of Zhengzhou, Zhengzhou (China); Mei, J.Z.; Liu, G.J. [Medical Oncology Department, People' s Hospital of Zhengzhou, Zhengzhou (China); Fan, Q.X. [Oncology Department, The First Affiliated Hospital of Zhengzhou University, Zhengzhou (China)

    2014-08-01

    We aimed to investigate miRNAs and related mRNAs through a network-based approach in order to learn the crucial role that they play in the biological processes of esophageal cancer. Esophageal squamous-cell carcinoma (ESCC) and adenocarcinoma (EAC)-related miRNA and gene expression data were downloaded from the Gene Expression Omnibus database, and differentially expressed miRNAs and genes were selected. Target genes of differentially expressed miRNAs were predicted and their regulatory networks were constructed. Differentially expressed miRNA analysis selected four miRNAs associated with EAC and ESCC, among which hsa-miR-21 and hsa-miR-202 were shared by both diseases. hsa-miR-202 was reported for the first time to be associated with esophageal cancer in the present study. Differentially expressed miRNA target genes were mainly involved in cancer-related and signal-transduction pathways. Functional categories of these target genes were related to transcriptional regulation. The results may indicate potential target miRNAs and genes for future investigations of esophageal cancer.

  17. Analysis and Visualization Tool for Targeted Amplicon Bisulfite Sequencing on Ion Torrent Sequencers.

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    Full Text Available Targeted sequencing of PCR amplicons generated from bisulfite deaminated DNA is a flexible, cost-effective way to study methylation of a sample at single CpG resolution and perform subsequent multi-target, multi-sample comparisons. Currently, no platform specific protocol, support, or analysis solution is provided to perform targeted bisulfite sequencing on a Personal Genome Machine (PGM. Here, we present a novel tool, called TABSAT, for analyzing targeted bisulfite sequencing data generated on Ion Torrent sequencers. The workflow starts with raw sequencing data, performs quality assessment, and uses a tailored version of Bismark to map the reads to a reference genome. The pipeline visualizes results as lollipop plots and is able to deduce specific methylation-patterns present in a sample. The obtained profiles are then summarized and compared between samples. In order to assess the performance of the targeted bisulfite sequencing workflow, 48 samples were used to generate 53 different Bisulfite-Sequencing PCR amplicons from each sample, resulting in 2,544 amplicon targets. We obtained a mean coverage of 282X using 1,196,822 aligned reads. Next, we compared the sequencing results of these targets to the methylation level of the corresponding sites on an Illumina 450k methylation chip. The calculated average Pearson correlation coefficient of 0.91 confirms the sequencing results with one of the industry-leading CpG methylation platforms and shows that targeted amplicon bisulfite sequencing provides an accurate and cost-efficient method for DNA methylation studies, e.g., to provide platform-independent confirmation of Illumina Infinium 450k methylation data. TABSAT offers a novel way to analyze data generated by Ion Torrent instruments and can also be used with data from the Illumina MiSeq platform. It can be easily accessed via the Platomics platform, which offers a web-based graphical user interface along with sample and parameter storage

  18. Maximum power point tracking analysis of a coreless ironless electric generator for renewable energy application

    Science.gov (United States)

    Razali, Akhtar; Rahman, Fadhlur; Leong, Yap Wee; Razali Hanipah, Mohd; Azri Hizami, Mohd

    2018-04-01

    The magnetism attraction between permanent magnets and soft ironcore lamination in a conventional electric ironcore generator is often known as cogging. Cogging requires an additional input power to overcome, hence became one of the power loss sources. With the increasing of power output, the cogging is also proportionally increased. This leads to the increasing of the supplied power of the driver motor to overcome the cog. Therefore, this research is embarked to study fundamentally about the possibility of removing ironcore lamination in an electric generator to see its performance characteristic. In the maximum power point tracking test, the fabricated ironless coreless electricity generator was tested by applying the load on the ironless coreless electricity generator optimization to maximize the power generated, voltage and the current produced by the ironless coreless electricity generator when the rotational speed of the rotor increased throughout the test. The rotational torque and power output are measured, and efficiency is then analyzed. Results indicated that the generator produced RMS voltage of 200VAC at rotational speed of 318 RPM. Torque required to rotate the generator was at 10.8Nm. The generator had working efficiency of 77.73% and the power generated was at 280W.

  19. Smart point-of-care systems for molecular diagnostics based on nanotechnology: whole blood glucose analysis

    Science.gov (United States)

    Devadhasan, Jasmine P.; Kim, Sanghyo

    2015-07-01

    Complementary metal oxide semiconductor (CMOS) image sensors are received great attention for their high efficiency in biological applications. The present work describes a CMOS image sensor-based whole blood glucose monitoring system through a point-of-care (POC) approach. A simple poly-ethylene terephthalate (PET) film chip was developed to carry out the enzyme kinetic reaction at various concentrations of blood glucose. In this technique, assay reagent was adsorbed onto amine functionalized silica (AFSiO2) nanoparticles in order to achieve glucose oxidation on the PET film chip. The AFSiO2 nanoparticles can immobilize the assay reagent with an electrostatic attraction and eased to develop the opaque platform which was technically suitable chip to analyze by the camera module. The oxidized glucose then produces a green color according to the glucose concentration and is analyzed by the camera module as a photon detection technique. The photon number decreases with increasing glucose concentration. The simple sensing approach, utilizing enzyme immobilized AFSiO2 nanoparticle chip and assay detection method was developed for quantitative glucose measurement.

  20. Parametric analysis of a combined dew point evaporative-vapour compression based air conditioning system

    Directory of Open Access Journals (Sweden)

    Shailendra Singh Chauhan

    2016-09-01

    Full Text Available A dew point evaporative-vapour compression based combined air conditioning system for providing good human comfort conditions at a low cost has been proposed in this paper. The proposed system has been parametrically analysed for a wide range of ambient temperatures and specific humidity under some reasonable assumptions. The proposed system has also been compared from the conventional vapour compression air conditioner on the basis of cooling load on the cooling coil working on 100% fresh air assumption. The saving of cooling load on the coil was found to be maximum with a value of 60.93% at 46 °C and 6 g/kg specific humidity, while it was negative for very high humidity of ambient air, which indicates that proposed system is applicable for dry and moderate humid conditions but not for very humid conditions. The system is working well with an average net monthly power saving of 192.31 kW h for hot and dry conditions and 124.38 kW h for hot and moderate humid conditions. Therefore it could be a better alternative for dry and moderate humid climate with a payback period of 7.2 years.