WorldWideScience

Sample records for pca temporally constrained

  1. k-t PCA: temporally constrained k-t BLAST reconstruction using principal component analysis

    DEFF Research Database (Denmark)

    Pedersen, Henrik; Kozerke, Sebastian; Ringgaard, Steffen

    2009-01-01

    in applications exhibiting a broad range of temporal frequencies such as free-breathing myocardial perfusion imaging. We show that temporal basis functions calculated by subjecting the training data to principal component analysis (PCA) can be used to constrain the reconstruction such that the temporal resolution...... is improved. The presented method is called k-t PCA....

  2. On applicability of PCA, voxel-wise variance normalization and dimensionality assumptions for sliding temporal window sICA in resting-state fMRI.

    Science.gov (United States)

    Remes, Jukka J; Abou Elseoud, Ahmed; Ollila, Esa; Haapea, Marianne; Starck, Tuomo; Nikkinen, Juha; Tervonen, Osmo; Silven, Olli

    2013-10-01

    Subject-level resting-state fMRI (RS-fMRI) spatial independent component analysis (sICA) may provide new ways to analyze the data when performed in the sliding time window. However, whether principal component analysis (PCA) and voxel-wise variance normalization (VN) are applicable pre-processing procedures in the sliding-window context, as they are for regular sICA, has not been addressed so far. Also model order selection requires further studies concerning sliding-window sICA. In this paper we have addressed these concerns. First, we compared PCA-retained subspaces concerning overlapping parts of consecutive temporal windows to answer whether in-window PCA and VN can confound comparisons between sICA analyses in consecutive windows. Second, we compared the PCA subspaces between windowed and full data to assess expected comparability between windowed and full-data sICA results. Third, temporal evolution of dimensionality estimates in RS-fMRI data sets was monitored to identify potential challenges in model order selection in a sliding-window sICA context. Our results illustrate that in-window VN can be safely used, in-window PCA is applicable with most window widths and that comparisons between windowed and full data should not be performed from a subspace similarity point of view. In addition, our studies on dimensionality estimates demonstrated that there are sustained, periodic and very case-specific changes in signal-to-noise ratio within RS-fMRI data sets. Consequently, dimensionality estimation is needed for well-founded model order determination in the sliding-window case. The observed periodic changes correspond to a frequency band of ≤0.1 Hz, which is commonly associated with brain activity in RS-fMRI and become on average most pronounced at window widths of 80 and 60 time points (144 and 108 s, respectively). Wider windows provided only slightly better comparability between consecutive windows, and 60 time point or shorter windows also provided the

  3. Audio-visual temporal recalibration can be constrained by content cues regardless of spatial overlap

    Directory of Open Access Journals (Sweden)

    Warrick eRoseboom

    2013-04-01

    Full Text Available It has now been well established that the point of subjective synchrony for audio and visual events can be shifted following exposure to asynchronous audio-visual presentations, an effect often referred to as temporal recalibration. Recently it was further demonstrated that it is possible to concurrently maintain two such recalibrated, and opposing, estimates of audio-visual temporal synchrony. However, it remains unclear precisely what defines a given audio-visual pair such that it is possible to maintain a temporal relationship distinct from other pairs. It has been suggested that spatial separation of the different audio-visual pairs is necessary to achieve multiple distinct audio-visual synchrony estimates. Here we investigated if this was necessarily true. Specifically, we examined whether it is possible to obtain two distinct temporal recalibrations for stimuli that differed only in featural content. Using both complex (audio visual speech; Experiment 1 and simple stimuli (high and low pitch audio matched with either vertically or horizontally oriented Gabors; Experiment 2 we found concurrent, and opposite, recalibrations despite there being no spatial difference in presentation location at any point throughout the experiment. This result supports the notion that the content of an audio-visual pair can be used to constrain distinct audio-visual synchrony estimates regardless of spatial overlap.

  4. The effect of stimulus intensity on response time and accuracy in dynamic, temporally constrained environments.

    Science.gov (United States)

    Causer, J; McRobert, A P; Williams, A M

    2013-10-01

    The ability to make accurate judgments and execute effective skilled movements under severe temporal constraints are fundamental to elite performance in a number of domains including sport, military combat, law enforcement, and medicine. In two experiments, we examine the effect of stimulus strength on response time and accuracy in a temporally constrained, real-world, decision-making task. Specifically, we examine the effect of low stimulus intensity (black) and high stimulus intensity (sequin) uniform designs, worn by teammates, to determine the effect of stimulus strength on the ability of soccer players to make rapid and accurate responses. In both field- and laboratory-based scenarios, professional soccer players viewed developing patterns of play and were required to make a penetrative pass to an attacking player. Significant differences in response accuracy between uniform designs were reported in laboratory- and field-based experiments. Response accuracy was significantly higher in the sequin compared with the black uniform condition. Response times only differed between uniform designs in the laboratory-based experiment. These findings extend the literature into a real-world environment and have significant implications for the design of clothing wear in a number of domains. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Acoustical Survey of Methane Plumes on North Hydrate Ridge: Constraining Temporal and Spatial Characteristics.

    Science.gov (United States)

    Kannberg, P. K.; Trehu, A. M.

    2008-12-01

    While methane plumes associated with hydrate formations have been acoustically imaged before, little is known about their temporal characteristics. Previous acoustic surveys have focused on determining plume location, but as far as we know, multiple, repeated surveys of the same plume have not been done prior to the survey presented here. In July 2008, we acquired sixteen identical surveys within 19 hours over the northern summit of Hydrate Ridge in the Cascadia accretionary complex using the onboard 3.5 and 12 kHz echosounders. As in previous studies, the plumes were invisible to the 3.5 kHz echosounder and clearly imaged with 12 kHz. Seafloor depth in this region is ~600 m. Three distinct plumes were detected close to where plumes were located by Heeschen et al. (2003) a decade ago. Two of the plumes disappeared at ~520 m water depth, which is the depth of the top of the gas hydrate stability as determined from CTD casts obtained during the cruise. This supports the conclusion of Heeschen et al. (2003) that the bubbles are armored by gas hydrate and that they dissolve in the water column when they leave the hydrate stability zone. One of the plumes near the northern summit, however, extended through this boundary to at least 400 m (the shallowest depth recorded). A similar phenomenon was observed in methane plumes in the Gulf of Mexico, where the methane was found to be armored by an oil skin. In addition to the steady plumes, two discrete "burps" were observed. One "burp" occurred approximately 600 m to the SSW of the northern summit. This was followed by a second strong event 300m to the north an hour later. To evaluate temporal and spatial patterns, we summed the power of the backscattered signal in different depth windows for each survey. We present the results as a movie in which the backscatter power is shown in map view as a function of time. The surveys encompassed two complete tidal cycles, but no correlation between plume location or intensity and tides

  6. The PCa Tumor Microenvironment.

    Science.gov (United States)

    Sottnik, Joseph L; Zhang, Jian; Macoska, Jill A; Keller, Evan T

    2011-12-01

    The tumor microenvironment (TME) is a very complex niche that consists of multiple cell types, supportive matrix and soluble factors. Cells in the TME consist of both host cells that are present at tumor site at the onset of tumor growth and cells that are recruited in either response to tumor- or host-derived factors. PCa (PCa) thrives on crosstalk between tumor cells and the TME. Crosstalk results in an orchestrated evolution of both the tumor and microenvironment as the tumor progresses. The TME reacts to PCa-produced soluble factors as well as direct interaction with PCa cells. In return, the TME produces soluble factors, structural support and direct contact interactions that influence the establishment and progression of PCa. In this review, we focus on the host side of the equation to provide a foundation for understanding how different aspects of the TME contribute to PCa progression. We discuss immune effector cells, specialized niches, such as the vascular and bone marrow, and several key protein factors that mediate host effects on PCa. This discussion highlights the concept that the TME offers a potentially very fertile target for PCa therapy.

  7. Using temporal seeding to constrain the disparity search range in stereo matching

    CSIR Research Space (South Africa)

    Ndhlovu, T

    2011-11-01

    Full Text Available for reusing computed disparity estimates on features in a stereo image sequence to constrain the disparity search range. Features are detected on a left image and their disparity estimates are computed using a local-matching algorithm. The features...

  8. Audio-Visual Temporal Recalibration Can be Constrained by Content Cues Regardless of Spatial Overlap

    OpenAIRE

    Roseboom, Warrick; Kawabe, Takahiro; Nishida, Shin?Ya

    2013-01-01

    It has now been well established that the point of subjective synchrony for audio and visual events can be shifted following exposure to asynchronous audio-visual presentations, an effect often referred to as temporal recalibration. Recently it was further demonstrated that it is possible to concurrently maintain two such recalibrated, and opposing, estimates of audio-visual temporal synchrony. However, it remains unclear precisely what defines a given audio-visual pair such that it is possib...

  9. Identification of associations between genotypes and longitudinal phenotypes via temporally-constrained group sparse canonical correlation analysis.

    Science.gov (United States)

    Hao, Xiaoke; Li, Chanxiu; Yan, Jingwen; Yao, Xiaohui; Risacher, Shannon L; Saykin, Andrew J; Shen, Li; Zhang, Daoqiang

    2017-07-15

    Neuroimaging genetics identifies the relationships between genetic variants (i.e., the single nucleotide polymorphisms) and brain imaging data to reveal the associations from genotypes to phenotypes. So far, most existing machine-learning approaches are widely used to detect the effective associations between genetic variants and brain imaging data at one time-point. However, those associations are based on static phenotypes and ignore the temporal dynamics of the phenotypical changes. The phenotypes across multiple time-points may exhibit temporal patterns that can be used to facilitate the understanding of the degenerative process. In this article, we propose a novel temporally constrained group sparse canonical correlation analysis (TGSCCA) framework to identify genetic associations with longitudinal phenotypic markers. The proposed TGSCCA method is able to capture the temporal changes in brain from longitudinal phenotypes by incorporating the fused penalty, which requires that the differences between two consecutive canonical weight vectors from adjacent time-points should be small. A new efficient optimization algorithm is designed to solve the objective function. Furthermore, we demonstrate the effectiveness of our algorithm on both synthetic and real data (i.e., the Alzheimer's Disease Neuroimaging Initiative cohort, including progressive mild cognitive impairment, stable MCI and Normal Control participants). In comparison with conventional SCCA, our proposed method can achieve strong associations and discover phenotypic biomarkers across multiple time-points to guide disease-progressive interpretation. The Matlab code is available at https://sourceforge.net/projects/ibrain-cn/files/ . dqzhang@nuaa.edu.cn or shenli@iu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Constraining Swiss Methane Emissions from Atmospheric Observations: Sensitivities and Temporal Development

    Science.gov (United States)

    Henne, Stephan; Leuenberger, Markus; Steinbacher, Martin; Eugster, Werner; Meinhardt, Frank; Bergamaschi, Peter; Emmenegger, Lukas; Brunner, Dominik

    2017-04-01

    Similar to other Western European countries, agricultural sources dominate the methane (CH4) emission budget in Switzerland. 'Bottom-up' estimates of these emissions are still connected with relatively large uncertainties due to considerable variability and uncertainties in observed emission factors for the underlying processes (e.g., enteric fermentation, manure management). Here, we present a regional-scale (˜300 x 200 km2) atmospheric inversion study of CH4 emissions in Switzerland making use of the recently established CarboCount-CH network of four stations on the Swiss Plateau as well as the neighbouring mountain-top sites Jungfraujoch and Schauinsland (Germany). Continuous observations from all CarboCount-CH sites are available since 2013. We use a high-resolution (7 x 7 km2) Lagrangian particle dispersion model (FLEXPART-COSMO) in connection with two different inversion systems (Bayesian and extended Kalman filter) to estimate spatially and temporally resolved CH4 emissions for the Swiss domain in the period 2013 to 2016. An extensive set of sensitivity inversions is used to assess the overall uncertainty of our inverse approach. In general we find good agreement of the total Swiss CH4 emissions between our 'top-down' estimate and the national 'bottom-up' reporting. In addition, a robust emission seasonality, with reduced winter time values, can be seen in all years. No significant trend or year-to-year variability was observed for the analysed four-year period, again in agreement with a very small downward trend in the national 'bottom-up' reporting. Special attention is given to the influence of boundary conditions as taken from different global scale model simulations (TM5, FLEXPART) and remote observations. We find that uncertainties in the boundary conditions can induce large offsets in the national total emissions. However, spatial emission patterns are less sensitive to the choice of boundary condition. Furthermore and in order to demonstrate the

  11. Volatilization from PCA steel alloy

    Energy Technology Data Exchange (ETDEWEB)

    Hagrman, D.L.; Smolik, G.R.; McCarthy, K.A.; Petti, D.A.

    1996-08-01

    The mobilizations of key components from Primary Candidate Alloy (PCA) steel alloy have been measured with laboratory-scale experiments. The experiments indicate most of the mobilization from PCA steel is due to oxide formation and spalling but that the spalled particles are large enough to settle rapidly. Based on the experiments, models for the volatization of iron, manganese, and cobalt from PCA steel in steam and molybdenum from PCA steel in air have been derived.

  12. Petrology of Antarctic Eucrites PCA 91078 and PCA 91245

    Science.gov (United States)

    Howard, L. M.; Domanik, K. J.; Drake, M. J.; Mittlefehldt, D. W.

    2002-01-01

    Antarctic eucrites PCA 91078 and PCA 91245, are petrographically characterized and found to be unpaired, type 6, basaltic eucrites. Observed textures that provide insight into the petrogenesis of these meteorites are also discussed. Additional information is contained in the original extended abstract.

  13. Memory Efficient PCA Methods for Large Group ICA.

    Science.gov (United States)

    Rachakonda, Srinivas; Silva, Rogers F; Liu, Jingyu; Calhoun, Vince D

    2016-01-01

    Principal component analysis (PCA) is widely used for data reduction in group independent component analysis (ICA) of fMRI data. Commonly, group-level PCA of temporally concatenated datasets is computed prior to ICA of the group principal components. This work focuses on reducing very high dimensional temporally concatenated datasets into its group PCA space. Existing randomized PCA methods can determine the PCA subspace with minimal memory requirements and, thus, are ideal for solving large PCA problems. Since the number of dataloads is not typically optimized, we extend one of these methods to compute PCA of very large datasets with a minimal number of dataloads. This method is coined multi power iteration (MPOWIT). The key idea behind MPOWIT is to estimate a subspace larger than the desired one, while checking for convergence of only the smaller subset of interest. The number of iterations is reduced considerably (as well as the number of dataloads), accelerating convergence without loss of accuracy. More importantly, in the proposed implementation of MPOWIT, the memory required for successful recovery of the group principal components becomes independent of the number of subjects analyzed. Highly efficient subsampled eigenvalue decomposition techniques are also introduced, furnishing excellent PCA subspace approximations that can be used for intelligent initialization of randomized methods such as MPOWIT. Together, these developments enable efficient estimation of accurate principal components, as we illustrate by solving a 1600-subject group-level PCA of fMRI with standard acquisition parameters, on a regular desktop computer with only 4 GB RAM, in just a few hours. MPOWIT is also highly scalable and could realistically solve group-level PCA of fMRI on thousands of subjects, or more, using standard hardware, limited only by time, not memory. Also, the MPOWIT algorithm is highly parallelizable, which would enable fast, distributed implementations ideal for big

  14. Memory efficient PCA methods for large group ICA

    Directory of Open Access Journals (Sweden)

    Srinivas eRachakonda

    2016-02-01

    Full Text Available Principal component analysis (PCA is widely used for data reduction in group independent component analysis (ICA of fMRI data. Commonly, group-level PCA of temporally concatenated datasets is computed prior to ICA of the group principal components. This work focuses on reducing very high dimensional temporally concatenated datasets into its group PCA space. Existing randomized PCA methods can determine the PCA subspace with minimal memory requirements and, thus, are ideal for solving large PCA problems. Since the number of dataloads is not typically optimized, we extend one of these methods to compute PCA of very large datasets with a minimal number of dataloads. This method is coined multi power iteration (MPOWIT. The key idea behind MPOWIT is to estimate a subspace larger than the desired one, while checking for convergence of only the smaller subset of interest. The number of iterations is reduced considerably (as well as the number of dataloads, accelerating convergence without loss of accuracy. More importantly, in the proposed implementation of MPOWIT, the memory required for successful recovery of the group principal components becomes independent of the number of subjects analyzed. Highly efficient subsampled eigenvalue decomposition techniques are also introduced, furnishing excellent PCA subspace approximations that can be used for intelligent initialization of randomized methods such as MPOWIT. Together, these developments enable efficient estimation of accurate principal components, as we illustrate by solving a 1600-subject group-level PCA of fMRI with standard acquisition parameters, on a regular desktop computer with only 4GB RAM, in just a few hours. MPOWIT is also highly scalable and could realistically solve group-level PCA of fMRI on thousands of subjects, or more, using standard hardware, limited only by time, not memory. Also, the MPOWIT algorithm is highly parallelizable, which would enable fast, distributed implementations

  15. Circle of Willis Variants: Fetal PCA

    Directory of Open Access Journals (Sweden)

    Amir Shaban

    2013-01-01

    Full Text Available We sought to determine the prevalence of fetal posterior cerebral artery (fPCA and if fPCA was associated with specific stroke etiology and vessel territory affected. This paper is a retrospective review of prospectively identified patients with acute ischemic stroke from July 2008 to December 2010. We defined complete fPCA as absence of a P1 segment linking the basilar with the PCA and partial fPCA as small segment linking the basilar with the PCA. Patients without intracranial vascular imaging were excluded. We compared patients with complete fPCA, partial fPCA, and without fPCA in terms of demographics, stroke severity, distribution, and etiology and factored in whether the stroke was ipsilateral to the fPCA. Of the 536 included patients, 9.5% ( had complete fPCA and 15.1% ( had partial fPCA. Patients with complete fPCA were older and more often female than partial fPCA and no fPCA patients, and significant variation in TOAST classification was detected across groups (. Patients with complete fPCA had less small vessel and more large vessel strokes than patients with no fPCA and partial fPCA. Fetal PCA may predispose to stroke mechanism, but is not associated with vascular distribution, stroke severity, or early outcome.

  16. Recent Improvements to the Calibration Models for RXTE/PCA

    Science.gov (United States)

    Jahoda, K.

    2008-01-01

    We are updating the calibration of the PCA to correct for slow variations, primarily in energy to channel relationship. We have also improved the physical model in the vicinity of the Xe K-edge, which should increase the reliability of continuum fits above 20 keV. The improvements to the matrix are especially important to simultaneous observations, where the PCA is often used to constrain the continuum while other higher resolution spectrometers are used to study the shape of lines and edges associated with Iron.

  17. PCaPAC 2006 Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Pavel Chevtsov; Matthew Bickley (Eds.)

    2007-03-30

    The 6-th international PCaPAC (Personal Computers and Particle Accelerator Controls) workshop was held at Jefferson Lab, Newport News, Virginia, from October 24-27, 2006. The main objectives of the conference were to discuss the most important issues of the use of PCs and modern IT technologies for controls of accelerators and to give scientists, engineers, and technicians a forum to exchange the ideas on control problems and their solutions. The workshop consisted of plenary sessions and poster sessions. No parallel sessions were held.Totally, more than seventy oral and poster presentations as well as tutorials were made during the conference, on the basis of which about fifty papers were submitted by the authors and included in this publication. This printed version of the PCaPAC 2006 Proceedings is published at Jefferson Lab according to the decision of the PCaPAC International Program Committee of October 26, 2006.

  18. Automated tractography in patients with temporal lobe epilepsy using TRActs Constrained by UnderLying Anatomy (TRACULA

    Directory of Open Access Journals (Sweden)

    Barbara A.K. Kreilkamp

    2017-01-01

    Conclusion: This study shows that TRACULA permits the detection of alterations of DTI tract scalar metrics in patients with TLE. It also provides the opportunity to explore relationships with structural volume measurements and clinical variables along white matter tracts. Our data suggests that the anterior temporal lobe portions of the uncinate and inferior-longitudinal fasciculus may be particularly vulnerable to pathological alterations in patients with TLE. These alterations are unrelated to the extent of hippocampal atrophy (and therefore potentially mediated by independent mechanisms but influenced by chronicity and severity of the disorder.

  19. Spatio-Temporal Constrained Human Trajectory Generation from the PIR Motion Detector Sensor Network Data: A Geometric Algebra Approach.

    Science.gov (United States)

    Yu, Zhaoyuan; Yuan, Linwang; Luo, Wen; Feng, Linyao; Lv, Guonian

    2015-12-30

    Passive infrared (PIR) motion detectors, which can support long-term continuous observation, are widely used for human motion analysis. Extracting all possible trajectories from the PIR sensor networks is important. Because the PIR sensor does not log location and individual information, none of the existing methods can generate all possible human motion trajectories that satisfy various spatio-temporal constraints from the sensor activation log data. In this paper, a geometric algebra (GA)-based approach is developed to generate all possible human trajectories from the PIR sensor network data. Firstly, the representation of the geographical network, sensor activation response sequences and the human motion are represented as algebraic elements using GA. The human motion status of each sensor activation are labeled using the GA-based trajectory tracking. Then, a matrix multiplication approach is developed to dynamically generate the human trajectories according to the sensor activation log and the spatio-temporal constraints. The method is tested with the MERL motion database. Experiments show that our method can flexibly extract the major statistical pattern of the human motion. Compared with direct statistical analysis and tracklet graph method, our method can effectively extract all possible trajectories of the human motion, which makes it more accurate. Our method is also likely to provides a new way to filter other passive sensor log data in sensor networks.

  20. Spatio-Temporal Constrained Human Trajectory Generation from the PIR Motion Detector Sensor Network Data: A Geometric Algebra Approach

    Directory of Open Access Journals (Sweden)

    Zhaoyuan Yu

    2015-12-01

    Full Text Available Passive infrared (PIR motion detectors, which can support long-term continuous observation, are widely used for human motion analysis. Extracting all possible trajectories from the PIR sensor networks is important. Because the PIR sensor does not log location and individual information, none of the existing methods can generate all possible human motion trajectories that satisfy various spatio-temporal constraints from the sensor activation log data. In this paper, a geometric algebra (GA-based approach is developed to generate all possible human trajectories from the PIR sensor network data. Firstly, the representation of the geographical network, sensor activation response sequences and the human motion are represented as algebraic elements using GA. The human motion status of each sensor activation are labeled using the GA-based trajectory tracking. Then, a matrix multiplication approach is developed to dynamically generate the human trajectories according to the sensor activation log and the spatio-temporal constraints. The method is tested with the MERL motion database. Experiments show that our method can flexibly extract the major statistical pattern of the human motion. Compared with direct statistical analysis and tracklet graph method, our method can effectively extract all possible trajectories of the human motion, which makes it more accurate. Our method is also likely to provides a new way to filter other passive sensor log data in sensor networks.

  1. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  2. Circle of Willis Variants: Fetal PCA

    OpenAIRE

    Amir Shaban; Karen C. Albright; Amelia K. Boehme; Sheryl Martin-Schild

    2013-01-01

    We sought to determine the prevalence of fetal posterior cerebral artery (fPCA) and if fPCA was associated with specific stroke etiology and vessel territory affected. This paper is a retrospective review of prospectively identified patients with acute ischemic stroke from July 2008 to December 2010. We defined complete fPCA as absence of a P1 segment linking the basilar with the PCA and partial fPCA as small segment linking the basilar with the PCA. Patients without intracranial vascular ima...

  3. Sparse PCA with Oracle Property.

    Science.gov (United States)

    Gu, Quanquan; Wang, Zhaoran; Liu, Han

    In this paper, we study the estimation of the k -dimensional sparse principal subspace of covariance matrix Σ in the high-dimensional setting. We aim to recover the oracle principal subspace solution, i.e., the principal subspace estimator obtained assuming the true support is known a priori. To this end, we propose a family of estimators based on the semidefinite relaxation of sparse PCA with novel regularizations. In particular, under a weak assumption on the magnitude of the population projection matrix, one estimator within this family exactly recovers the true support with high probability, has exact rank- k , and attains a [Formula: see text] statistical rate of convergence with s being the subspace sparsity level and n the sample size. Compared to existing support recovery results for sparse PCA, our approach does not hinge on the spiked covariance model or the limited correlation condition. As a complement to the first estimator that enjoys the oracle property, we prove that, another estimator within the family achieves a sharper statistical rate of convergence than the standard semidefinite relaxation of sparse PCA, even when the previous assumption on the magnitude of the projection matrix is violated. We validate the theoretical results by numerical experiments on synthetic datasets.

  4. A temporal record of pre-eruptive magmatic volatile contents at Campi Flegrei: Insights from texturally-constrained apatite analyses

    Science.gov (United States)

    Stock, Michael J.; Isaia, Roberto; Humphreys, Madeleine C. S.; Smith, Victoria C.; Pyle, David M.

    2016-04-01

    Apatite is capable of incorporating all major magmatic volatile species (H2O, CO2, S, Cl and F) into its crystal structure. Analysis of apatite volatile contents can be related to parental magma compositions through the application of pressure and temperature-dependent exchange reactions (Piccoli and Candela, 1994). Once included within phenocrysts, apatite inclusions are isolated from the melt and preserve a temporal record of magmatic volatile contents in the build-up to eruption. In this work, we measured the volatile compositions of apatite inclusions, apatite microphenocrysts and pyroxene-hosted melt inclusions from the Astroni 1 eruption of Campi Flegrei, Italy (Stock et al. 2016). These data are coupled with magmatic differentiation models (Gualda et al., 2012), experimental volatile solubility data (Webster et al., 2014) and thermodynamic models of apatite compositional variations (Piccoli and Candela, 1994) to decipher pre-eruptive magmatic processes. We find that apatite halogen/OH ratios decreased through magmatic differentiation, while melt inclusion F and Cl concentrations increased. Melt inclusion H2O contents are constant at ~2.5 wt%. These data are best explained by volatile-undersaturated differentiation over most of the crystallisation history of the Astroni 1 melt, with melt inclusion H2O contents reset at shallow levels during ascent. Given the high diffusivity of volatiles in apatite (Brenan, 1993), the preservation of volatile-undersaturated melt compositions in microphenocrysts suggests that saturation was only achieved 10 - 103 days before eruption. We suggest that late-stage transition into a volatile-saturated state caused an increase in magma chamber overpressure, which ultimately triggered the Astroni 1 eruption. This has major implications for monitoring of Campi Flegrei and other similar volcanic systems. Piccoli and Candela, 1994. Am. J. of Sc., 294, 92-135. Stock et al., 2016, Nat. Geosci. Gualda et al., 2012. J. Pet., 53, 875

  5. Tensile properties of unirradiated path A PCA

    International Nuclear Information System (INIS)

    Braski, D.N.; Maziasz, P.J.

    1983-01-01

    The tensile properties of PCA in the Al (solution annealed), A3 (25%-cold worked), and B2 (aged, cold worked, and reaged) conditions were determined from room temperature to 600 0 C. The tensile behavior of PCA-A1 and -A3 was generally similar to that of titanium-modified type 316 stainless steel with similar microstructures. The PCA-B2 was weaker than PCA-A3, especially above 500 0 C, but demonstrated slightly better ducility

  6. Improved k-t PCA Algorithm Using Artificial Sparsity in Dynamic MRI.

    Science.gov (United States)

    Wang, Yiran; Chen, Zhifeng; Wang, Jing; Yuan, Lixia; Xia, Ling; Liu, Feng

    2017-01-01

    The k - t principal component analysis ( k - t PCA) is an effective approach for high spatiotemporal resolution dynamic magnetic resonance (MR) imaging. However, it suffers from larger residual aliasing artifacts and noise amplification when the reduction factor goes higher. To further enhance the performance of this technique, we propose a new method called sparse k - t PCA that combines the k - t PCA algorithm with an artificial sparsity constraint. It is a self-calibrated procedure that is based on the traditional k - t PCA method by further eliminating the reconstruction error derived from complex subtraction of the sampled k - t space from the original reconstructed k - t space. The proposed method is tested through both simulations and in vivo datasets with different reduction factors. Compared to the standard k - t PCA algorithm, the sparse k - t PCA can improve the normalized root-mean-square error performance and the accuracy of temporal resolution. It is thus useful for rapid dynamic MR imaging.

  7. Semi-Supervised Kernel PCA

    DEFF Research Database (Denmark)

    Walder, Christian; Henao, Ricardo; Mørup, Morten

    We present three generalisations of Kernel Principal Components Analysis (KPCA) which incorporate knowledge of the class labels of a subset of the data points. The first, MV-KPCA, penalises within class variances similar to Fisher discriminant analysis. The second, LSKPCA is a hybrid of least...... squares regression and kernel PCA. The final LR-KPCA is an iteratively reweighted version of the previous which achieves a sigmoid loss function on the labeled points. We provide a theoretical risk bound as well as illustrative experiments on real and toy data sets....

  8. International assessment of PCA codes

    International Nuclear Information System (INIS)

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE

  9. Parallel GPU implementation of iterative PCA algorithms.

    Science.gov (United States)

    Andrecut, M

    2009-11-01

    Principal component analysis (PCA) is a key statistical technique for multivariate data analysis. For large data sets, the common approach to PCA computation is based on the standard NIPALS-PCA algorithm, which unfortunately suffers from loss of orthogonality, and therefore its applicability is usually limited to the estimation of the first few components. Here we present an algorithm based on Gram-Schmidt orthogonalization (called GS-PCA), which eliminates this shortcoming of NIPALS-PCA. Also, we discuss the GPU (Graphics Processing Unit) parallel implementation of both NIPALS-PCA and GS-PCA algorithms. The numerical results show that the GPU parallel optimized versions, based on CUBLAS (NVIDIA), are substantially faster (up to 12 times) than the CPU optimized versions based on CBLAS (GNU Scientific Library).

  10. MD-11 PCA - Research flight team photo

    Science.gov (United States)

    1995-01-01

    On Aug. 30, 1995, a the McDonnell Douglas MD-11 transport aircraft landed equipped with a computer-assisted engine control system that has the potential to increase flight safety. In landings at NASA Dryden Flight Research Center, Edwards, California, on August 29 and 30, the aircraft demonstrated software used in the aircraft's flight control computer that essentially landed the MD-11 without a need for the pilot to manipulate the flight controls significantly. In partnership with McDonnell Douglas Aerospace (MDA), with Pratt & Whitney and Honeywell helping to design the software, NASA developed this propulsion-controlled aircraft (PCA) system following a series of incidents in which hydraulic failures resulted in the loss of flight controls. This new system enables a pilot to operate and land the aircraft safely when its normal, hydraulically-activated control surfaces are disabled. This August 29, 1995, photo shows the MD-11 team. Back row, left to right: Tim Dingen, MDA pilot; John Miller, MD-11 Chief pilot (MDA); Wayne Anselmo, MD-11 Flight Test Engineer (MDA); Gordon Fullerton, PCA Project pilot; Bill Burcham, PCA Chief Engineer; Rudey Duran, PCA Controls Engineer (MDA); John Feather, PCA Controls Engineer (MDA); Daryl Townsend, Crew Chief; Henry Hernandez, aircraft mechanic; Bob Baron, PCA Project Manager; Don Hermann, aircraft mechanic; Jerry Cousins, aircraft mechanic; Eric Petersen, PCA Manager (Honeywell); Trindel Maine, PCA Data Engineer; Jeff Kahler, PCA Software Engineer (Honeywell); Steve Goldthorpe, PCA Controls Engineer (MDA). Front row, left to right: Teresa Hass, Senior Project Management Analyst; Hollie Allingham (Aguilera), Senior Project Management Analyst; Taher Zeglum, PCA Data Engineer (MDA); Drew Pappas, PCA Project Manager (MDA); John Burken, PCA Control Engineer.

  11. An efficient algorithm for weighted PCA

    NARCIS (Netherlands)

    Krijnen, W.P.; Kiers, H.A.L.

    1995-01-01

    The method for analyzing three-way data where one of the three components matrices in TUCKALS3 is chosen to have one column is called Replicated PCA. The corresponding algorithm is relatively inefficient. This is shown by offering an alternative algorithm called Weighted PCA. Specifically it is

  12. Model selection for Gaussian kernel PCA denoising

    DEFF Research Database (Denmark)

    Jørgensen, Kasper Winther; Hansen, Lars Kai

    2012-01-01

    We propose kernel Parallel Analysis (kPA) for automatic kernel scale and model order selection in Gaussian kernel PCA. Parallel Analysis [1] is based on a permutation test for covariance and has previously been applied for model order selection in linear PCA, we here augment the procedure to also...... tune the Gaussian kernel scale of radial basis function based kernel PCA.We evaluate kPA for denoising of simulated data and the US Postal data set of handwritten digits. We find that kPA outperforms other heuristics to choose the model order and kernel scale in terms of signal-to-noise ratio (SNR...

  13. Performance comparisons between PCA-EA-LBG and PCA-LBG-EA approaches in VQ codebook generation for image compression

    Science.gov (United States)

    Tsai, Jinn-Tsong; Chou, Ping-Yi; Chou, Jyh-Horng

    2015-11-01

    The aim of this study is to generate vector quantisation (VQ) codebooks by integrating principle component analysis (PCA) algorithm, Linde-Buzo-Gray (LBG) algorithm, and evolutionary algorithms (EAs). The EAs include genetic algorithm (GA), particle swarm optimisation (PSO), honey bee mating optimisation (HBMO), and firefly algorithm (FF). The study is to provide performance comparisons between PCA-EA-LBG and PCA-LBG-EA approaches. The PCA-EA-LBG approaches contain PCA-GA-LBG, PCA-PSO-LBG, PCA-HBMO-LBG, and PCA-FF-LBG, while the PCA-LBG-EA approaches contain PCA-LBG, PCA-LBG-GA, PCA-LBG-PSO, PCA-LBG-HBMO, and PCA-LBG-FF. All training vectors of test images are grouped according to PCA. The PCA-EA-LBG used the vectors grouped by PCA as initial individuals, and the best solution gained by the EAs was given for LBG to discover a codebook. The PCA-LBG approach is to use the PCA to select vectors as initial individuals for LBG to find a codebook. The PCA-LBG-EA used the final result of PCA-LBG as an initial individual for EAs to find a codebook. The search schemes in PCA-EA-LBG first used global search and then applied local search skill, while in PCA-LBG-EA first used local search and then employed global search skill. The results verify that the PCA-EA-LBG indeed gain superior results compared to the PCA-LBG-EA, because the PCA-EA-LBG explores a global area to find a solution, and then exploits a better one from the local area of the solution. Furthermore the proposed PCA-EA-LBG approaches in designing VQ codebooks outperform existing approaches shown in the literature.

  14. PcaO Positively Regulates pcaHG of the β-Ketoadipate Pathway in Corynebacterium glutamicum▿

    OpenAIRE

    Zhao, Ke-Xin; Huang, Yan; Chen, Xi; Wang, Nan-Xi; Liu, Shuang-Jiang

    2010-01-01

    We identified a new regulator, PcaO, which is involved in regulation of the protocatechuate (PCA) branch of the β-ketoadipate pathway in Corynebacterium glutamicum. PcaO is an atypical large ATP-binding LuxR family (LAL)-type regulator and does not have a Walker A motif. A mutant of C. glutamicum in which pcaO was disrupted (RES167ΔpcaO) was unable to grow on PCA, and growth on PCA was restored by complementation with pcaO. Both an enzymatic assay of PCA 3,4-dioxygenase activity (encoded by p...

  15. Simultaneous Estimation of Hydrochlorothiazide, Hydralazine Hydrochloride, and Reserpine Using PCA, NAS, and NAS-PCA.

    Science.gov (United States)

    Sharma, Chetan; Badyal, Pragya Nand; Rawal, Ravindra K

    2015-01-01

    In this study, new and feasible UV-visible spectrophotometric and multivariate spectrophotometric methods were described for the simultaneous determination of hydrochlorothiazide (HCTZ), hydralazine hydrochloride (H.HCl), and reserpine (RES) in combined pharmaceutical tablets. Methanol was used as a solvent for analysis and the whole UV region was scanned from 200-400 nm. The resolution was obtained by using multivariate methods such as the net analyte signal method (NAS), principal component analysis (PCA), and net analyte signal-principal component analysis (NAS-PCA) applied to the UV spectra of the mixture. The results obtained from all of the three methods were compared. NAS-PCA showed a lot of resolved data as compared to NAS and PCA. Thus, the NAS-PCA technique is a combination of NAS and PCA methods which is advantageous to obtain the information from overlapping results.

  16. PCA3 and PCA3-Based Nomograms Improve Diagnostic Accuracy in Patients Undergoing First Prostate Biopsy

    Directory of Open Access Journals (Sweden)

    Virginie Vlaeminck-Guillem

    2013-08-01

    Full Text Available While now recognized as an aid to predict repeat prostate biopsy outcome, the urinary PCA3 (prostate cancer gene 3 test has also been recently advocated to predict initial biopsy results. The objective is to evaluate the performance of the PCA3 test in predicting results of initial prostate biopsies and to determine whether its incorporation into specific nomograms reinforces its diagnostic value. A prospective study included 601 consecutive patients addressed for initial prostate biopsy. The PCA3 test was performed before ≥12-core initial prostate biopsy, along with standard risk factor assessment. Diagnostic performance of the PCA3 test was evaluated. The three available nomograms (Hansen’s and Chun’s nomograms, as well as the updated Prostate Cancer Prevention Trial risk calculator; PCPT were applied to the cohort, and their predictive accuracies were assessed in terms of biopsy outcome: the presence of any prostate cancer (PCa and high-grade prostate cancer (HGPCa. The PCA3 score provided significant predictive accuracy. While the PCPT risk calculator appeared less accurate; both Chun’s and Hansen’s nomograms provided good calibration and high net benefit on decision curve analyses. When applying nomogram-derived PCa probability thresholds ≤30%, ≤6% of HGPCa would have been missed, while avoiding up to 48% of unnecessary biopsies. The urinary PCA3 test and PCA3-incorporating nomograms can be considered as reliable tools to aid in the initial biopsy decision.

  17. Theoretical analysis of the PCA experiment

    International Nuclear Information System (INIS)

    Minsart, G.

    1980-01-01

    A very brief description of the PCA-PVF facility is given, and the studied configurations are mentioned. The analysis of the experiment has been divided into two parts: study of the fission density distribution across the PCA core and neutronic analysis of the flux spectra and spatial distributions in the whole facility. For both parts, the procedure of calculation is explained: cross section sets, one- and two-dimensional models, group collapsing, choice of bucklings, ... . The obtained results are shortly compared with the measured values, and illustrated by a figure and several tables. The computations of the fission map in the PCA core yield results in good agreement with the experimental ones (within a few percents for nearly all points). The discrepancies observed for relative reaction rates and spectral indices of a series of threshold detectors at the selected locations in and between steel and iron layers in the water reflector are briefly discussed

  18. PEM-PCA: A Parallel Expectation-Maximization PCA Face Recognition Architecture

    Directory of Open Access Journals (Sweden)

    Kanokmon Rujirakul

    2014-01-01

    Full Text Available Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages’ complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA.

  19. PEM-PCA: a parallel expectation-maximization PCA face recognition architecture.

    Science.gov (United States)

    Rujirakul, Kanokmon; So-In, Chakchai; Arnonkijpanich, Banchar

    2014-01-01

    Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages' complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA.

  20. Constraining magma physical properties and its temporal evolution from InSAR and topographic data only: a physics-based eruption model for the effusive phase of the Cordon Caulle 2011-2012 rhyodacitic eruption

    Science.gov (United States)

    Delgado, F.; Kubanek, J.; Anderson, K. R.; Lundgren, P.; Pritchard, M. E.

    2017-12-01

    The 2011-2012 eruption of Cordón Caulle volcano in Chile is the best scientifically observed rhyodacitic eruption and is thus a key place to understand the dynamics of these rare but powerful explosive rhyodacitic eruptions. Because the volatile phase controls both the eruption temporal evolution and the eruptive style, either explosive or effusive, it is important to constrain the physical parameters that drive these eruptions. The eruption began explosively and after two weeks evolved into a hybrid explosive - lava flow effusion whose volume-time evolution we constrain with a series of TanDEM-X Digital Elevation Models. Our data shows the intrusion of a large volume laccolith or cryptodome during the first 2.5 months of the eruption and lava flow effusion only afterwards, with a total volume of 1.4 km3. InSAR data from the ENVISAT and TerraSAR-X missions shows more than 2 m of subsidence during the effusive eruption phase produced by deflation of a finite spheroidal source at a depth of 5 km. In order to constrain the magma total H2O content, crystal cargo, and reservoir pressure drop we numerically solve the coupled set of equations of a pressurized magma reservoir, magma conduit flow and time dependent density, volatile exsolution and viscosity that we use to invert the InSAR and topographic data time series. We compare the best-fit model parameters with independent estimates of magma viscosity and total gas content measured from lava samples. Preliminary modeling shows that although it is not possible to model both the InSAR and the topographic data during the onset of the laccolith emplacement, it is possible to constrain the magma H2O and crystal content, to 4% wt and 30% which agree well with published literature values.

  1. A g-factor metric for k-t SENSE and k-t PCA based parallel imaging.

    Science.gov (United States)

    Binter, Christian; Ramb, Rebecca; Jung, Bernd; Kozerke, Sebastian

    2016-02-01

    To propose and validate a g-factor formalism for k-t SENSE, k-t PCA and related k-t methods for assessing SNR and temporal fidelity. An analytical gxf -factor formulation in the spatiotemporal frequency domain is derived, enabling assessment of noise and depiction fidelity in both the spatial and frequency domain. Using pseudoreplica analysis of cardiac cine data the gxf -factor description is validated and example data are used to analyze the performance of k-t methods for various parameter settings. Analytical gxf -factor maps were found to agree well with pseudoreplica analysis for 3x, 5x, and 7x k-t SENSE and k-t PCA. While k-t SENSE resulted in lower average gxf values (gx (avg) ) in static regions when compared with k-t PCA, k-t PCA yielded lower gx (avg) values in dynamic regions. Temporal transfer was better preserved with k-t PCA for increasing undersampling factors. The proposed gxf -factor and temporal transfer formalism allows assessing noise performance and temporal depiction fidelity of k-t methods including k-t SENSE and k-t PCA. The framework enables quantitative comparison of different k-t methods relative to frame-by-frame parallel imaging reconstruction. © 2015 Wiley Periodicals, Inc.

  2. MD-11 PCA - First Landing at Edwards

    Science.gov (United States)

    1995-01-01

    This McDonnell Douglas MD-11 approaches the first landing ever of a transport aircraft under engine power only on Aug. 29, 1995, at NASA's Dryden Flight Research Center, Edwards, California. The milestone flight, flown by NASA research pilot and former astronaut Gordon Fullerton, was part of a NASA project to develop a computer-assisted engine control system that enables a pilot to land a plane safely when it normal control surfaces are disabled. The Propulsion-Controlled Aircraft (PCA) system uses standard autopilot controls already present in the cockpit, together with the new programming in the aircraft's flight control computers. The PCA concept is simple--for pitch control, the program increases thrust to climb and reduces thrust to descend. To turn right, the autopilot increases the left engine thrust while decreasing the right engine thrust. The initial Propulsion-Controlled Aircraft studies by NASA were carried out at Dryden with a modified twin-engine F-15 research aircraft.

  3. MD-11 PCA - Research flight team egress

    Science.gov (United States)

    1995-01-01

    This McDonnell Douglas MD-11 has parked on the flightline at NASA's Dryden Flight Research Center, Edwards, California, following its completion of the first and second landings ever performed by a transport aircraft under engine power only (on Aug. 29, 1995). The milestone flight, with NASA research pilot and former astronaut Gordon Fullerton at the controls, was part of a NASA project to develop a computer-assisted engine control system that enables a pilot to land a plane safely when its normal control surfaces are disabled. Coming down the steps from the aircraft are Gordon Fullerton (in front), followed by Bill Burcham, Propulsion Controlled Aircraft (PCA) project engineer at Dryden; NASA Dryden controls engineer John Burken; John Feather of McDonnell Douglas; and Drew Pappas, McDonnell Douglas' project manager for PCA.

  4. Improvements to the RXTE/PCA Calibration

    Science.gov (United States)

    Jahoda, K.

    2009-01-01

    The author presents the current status of the RXTE/PCA Calibration, with emphasis on recent updates to the energy scale and the background subtraction. A new treatment of the Xenon K-escape line removes the largest remaining residual in the previously distributed matrices. Observations of Sco X-1 made simultaneously with Swift XRT, expressly for the purpose of cross calibrating the response to bright sources, are presented.

  5. Constrained consequence

    CSIR Research Space (South Africa)

    Britz, K

    2011-09-01

    Full Text Available their basic properties and relationship. In Section 3 we present a modal instance of these constructions which also illustrates with an example how to reason abductively with constrained entailment in a causal or action oriented context. In Section 4 we... of models with the former approach, whereas in Section 3.3 we give an example illustrating ways in which C can be de ned with both. Here we employ the following versions of local consequence: De nition 3.4. Given a model M = hW;R;Vi and formulas...

  6. Sequential combination of k-t principle component analysis (PCA) and partial parallel imaging: k-t PCA GROWL.

    Science.gov (United States)

    Qi, Haikun; Huang, Feng; Zhou, Hongmei; Chen, Huijun

    2017-03-01

    k-t principle component analysis (k-t PCA) is a distinguished method for high spatiotemporal resolution dynamic MRI. To further improve the accuracy of k-t PCA, a combination with partial parallel imaging (PPI), k-t PCA/SENSE, has been tested. However, k-t PCA/SENSE suffers from long reconstruction time and limited improvement. This study aims to improve the combination of k-t PCA and PPI on both reconstruction speed and accuracy. A sequential combination scheme called k-t PCA GROWL (GRAPPA operator for wider readout line) was proposed. The GRAPPA operator was performed before k-t PCA to extend each readout line into a wider band, which improved the condition of the encoding matrix in the following k-t PCA reconstruction. k-t PCA GROWL was tested and compared with k-t PCA and k-t PCA/SENSE on cardiac imaging. k-t PCA GROWL consistently resulted in better image quality compared with k-t PCA/SENSE at high acceleration factors for both retrospectively and prospectively undersampled cardiac imaging, with a much lower computation cost. The improvement in image quality became greater with the increase of acceleration factor. By sequentially combining the GRAPPA operator and k-t PCA, the proposed k-t PCA GROWL method outperformed k-t PCA/SENSE in both reconstruction speed and accuracy, suggesting that k-t PCA GROWL is a better combination scheme than k-t PCA/SENSE. Magn Reson Med 77:1058-1067, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  7. 24 CFR 401.451 - PAE Physical Condition Analysis (PCA).

    Science.gov (United States)

    2010-04-01

    ... (PCA). 401.451 Section 401.451 Housing and Urban Development Regulations Relating to Housing and Urban... PROGRAM (MARK-TO-MARKET) Restructuring Plan § 401.451 PAE Physical Condition Analysis (PCA). (a) Review... of the project by means of a PCA. If the PAE finds any immediate threats to health and safety, the...

  8. Grassmann Averages for Scalable Robust PCA

    DEFF Research Database (Denmark)

    Hauberg, Søren; Feragen, Aasa; Black, Michael J.

    2014-01-01

    As the collection of large datasets becomes increasingly automated, the occurrence of outliers will increase—“big data” implies “big outliers”. While principal component analysis (PCA) is often used to reduce the size of data, and scalable solutions exist, it is well-known that outliers can...... to vectors (subspaces) or elements of vectors; we focus on the latter and use a trimmed average. The resulting Trimmed Grassmann Average (TGA) is particularly appropriate for computer vision because it is robust to pixel outliers. The algorithm has low computational complexity and minimal memory requirements...

  9. Preliminary PCA/TT Results on MRO CRISM Multispectral Images

    Science.gov (United States)

    Klassen, David R.; Smith, M. D.

    2010-10-01

    Mars Reconnaissance Orbiter arrived at Mars in March 2006 and by September had achieved its science-phase orbit with the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) beginning its visible to near-infrared (VIS/NIR) spectral imaging shortly thereafter. One goal of CRISM is to fill in the spatial gaps between the various targeted observations, eventually mapping the entire surface. Due to the large volume of data this would create, the instrument works in a reduced spectral sampling mode creating "multispectral” images. From these data we can create image cubes using 64 wavelengths from 0.410 to 3.923 µm. We present here our analysis of these multispectral mode data products using Principal Components Analysis (PCA) and Target Transformation (TT) [1]. Previous work with ground-based images [2-5] has shown that over an entire visible hemisphere, there are only three to four meaningful components using 32-105 wavelengths over 1.5-4.1 µm the first two are consistent over all temporal scales. The TT retrieved spectral endmembers show nearly the same level of consistency [5]. The preliminary work on the CRISM images cubes implies similar results; three to four significant principal components that are fairly consistent over time. These components are then used in TT to find spectral endmembers which can be used to characterize the surface reflectance for future use in radiative transfer cloud optical depth retrievals. We present here the PCA/TT results comparing the principal components and recovered endmembers from six reconstructed CRISM multi-spectral image cubes. References: [1] Bandfield, J. L., et al. (2000) JGR, 105, 9573. [2] Klassen, D. R. and Bell III, J. F. (2001) BAAS 33, 1069. [3] Klassen, D. R. and Bell III, J. F. (2003) BAAS, 35, 936. [4] Klassen, D. R., Wark, T. J., Cugliotta, C. G. (2005) BAAS, 37, 693. [5] Klassen, D. R. (2009) Icarus, 204, 32.

  10. PCA: Principal Component Analysis for spectra modeling

    Science.gov (United States)

    Hurley, Peter D.; Oliver, Seb; Farrah, Duncan; Wang, Lingyu; Efstathiou, Andreas

    2012-07-01

    The mid-infrared spectra of ultraluminous infrared galaxies (ULIRGs) contain a variety of spectral features that can be used as diagnostics to characterize the spectra. However, such diagnostics are biased by our prior prejudices on the origin of the features. Moreover, by using only part of the spectrum they do not utilize the full information content of the spectra. Blind statistical techniques such as principal component analysis (PCA) consider the whole spectrum, find correlated features and separate them out into distinct components. This code, written in IDL, classifies principal components of IRS spectra to define a new classification scheme using 5D Gaussian mixtures modelling. The five PCs and average spectra for the four classifications to classify objects are made available with the code.

  11. Nonlinear peculiar-velocity analysis and PCA

    Energy Technology Data Exchange (ETDEWEB)

    Dekel, A. [and others

    2001-02-20

    We allow for nonlinear effects in the likelihood analysis of peculiar velocities, and obtain {approximately}35%-lower values for the cosmological density parameter and for the amplitude of mass-density fluctuations. The power spectrum in the linear regime is assumed to be of the flat {Lambda}CDM model (h = 0:65, n = 1) with only {Omega}{sub m} free. Since the likelihood is driven by the nonlinear regime, we break the power spectrum at k{sub b} {approximately} 0.2 (h{sup {minus}1} Mpc){sup {minus}1} and fit a two-parameter power-law at k > k{sub b} . This allows for an unbiased fit in the linear regime. Tests using improved mock catalogs demonstrate a reduced bias and a better fit. We find for the Mark III and SFI data {Omega}{sub m} = 0.35 {+-} 0.09 with {sigma}{sub 8}{Omega}P{sub m}{sup 0.6} = 0.55 {+-} 0.10 (90% errors). When allowing deviations from {Lambda}CDM, we find an indication for a wiggle in the power spectrum in the form of an excess near k {approximately} 0.05 and a deficiency at k {approximately} 0.1 (h{sup {minus}1} Mpc){sup {minus}1}--a cold flow which may be related to a feature indicated from redshift surveys and the second peak in the CMB anisotropy. A {chi}{sup 2} test applied to principal modes demonstrates that the nonlinear procedure improves the goodness of fit. The Principal Component Analysis (PCA) helps identifying spatial features of the data and fine-tuning the theoretical and error models. We address the potential for optimal data compression using PCA.

  12. Beyond textbook neuroanatomy: The syndrome of malignant PCA infarction.

    Science.gov (United States)

    Gogela, Steven L; Gozal, Yair M; Rahme, Ralph; Zuccarello, Mario; Ringer, Andrew J

    2015-01-01

    Given its limited vascular territory, occlusion of the posterior cerebral artery (PCA) usually does not result in malignant infarction. Challenging this concept, we present 3 cases of unilateral PCA infarction with secondary malignant progression, resulting from extension into what would classically be considered the posterior middle cerebral artery (MCA) territory. Interestingly, these were true PCA infarctions, not "MCA plus" strokes, since the underlying occlusive lesion was in the PCA. We hypothesize that congenital and/or acquired variability in the distribution and extent of territory supplied by the PCA may underlie this rare clinical entity. Patients with a PCA infarction should thus be followed closely and offered early surgical decompression in the event of malignant progression.

  13. Nonlinear PCA: characterizing interactions between modes of brain activity.

    OpenAIRE

    Friston, K; Phillips, J; Chawla, D; Büchel, C

    2000-01-01

    This paper presents a nonlinear principal component analysis (PCA) that identifies underlying sources causing the expression of spatial modes or patterns of activity in neuroimaging time-series. The critical aspect of this technique is that, in relation to conventional PCA, the sources can interact to produce (second-order) spatial modes that represent the modulation of one (first-order) spatial mode by another. This nonlinear PCA uses a simple neural network architecture that embodies a spec...

  14. SVD vs PCA: Comparison of Performance in an Imaging Spectrometer

    Directory of Open Access Journals (Sweden)

    Wilma Oblefias

    2004-12-01

    Full Text Available The calculation of basis spectra from a spectral library is an important prerequisite of any compact imaging spectrometer. In this paper, we compare the basis spectra computed by singular-value decomposition (SVD and principal component analysis (PCA in terms of estimation performance with respect to resolution, presence of noise, intensity variation, and quantization error. Results show that SVD is robust in intensity variation while PCA is not. However, PCA performs better with signals of low signal-to-noise ratio. No significant difference is seen between SVD and PCA in terms of resolution and quantization error.

  15. Preliminary Design Review: PCA Integrated Radar-Tracker Application

    National Research Council Canada - National Science Library

    Lebak, J

    2002-01-01

    The DARPA Polymorphous Computing Architecture (PCA) program is building advanced computer architectures that can reorganize their computation and communication structure to achieve better overall application performance...

  16. Denoising by semi-supervised kernel PCA preimaging

    DEFF Research Database (Denmark)

    Hansen, Toke Jansen; Abrahamsen, Trine Julie; Hansen, Lars Kai

    2014-01-01

    Kernel Principal Component Analysis (PCA) has proven a powerful tool for nonlinear feature extraction, and is often applied as a pre-processing step for classification algorithms. In denoising applications Kernel PCA provides the basis for dimensionality reduction, prior to the so-called pre-imag...

  17. Condition Monitoring of Sensors in a NPP Using Optimized PCA

    Directory of Open Access Journals (Sweden)

    Wei Li

    2018-01-01

    Full Text Available An optimized principal component analysis (PCA framework is proposed to implement condition monitoring for sensors in a nuclear power plant (NPP in this paper. Compared with the common PCA method in previous research, the PCA method in this paper is optimized at different modeling procedures, including data preprocessing stage, modeling parameter selection stage, and fault detection and isolation stage. Then, the model’s performance is greatly improved through these optimizations. Finally, sensor measurements from a real NPP are used to train the optimized PCA model in order to guarantee the credibility and reliability of the simulation results. Meanwhile, artificial faults are sequentially imposed to sensor measurements to estimate the fault detection and isolation ability of the proposed PCA model. Simulation results show that the optimized PCA model is capable of detecting and isolating the sensors regardless of whether they exhibit major or small failures. Meanwhile, the quantitative evaluation results also indicate that better performance can be obtained in the optimized PCA method compared with the common PCA method.

  18. Gas-Chromatographic Determination Of Water In Freon PCA

    Science.gov (United States)

    Melton, Donald M.

    1994-01-01

    Gas-chromatographic apparatus measures small concentrations of water in specimens of Freon PCA. Testing by use of apparatus faster and provides greater protection against accidental contamination of specimens by water in testing environment. Automated for unattended operation. Also used to measure water contents of materials, other than Freon PCA. Innovation extended to development of purgeable sampling accessory for gas chromatographs.

  19. Decoupled ARX and RBF Neural Network Modeling Using PCA and GA Optimization for Nonlinear Distributed Parameter Systems.

    Science.gov (United States)

    Zhang, Ridong; Tao, Jili; Lu, Renquan; Jin, Qibing

    2018-02-01

    Modeling of distributed parameter systems is difficult because of their nonlinearity and infinite-dimensional characteristics. Based on principal component analysis (PCA), a hybrid modeling strategy that consists of a decoupled linear autoregressive exogenous (ARX) model and a nonlinear radial basis function (RBF) neural network model are proposed. The spatial-temporal output is first divided into a few dominant spatial basis functions and finite-dimensional temporal series by PCA. Then, a decoupled ARX model is designed to model the linear dynamics of the dominant modes of the time series. The nonlinear residual part is subsequently parameterized by RBFs, where genetic algorithm is utilized to optimize their hidden layer structure and the parameters. Finally, the nonlinear spatial-temporal dynamic system is obtained after the time/space reconstruction. Simulation results of a catalytic rod and a heat conduction equation demonstrate the effectiveness of the proposed strategy compared to several other methods.

  20. Comparative study of PCA in classification of multichannel EMG signals.

    Science.gov (United States)

    Geethanjali, P

    2015-06-01

    Electromyographic (EMG) signals are abundantly used in the field of rehabilitation engineering in controlling the prosthetic device and significantly essential to find fast and accurate EMG pattern recognition system, to avoid intrusive delay. The main objective of this paper is to study the influence of Principal component analysis (PCA), a transformation technique, in pattern recognition of six hand movements using four channel surface EMG signals from ten healthy subjects. For this reason, time domain (TD) statistical as well as auto regression (AR) coefficients are extracted from the four channel EMG signals. The extracted statistical features as well as AR coefficients are transformed using PCA to 25, 50 and 75 % of corresponding original feature vector space. The classification accuracy of PCA transformed and non-PCA transformed TD statistical features as well as AR coefficients are studied with simple logistic regression (SLR), decision tree (DT) with J48 algorithm, logistic model tree (LMT), k nearest neighbor (kNN) and neural network (NN) classifiers in the identification of six different movements. The Kruskal-Wallis (KW) statistical test shows that there is a significant reduction (P PCA transformed features compared to non-PCA transformed features. SLR with non-PCA transformed time domain (TD) statistical features performs better in accuracy and computational power compared to other features considered in this study. In addition, the motion control of three drives for six movements of the hand is implemented with SLR using TD statistical features in off-line with TMSLF2407 digital signal controller (DSC).

  1. PRINCIPAL COMPONENT ANALYSIS (PCA DAN APLIKASINYA DENGAN SPSS

    Directory of Open Access Journals (Sweden)

    Hermita Bus Umar

    2009-03-01

    Full Text Available PCA (Principal Component Analysis are statistical techniques applied to a single set of variables when the researcher is interested in discovering which variables in the setform coherent subset that are relativity independent of one another.Variables that are correlated with one another but largely independent of other subset of variables are combined into factors. The Coals of PCA to which each variables is explained by each dimension. Step in PCA include selecting and mean measuring a set of variables, preparing the correlation matrix, extracting a set offactors from the correlation matrixs. Rotating the factor to increase interpretabilitv and interpreting the result.

  2. Stability and chaos of LMSER PCA learning algorithm

    International Nuclear Information System (INIS)

    Lv Jiancheng; Y, Zhang

    2007-01-01

    LMSER PCA algorithm is a principal components analysis algorithm. It is used to extract principal components on-line from input data. The algorithm has both stability and chaotic dynamic behavior under some conditions. This paper studies the local stability of the LMSER PCA algorithm via a corresponding deterministic discrete time system. Conditions for local stability are derived. The paper also explores the chaotic behavior of this algorithm. It shows that the LMSER PCA algorithm can produce chaos. Waveform plots, Lyapunov exponents and bifurcation diagrams are presented to illustrate the existence of chaotic behavior of this algorithm

  3. Elevated YKL40 is associated with advanced prostate cancer (PCa) and positively regulates invasion and migration of PCa cells.

    Science.gov (United States)

    Jeet, Varinder; Tevz, Gregor; Lehman, Melanie; Hollier, Brett; Nelson, Colleen

    2014-10-01

    Chitinase 3-like 1 (CHI3L1 or YKL40) is a secreted glycoprotein highly expressed in tumours from patients with advanced stage cancers, including prostate cancer (PCa). The exact function of YKL40 is poorly understood, but it has been shown to play an important role in promoting tumour angiogenesis and metastasis. The therapeutic value and biological function of YKL40 are unknown in PCa. The objective of this study was to examine the expression and function of YKL40 in PCa. Gene expression analysis demonstrated that YKL40 was highly expressed in metastatic PCa cells when compared with less invasive and normal prostate epithelial cell lines. In addition, the expression was primarily limited to androgen receptor-positive cell lines. Evaluation of YKL40 tissue expression in PCa patients showed a progressive increase in patients with aggressive disease when compared with those with less aggressive cancers and normal controls. Treatment of LNCaP and C4-2B cells with androgens increased YKL40 expression, whereas treatment with an anti-androgen agent decreased the gene expression of YKL40 in androgen-sensitive LNCaP cells. Furthermore, knockdown of YKL40 significantly decreased invasion and migration of PCa cells, whereas overexpression rendered them more invasive and migratory, which was commensurate with an enhancement in the anchorage-independent growth of cells. To our knowledge, this study characterises the role of YKL40 for the first time in PCa. Together, these results suggest that YKL40 plays an important role in PCa progression and thus inhibition of YKL40 may be a potential therapeutic strategy for the treatment of PCa. © 2014 The authors.

  4. Effect of Spatial Alignment Transformations in PCA and ICA of Functional Neuroimages

    DEFF Research Database (Denmark)

    Lukic, Ana S.; Wernick, Miles N.; Yang, Yongui

    2007-01-01

    this observation is true, not only for spatial ICA, but also for temporal ICA and for principal component analysis (PCA). In each case we find conditions that the spatial alignment operator must satisfy to ensure invariance of the results. We illustrate our findings using functional magnetic-resonance imaging (f......It has been previously observed that spatial independent component analysis (ICA), if applied to data pooled in a particular way, may lessen the need for spatial alignment of scans in a functional neuroimaging study. In this paper we seek to determine analytically the conditions under which...

  5. Quantitation of passive cutaneous anaphylaxis (PCA) by using radiolabelled antigen

    International Nuclear Information System (INIS)

    Ring, J.; Seifert, J.; Brendel, W.

    1978-01-01

    The major problem of detecting reaginic antibody by passive cutaneous anaphylaxis (PCA) is the quantitation of the dye reaction. Radiolabelled antigen was used in an attempt to quantitate the PCA reaction (Radio-PCA). Antisera containing reaginic antibody against human serum albumin (HSA) were produced in rabbits. These antisera were injected into normal rabbit skin in different dilutions. Twentyfour hours later HSA was injected intravenously either with Evans Blue or as 125-I-HSA. Radioactivity found in antibody-containing skin was significantly higher than in control specimens containing saline or normal rabbit serum, as low as antiserum dilutions of 1:1,000. Compared with Evans Blue technique Radio-PCA was able to distinguish quantitatively between different antiserum dilutions at a higher level of statistical significance. (author)

  6. Geochemical Constraints for Mercury's PCA-Derived Geochemical Terranes

    Science.gov (United States)

    Stockstill-Cahill, K. R.; Peplowski, P. N.

    2018-05-01

    PCA-derived geochemical terranes provide a robust, analytical means of defining these terranes using strictly geochemical inputs. Using the end members derived in this way, we are able to assess the geochemical implications for Mercury.

  7. EEG frequency PCA in EEG-ERP dynamics.

    Science.gov (United States)

    Barry, Robert J; De Blasio, Frances M

    2018-05-01

    Principal components analysis (PCA) has long been used to decompose the ERP into components, and these mathematical entities are increasingly accepted as meaningful and useful representatives of the electrophysiological components constituting the ERP. A similar expansion appears to be beginning in regard to decomposition of the EEG amplitude spectrum into frequency components via frequency PCA. However, to date, there has been no exploration of the brain's dynamic EEG-ERP linkages using PCA decomposition to assess components in each measure. Here, we recorded intrinsic EEG in both eyes-closed and eyes-open resting conditions, followed by an equiprobable go/no-go task. Frequency PCA of the EEG, including the nontask resting and within-task prestimulus periods, found seven frequency components within the delta to beta range. These differentially predicted PCA-derived go and no-go N1 and P3 ERP components. This demonstration suggests that it may be beneficial in future brain dynamics studies to implement PCA for the derivation of data-driven components from both the ERP and EEG. © 2017 Society for Psychophysiological Research.

  8. On the Link Between L1-PCA and ICA.

    Science.gov (United States)

    Martin-Clemente, Ruben; Zarzoso, Vicente

    2017-03-01

    Principal component analysis (PCA) based on L1-norm maximization is an emerging technique that has drawn growing interest in the signal processing and machine learning research communities, especially due to its robustness to outliers. The present work proves that L1-norm PCA can perform independent component analysis (ICA) under the whitening assumption. However, when the source probability distributions fulfil certain conditions, the L1-norm criterion needs to be minimized rather than maximized, which can be accomplished by simple modifications on existing optimal algorithms for L1-PCA. If the sources have symmetric distributions, we show in addition that L1-PCA is linked to kurtosis optimization. A number of numerical experiments illustrate the theoretical results and analyze the comparative performance of different algorithms for ICA via L1-PCA. Although our analysis is asymptotic in the sample size, this equivalence opens interesting new perspectives for performing ICA using optimal algorithms for L1-PCA with guaranteed global convergence while inheriting the increased robustness to outliers of the L1-norm criterion.

  9. A PCA3 gene-based transcriptional amplification system targeting primary prostate cancer

    OpenAIRE

    Neveu, Bertrand; Jain, Pallavi; T?tu, Bernard; Wu, Lily; Fradet, Yves; Pouliot, Fr?d?ric

    2015-01-01

    Targeting specifically primary prostate cancer (PCa) cells for immune therapy, gene therapy or molecular imaging is of high importance. The PCA3 long non-coding RNA is a unique PCa biomarker and oncogene that has been widely studied. This gene has been mainly exploited as an accurate diagnostic urine biomarker for PCa detection. In this study, the PCA3 promoter was introduced into a new transcriptional amplification system named the 3-Step Transcriptional Amplification System (PCA3-3STA) and ...

  10. Joint Group Sparse PCA for Compressed Hyperspectral Imaging.

    Science.gov (United States)

    Khan, Zohaib; Shafait, Faisal; Mian, Ajmal

    2015-12-01

    A sparse principal component analysis (PCA) seeks a sparse linear combination of input features (variables), so that the derived features still explain most of the variations in the data. A group sparse PCA introduces structural constraints on the features in seeking such a linear combination. Collectively, the derived principal components may still require measuring all the input features. We present a joint group sparse PCA (JGSPCA) algorithm, which forces the basic coefficients corresponding to a group of features to be jointly sparse. Joint sparsity ensures that the complete basis involves only a sparse set of input features, whereas the group sparsity ensures that the structural integrity of the features is maximally preserved. We evaluate the JGSPCA algorithm on the problems of compressed hyperspectral imaging and face recognition. Compressed sensing results show that the proposed method consistently outperforms sparse PCA and group sparse PCA in reconstructing the hyperspectral scenes of natural and man-made objects. The efficacy of the proposed compressed sensing method is further demonstrated in band selection for face recognition.

  11. PCA and Postoperative Pain Management After Orthopedic Surgeries

    Directory of Open Access Journals (Sweden)

    S.M. Hashemi

    2016-08-01

    Full Text Available Background: Patients often suffer from inadequate treatment of postoperative pain. The aim of this study was to investigate effect of PCA on postoperative pain management and patients’ satisfaction from use of PCA. Materials and Methods: In this prospective study, between 2010 to 2011, patients presented by orthopedic specialists to acute and chronic pain service of Akhtar Hospital. A satisfaction questionnaire was given on discharge to this patients, were asked to fill out it . Then collected by ward nurse. Results: patients’ satisfaction from pain relief with use of PCA was high ( 94.9% . In this patient pain relief at third day after surgery and require analgesic was low, significantly (p=0.0001. Significant patients’ satisfaction from effect of PCA in pain control and products support was high (p=0.0001.     Conclusion: Patient controlled analgesia is a safe, effective and noninvasive method for post operative pain management and in this study patients’ satisfaction for pain management was high for use of PCA and pain service. 

  12. [A Quantitative Verification for Operability of Three PCA Devices Attached to the Disposable Infusion Pumps].

    Science.gov (United States)

    Tadokoro, Takahiro; Fuchibe, Makoto; Odo, Yuichiro; Kakinohana, Manabu

    2015-11-01

    In this study using 3 different PCA devices (Baxter infuser LVBB +PCM 2 ml: Pump B, Coopdech Balloonjector +PCA 3 ml: Pump C, Rakuraku fuser +PCA 3 ml: Pump S), we investigated how easily PCA devices could be handled. In this study with 42 volunteers (14 elders and 28 nurses), we compared 3 PCA ejection volume and ejection rate among three PCA devices. PCA ejection rate was defined as the ratio of actual ejection volume to the maximum ejection volume (MEV) of each PCA device. Although not only elders but also nurses failed to produce accurate PCA ejection volume in the Pump B, Pump S could provide the MEV even by elders. In the Pump C, approximately 80% of MEV could be achieved by nurses, but 60% of MEV by elders (P PCA ejection volume might be dependent on PCA device.

  13. On a PCA-based lung motion model

    Energy Technology Data Exchange (ETDEWEB)

    Li Ruijiang; Lewis, John H; Jia Xun; Jiang, Steve B [Department of Radiation Oncology and Center for Advanced Radiotherapy Technologies, University of California San Diego, 3855 Health Sciences Dr, La Jolla, CA 92037-0843 (United States); Zhao Tianyu; Wuenschel, Sara; Lamb, James; Yang Deshan; Low, Daniel A [Department of Radiation Oncology, Washington University School of Medicine, 4921 Parkview Pl, St. Louis, MO 63110-1093 (United States); Liu Weifeng, E-mail: sbjiang@ucsd.edu [Amazon.com Inc., 701 5th Ave. Seattle, WA 98104 (United States)

    2011-09-21

    Respiration-induced organ motion is one of the major uncertainties in lung cancer radiotherapy and is crucial to be able to accurately model the lung motion. Most work so far has focused on the study of the motion of a single point (usually the tumor center of mass), and much less work has been done to model the motion of the entire lung. Inspired by the work of Zhang et al (2007 Med. Phys. 34 4772-81), we believe that the spatiotemporal relationship of the entire lung motion can be accurately modeled based on principle component analysis (PCA) and then a sparse subset of the entire lung, such as an implanted marker, can be used to drive the motion of the entire lung (including the tumor). The goal of this work is twofold. First, we aim to understand the underlying reason why PCA is effective for modeling lung motion and find the optimal number of PCA coefficients for accurate lung motion modeling. We attempt to address the above important problems both in a theoretical framework and in the context of real clinical data. Second, we propose a new method to derive the entire lung motion using a single internal marker based on the PCA model. The main results of this work are as follows. We derived an important property which reveals the implicit regularization imposed by the PCA model. We then studied the model using two mathematical respiratory phantoms and 11 clinical 4DCT scans for eight lung cancer patients. For the mathematical phantoms with cosine and an even power (2n) of cosine motion, we proved that 2 and 2n PCA coefficients and eigenvectors will completely represent the lung motion, respectively. Moreover, for the cosine phantom, we derived the equivalence conditions for the PCA motion model and the physiological 5D lung motion model (Low et al 2005 Int. J. Radiat. Oncol. Biol. Phys. 63 921-9). For the clinical 4DCT data, we demonstrated the modeling power and generalization performance of the PCA model. The average 3D modeling error using PCA was within 1

  14. On a PCA-based lung motion model.

    Science.gov (United States)

    Li, Ruijiang; Lewis, John H; Jia, Xun; Zhao, Tianyu; Liu, Weifeng; Wuenschel, Sara; Lamb, James; Yang, Deshan; Low, Daniel A; Jiang, Steve B

    2011-09-21

    Respiration-induced organ motion is one of the major uncertainties in lung cancer radiotherapy and is crucial to be able to accurately model the lung motion. Most work so far has focused on the study of the motion of a single point (usually the tumor center of mass), and much less work has been done to model the motion of the entire lung. Inspired by the work of Zhang et al (2007 Med. Phys. 34 4772-81), we believe that the spatiotemporal relationship of the entire lung motion can be accurately modeled based on principle component analysis (PCA) and then a sparse subset of the entire lung, such as an implanted marker, can be used to drive the motion of the entire lung (including the tumor). The goal of this work is twofold. First, we aim to understand the underlying reason why PCA is effective for modeling lung motion and find the optimal number of PCA coefficients for accurate lung motion modeling. We attempt to address the above important problems both in a theoretical framework and in the context of real clinical data. Second, we propose a new method to derive the entire lung motion using a single internal marker based on the PCA model. The main results of this work are as follows. We derived an important property which reveals the implicit regularization imposed by the PCA model. We then studied the model using two mathematical respiratory phantoms and 11 clinical 4DCT scans for eight lung cancer patients. For the mathematical phantoms with cosine and an even power (2n) of cosine motion, we proved that 2 and 2n PCA coefficients and eigenvectors will completely represent the lung motion, respectively. Moreover, for the cosine phantom, we derived the equivalence conditions for the PCA motion model and the physiological 5D lung motion model (Low et al 2005 Int. J. Radiat. Oncol. Biol. Phys. 63 921-9). For the clinical 4DCT data, we demonstrated the modeling power and generalization performance of the PCA model. The average 3D modeling error using PCA was within 1

  15. Synthesis and antifungal evaluation of PCA amide analogues.

    Science.gov (United States)

    Qin, Chuan; Yu, Di-Ya; Zhou, Xu-Dong; Zhang, Min; Wu, Qing-Lai; Li, Jun-Kai

    2018-04-18

    To improve the physical and chemical properties of phenazine-1-carboxylic acid (PCA) and find higher antifungal compounds, a series of PCA amide analogues were designed and synthesized and their structures were confirmed by 1 H NMR, HRMS, and X-ray. Most compounds showed some antifungal activities in vitro. Particularly, compound 3d exhibited inhibition effect against Pyriculariaoryzac Cavgra with EC 50 value of 28.7 μM and compound 3q exhibited effect against Rhizoctonia solani with EC 50 value of 24.5 μM, more potently active than that of the positive control PCA with its EC 50 values of 37.3 μM (Pyriculariaoryzac Cavgra) and 33.2 μM (Rhizoctonia solani), respectively.

  16. On a PCA-based lung motion model

    International Nuclear Information System (INIS)

    Li Ruijiang; Lewis, John H; Jia Xun; Jiang, Steve B; Zhao Tianyu; Wuenschel, Sara; Lamb, James; Yang Deshan; Low, Daniel A; Liu Weifeng

    2011-01-01

    Respiration-induced organ motion is one of the major uncertainties in lung cancer radiotherapy and is crucial to be able to accurately model the lung motion. Most work so far has focused on the study of the motion of a single point (usually the tumor center of mass), and much less work has been done to model the motion of the entire lung. Inspired by the work of Zhang et al (2007 Med. Phys. 34 4772-81), we believe that the spatiotemporal relationship of the entire lung motion can be accurately modeled based on principle component analysis (PCA) and then a sparse subset of the entire lung, such as an implanted marker, can be used to drive the motion of the entire lung (including the tumor). The goal of this work is twofold. First, we aim to understand the underlying reason why PCA is effective for modeling lung motion and find the optimal number of PCA coefficients for accurate lung motion modeling. We attempt to address the above important problems both in a theoretical framework and in the context of real clinical data. Second, we propose a new method to derive the entire lung motion using a single internal marker based on the PCA model. The main results of this work are as follows. We derived an important property which reveals the implicit regularization imposed by the PCA model. We then studied the model using two mathematical respiratory phantoms and 11 clinical 4DCT scans for eight lung cancer patients. For the mathematical phantoms with cosine and an even power (2n) of cosine motion, we proved that 2 and 2n PCA coefficients and eigenvectors will completely represent the lung motion, respectively. Moreover, for the cosine phantom, we derived the equivalence conditions for the PCA motion model and the physiological 5D lung motion model (Low et al 2005 Int. J. Radiat. Oncol. Biol. Phys. 63 921-9). For the clinical 4DCT data, we demonstrated the modeling power and generalization performance of the PCA model. The average 3D modeling error using PCA was within 1

  17. Adaptive PCA based fault diagnosis scheme in imperial smelting process.

    Science.gov (United States)

    Hu, Zhikun; Chen, Zhiwen; Gui, Weihua; Jiang, Bin

    2014-09-01

    In this paper, an adaptive fault detection scheme based on a recursive principal component analysis (PCA) is proposed to deal with the problem of false alarm due to normal process changes in real process. Our further study is also dedicated to develop a fault isolation approach based on Generalized Likelihood Ratio (GLR) test and Singular Value Decomposition (SVD) which is one of general techniques of PCA, on which the off-set and scaling fault can be easily isolated with explicit off-set fault direction and scaling fault classification. The identification of off-set and scaling fault is also applied. The complete scheme of PCA-based fault diagnosis procedure is proposed. The proposed scheme is first applied to Imperial Smelting Process, and the results show that the proposed strategies can be able to mitigate false alarms and isolate faults efficiently. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Principal Component Analysis Based Two-Dimensional (PCA-2D) Correlation Spectroscopy: PCA Denoising for 2D Correlation Spectroscopy

    International Nuclear Information System (INIS)

    Jung, Young Mee

    2003-01-01

    Principal component analysis based two-dimensional (PCA-2D) correlation analysis is applied to FTIR spectra of polystyrene/methyl ethyl ketone/toluene solution mixture during the solvent evaporation. Substantial amount of artificial noise were added to the experimental data to demonstrate the practical noise-suppressing benefit of PCA-2D technique. 2D correlation analysis of the reconstructed data matrix from PCA loading vectors and scores successfully extracted only the most important features of synchronicity and asynchronicity without interference from noise or insignificant minor components. 2D correlation spectra constructed with only one principal component yield strictly synchronous response with no discernible a asynchronous features, while those involving at least two or more principal components generated meaningful asynchronous 2D correlation spectra. Deliberate manipulation of the rank of the reconstructed data matrix, by choosing the appropriate number and type of PCs, yields potentially more refined 2D correlation spectra

  19. Metoclopramide improves the quality of tramadol PCA indistinguishable to morphine PCA: a prospective, randomized, double blind clinical comparison.

    Science.gov (United States)

    Pang, Weiwu; Liu, Yu-Cheng; Maboudou, Edgard; Chen, Tom Xianxiu; Chois, John M; Liao, Cheng-Chun; Wu, Rick Sai-Chuen

    2013-09-01

    Multimodal analgesia has been effectively used in postoperative pain control. Tramadol can be considered "multimodal" because it has two main mechanisms of action, an opioid agonist and a reuptake inhibitor of norepinephrine and serotonin. Tramadol is not as commonly used as morphine due to the increased incidence of postoperative nausea and vomiting (PONV). As metoclopramide is an antiemetic and an analgesic, it was hypothesized that when added to reduce PONV, metoclopromide may enhance the multimodal feature of tramadol by the analgesic property of metoclopramide. Therefore, the effectiveness of postoperative patient-controlled analgesia (PCA) with morphine was compared against PCA with combination of tramadol and metoclopramide. A prospective, randomized, double blind clinical trial. Academic pain service of a university hospital. Sixty patients undergoing elective total knee arthroplasty with general anesthesia. Sixty patients were randomly divided into Group M and Group T. In a double-blinded fashion, Group M received intraoperative 0.2 mg/kg morphine and postoperative PCA with 1 mg morphine per bolus, whereas Group T received intraoperative tramadol 2.5 mg/kg and postoperative PCA with 20 mg tramadol plus 1 mg metoclopramide per bolus. Lockout interval was 5 minutes in both groups. Pain scale, satisfaction rate, analgesic consumption, PCA demand, and side effects were recorded by a blind investigator. These two groups displayed no statistically significant difference between the items and variables evaluated. This combination provides analgesia equivalent to that of morphine and can be used as an alternative to morphine PCA. Wiley Periodicals, Inc.

  20. Linking PCA and time derivatives of dynamic systems

    NARCIS (Netherlands)

    Stanimirovic, Olja; Hoefsloot, Huub C. J.; de Bokx, Pieter K.; Smilde, Age K.

    2006-01-01

    Low dimensional approximate descriptions of the high dimensional phase space of dynamic processes are very useful. Principal component analysis (PCA) is the most used technique to find the low dimensional subspace of interest. Here, it will be shown that mean centering of the process data across

  1. Copenhagen uPAR prostate cancer (CuPCa) database

    DEFF Research Database (Denmark)

    Lippert, Solvej; Berg, Kasper D; Høyer-Hansen, Gunilla

    2016-01-01

    AIM: Urokinase plasminogen activator receptor (uPAR) plays a central role during cancer invasion by facilitating pericellular proteolysis. We initiated the prospective 'Copenhagen uPAR Prostate Cancer' study to investigate the significance of uPAR levels in prostate cancer (PCa) patients. METHODS...

  2. The in-reactor deformation of the PCA alloy

    International Nuclear Information System (INIS)

    Puigh, R.J.

    1986-04-01

    The swelling and in-reactor creep behaviors of the PCA alloy have been determined from the irradiation of pressurized tube specimens in the FFTF reactor. These data have been obtained to a peak neutron fluence corresponding to approximately 80 dpa in the FFTF reactor for irradiation temperatures between 400 and 750 0 C. Diametral measurements performed on the unstressed specimens indicate the possible onset of swelling in the PCA alloy for irradiation temperatures between 400 and 550 0 C and at a neutron fluence corresponding to ∼50 dpa. The creep data suggest a non-linear fluence dependence and linear stress dependence (for hoop stresses less than 100 MPa) which is consistent with the in-reactor creep behavior of many cold worked austenitic stainless steels. These PCA creep data are compared to available 316 SS in-reactor creep data. The in-reactor creep strains for PCA are significantly less than observed in 20% cold worked 316 SS over the temperature ranges and fluences investigated

  3. Neutron spectral characterization of the PCA-PV benchmark facility

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Kam, F.B.K.; Fabry, A.

    1980-01-01

    The Pool Critical Assembly (PCA) at the Oak Ridge National Laboratory is being used to generate the PCA-PV benchmark neutron field. A configuration consisting of steel blocks and water gaps is used to simulate the thermal shield pressure vessel configurations in power reactors. The distances between the steel blocks can be changed so that the penetration of neutrons through water and steel can be determined and compared for many different configurations. Easy access and low flux levels make it possible to conduct extensive measurements using active and passive neutron dosimetry, which are impossible to perform in commercial reactors. The clean core and simple geometry facilitates neutron transport calculations which can be validated in detail by comparison with measurements. A facility which has the same configuration of water and steel as the PCA-PV facility but contains test specimens for materials testing, will be irradiated in the higher fluxes at the Oak Ridge Research Reactor. Using the results from the PCA-PV facility, the correlation between neutron flux-fluences and radiation damage in steel can be established. This facility is being discussed in a separate paper

  4. ECG-derived respiration methods: adapted ICA and PCA.

    Science.gov (United States)

    Tiinanen, Suvi; Noponen, Kai; Tulppo, Mikko; Kiviniemi, Antti; Seppänen, Tapio

    2015-05-01

    Respiration is an important signal in early diagnostics, prediction, and treatment of several diseases. Moreover, a growing trend toward ambulatory measurements outside laboratory environments encourages developing indirect measurement methods such as ECG derived respiration (EDR). Recently, decomposition techniques like principal component analysis (PCA), and its nonlinear version, kernel PCA (KPCA), have been used to derive a surrogate respiration signal from single-channel ECG. In this paper, we propose an adapted independent component analysis (AICA) algorithm to obtain EDR signal, and extend the normal linear PCA technique based on the best principal component (PC) selection (APCA, adapted PCA) to improve its performance further. We also demonstrate that the usage of smoothing spline resampling and bandpass-filtering improve the performance of all EDR methods. Compared with other recent EDR methods using correlation coefficient and magnitude squared coherence, the proposed AICA and APCA yield a statistically significant improvement with correlations 0.84, 0.82, 0.76 and coherences 0.90, 0.91, 0.85 between reference respiration and AICA, APCA and KPCA, respectively. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. A Hold-out method to correct PCA variance inflation

    DEFF Research Database (Denmark)

    Garcia-Moreno, Pablo; Artes-Rodriguez, Antonio; Hansen, Lars Kai

    2012-01-01

    In this paper we analyze the problem of variance inflation experienced by the PCA algorithm when working in an ill-posed scenario where the dimensionality of the training set is larger than its sample size. In an earlier article a correction method based on a Leave-One-Out (LOO) procedure...

  6. Effects of PCA and DMAE on the namatode Caenorhabditis briggsae.

    Science.gov (United States)

    Zuckerman, B M; Barrett, K A

    1978-04-01

    Concentration of 6.8 mM DMAE did not retard age pigment accumulation in Caenorhabditis briggsae. However, when the nematodes were exposed to 6.8 mM PCA + 6.8 mM DMAE combined, the accumulation of age pigment was significantly retarded. A combination of 3.4 mM DMAE + 3.4 mM PCA had no effect on age pigment. It is concluded from this study that PCA and DMAE act in concert to produce the observed effect on age pigment. In respect to this parameter neither molecule was effective alone. The results indicate that the effect of centrophenoxine on age pigment might be enhanced by retarding the hydrolysis of centrophenoxine. The accumulation of electron dense aggregates, thought to be aggregates of cross-linked molecules, was reduced by 6.8 PCA + 6.8 DMAE. It is suggested that centrophenoxine be tested for its ability to remove random, unwanted cross-linkages in higher animals.

  7. Tracking image features with PCA-SURF descriptors

    CSIR Research Space (South Africa)

    Pancham, A

    2015-05-01

    Full Text Available IAPR International Conference on Machine Vision Applications, May 18-22, 2015, Tokyo, JAPAN Tracking Image Features with PCA-SURF Descriptors Ardhisha Pancham CSIR, UKZN South Africa apancham@csir.co.za Daniel Withey CSIR South Africa...

  8. Optimization of CNC end milling process parameters using PCA ...

    African Journals Online (AJOL)

    Optimization of CNC end milling process parameters using PCA-based Taguchi method. ... International Journal of Engineering, Science and Technology ... To meet the basic assumption of Taguchi method; in the present work, individual response correlations have been eliminated first by means of Principal Component ...

  9. [An improved low spectral distortion PCA fusion method].

    Science.gov (United States)

    Peng, Shi; Zhang, Ai-Wu; Li, Han-Lun; Hu, Shao-Xing; Meng, Xian-Gang; Sun, Wei-Dong

    2013-10-01

    Aiming at the spectral distortion produced in PCA fusion process, the present paper proposes an improved low spectral distortion PCA fusion method. This method uses NCUT (normalized cut) image segmentation algorithm to make a complex hyperspectral remote sensing image into multiple sub-images for increasing the separability of samples, which can weaken the spectral distortions of traditional PCA fusion; Pixels similarity weighting matrix and masks were produced by using graph theory and clustering theory. These masks are used to cut the hyperspectral image and high-resolution image into some sub-region objects. All corresponding sub-region objects between the hyperspectral image and high-resolution image are fused by using PCA method, and all sub-regional integration results are spliced together to produce a new image. In the experiment, Hyperion hyperspectral data and Rapid Eye data were used. And the experiment result shows that the proposed method has the same ability to enhance spatial resolution and greater ability to improve spectral fidelity performance.

  10. Chemical fingerprinting of petroleum biomakers using time warping and PCA

    DEFF Research Database (Denmark)

    Christensen, Jan H.; Tomasi, Giorgio; Hansen, Asger B.

    2005-01-01

    A new method for chemical fingerprinting of petroleum biomakers is described. The method consists of GC-MS analysis, preprocessing of GC-MS chromatograms, and principal component analysis (PCA) of selected regions. The preprocessing consists of baseline removal by derivatization, normalization...

  11. Parent-controlled PCA for pain management in pediatric oncology: is it safe?

    Science.gov (United States)

    Anghelescu, Doralina L; Faughnan, Lane G; Oakes, Linda L; Windsor, Kelley B; Pei, Deqing; Burgoyne, Laura L

    2012-08-01

    Patient-controlled analgesia offers safe and effective pain control for children who can self-administer medication. Some children may not be candidates for patient-controlled analgesia (PCA) unless a proxy can administer doses. The safety of proxy-administered PCA has been studied, but the safety of parent-administered PCA in children with cancer has not been reported. In this study, we compare the rate of complications in PCA by parent proxy versus PCA by clinician (nurse) proxy and self-administered PCA. Our pediatric institution's quality improvement database was reviewed for adverse events associated with PCA from 2004 through 2010. Each PCA day was categorized according to patient or proxy authorization. Data from 6151 PCA observation days were included; 61.3% of these days were standard PCA, 23.5% were parent-proxy PCA, and 15.2% were clinician-proxy PCA days. The mean duration of PCA use was 12.1 days, and the mean patient age was 12.3 years. The mean patient age was lower in the clinician-proxy (9.4 y) and parent-proxy (5.1 y) groups, respectively. The complication rate was lowest in the parent-proxy group (0.62%). We found that proxy administration of PCA by authorized parents is as safe as clinician administered and standard PCA at our pediatric institution.

  12. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  13. SU-G-BRA-03: PCA Based Imaging Angle Optimization for 2D Cine MRI Based Radiotherapy Guidance

    Energy Technology Data Exchange (ETDEWEB)

    Chen, T; Yue, N; Jabbour, S; Zhang, M [Rutgers University, New Brunswick, NJ (United States)

    2016-06-15

    Purpose: To develop an imaging angle optimization methodology for orthogonal 2D cine MRI based radiotherapy guidance using Principal Component Analysis (PCA) of target motion retrieved from 4DCT. Methods: We retrospectively analyzed 4DCT of 6 patients with lung tumor. A radiation oncologist manually contoured the target volume at the maximal inhalation phase of the respiratory cycle. An object constrained deformable image registration (DIR) method has been developed to track the target motion along the respiration at ten phases. The motion of the center of the target mass has been analyzed using the PCA to find out the principal motion components that were uncorrelated with each other. Two orthogonal image planes for cineMRI have been determined using this method to minimize the through plane motion during MRI based radiotherapy guidance. Results: 3D target respiratory motion for all 6 patients has been efficiently retrieved from 4DCT. In this process, the object constrained DIR demonstrated satisfactory accuracy and efficiency to enable the automatic motion tracking for clinical application. The average motion amplitude in the AP, lateral, and longitudinal directions were 3.6mm (min: 1.6mm, max: 5.6mm), 1.7mm (min: 0.6mm, max: 2.7mm), and 5.6mm (min: 1.8mm, max: 16.1mm), respectively. Based on PCA, the optimal orthogonal imaging planes were determined for cineMRI. The average angular difference between the PCA determined imaging planes and the traditional AP and lateral imaging planes were 47 and 31 degrees, respectively. After optimization, the average amplitude of through plane motion reduced from 3.6mm in AP images to 2.5mm (min:1.3mm, max:3.9mm); and from 1.7mm in lateral images to 0.6mm (min: 0.2mm, max:1.5mm), while the principal in plane motion amplitude increased from 5.6mm to 6.5mm (min: 2.8mm, max: 17mm). Conclusion: DIR and PCA can be used to optimize the orthogonal image planes of cineMRI to minimize the through plane motion during radiotherapy

  14. SU-G-BRA-03: PCA Based Imaging Angle Optimization for 2D Cine MRI Based Radiotherapy Guidance

    International Nuclear Information System (INIS)

    Chen, T; Yue, N; Jabbour, S; Zhang, M

    2016-01-01

    Purpose: To develop an imaging angle optimization methodology for orthogonal 2D cine MRI based radiotherapy guidance using Principal Component Analysis (PCA) of target motion retrieved from 4DCT. Methods: We retrospectively analyzed 4DCT of 6 patients with lung tumor. A radiation oncologist manually contoured the target volume at the maximal inhalation phase of the respiratory cycle. An object constrained deformable image registration (DIR) method has been developed to track the target motion along the respiration at ten phases. The motion of the center of the target mass has been analyzed using the PCA to find out the principal motion components that were uncorrelated with each other. Two orthogonal image planes for cineMRI have been determined using this method to minimize the through plane motion during MRI based radiotherapy guidance. Results: 3D target respiratory motion for all 6 patients has been efficiently retrieved from 4DCT. In this process, the object constrained DIR demonstrated satisfactory accuracy and efficiency to enable the automatic motion tracking for clinical application. The average motion amplitude in the AP, lateral, and longitudinal directions were 3.6mm (min: 1.6mm, max: 5.6mm), 1.7mm (min: 0.6mm, max: 2.7mm), and 5.6mm (min: 1.8mm, max: 16.1mm), respectively. Based on PCA, the optimal orthogonal imaging planes were determined for cineMRI. The average angular difference between the PCA determined imaging planes and the traditional AP and lateral imaging planes were 47 and 31 degrees, respectively. After optimization, the average amplitude of through plane motion reduced from 3.6mm in AP images to 2.5mm (min:1.3mm, max:3.9mm); and from 1.7mm in lateral images to 0.6mm (min: 0.2mm, max:1.5mm), while the principal in plane motion amplitude increased from 5.6mm to 6.5mm (min: 2.8mm, max: 17mm). Conclusion: DIR and PCA can be used to optimize the orthogonal image planes of cineMRI to minimize the through plane motion during radiotherapy

  15. Parent-Controlled PCA for Pain Management in Pediatric Oncology: Is it Safe?

    OpenAIRE

    Anghelescu, Doralina L.; Faughnan, Lane G.; Oakes, Linda L.; Windsor, Kelley B.; Pei, Deqing; Burgoyne, Laura L.

    2012-01-01

    Patient-controlled analgesia offers safe and effective pain control for children who can self-administer medication. Some children may not be candidates for PCA unless a proxy can administer doses. The safety of proxy-administered PCA has been studied, but the safety of parent-administered PCA in children with cancer has not been reported. In this study we compare the rate of complications in PCA by parent proxy versus PCA by clinician (nurse) proxy and self-administered PCA. Our pediatric in...

  16. F-15 PCA (Propulsion Controlled Aircraft) Simulation Cockpit

    Science.gov (United States)

    1990-01-01

    The F-15 PCA (Propulsion Controlled Aircraft) simulation was used from 1990 to 1993. It was used for the development of propulsion algorithms and piloting techniques (using throttles only) to be used for emergency flight control in the advent of a major flight control system failure on a multi-engine aircraft. Following this program with the Dryden F-15, similiar capabilities were developed for other aircraft, such as the B-720, Lear 24, B-727, C-402, and B-747.

  17. Technology Marketing using PCA , SOM, and STP Strategy Modeling

    OpenAIRE

    Sunghae Jun

    2011-01-01

    Technology marketing is a total processing about identifying and meeting the technological needs of human society. Most technology results exist in intellectual properties like patents. In our research, we consider patent document as a technology. So patent data are analyzed by Principal Component Analysis (PCA) and Self Organizing Map (SOM) for STP(Segmentation, Targeting, and Positioning) strategy modeling. STP is a popular approach for developing marketing strategies. We use STP strategy m...

  18. Imaging network level language recovery after left PCA stroke.

    Science.gov (United States)

    Sebastian, Rajani; Long, Charltien; Purcell, Jeremy J; Faria, Andreia V; Lindquist, Martin; Jarso, Samson; Race, David; Davis, Cameron; Posner, Joseph; Wright, Amy; Hillis, Argye E

    2016-05-11

    The neural mechanisms that support aphasia recovery are not yet fully understood. Our goal was to evaluate longitudinal changes in naming recovery in participants with posterior cerebral artery (PCA) stroke using a case-by-case analysis. Using task based and resting state functional magnetic resonance imaging (fMRI) and detailed language testing, we longitudinally studied the recovery of the naming network in four participants with PCA stroke with naming deficits at the acute (0 week), sub acute (3-5 weeks), and chronic time point (5-7 months) post stroke. Behavioral and imaging analyses (task related and resting state functional connectivity) were carried out to elucidate longitudinal changes in naming recovery. Behavioral and imaging analysis revealed that an improvement in naming accuracy from the acute to the chronic stage was reflected by increased connectivity within and between left and right hemisphere "language" regions. One participant who had persistent moderate naming deficit showed weak and decreasing connectivity longitudinally within and between left and right hemisphere language regions. These findings emphasize a network view of aphasia recovery, and show that the degree of inter- and intra- hemispheric balance between the language-specific regions is necessary for optimal recovery of naming, at least in participants with PCA stroke.

  19. PCA3 Silencing Sensitizes Prostate Cancer Cells to Enzalutamide-mediated Androgen Receptor Blockade.

    Science.gov (United States)

    Özgür, Emre; Celik, Ayca Iribas; Darendeliler, Emin; Gezer, Ugur

    2017-07-01

    Prostate cancer (PCa) is an androgen-dependent disease. Novel anti-androgens (i.e. enzalutamide) have recently been developed for the treatment of patients with metastatic castration-resistant prostate cancer (CRPC). Evidence is accumulating that prostate cancer antigen 3 (PCA3) is involved in androgen receptor (AR) signaling. Here, in combination with enzalutamide-mediated AR blockade, we investigated the effect of PCA3 targeting on the viability of PCa cells. In hormone-sensitive LNCaP cells, AR-overexpressing LNCaP-AR + cells and VCaP cells (representing CRPC), PCA3 was silenced using siRNA oligonucleotides. Gene expression and cell viability was assessed in PCA3-silenced and/or AR-blocked cells. PCA3 targeting reduced the expression of AR-related genes (i.e. prostate-specific antigen (PSA) and prostate-specific transcript 1 (non-protein coding) (PCGEM1)) and potentiated the effect of enzalutamide. Proliferation of PCa cells was suppressed upon PCA3 silencing with a greater effect in LNCaP-AR + cells. Furthermore, PCA3 silencing sensitized PCa cells to enzalutamide-induced loss of cell growth. PCA3, as a therapeutic target in PCa, might be used to potentiate AR antagonists. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  20. Prostate cancer (PCa) risk variants and risk of fatal PCa in the National Cancer Institute Breast and Prostate Cancer Cohort Consortium.

    Science.gov (United States)

    Shui, Irene M; Lindström, Sara; Kibel, Adam S; Berndt, Sonja I; Campa, Daniele; Gerke, Travis; Penney, Kathryn L; Albanes, Demetrius; Berg, Christine; Bueno-de-Mesquita, H Bas; Chanock, Stephen; Crawford, E David; Diver, W Ryan; Gapstur, Susan M; Gaziano, J Michael; Giles, Graham G; Henderson, Brian; Hoover, Robert; Johansson, Mattias; Le Marchand, Loic; Ma, Jing; Navarro, Carmen; Overvad, Kim; Schumacher, Fredrick R; Severi, Gianluca; Siddiq, Afshan; Stampfer, Meir; Stevens, Victoria L; Travis, Ruth C; Trichopoulos, Dimitrios; Vineis, Paolo; Mucci, Lorelei A; Yeager, Meredith; Giovannucci, Edward; Kraft, Peter

    2014-06-01

    Screening and diagnosis of prostate cancer (PCa) is hampered by an inability to predict who has the potential to develop fatal disease and who has indolent cancer. Studies have identified multiple genetic risk loci for PCa incidence, but it is unknown whether they could be used as biomarkers for PCa-specific mortality (PCSM). To examine the association of 47 established PCa risk single-nucleotide polymorphisms (SNPs) with PCSM. We included 10 487 men who had PCa and 11 024 controls, with a median follow-up of 8.3 yr, during which 1053 PCa deaths occurred. The main outcome was PCSM. The risk allele was defined as the allele associated with an increased risk for PCa in the literature. We used Cox proportional hazards regression to calculate the hazard ratios of each SNP with time to progression to PCSM after diagnosis. We also used logistic regression to calculate odds ratios for each risk SNP, comparing fatal PCa cases to controls. Among the cases, we found that 8 of the 47 SNPs were significantly associated (pPCa, but most did not differentiate between fatal and nonfatal PCa. Rs11672691 and rs10993994 were associated with both fatal and nonfatal PCa, while rs6465657, rs7127900, rs2735839, and rs13385191 were associated with nonfatal PCa only. Eight established risk loci were associated with progression to PCSM after diagnosis. Twenty-two SNPs were associated with fatal PCa incidence, but most did not differentiate between fatal and nonfatal PCa. The relatively small magnitudes of the associations do not translate well into risk prediction, but these findings merit further follow-up, because they may yield important clues about the complex biology of fatal PCa. In this report, we assessed whether established PCa risk variants could predict PCSM. We found eight risk variants associated with PCSM: One predicted an increased risk of PCSM, while seven were associated with decreased risk. Larger studies that focus on fatal PCa are needed to identify more markers that

  1. Exploring Constrained Creative Communication

    DEFF Research Database (Denmark)

    Sørensen, Jannick Kirk

    2017-01-01

    Creative collaboration via online tools offers a less ‘media rich’ exchange of information between participants than face-to-face collaboration. The participants’ freedom to communicate is restricted in means of communication, and rectified in terms of possibilities offered in the interface. How do...... these constrains influence the creative process and the outcome? In order to isolate the communication problem from the interface- and technology problem, we examine via a design game the creative communication on an open-ended task in a highly constrained setting, a design game. Via an experiment the relation...... between communicative constrains and participants’ perception of dialogue and creativity is examined. Four batches of students preparing for forming semester project groups were conducted and documented. Students were asked to create an unspecified object without any exchange of communication except...

  2. Choosing health, constrained choices.

    Science.gov (United States)

    Chee Khoon Chan

    2009-12-01

    In parallel with the neo-liberal retrenchment of the welfarist state, an increasing emphasis on the responsibility of individuals in managing their own affairs and their well-being has been evident. In the health arena for instance, this was a major theme permeating the UK government's White Paper Choosing Health: Making Healthy Choices Easier (2004), which appealed to an ethos of autonomy and self-actualization through activity and consumption which merited esteem. As a counterpoint to this growing trend of informed responsibilization, constrained choices (constrained agency) provides a useful framework for a judicious balance and sense of proportion between an individual behavioural focus and a focus on societal, systemic, and structural determinants of health and well-being. Constrained choices is also a conceptual bridge between responsibilization and population health which could be further developed within an integrative biosocial perspective one might refer to as the social ecology of health and disease.

  3. New genomic structure for prostate cancer specific gene PCA3 within BMCC1: implications for prostate cancer detection and progression.

    Directory of Open Access Journals (Sweden)

    Raymond A Clarke

    Full Text Available The prostate cancer antigen 3 (PCA3/DD3 gene is a highly specific biomarker upregulated in prostate cancer (PCa. In order to understand the importance of PCA3 in PCa we investigated the organization and evolution of the PCA3 gene locus.We have employed cDNA synthesis, RTPCR and DNA sequencing to identify 4 new transcription start sites, 4 polyadenylation sites and 2 new differentially spliced exons in an extended form of PCA3. Primers designed from these novel PCA3 exons greatly improve RT-PCR based discrimination between PCa, PCa metastases and BPH specimens. Comparative genomic analyses demonstrated that PCA3 has only recently evolved in an anti-sense orientation within a second gene, BMCC1/PRUNE2. BMCC1 has been shown previously to interact with RhoA and RhoC, determinants of cellular transformation and metastasis, respectively. Using RT-PCR we demonstrated that the longer BMCC1-1 isoform - like PCA3 - is upregulated in PCa tissues and metastases and in PCa cell lines. Furthermore PCA3 and BMCC1-1 levels are responsive to dihydrotestosterone treatment.Upregulation of two new PCA3 isoforms in PCa tissues improves discrimination between PCa and BPH. The functional relevance of this specificity is now of particular interest given PCA3's overlapping association with a second gene BMCC1, a regulator of Rho signalling. Upregulation of PCA3 and BMCC1 in PCa has potential for improved diagnosis.

  4. Opioid Patient Controlled Analgesia (PCA) use during the Initial Experience with the IMPROVE PCA Trial: A Phase III Analgesic Trial for Hospitalized Sickle Cell Patients with Painful Episodes

    OpenAIRE

    Dampier, Carlton D.; Smith, Wally R.; Kim, Hae-Young; Wager, Carrie Greene; Bell, Margaret C.; Minniti, Caterina P.; Keefer, Jeffrey; Hsu, Lewis; Krishnamurti, Lakshmanan; Mack, A. Kyle; McClish, Donna; McKinlay, Sonja M.; Miller, Scott T.; Osunkwo, Ifeyinwa; Seaman, Phillip

    2011-01-01

    Opioid analgesics administered by patient-controlled analgesia (PCA) are frequently used for pain relief in children and adults with sickle cell disease (SCD) hospitalized for persistent vaso-occlusive pain, but optimum opioid dosing is not known. To better define PCA dosing recommendations, a multi-center phase III clinical trial was conducted comparing two alternative opioid PCA dosing strategies (HDLI-higher demand dose with low constant infusion or LDHI- lower demand dose and higher const...

  5. Clinical utility of the PCA3 urine assay in European men scheduled for repeat biopsy.

    NARCIS (Netherlands)

    Haese, A.; Taille, A. De La; Poppel, H. van; Marberger, M.; Stenzl, A.; Mulders, P.F.A.; Huland, H.; Abbou, C.C.; Remzi, M.; Tinzl, M.; Feyerabend, S.; Stillebroer, A.B.; Gils, M.P.M.Q.; Schalken, J.A.

    2008-01-01

    BACKGROUND: The Prostate CAncer gene 3 (PCA3) assay has shown promise as an aid in prostate cancer (pCA) diagnosis in identifying men with a high probability of a positive (repeat) biopsy. OBJECTIVE: This study evaluated the clinical utility of the PROGENSA PCA3 assay. DESIGN, SETTING, AND

  6. Constrained superfields in supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Dall’Agata, Gianguido; Farakos, Fotis [Dipartimento di Fisica ed Astronomia “Galileo Galilei”, Università di Padova,Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova,Via Marzolo 8, 35131 Padova (Italy)

    2016-02-16

    We analyze constrained superfields in supergravity. We investigate the consistency and solve all known constraints, presenting a new class that may have interesting applications in the construction of inflationary models. We provide the superspace Lagrangians for minimal supergravity models based on them and write the corresponding theories in component form using a simplifying gauge for the goldstino couplings.

  7. Minimal constrained supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Cribiori, N. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Dall' Agata, G., E-mail: dallagat@pd.infn.it [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Farakos, F. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Porrati, M. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States)

    2017-01-10

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  8. Minimal constrained supergravity

    Directory of Open Access Journals (Sweden)

    N. Cribiori

    2017-01-01

    Full Text Available We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  9. Minimal constrained supergravity

    International Nuclear Information System (INIS)

    Cribiori, N.; Dall'Agata, G.; Farakos, F.; Porrati, M.

    2017-01-01

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  10. Biometric identification based on feature fusion with PCA and SVM

    Science.gov (United States)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina

    2018-04-01

    Biometric identification is gaining ground compared to traditional identification methods. Many biometric measurements may be used for secure human identification. The most reliable among them is the iris pattern because of its uniqueness, stability, unforgeability and inalterability over time. The approach presented in this paper is a fusion of different feature descriptor methods such as HOG, LIOP, LBP, used for extracting iris texture information. The classifiers obtained through the SVM and PCA methods demonstrate the effectiveness of our system applied to one and both irises. The performances measured are highly accurate and foreshadow a fusion system with a rate of identification approaching 100% on the UPOL database.

  11. Innovative Comparison of Transient Ignition Temperature at the Booster Interface, New Stainless Steel Pyrovalve Primer Chamber Assembly "V" (PCA) Design Versus the Current Aluminum "Y" PCA Design

    Science.gov (United States)

    Saulsberry, Regor L.; McDougle, Stephen H.; Garcia,Roberto; Johnson, Kenneth L.; Sipes, William; Rickman, Steven; Hosangadi, Ashvin

    2011-01-01

    An assessment of four spacecraft pyrovalve anomalies that occurred during ground testing was conducted by the NASA Engineering & Safety Center (NESC) in 2008. In all four cases, a common aluminum (Al) primer chamber assembly (PCA) was used with dual NASA Standard Initiators (NSIs) and the nearly simultaneous (separated by less than 80 microseconds) firing of both initiators failed to ignite the booster charge. The results of the assessment and associated test program were reported in AIAA Paper AIAA-2008-4798, NESC Independent Assessment of Pyrovalve Ground Test Anomalies. As a result of the four Al PCA anomalies, and the test results and findings of the NESC assessment, the Mars Science Laboratory (MSL) project team decided to make changes to the PCA. The material for the PCA body was changed from aluminum (Al) to stainless steel (SS) to avoid melting, distortion, and potential leakage of the NSI flow passages when the device functioned. The flow passages, which were interconnected in a Y-shaped configuration (Y-PCA) in the original design, were changed to a V-shaped configuration (V-PCA). The V-shape was used to more efficiently transfer energy from the NSIs to the booster. Development and qualification testing of the new design clearly demonstrated faster booster ignition times compared to the legacy AL Y-PCA design. However, the final NESC assessment report recommended that the SS V-PCA be experimentally characterized and quantitatively compared to the Al Y-PCA design. This data was deemed important for properly evaluating the design options for future NASA projects. This test program has successfully quantified the improvement of the SS V-PCA over the Al Y-PCA. A phase B of the project was also conducted and evaluated the effect of firing command skew and enlargement of flame channels to further assist spacecraft applications.

  12. On the use of A PCA as a multichannel time analyzer

    International Nuclear Information System (INIS)

    Adib, M.; Abdelkawy, A.; Abuelela, M.; Habib, N.; Wahba, M.; Salama, F.

    1992-01-01

    PCA and PCA-11 software programmes have been used to utilize the operation of the nucleus personal computer analyzer PCA-8000 in its multichannel scaler (MCS) mode. The operating condition of PCA-8000 were selected to match the time-of-flight (TOF) spectrometer which is in operation at the ET-RR-1 reactor. The results of measuring the main parameters of PCA-8000 operating in its MCS mode showed that it can be successfully used as a multichannel time analyzer.5 fig

  13. Performance evaluation of PCA-based spike sorting algorithms.

    Science.gov (United States)

    Adamos, Dimitrios A; Kosmidis, Efstratios K; Theophilidis, George

    2008-09-01

    Deciphering the electrical activity of individual neurons from multi-unit noisy recordings is critical for understanding complex neural systems. A widely used spike sorting algorithm is being evaluated for single-electrode nerve trunk recordings. The algorithm is based on principal component analysis (PCA) for spike feature extraction. In the neuroscience literature it is generally assumed that the use of the first two or most commonly three principal components is sufficient. We estimate the optimum PCA-based feature space by evaluating the algorithm's performance on simulated series of action potentials. A number of modifications are made to the open source nev2lkit software to enable systematic investigation of the parameter space. We introduce a new metric to define clustering error considering over-clustering more favorable than under-clustering as proposed by experimentalists for our data. Both the program patch and the metric are available online. Correlated and white Gaussian noise processes are superimposed to account for biological and artificial jitter in the recordings. We report that the employment of more than three principal components is in general beneficial for all noise cases considered. Finally, we apply our results to experimental data and verify that the sorting process with four principal components is in agreement with a panel of electrophysiology experts.

  14. MD-11 PCA - View of aircraft on ramp

    Science.gov (United States)

    1995-01-01

    This McDonnell Douglas MD-11 is taxiing to a position on the flightline at NASA's Dryden Flight Research Center, Edwards, California, following its completion of the first and second landings ever performed by a transport aircraft under engine power only (on Aug. 29, 1995). The milestone flight, with NASA research pilot and former astronaut Gordon Fullerton at the controls, was part of a NASA project to develop a computer-assisted engine control system that enables a pilot to land a plane safely when its normal control surfaces are disabled. The Propulsion-Controlled Aircraft (PCA) system uses standard autopilot controls already present in the cockpit, together with the new programming in the aircraft's flight control computers. The PCA concept is simple. For pitch control, the program increases thrust to climb and reduces thrust to descend. To turn right, the autopilot increases the left engine thrust while decreasing the right engine thrust. The initial Propulsion-Controlled Aircraft studies by NASA were carried out at Dryden with a modified twin-engine F-15 research aircraft.

  15. MD-11 PCA - Closeup view of aircraft on ramp

    Science.gov (United States)

    1995-01-01

    This McDonnell Douglas MD-11 has taxied to a position on the flightline at NASA's Dryden Flight Research Center, Edwards, California, following its completion of the first and second landings ever performed by a transport aircraft under engine power only (on Aug. 29, 1995). The milestone flight, with NASA research pilot and former astronaut Gordon Fullerton at the controls, was part of a NASA project to develop a computer-assisted engine control system that enables a pilot to land a plane safely when its normal control surfaces are disabled. The Propulsion-Controlled Aircraft (PCA) system uses standard autopilot controls already present in the cockpit, together with the new programming in the aircraft's flight control computers. The PCA concept is simple. For pitch control, the program increases thrust to climb and reduces thrust to descend. To turn right, the autopilot increases the left engine thrust while decreasing the right engine thrust. The initial Propulsion-Controlled Aircraft studies by NASA were carried out at Dryden with a modified twin-engine F-15 research aircraft.

  16. PCA safety data review after clinical decision support and smart pump technology implementation.

    Science.gov (United States)

    Prewitt, Judy; Schneider, Susan; Horvath, Monica; Hammond, Julia; Jackson, Jason; Ginsberg, Brian

    2013-06-01

    Medication errors account for 20% of medical errors in the United States with the largest risk at prescribing and administration. Analgesics or opioids are frequently used medications that can be associated with patient harm when prescribed or administered improperly. In an effort to decrease medication errors, Duke University Hospital implemented clinical decision support via computer provider order entry (CPOE) and "smart pump" technology, 2/2008, with the goal to decrease patient-controlled analgesia (PCA) adverse events. This project evaluated PCA safety events, reviewing voluntary report system and adverse drug events via surveillance (ADE-S), on intermediate and step-down units preimplementation and postimplementation of clinical decision support via CPOE and PCA smart pumps for the prescribing and administration of opioids therapy in the adult patient requiring analgesia for acute pain. Voluntary report system and ADE-S PCA events decreased based upon 1000 PCA days; ADE-S PCA events per 1000 PCA days decreased 22%, from 5.3 (pre) to 4.2 (post) (P = 0.09). Voluntary report system events decreased 72%, from 2.4/1000 PCA days (pre) to 0.66/1000 PCA days (post) and was statistically significant (P PCA events between time periods in both the ADE-S and voluntary report system data, thus supporting the recommendation of clinical decision support via CPOE and PCA smart pump technology.

  17. Fault detection of feed water treatment process using PCA-WD with parameter optimization.

    Science.gov (United States)

    Zhang, Shirong; Tang, Qian; Lin, Yu; Tang, Yuling

    2017-05-01

    Feed water treatment process (FWTP) is an essential part of utility boilers; and fault detection is expected for its reliability improvement. Classical principal component analysis (PCA) has been applied to FWTPs in our previous work; however, the noises of T 2 and SPE statistics result in false detections and missed detections. In this paper, Wavelet denoise (WD) is combined with PCA to form a new algorithm, (PCA-WD), where WD is intentionally employed to deal with the noises. The parameter selection of PCA-WD is further formulated as an optimization problem; and PSO is employed for optimization solution. A FWTP, sustaining two 1000MW generation units in a coal-fired power plant, is taken as a study case. Its operation data is collected for following verification study. The results show that the optimized WD is effective to restrain the noises of T 2 and SPE statistics, so as to improve the performance of PCA-WD algorithm. And, the parameter optimization enables PCA-WD to get its optimal parameters in an automatic way rather than on individual experience. The optimized PCA-WD is further compared with classical PCA and sliding window PCA (SWPCA), in terms of four cases as bias fault, drift fault, broken line fault and normal condition, respectively. The advantages of the optimized PCA-WD, against classical PCA and SWPCA, is finally convinced with the results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Posterior cerebral artery involvement in moyamoya disease: initial infarction and angle between PCA and basilar artery.

    Science.gov (United States)

    Lee, Ji Yeoun; Kim, Seung-Ki; Cheon, Jung-Eun; Choi, Jung Won; Phi, Ji Hoon; Kim, In-One; Cho, Byung-Kyu; Wang, Kyu-Chang

    2013-12-01

    Moyamoya disease (MMD) is a chronic cerebrovascular occlusive disease, and progressive involvement of the posterior cerebral artery (PCA) has been reported. However, majority of MMD articles are presenting classic anterior circulation related issues. This study investigates the preoperative factors related to the long-term outcome of posterior circulation in MMD. Retrospective review of 88 MMD patients (166 PCAs in either hemisphere) without symptomatic disease involvement of PCA at initial diagnosis was done. Data at initial diagnosis regarding age, presence of infarction, status of the PCA, type of posterior communicating artery, and the angle between PCA and basilar artery were reviewed. Progressive stenosis of PCA was evaluated by symptom or radiological imaging during follow up. During an average follow up of 8.3 years, 29 out of 166 (18 %) evaluated PCAs showed progressive disease involvement. The average time of progression from the initial operation was 4.9 years, with the latest onset at 10.8 years. The patients who showed progressive stenosis of the PCA tended to be younger, present with infarction, have smaller angle between PCA and basilar artery, and have asymptomatic stenosis of the PCA at initial presentation. However, multivariate analysis confirmed only the presence of initial infarction and a smaller angle between PCA and basilar artery to be significantly associated with progressive stenosis of PCA. Involvement of PCA in MMD may occur in a delayed fashion, years after the completion of revascularization of anterior circulation. Persistent long-term follow-up regarding the posterior circulation is recommended.

  19. PCA and vTEC climatology at midnight over mid-latitude regions

    Science.gov (United States)

    Natali, M. P.; Meza, A.

    2017-12-01

    The effect of the thermospheric vertical neutral wind on vertical total electron content (vTEC) variations including longitudinal anomaly, remaining winter anomaly, mid-latitude summer night anomaly, and semiannual anomaly is studied at mid-latitude regions around zero magnetic declination at midnight during high solar activity. By using the principal component analysis (PCA) numerical technique, this work studies the spatial and temporal variations of the ionosphere at midnight over mid-latitude regions during 2000-2002. PCA is applied to a time series of global vTEC maps produced by the International Global Navigation Satellite System (GNSS) Service. Four regions were studied in particular, each located at mid-latitude and approximately centered at zero magnetic declination, with two in the northern hemisphere and two in southern hemisphere, and all are located near and far from geomagnetic poles in each case. This technique provides an effective method to analyze the main ionospheric variabilities at mid-latitudes. PCA is also applied to the vTEC computed using the International Reference Ionosphere (IRI) 2012 model, to analyze the capability of this model to represent ionospheric variabilities at mid-latitude. Also, the Horizontal Wind Model 2007 (HWM07) is used to improve our climatology interpretation, by analyzing the relationship between vTEC and thermospheric wind, both quantitatively and qualitatively. At midnight, the behavior of mean vTEC values strongly responds to vertical wind variation, experiencing a decrease of about 10-15% with the action of the positive vertical component of the field-aligned neutral wind lasting for 2 h in all regions except for Oceania. Notable results include: a significant increase toward higher latitudes during summer in the South America and Asia regions, associated with the mid-latitude summer night anomaly, and an increase toward higher latitudes in winter in the North America and Oceania regions, highlighting the

  20. Constrained Vapor Bubble Experiment

    Science.gov (United States)

    Gokhale, Shripad; Plawsky, Joel; Wayner, Peter C., Jr.; Zheng, Ling; Wang, Ying-Xi

    2002-11-01

    Microgravity experiments on the Constrained Vapor Bubble Heat Exchanger, CVB, are being developed for the International Space Station. In particular, we present results of a precursory experimental and theoretical study of the vertical Constrained Vapor Bubble in the Earth's environment. A novel non-isothermal experimental setup was designed and built to study the transport processes in an ethanol/quartz vertical CVB system. Temperature profiles were measured using an in situ PC (personal computer)-based LabView data acquisition system via thermocouples. Film thickness profiles were measured using interferometry. A theoretical model was developed to predict the curvature profile of the stable film in the evaporator. The concept of the total amount of evaporation, which can be obtained directly by integrating the experimental temperature profile, was introduced. Experimentally measured curvature profiles are in good agreement with modeling results. For microgravity conditions, an analytical expression, which reveals an inherent relation between temperature and curvature profiles, was derived.

  1. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  2. Identification of an IL-1-induced gene expression pattern in AR+ PCa cells that mimics the molecular phenotype of AR- PCa cells.

    Science.gov (United States)

    Thomas-Jardin, Shayna E; Kanchwala, Mohammed S; Jacob, Joan; Merchant, Sana; Meade, Rachel K; Gahnim, Nagham M; Nawas, Afshan F; Xing, Chao; Delk, Nikki A

    2018-06-01

    In immunosurveillance, bone-derived immune cells infiltrate the tumor and secrete inflammatory cytokines to destroy cancer cells. However, cancer cells have evolved mechanisms to usurp inflammatory cytokines to promote tumor progression. In particular, the inflammatory cytokine, interleukin-1 (IL-1), is elevated in prostate cancer (PCa) patient tissue and serum, and promotes PCa bone metastasis. IL-1 also represses androgen receptor (AR) accumulation and activity in PCa cells, yet the cells remain viable and tumorigenic; suggesting that IL-1 may also contribute to AR-targeted therapy resistance. Furthermore, IL-1 and AR protein levels negatively correlate in PCa tumor cells. Taken together, we hypothesize that IL-1 reprograms AR positive (AR + ) PCa cells into AR negative (AR - ) PCa cells that co-opt IL-1 signaling to ensure AR-independent survival and tumor progression in the inflammatory tumor microenvironment. LNCaP and PC3 PCa cells were treated with IL-1β or HS-5 bone marrow stromal cell (BMSC) conditioned medium and analyzed by RNA sequencing and RT-QPCR. To verify genes identified by RNA sequencing, LNCaP, MDA-PCa-2b, PC3, and DU145 PCa cell lines were treated with the IL-1 family members, IL-1α or IL-1β, or exposed to HS-5 BMSC in the presence or absence of Interleukin-1 Receptor Antagonist (IL-1RA). Treated cells were analyzed by western blot and/or RT-QPCR. Comparative analysis of sequencing data from the AR + LNCaP PCa cell line versus the AR - PC3 PCa cell line reveals an IL-1-conferred gene suite in LNCaP cells that is constitutive in PC3 cells. Bioinformatics analysis of the IL-1 regulated gene suite revealed that inflammatory and immune response pathways are primarily elicited; likely facilitating PCa cell survival and tumorigenicity in an inflammatory tumor microenvironment. Our data supports that IL-1 reprograms AR + PCa cells to mimic AR - PCa gene expression patterns that favor AR-targeted treatment resistance and cell survival. © 2018 Wiley

  3. Cost effectiveness analysis of screening in the early diagnosis of prostate cancer (PCA)

    International Nuclear Information System (INIS)

    Mueller-Lisse, U.G.; Mueller-Lisse, U.L.

    2002-01-01

    Purpose. The authors attempted to provide an overview of current concepts and the status of research in the field of cost effectiveness analysis (CEA) of screening for prostate cancer (PCA).Material and methods. Basic concepts and methods of CEA were reviewed. Examples of CEA-related studies of PCA were obtained from pertinent literature through medical databases.Results. Screening for PCA has so far been restricted to limited groups of health care recipients, usually within the framework of clinical trials. In those trials, screening for PCA usually results in higher numbers of PCAs being detected at lower average stages in a given population. As a consequence of screening, the rate of potentially curable PCAs increases. However, it has not yet been demonstrated that screening for PCA decreases PCA-related mortality or morbidity from metastatic PCA. On the other hand, additional costs are associated with the screening measure and with increased use of resources for diagnosis and treatment of the additional PCAs detected through screening.Conclusions. Throughout the European Union and North America, mass screening for PCA has not been implemented. This may chiefly be due to the current lack of information on long term benefits of PCA screening, particularly disease-specific survival. Currently, major studies are underway to assess the effects of PCA screening and its cost effectiveness. These studies include the US-American prostate, lung, colon and ovary trials (PLCO) and the European randomised study of Screening for Prostate Cancer (ERSPC). (orig.) [de

  4. Prostate-Specific Antigen (PSA) Screening and New Biomarkers for Prostate Cancer (PCa).

    Science.gov (United States)

    Stephan, Carsten; Rittenhouse, Harry; Hu, Xinhai; Cammann, Henning; Jung, Klaus

    2014-04-01

    PSA screening reduces PCa-mortality but the disadvantages overdiagnosis and overtreatment require multivariable risk-prediction tools to select appropriate treatment or active surveillance. This review explains the differences between the two largest screening trials and discusses the drawbacks of screening and its meta-analysisxs. The current American and European screening strategies are described. Nonetheless, PSA is one of the most widely used tumor markers and strongly correlates with the risk of harboring PCa. However, while PSA has limitations for PCa detection with its low specificity there are several potential biomarkers presented in this review with utility for PCa currently being studied. There is an urgent need for new biomarkers especially to detect clinically significant and aggressive PCa. From all PSA-based markers, the FDA-approved prostate health index (phi) shows improved specificity over percent free and total PSA. Another kallikrein panel, 4K, which includes KLK2 has recently shown promise in clinical research studies but has not yet undergone formal validation studies. In urine, prostate cancer gene 3 (PCA3) has also been validated and approved by the FDA for its utility to detect PCa. The potential correlation of PCA3 with cancer aggressiveness requires more clinical studies. The detection of the fusion of androgen-regulated genes with genes of the regulatory transcription factors in tissue of (~)50% of all PCa-patients is a milestone in PCa research. A combination of the urinary assays for TMPRSS2:ERG gene fusion and PCA3 shows an improved accuracy for PCa detection. Overall, the field of PCa biomarker discovery is very exciting and prospective.

  5. Prostate-Specific Antigen (PSA) Screening and New Biomarkers for Prostate Cancer (PCa)

    Science.gov (United States)

    Rittenhouse, Harry; Hu, Xinhai; Cammann, Henning; Jung, Klaus

    2014-01-01

    Abstract PSA screening reduces PCa-mortality but the disadvantages overdiagnosis and overtreatment require multivariable risk-prediction tools to select appropriate treatment or active surveillance. This review explains the differences between the two largest screening trials and discusses the drawbacks of screening and its meta-analysisxs. The current American and European screening strategies are described. Nonetheless, PSA is one of the most widely used tumor markers and strongly correlates with the risk of harboring PCa. However, while PSA has limitations for PCa detection with its low specificity there are several potential biomarkers presented in this review with utility for PCa currently being studied. There is an urgent need for new biomarkers especially to detect clinically significant and aggressive PCa. From all PSA-based markers, the FDA-approved prostate health index (phi) shows improved specificity over percent free and total PSA. Another kallikrein panel, 4K, which includes KLK2 has recently shown promise in clinical research studies but has not yet undergone formal validation studies. In urine, prostate cancer gene 3 (PCA3) has also been validated and approved by the FDA for its utility to detect PCa. The potential correlation of PCA3 with cancer aggressiveness requires more clinical studies. The detection of the fusion of androgen-regulated genes with genes of the regulatory transcription factors in tissue of ~50% of all PCa-patients is a milestone in PCa research. A combination of the urinary assays for TMPRSS2:ERG gene fusion and PCA3 shows an improved accuracy for PCa detection. Overall, the field of PCa biomarker discovery is very exciting and prospective. PMID:27683457

  6. Ring-constrained Join

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Karras, Panagiotis; Mamoulis, Nikos

    2008-01-01

    . This new operation has important applications in decision support, e.g., placing recycling stations at fair locations between restaurants and residential complexes. Clearly, RCJ is defined based on a geometric constraint but not on distances between points. Thus, our operation is fundamentally different......We introduce a novel spatial join operator, the ring-constrained join (RCJ). Given two sets P and Q of spatial points, the result of RCJ consists of pairs (p, q) (where p ε P, q ε Q) satisfying an intuitive geometric constraint: the smallest circle enclosing p and q contains no other points in P, Q...

  7. PCA-derived factors that may be predictive of postoperative pain in pediatric patients: a possible role for the PCA ratio.

    Science.gov (United States)

    McDonnell, Conor; Pehora, Carolyne; Crawford, Mark W

    2012-01-01

    No method exists to reliably predict which patients will develop severe postoperative pain. The authors hypothesized that data derived from patient-controlled analgesia (PCA) pumps (specifically the ratio of patient demands to pump deliveries) may predict which patients would develop severe pain after scoliosis repair. Quaternary, university-affiliated, pediatric hospital. Forty American Society of Anesthesiologists I-Il pediatric patients who had undergone elective scoliosis repair and had consented to recruitment to a randomized clinical trial investigating the effects of early morphine administration on remifentanil-induced hyperalgesia. To test the hypothesis of the current study, the authors calculated the PCA ratio of demand to delivery at every 4 hours throughout the first 24 hours after surgery for all the patients recruited to the original study. The authors compared calculated PCA ratios, numeric rating scale pain scores, and cumulative morphine consumption for those patients who developed severe postoperative pain and met the criteria for opioid rotation versus those patients who did not. Seven patients required opioid rotation from PCA morphine to PCA hydromorphone. Eight hours after surgery, the median PCA ratio for those seven patients (2.5[range, 1.8-4.3]) was significantly greater than that for all other recruited patients (1.3 [range, 0-2.7]; p PCA ratios of demand to delivery as early as 8 hours after surgery.

  8. Opioid Patient Controlled Analgesia (PCA) use during the Initial Experience with the IMPROVE PCA Trial: A Phase III Analgesic Trial for Hospitalized Sickle Cell Patients with Painful Episodes

    Science.gov (United States)

    Dampier, Carlton D.; Smith, Wally R.; Kim, Hae-Young; Wager, Carrie Greene; Bell, Margaret C.; Minniti, Caterina P.; Keefer, Jeffrey; Hsu, Lewis; Krishnamurti, Lakshmanan; Mack, A. Kyle; McClish, Donna; McKinlay, Sonja M.; Miller, Scott T.; Osunkwo, Ifeyinwa; Seaman, Phillip; Telen, Marilyn J.; Weiner, Debra L.

    2015-01-01

    Opioid analgesics administered by patient-controlled analgesia (PCA) are frequently used for pain relief in children and adults with sickle cell disease (SCD) hospitalized for persistent vaso-occlusive pain, but optimum opioid dosing is not known. To better define PCA dosing recommendations, a multi-center phase III clinical trial was conducted comparing two alternative opioid PCA dosing strategies (HDLI-higher demand dose with low constant infusion or LDHI- lower demand dose and higher constant infusion) in 38 subjects who completed randomization prior to trial closure. Total opioid utilization (morphine equivalents, mg/kg) in 22 adults was 11.6 ± 2.6 and 4.7 ± 0.9 in the HDLI and in the LDHI arms, respectively, and in 12 children it was 3.7 ± 1.0 and 5.8 ± 2.2, respectively. Opioid-related symptoms were mild and similar in both PCA arms (mean daily opioid symptom intensity score: HDLI 0.9 ± 0.1, LDHI 0.9 ± 0.2). The slow enrollment and early study termination limited conclusions regarding superiority of either treatment regimen. This study adds to our understanding of opioid PCA usage in SCD. Future clinical trial protocol designs for opioid PCA may need to consider potential differences between adults and children in PCA usage. PMID:21953763

  9. Shallow-Land Buriable PCA-type austenitic stainless steel for fusion application

    International Nuclear Information System (INIS)

    Zucchetti, M.

    1991-01-01

    Neutron-induced activity in the PCA (Primary Candidate Alloy) austenitic stainless steel is examined, when used for first-wall components in a DEMO fusion reactor. Some low-activity definitions, based on different waste management and disposal concepts, are introduced. Activity in the PCA is so high that any recycling of the irradiated material can be excluded. Disposal of PCA radioactive wastes in Shallow-Land Buriable (SLB) is prevented as well. Mo, Nb and some impurity elements have to be removed or limited, in order to reduce the radioactivity of the PCA. Possible low-activity versions of the PCA are introduced (PCA-la); they meet the requirements for SLB and may also be recycled under certain conditions. (author)

  10. Patient perspectives of patient-controlled analgesia (PCA) and methods for improving pain control and patient satisfaction.

    Science.gov (United States)

    Patak, Lance S; Tait, Alan R; Mirafzali, Leela; Morris, Michelle; Dasgupta, Sunavo; Brummett, Chad M

    2013-01-01

    This study aimed to (1) identify patient-controlled analgesia (PCA) attributes that negatively impact patient satisfaction and ability to control pain while using PCA and (2) obtain data on patient perceptions of new PCA design features. We conducted a prospective survey study of postoperative pain control among patients using a PCA device. The survey was designed to evaluate patient satisfaction with pain control, understanding of PCA, difficulties using PCA, lockout-period management, and evaluation of new PCA design features. A total of 350 eligible patients completed the survey (91%). Patients who had difficulties using PCA were less satisfied (P PCA. Forty-nine percent of patients reported not knowing if they would receive medicine when they pushed the PCA button, and of these, 22% believed that this uncertainty made their pain worse. The majority of patients preferred the proposed PCA design features for easier use, including a light on the button, making it easier to find (57%), and a PCA button that vibrates (55%) or lights up (70%), alerting the patient that the PCA pump is able to deliver more medicine. A majority of patients, irrespective of their satisfaction with PCA, preferred a new PCA design. Certain attributes of current PCA technology may negatively impact patient experience, and modifications could potentially address these concerns and improve patient outcomes.

  11. Epileptic seizure detection in EEG signal with GModPCA and support vector machine.

    Science.gov (United States)

    Jaiswal, Abeg Kumar; Banka, Haider

    2017-01-01

    Epilepsy is one of the most common neurological disorders caused by recurrent seizures. Electroencephalograms (EEGs) record neural activity and can detect epilepsy. Visual inspection of an EEG signal for epileptic seizure detection is a time-consuming process and may lead to human error; therefore, recently, a number of automated seizure detection frameworks were proposed to replace these traditional methods. Feature extraction and classification are two important steps in these procedures. Feature extraction focuses on finding the informative features that could be used for classification and correct decision-making. Therefore, proposing effective feature extraction techniques for seizure detection is of great significance. Principal Component Analysis (PCA) is a dimensionality reduction technique used in different fields of pattern recognition including EEG signal classification. Global modular PCA (GModPCA) is a variation of PCA. In this paper, an effective framework with GModPCA and Support Vector Machine (SVM) is presented for epileptic seizure detection in EEG signals. The feature extraction is performed with GModPCA, whereas SVM trained with radial basis function kernel performed the classification between seizure and nonseizure EEG signals. Seven different experimental cases were conducted on the benchmark epilepsy EEG dataset. The system performance was evaluated using 10-fold cross-validation. In addition, we prove analytically that GModPCA has less time and space complexities as compared to PCA. The experimental results show that EEG signals have strong inter-sub-pattern correlations. GModPCA and SVM have been able to achieve 100% accuracy for the classification between normal and epileptic signals. Along with this, seven different experimental cases were tested. The classification results of the proposed approach were better than were compared the results of some of the existing methods proposed in literature. It is also found that the time and space

  12. Comparative analysis of the PCA3 gene expression in sediments and exosomes isolated from urine

    Directory of Open Access Journals (Sweden)

    D. S. Mikhaylenko

    2017-01-01

    Full Text Available Introduction. Prostate cancer (PCa is one of the common oncological diseases in men. Expression of the PCA3 gene in urine is currently used as a molecular genetic marker of PCa.Objective: to comparative analysis of the PCA3 expression in urine sediments and exosomes for the determination of the biomaterial, which allows detecting the PCA3 expression in more efficient manner.Materials and methods. The 12 patients with different stages of PCa and 8 control samples were examined.Results. The diagnostic accuracy of the PCA3 gene expression analysis in this cohort exceeded 90 %. We had not obtained significant differences in the sensitivity and specificity of the PCA3 hyperexpression in the urine sediments compared with exosomes. This result indicates in favor to using urine sediment for the PCA3 analysis as a biomaterial with less time-consuming sample preparation, although the possible advantage of exosomes for the analysis of the expression marker panels requires further studies.

  13. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    National Research Council Canada - National Science Library

    Skjellum, Anthony

    2004-01-01

    Polymorphous Computing Architectures (PCA) rapidly "morph" (reorganize) software and hardware configurations in order to achieve high performance on computation styles ranging from specialized streaming to general threaded applications...

  14. PCA-1/ALKBH3 contributes to pancreatic cancer by supporting apoptotic resistance and angiogenesis.

    Science.gov (United States)

    Yamato, Ichiro; Sho, Masayuki; Shimada, Keiji; Hotta, Kiyohiko; Ueda, Yuko; Yasuda, Satoshi; Shigi, Naoko; Konishi, Noboru; Tsujikawa, Kazutake; Nakajima, Yoshiyuki

    2012-09-15

    The PCA-1/ALKBH3 gene implicated in DNA repair is expressed in several human malignancies but its precise contributions to cancer remain mainly unknown. In this study, we have determined its functions and clinical importance in pancreatic cancer. PCA-1/ALKBH3 functions in proliferation, apoptosis and angiogenesis were evaluated in human pancreatic cancer cells in vitro and in vivo. Further, PCA-1/ALKBH3 expression in 116 patients with pancreatic cancer was evaluated by immunohistochemistry. siRNA-mediated silencing of PCA-1/ALKBH3 expression induced apoptosis and suppressed cell proliferation. Conversely, overexpression of PCA-1/ALKBH3 increased anchorage-independent growth and invasiveness. In addition, PCA-1/ALKBH3 silencing downregulated VEGF expression and inhibited angiogenesis in vivo. Furthermore, immunohistochemical analysis showed that PCA-1/ALKBH3 expression was abundant in pancreatic cancer tissues, where it correlated with advanced tumor status, pathological stage and VEGF intensity. Importantly, patients with low positivity of PCA-1/ALKBH3 expression had improved postoperative prognosis compared with those with high positivity. Our results establish PCA-1/ALKBH3 as important gene in pancreatic cancer with potential utility as a therapeutic target in this fatal disease.

  15. Tensile properties of unirradiated PCA from room temperature to 7000C

    International Nuclear Information System (INIS)

    Braski, D.N.; Maziasz, P.J.

    1983-01-01

    The tensile properties of Prime Candidate Alloy (PCA) austenitic stainless steel after three different thermomechanical treatments were determined from room temperature to 700 0 C. The solution-annealed PCA had the lowest strength and highest ductility, while the reverse was true for the 25%-cold-worked material. The PCA containing titanium-rich MC particles fell between the other two heats. The cold-worked PCA had nearly the same tensile properties as cold-worked type 316 stainless steel. Both alloys showed ductility minima at 300 0 C

  16. Quality of Life and Sexual Health in the Aging of PCa Survivors.

    Science.gov (United States)

    Gacci, Mauro; Baldi, Elisabetta; Tamburrino, Lara; Detti, Beatrice; Livi, Lorenzo; De Nunzio, Cosimo; Tubaro, Andrea; Gravas, Stavros; Carini, Marco; Serni, Sergio

    2014-01-01

    Prostate cancer (PCa) is the most common malignancy in elderly men. The progressive ageing of the world male population will further increase the need for tailored assessment and treatment of PCa patients. The determinant role of androgens and sexual hormones for PCa growth and progression has been established. However, several trials on androgens and PCa are recently focused on urinary continence, quality of life, and sexual function, suggesting a new point of view on the whole endocrinological aspect of PCa. During aging, metabolic syndrome, including diabetes, hypertension, dyslipidemia, and central obesity, can be associated with a chronic, low-grade inflammation of the prostate and with changes in the sex steroid pathways. These factors may affect both the carcinogenesis processes and treatment outcomes of PCa. Any treatment for PCa can have a long-lasting negative impact on quality of life and sexual health, which should be assessed by validated self-reported questionnaires. In particular, sexual health, urinary continence, and bowel function can be worsened after prostatectomy, radiotherapy, or hormone treatment, mostly in the elderly population. In the present review we summarized the current knowledge on the role of hormones, metabolic features, and primary treatments for PCa on the quality of life and sexual health of elderly Pca survivors.

  17. Polymorphous Computing Architecture (PCA) Application Benchmark 1: Three-Dimensional Radar Data Processing

    National Research Council Canada - National Science Library

    Lebak, J

    2001-01-01

    The DARPA Polymorphous Computing Architecture (PCA) program is building advanced computer architectures that can reorganize their computation and communication structures to achieve better overall application performance...

  18. Sharp spatially constrained inversion

    DEFF Research Database (Denmark)

    Vignoli, Giulio G.; Fiandaca, Gianluca G.; Christiansen, Anders Vest C A.V.C.

    2013-01-01

    We present sharp reconstruction of multi-layer models using a spatially constrained inversion with minimum gradient support regularization. In particular, its application to airborne electromagnetic data is discussed. Airborne surveys produce extremely large datasets, traditionally inverted...... by using smoothly varying 1D models. Smoothness is a result of the regularization constraints applied to address the inversion ill-posedness. The standard Occam-type regularized multi-layer inversion produces results where boundaries between layers are smeared. The sharp regularization overcomes...... inversions are compared against classical smooth results and available boreholes. With the focusing approach, the obtained blocky results agree with the underlying geology and allow for easier interpretation by the end-user....

  19. Temporal Concurrent Constraint Programming

    DEFF Research Database (Denmark)

    Valencia, Frank Dan

    Concurrent constraint programming (ccp) is a formalism for concurrency in which agents interact with one another by telling (adding) and asking (reading) information in a shared medium. Temporal ccp extends ccp by allowing agents to be constrained by time conditions. This dissertation studies...... temporal ccp by developing a process calculus called ntcc. The ntcc calculus generalizes the tcc model, the latter being a temporal ccp model for deterministic and synchronouss timed reactive systems. The calculus is built upon few basic ideas but it captures several aspects of timed systems. As tcc, ntcc...... structures, robotic devises, multi-agent systems and music applications. The calculus is provided with a denotational semantics that captures the reactive computations of processes in the presence of arbitrary environments. The denotation is proven to be fully-abstract for a substantial fragment...

  20. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  1. PCA/HEXTE Observations of Coma and A2319

    Science.gov (United States)

    Rephaeli, Yoel

    1998-01-01

    The Coma cluster was observed in 1996 for 90 ks by the PCA and HEXTE instruments aboard the RXTE satellite, the first simultaneous, pointing measurement of Coma in the broad, 2-250 keV, energy band. The high sensitivity achieved during this long observation allows precise determination of the spectrum. Our analysis of the measurements clearly indicates that in addition to the main thermal emission from hot intracluster gas at kT=7.5 keV, a second spectral component is required to best-fit the data. If thermal, it can be described with a temperature of 4.7 keV contributing about 20% of the total flux. The additional spectral component can also be described by a power-law, possibly due to Compton scattering of relativistic electrons by the CMB. This interpretation is based on the diffuse radio synchrotron emission, which has a spectral index of 2.34, within the range allowed by fits to the RXTE spectral data. A Compton origin of the measured nonthermal component would imply that the volume-averaged magnetic field in the central region of Coma is B =0.2 micro-Gauss, a value deduced directly from the radio and X-ray measurements (and thus free of the usual assumption of energy equipartition). Barring the presence of unknown systematic errors in the RXTE source or background measurements, our spectral analysis yields considerable evidence for Compton X-ray emission in the Coma cluster.

  2. Behavior of the PCA3 gene in the urine of men with high grade prostatic intraepithelial neoplasia.

    Science.gov (United States)

    Morote, Juan; Rigau, Marina; Garcia, Marta; Mir, Carmen; Ballesteros, Carlos; Planas, Jacques; Raventós, Carles X; Placer, José; de Torres, Inés M; Reventós, Jaume; Doll, Andreas

    2010-12-01

    An ideal marker for the early detection of prostate cancer (PCa) should also differentiate between men with isolated high grade prostatic intraepithelial neoplasia (HGPIN) and those with PCa. Prostate Cancer Gene 3 (PCA3) is a highly specific PCa gene and its score, in relation to the PSA gene in post-prostate massage urine (PMU-PCA3), seems to be useful in ruling out PCa, especially after a negative prostate biopsy. Because PCA3 is also expressed in the HGPIN lesion, the aim of this study was to determine the efficacy of PMU-PCA3 scores for ruling out PCa in men with previous HGPIN. The PMU-PCA3 score was assessed by quantitative PCR (multiplex research assay) in 244 men subjected to prostate biopsy: 64 men with an isolated HGPIN (no cancer detected after two or more repeated biopsies), 83 men with PCa and 97 men with benign pathology findings (BP: no PCa, HGPIN or ASAP). The median PMU-PCA3 score was 1.56 in men with BP, 2.01 in men with HGPIN (p = 0.128) and 9.06 in men with PCa (p = 0.008). The AUC in the ROC analysis was 0.705 in the subset of men with BP and PCa, while it decreased to 0.629 when only men with isolated HGPIN and PCa were included in the analysis. Fixing the sensitivity of the PMU-PCA3 score at 90%, its specificity was 79% in men with BP and 69% in men with isolated HGPIN. The efficacy of the PMU-PCA3 score to rule out PCa in men with HGPIN is lower than in men with BP.

  3. Developing and Evaluating Creativity Gamification Rehabilitation System: The Application of PCA-ANFIS Based Emotions Model

    Science.gov (United States)

    Su, Chung-Ho; Cheng, Ching-Hsue

    2016-01-01

    This study aims to explore the factors in a patient's rehabilitation achievement after a total knee replacement (TKR) patient exercises, using a PCA-ANFIS emotion model-based game rehabilitation system, which combines virtual reality (VR) and motion capture technology. The researchers combine a principal component analysis (PCA) and an adaptive…

  4. The relationship between Prostate CAncer gene 3 (PCA3) and prostate cancer significance

    NARCIS (Netherlands)

    van Poppel, Hein; Haese, Alexander; Graefen, Markus; de la Taille, Alexandre; Irani, Jacques; de Reijke, Theo; Remzi, Mesut; Marberger, Michael

    2012-01-01

    OBJECTIVE To evaluate the relationship between Prostate CAncer gene 3 (PCA3) and prostate cancer significance. PATIENTS AND METHODS Clinical data from two multi-centre European open-label, prospective studies evaluating the clinical utility of the PCA3 assay in guiding initial and repeat biopsy

  5. The role of PCA3 in the diagnosis of prostate cancer.

    NARCIS (Netherlands)

    Hessels, D.

    2010-01-01

    Serum PSA has shown to be the most valuable tool in the detection, staging and monitoring of prostate cancer (PCa). However, the substantial overlap in serum PSA values between men with non-malignant prostatic diseases and PCa is the limitation of PSA as a prostate tumor marker. In patients with

  6. Protocatechuic acid (PCA) induced a better antiviral effect by immune enhancement in SPF chickens.

    Science.gov (United States)

    Guo, Yongxia; Zhang, Qiang; Zuo, Zonghui; Chu, Jun; Xiao, Hongzhi; Javed, M Tariq; He, Cheng

    2018-01-01

    Protocatechuic acid (PCA) is an antiviral agent against Avian Influenza virus (AIV) and Infectious Bursal Disease (IBD) virus, but its antiviral mechanism is unknown. In this study, we evaluated the humoral and cellular responses to PCA in specific pathogen-free (SPF) chickens. One hundred forty 35-day-old SPF chickens were randomly divided into 7 groups. The birds were inoculated with the commercial, attenuated Newcastle Disease Virus (NDV) vaccine and then received orally with 10, 20 or 40 mg/kg body weight of PCA for 30 days. Immune organ indexes, anti-Newcastle Disease Virus (NDV) antibodies and lymphocyte proliferation, but not body weight, were significantly increased in chicken treated with 40 mg/kg PCA, compared to the control birds treated with Astragalus polysaccharide (ASP). Survival rate was 70% and 60%, respectively, in the chickens with 40 mg/kg PCA, 20 mg/kg PCA while 50% survival was found in the birds treated with 125 mg/kg ASP. PCA treatment resulted in significantly lower viral load and reduced shedding. These results indicate that PCA may improve poultry health by enhancing both the humoral and cellular immune response. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. GO-PCA: An Unsupervised Method to Explore Gene Expression Data Using Prior Knowledge.

    Science.gov (United States)

    Wagner, Florian

    2015-01-01

    Genome-wide expression profiling is a widely used approach for characterizing heterogeneous populations of cells, tissues, biopsies, or other biological specimen. The exploratory analysis of such data typically relies on generic unsupervised methods, e.g. principal component analysis (PCA) or hierarchical clustering. However, generic methods fail to exploit prior knowledge about the molecular functions of genes. Here, I introduce GO-PCA, an unsupervised method that combines PCA with nonparametric GO enrichment analysis, in order to systematically search for sets of genes that are both strongly correlated and closely functionally related. These gene sets are then used to automatically generate expression signatures with functional labels, which collectively aim to provide a readily interpretable representation of biologically relevant similarities and differences. The robustness of the results obtained can be assessed by bootstrapping. I first applied GO-PCA to datasets containing diverse hematopoietic cell types from human and mouse, respectively. In both cases, GO-PCA generated a small number of signatures that represented the majority of lineages present, and whose labels reflected their respective biological characteristics. I then applied GO-PCA to human glioblastoma (GBM) data, and recovered signatures associated with four out of five previously defined GBM subtypes. My results demonstrate that GO-PCA is a powerful and versatile exploratory method that reduces an expression matrix containing thousands of genes to a much smaller set of interpretable signatures. In this way, GO-PCA aims to facilitate hypothesis generation, design of further analyses, and functional comparisons across datasets.

  8. Enabling Agility through Coordinating Temporally Constrained Planning Agents

    NARCIS (Netherlands)

    Steenhuisen, J.R.; De Weerdt, M.M.; Witteveen, C.

    2007-01-01

    In crisis response, hierarchical organizations are being replaced by dynamic assemblies of autonomous agents that promise more agility. However, these autonomous agents might cause a decrease in effectiveness when individually constructed plans for moderately-coupled tasks are not jointly feasible.

  9. Structure constrained semi-nonnegative matrix factorization for EEG-based motor imagery classification.

    Science.gov (United States)

    Lu, Na; Li, Tengfei; Pan, Jinjin; Ren, Xiaodong; Feng, Zuren; Miao, Hongyu

    2015-05-01

    Electroencephalogram (EEG) provides a non-invasive approach to measure the electrical activities of brain neurons and has long been employed for the development of brain-computer interface (BCI). For this purpose, various patterns/features of EEG data need to be extracted and associated with specific events like cue-paced motor imagery. However, this is a challenging task since EEG data are usually non-stationary time series with a low signal-to-noise ratio. In this study, we propose a novel method, called structure constrained semi-nonnegative matrix factorization (SCS-NMF), to extract the key patterns of EEG data in time domain by imposing the mean envelopes of event-related potentials (ERPs) as constraints on the semi-NMF procedure. The proposed method is applicable to general EEG time series, and the extracted temporal features by SCS-NMF can also be combined with other features in frequency domain to improve the performance of motor imagery classification. Real data experiments have been performed using the SCS-NMF approach for motor imagery classification, and the results clearly suggest the superiority of the proposed method. Comparison experiments have also been conducted. The compared methods include ICA, PCA, Semi-NMF, Wavelets, EMD and CSP, which further verified the effectivity of SCS-NMF. The SCS-NMF method could obtain better or competitive performance over the state of the art methods, which provides a novel solution for brain pattern analysis from the perspective of structure constraint. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Early cosmology constrained

    Energy Technology Data Exchange (ETDEWEB)

    Verde, Licia; Jimenez, Raul [Institute of Cosmos Sciences, University of Barcelona, IEEC-UB, Martí Franquès, 1, E08028 Barcelona (Spain); Bellini, Emilio [University of Oxford, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom); Pigozzo, Cassio [Instituto de Física, Universidade Federal da Bahia, Salvador, BA (Brazil); Heavens, Alan F., E-mail: liciaverde@icc.ub.edu, E-mail: emilio.bellini@physics.ox.ac.uk, E-mail: cpigozzo@ufba.br, E-mail: a.heavens@imperial.ac.uk, E-mail: raul.jimenez@icc.ub.edu [Imperial Centre for Inference and Cosmology (ICIC), Imperial College, Blackett Laboratory, Prince Consort Road, London SW7 2AZ (United Kingdom)

    2017-04-01

    We investigate our knowledge of early universe cosmology by exploring how much additional energy density can be placed in different components beyond those in the ΛCDM model. To do this we use a method to separate early- and late-universe information enclosed in observational data, thus markedly reducing the model-dependency of the conclusions. We find that the 95% credibility regions for extra energy components of the early universe at recombination are: non-accelerating additional fluid density parameter Ω{sub MR} < 0.006 and extra radiation parameterised as extra effective neutrino species 2.3 < N {sub eff} < 3.2 when imposing flatness. Our constraints thus show that even when analyzing the data in this largely model-independent way, the possibility of hiding extra energy components beyond ΛCDM in the early universe is seriously constrained by current observations. We also find that the standard ruler, the sound horizon at radiation drag, can be well determined in a way that does not depend on late-time Universe assumptions, but depends strongly on early-time physics and in particular on additional components that behave like radiation. We find that the standard ruler length determined in this way is r {sub s} = 147.4 ± 0.7 Mpc if the radiation and neutrino components are standard, but the uncertainty increases by an order of magnitude when non-standard dark radiation components are allowed, to r {sub s} = 150 ± 5 Mpc.

  11. The biological knowledge discovery by PCCF measure and PCA-F projection.

    Science.gov (United States)

    Jia, Xingang; Zhu, Guanqun; Han, Qiuhong; Lu, Zuhong

    2017-01-01

    In the process of biological knowledge discovery, PCA is commonly used to complement the clustering analysis, but PCA typically gives the poor visualizations for most gene expression data sets. Here, we propose a PCCF measure, and use PCA-F to display clusters of PCCF, where PCCF and PCA-F are modeled from the modified cumulative probabilities of genes. From the analysis of simulated and experimental data sets, we demonstrate that PCCF is more appropriate and reliable for analyzing gene expression data compared to other commonly used distances or similarity measures, and PCA-F is a good visualization technique for identifying clusters of PCCF, where we aim at such data sets that the expression values of genes are collected at different time points.

  12. Constraining neutrinoless double beta decay

    International Nuclear Information System (INIS)

    Dorame, L.; Meloni, D.; Morisi, S.; Peinado, E.; Valle, J.W.F.

    2012-01-01

    A class of discrete flavor-symmetry-based models predicts constrained neutrino mass matrix schemes that lead to specific neutrino mass sum-rules (MSR). We show how these theories may constrain the absolute scale of neutrino mass, leading in most of the cases to a lower bound on the neutrinoless double beta decay effective amplitude.

  13. Global Incidence and Mortality for Prostate Cancer: Analysis of Temporal Patterns and Trends in 36 Countries.

    Science.gov (United States)

    Wong, Martin C S; Goggins, William B; Wang, Harry H X; Fung, Franklin D H; Leung, Colette; Wong, Samuel Y S; Ng, Chi Fai; Sung, Joseph J Y

    2016-11-01

    Prostate cancer (PCa) is a leading cause of mortality and morbidity globally, but its specific geographic patterns and temporal trends are under-researched. To test the hypotheses that PCa incidence is higher and PCa mortality is lower in countries with higher socioeconomic development, and that temporal trends for PCa incidence have increased while mortality has decreased over time. Data on age-standardized incidence and mortality rates in 2012 were retrieved from the GLOBOCAN database. Temporal patterns were assessed for 36 countries using data obtained from Cancer incidence in five continents volumes I-X and the World Health Organization mortality database. Correlations between incidence or mortality rates and socioeconomic indicators (human development index [HDI] and gross domestic product [GDP]) were evaluated. The average annual percent change in PCa incidence and mortality in the most recent 10 yr according to join-point regression. Reported PCa incidence rates varied more than 25-fold worldwide in 2012, with the highest incidence rates observed in Micronesia/Polynesia, the USA, and European countries. Mortality rates paralleled the incidence rates except for Africa, where PCa mortality rates were the highest. Countries with higher HDI (r=0.58) and per capita GDP (r=0.62) reported greater incidence rates. According to the most recent 10-yr temporal data available, most countries experienced increases in incidence, with sharp rises in incidence rates in Asia and Northern and Western Europe. A substantial reduction in mortality rates was reported in most countries, except in some Asian countries and Eastern Europe, where mortality increased. Data in regional registries could be underestimated. PCa incidence has increased while PCa mortality has decreased in most countries. The reported incidence was higher in countries with higher socioeconomic development. The incidence of prostate cancer has shown high variations geographically and over time, with smaller

  14. [The value of PHI/PCA3 in the early diagnosis of prostate cancer].

    Science.gov (United States)

    Tan, S J; Xu, L W; Xu, Z; Wu, J P; Liang, K; Jia, R P

    2016-01-12

    To investigate the value of prostate health index (PHI) and prostate cancer gene 3 (PCA3) in the early diagnosis of prostate cancer (PCa). A total of 190 patients with abnormal serum prostate specific antigen (PSA) or abnormal digital rectal examination were enrolled. They were all underwent initial biopsy and 11 of them were also underwent repeated biopsy. In addition, 25 healthy cases (with normal digital rectal examination and PSAPHI and PCA3 were detected by using immunofluorescence and Loop-Mediated Isothermal Amplification (LAMP). The sensitivity and specificity of diagnosis were determined by ROC curve.In addition, the relationship between PHI/PSA and the Gleason score and clinical stage were analyzed. A total of 89 patients were confirmed PCa by Pathological diagnosis. The other 101 patients were diagnosed as benign prostatic hyperplasia (BPH). The sensitivity and specificity of PCA3 test were 85.4% was 92.1%. Area under curve (AUC) of PHI is higher than AUC of PSA (0.727>0.699). The PHI in peripheral blood was positively correlated with Gleason score and clinical stage. The detection of PCA3 and PHI shows excellent detecting effectiveness. Compared with single PSA, the combined detection of PHI and PCA3 improved the diagnostic specificity. It can provide a new method for the early diagnosis in prostate cancer and avoid unnecessary biopsies.

  15. GND-PCA-based statistical modeling of diaphragm motion extracted from 4D MRI.

    Science.gov (United States)

    Swastika, Windra; Masuda, Yoshitada; Xu, Rui; Kido, Shoji; Chen, Yen-Wei; Haneishi, Hideaki

    2013-01-01

    We analyzed a statistical model of diaphragm motion using regular principal component analysis (PCA) and generalized N-dimensional PCA (GND-PCA). First, we generate 4D MRI of respiratory motion from 2D MRI using an intersection profile method. We then extract semiautomatically the diaphragm boundary from the 4D-MRI to get subject-specific diaphragm motion. In order to build a general statistical model of diaphragm motion, we normalize the diaphragm motion in time and spatial domains and evaluate the diaphragm motion model of 10 healthy subjects by applying regular PCA and GND-PCA. We also validate the results using the leave-one-out method. The results show that the first three principal components of regular PCA contain more than 98% of the total variation of diaphragm motion. However, validation using leave-one-out method gives up to 5.0 mm mean of error for right diaphragm motion and 3.8 mm mean of error for left diaphragm motion. Model analysis using GND-PCA provides about 1 mm margin of error and is able to reconstruct the diaphragm model by fewer samples.

  16. The applications of PCA in QSAR studies: A case study on CCR5 antagonists.

    Science.gov (United States)

    Yoo, ChangKyoo; Shahlaei, Mohsen

    2018-01-01

    Principal component analysis (PCA), as a well-known multivariate data analysis and data reduction technique, is an important and useful algebraic tool in drug design and discovery. PCA, in a typical quantitative structure-activity relationship (QSAR) study, analyzes an original data matrix in which molecules are described by several intercorrelated quantitative dependent variables (molecular descriptors). Although extensively applied, there is disparity in the literature with respect to the applications of PCA in the QSAR studies. This study investigates the different applications of PCA in QSAR studies using a dataset including CCR5 inhibitors. The different types of preprocessing are used to compare the PCA performances. The use of PC plots in the exploratory investigation of matrix of descriptors is described. This work is also proved PCA analysis to be a powerful technique for exploring complex datasets in QSAR studies for identification of outliers. This study shows that PCA is able to easily apply to the pool of calculated structural descriptors and also the extracted information can be used to help decide upon an appropriate harder model for further analysis. © 2017 John Wiley & Sons A/S.

  17. Periarticular infiltration for pain relief after total hip arthroplasty: a comparison with epidural and PCA analgesia.

    Science.gov (United States)

    Pandazi, Ageliki; Kanellopoulos, Ilias; Kalimeris, Konstantinos; Batistaki, Chrysanthi; Nikolakopoulos, Nikolaos; Matsota, Paraskevi; Babis, George C; Kostopanagiotou, Georgia

    2013-11-01

    Epidural and intravenous patient-controlled analgesia (PCA) are established methods for pain relief after total hip arthroplasty (THA). Periarticular infiltration is an alternative method that is gaining ground due to its simplicity and safety. Our study aims to assess the efficacy of periarticular infiltration in pain relief after THA. Sixty-three patients undergoing THA under spinal anaesthesia were randomly assigned to receive postoperative analgesia with continuous epidural infusion with ropivacaine (epidural group), intraoperative periarticular infiltration with ropivacaine, clonidine, morphine, epinephrine and corticosteroids (infiltration group) or PCA with morphine (PCA group). PCA morphine provided rescue analgesia in all groups. We recorded morphine consumption, visual analog scale (VAS) scores at rest and movement, blood loss from wound drainage, mean arterial pressure (MAP) and adverse effects at 1, 6, 12, 24 h postoperatively. Morphine consumption at all time points, VAS scores at rest, 6, 12 and 24 h and at movement, 6 and 12 h postoperatively were lower in infiltration group compared to PCA group (p PCA group (p PCA with morphine after THA, providing better pain relief and lower opioid consumption postoperatively. Infiltration seems to be equally effective to epidural analgesia without having the potential side effects of the latter.

  18. Constrained evolution in numerical relativity

    Science.gov (United States)

    Anderson, Matthew William

    The strongest potential source of gravitational radiation for current and future detectors is the merger of binary black holes. Full numerical simulation of such mergers can provide realistic signal predictions and enhance the probability of detection. Numerical simulation of the Einstein equations, however, is fraught with difficulty. Stability even in static test cases of single black holes has proven elusive. Common to unstable simulations is the growth of constraint violations. This work examines the effect of controlling the growth of constraint violations by solving the constraints periodically during a simulation, an approach called constrained evolution. The effects of constrained evolution are contrasted with the results of unconstrained evolution, evolution where the constraints are not solved during the course of a simulation. Two different formulations of the Einstein equations are examined: the standard ADM formulation and the generalized Frittelli-Reula formulation. In most cases constrained evolution vastly improves the stability of a simulation at minimal computational cost when compared with unconstrained evolution. However, in the more demanding test cases examined, constrained evolution fails to produce simulations with long-term stability in spite of producing improvements in simulation lifetime when compared with unconstrained evolution. Constrained evolution is also examined in conjunction with a wide variety of promising numerical techniques, including mesh refinement and overlapping Cartesian and spherical computational grids. Constrained evolution in boosted black hole spacetimes is investigated using overlapping grids. Constrained evolution proves to be central to the host of innovations required in carrying out such intensive simulations.

  19. Distinct clinical and metabolic deficits in PCA and AD are not related to amyloid distribution.

    Science.gov (United States)

    Rosenbloom, M H; Alkalay, A; Agarwal, N; Baker, S L; O'Neil, J P; Janabi, M; Yen, I V; Growdon, M; Jang, J; Madison, C; Mormino, E C; Rosen, H J; Gorno-Tempini, M L; Weiner, M W; Miller, B L; Jagust, W J; Rabinovici, G D

    2011-05-24

    Patients with posterior cortical atrophy (PCA) often have Alzheimer disease (AD) at autopsy, yet are cognitively and anatomically distinct from patients with clinical AD. We sought to compare the distribution of β-amyloid and glucose metabolism in PCA and AD in vivo using Pittsburgh compound B (PiB) and FDG-PET. Patients with PCA (n = 12, age 57.5 ± 7.4, Mini-Mental State Examination [MMSE] 22.2 ± 5.1), AD (n = 14, age 58.8 ± 9.6, MMSE 23.8 ± 6.7), and cognitively normal controls (NC, n = 30, age 73.6 ± 6.4) underwent PiB and FDG-PET. Group differences in PiB distribution volume ratios (DVR, cerebellar reference) and FDG uptake (pons-averaged) were assessed on a voxel-wise basis and by comparing binding in regions of interest (ROIs). Compared to NC, both patients with AD and patients with PCA showed diffuse PiB uptake throughout frontal, temporoparietal, and occipital cortex (p PCA and AD even after correcting for atrophy. FDG patterns in PCA and AD were distinct: while both groups showed hypometabolism compared to NC in temporoparietal cortex and precuneus/posterior cingulate, patients with PCA further showed hypometabolism in inferior occipitotemporal cortex compared to both NC and patients with AD (p PCA. Fibrillar amyloid deposition in PCA is diffuse and similar to AD, while glucose hypometabolism extends more posteriorly into occipital cortex. Further studies are needed to determine the mechanisms of selective network degeneration in focal variants of AD.

  20. Can we use PCA to detect small signals in noisy data?

    Science.gov (United States)

    Spiegelberg, Jakob; Rusz, Ján

    2017-01-01

    Principal component analysis (PCA) is among the most commonly applied dimension reduction techniques suitable to denoise data. Focusing on its limitations to detect low variance signals in noisy data, we discuss how statistical and systematical errors occur in PCA reconstructed data as a function of the size of the data set, which extends the work of Lichtert and Verbeeck, (2013) [16]. Particular attention is directed towards the estimation of bias introduced by PCA and its influence on experiment design. Aiming at the denoising of large matrices, nullspace based denoising (NBD) is introduced. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Improved swelling resistance for PCA austenitic stainless steel under HFIR irradiation through microstructural control

    International Nuclear Information System (INIS)

    Maziasz, P.J.; Braski, D.N.

    1984-01-01

    Swelling evaluation of PCA variants and 20%-cold-worked (N-Lot) type 316 stainless steel (CW 316) at 300 to 600 0 C was extended to 44 dpa. Swelling was negligible in all the steels at 300 0 C after approx. 44 dpa. At 500 to 600 0 C 25%-cold-worked PCA showed better void swelling resistance than type 316 at approx. 44 dpa. There was less swelling variation among alloys at 400 0 C, but again 25%-cold-worked PCA was the best

  2. The spatial and temporal variations of nematofauna of recovering ...

    African Journals Online (AJOL)

    The spatio-temporal variations in physical sediment characteristics and nematode community assemblages were investigated and compared between a natural, a 10-year reforested, and a degraded Rhizophora mucronata mangrove ecosystem in Gazi Bay, Kenya. PCA showed a clear separation of the degraded site from ...

  3. Identification of spatially-localized initial conditions via sparse PCA

    Science.gov (United States)

    Dwivedi, Anubhav; Jovanovic, Mihailo

    2017-11-01

    Principal Component Analysis involves maximization of a quadratic form subject to a quadratic constraint on the initial flow perturbations and it is routinely used to identify the most energetic flow structures. For general flow configurations, principal components can be efficiently computed via power iteration of the forward and adjoint governing equations. However, the resulting flow structures typically have a large spatial support leading to a question of physical realizability. To obtain spatially-localized structures, we modify the quadratic constraint on the initial condition to include a convex combination with an additional regularization term which promotes sparsity in the physical domain. We formulate this constrained optimization problem as a nonlinear eigenvalue problem and employ an inverse power-iteration-based method to solve it. The resulting solution is guaranteed to converge to a nonlinear eigenvector which becomes increasingly localized as our emphasis on sparsity increases. We use several fluids examples to demonstrate that our method indeed identifies the most energetic initial perturbations that are spatially compact. This work was supported by Office of Naval Research through Grant Number N00014-15-1-2522.

  4. Prostate health index (phi) and prostate cancer antigen 3 (PCA3) significantly improve diagnostic accuracy in patients undergoing prostate biopsy.

    Science.gov (United States)

    Perdonà, Sisto; Bruzzese, Dario; Ferro, Matteo; Autorino, Riccardo; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; Longo, Michele; Spinelli, Rosa; Di Lorenzo, Giuseppe; Oliva, Andrea; De Sio, Marco; Damiano, Rocco; Altieri, Vincenzo; Terracciano, Daniela

    2013-02-15

    Prostate health index (phi) and prostate cancer antigen 3 (PCA3) have been recently proposed as novel biomarkers for prostate cancer (PCa). We assessed the diagnostic performance of these biomarkers, alone or in combination, in men undergoing first prostate biopsy for suspicion of PCa. One hundred sixty male subjects were enrolled in this prospective observational study. PSA molecular forms, phi index (Beckman coulter immunoassay), PCA3 score (Progensa PCA3 assay), and other established biomarkers (tPSA, fPSA, and %fPSA) were assessed before patients underwent a 18-core first prostate biopsy. The discriminating ability between PCa-negative and PCa-positive biopsies of Beckman coulter phi and PCA3 score and other used biomarkers were determined. One hundred sixty patients met inclusion criteria. %p2PSA (p2PSA/fPSA × 100), phi and PCA3 were significantly higher in patients with PCa compared to PCa-negative group (median values: 1.92 vs. 1.55, 49.97 vs. 36.84, and 50 vs. 32, respectively, P ≤ 0.001). ROC curve analysis showed that %p2PSA, phi, and PCA3 are good indicator of malignancy (AUCs = 0.68, 0.71, and 0.66, respectively). A multivariable logistic regression model consisting of both the phi index and PCA3 score allowed to reach an overall diagnostic accuracy of 0.77. Decision curve analysis revealed that this "combined" marker achieved the highest net benefit over the examined range of the threshold probability. phi and PCA3 showed no significant difference in the ability to predict PCa diagnosis in men undergoing first prostate biopsy. However, diagnostic performance is significantly improved by combining phi and PCA3. Copyright © 2012 Wiley Periodicals, Inc.

  5. Lightweight cryptography for constrained devices

    DEFF Research Database (Denmark)

    Alippi, Cesare; Bogdanov, Andrey; Regazzoni, Francesco

    2014-01-01

    Lightweight cryptography is a rapidly evolving research field that responds to the request for security in resource constrained devices. This need arises from crucial pervasive IT applications, such as those based on RFID tags where cost and energy constraints drastically limit the solution...... complexity, with the consequence that traditional cryptography solutions become too costly to be implemented. In this paper, we survey design strategies and techniques suitable for implementing security primitives in constrained devices....

  6. Two-dimensional PCA-based human gait identification

    Science.gov (United States)

    Chen, Jinyan; Wu, Rongteng

    2012-11-01

    It is very necessary to recognize person through visual surveillance automatically for public security reason. Human gait based identification focus on recognizing human by his walking video automatically using computer vision and image processing approaches. As a potential biometric measure, human gait identification has attracted more and more researchers. Current human gait identification methods can be divided into two categories: model-based methods and motion-based methods. In this paper a two-Dimensional Principal Component Analysis and temporal-space analysis based human gait identification method is proposed. Using background estimation and image subtraction we can get a binary images sequence from the surveillance video. By comparing the difference of two adjacent images in the gait images sequence, we can get a difference binary images sequence. Every binary difference image indicates the body moving mode during a person walking. We use the following steps to extract the temporal-space features from the difference binary images sequence: Projecting one difference image to Y axis or X axis we can get two vectors. Project every difference image in the difference binary images sequence to Y axis or X axis difference binary images sequence we can get two matrixes. These two matrixes indicate the styles of one walking. Then Two-Dimensional Principal Component Analysis(2DPCA) is used to transform these two matrixes to two vectors while at the same time keep the maximum separability. Finally the similarity of two human gait images is calculated by the Euclidean distance of the two vectors. The performance of our methods is illustrated using the CASIA Gait Database.

  7. Using a cross-model loadings plot to identify protein spots causing 2-DE gels to become outliers in PCA

    DEFF Research Database (Denmark)

    Kristiansen, Luise Cederkvist; Jacobsen, Susanne; Jessen, Flemming

    2010-01-01

    The multivariate method PCA is an exploratory tool often used to get an overview of multivariate data, such as the quantified spot volumes of digitized 2-DE gels. PCA can reveal hidden structures present in the data, and thus enables identification of potential outliers and clustering. Based on PCA...

  8. Faults detection approach using PCA and SOM algorithm in PMSG-WT system

    Directory of Open Access Journals (Sweden)

    Mohamed Lamine FADDA

    2016-07-01

    Full Text Available In this paper, a new approach for faults detection in observable data system wind turbine - permanent magnet synchronous generator (WT-PMSG, the studying objective, illustrate the combination (SOM-PCA to build Multi-local-PCA models faults detection in system (WT-PMSG, the performance of the method suggested to faults detection in system data, finding good results in simulation experiment.

  9. Autophagosomal Sequestration of Mitochondria as an Indicator of Antiandrogen Therapy Resistance of Prostate Cancer (PCa)

    Science.gov (United States)

    2017-11-01

    Prostate Cancer (PCa) PRINCIPAL INVESTIGATOR: George Wilding, M.D. CONTRACTING ORGANIZATION: University of Texas MD Anderson Cancer Center Houston, TX...Indicator of Antiandrogen Therapy Resistance of Prostate Cancer (PCa) 5b. GRANT NUMBER W81XWH-15-1-0509 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER The University of Texas MD Anderson Cancer Center 1515 Holcombe Blvd. Houston, TX 77030-4009

  10. Lactate Oxidation Coupled to Iron or Electrode Reduction by Geobacter sulfurreducens PCA

    KAUST Repository

    Call, D. F.

    2011-10-14

    Geobacter sulfurreducens PCA completely oxidized lactate and reduced iron or an electrode, producing pyruvate and acetate intermediates. Compared to the current produced by Shewanella oneidensis MR-1, G. sulfurreducens PCA produced 10-times-higher current levels in lactate-fed microbial electrolysis cells. The kinetic and comparative analyses reported here suggest a prominent role of G. sulfurreducens strains in metaland electrode-reducing communities supplied with lactate. © 2011, American Society for Microbiology.

  11. Lactate Oxidation Coupled to Iron or Electrode Reduction by Geobacter sulfurreducens PCA

    KAUST Repository

    Call, D. F.; Logan, B. E.

    2011-01-01

    Geobacter sulfurreducens PCA completely oxidized lactate and reduced iron or an electrode, producing pyruvate and acetate intermediates. Compared to the current produced by Shewanella oneidensis MR-1, G. sulfurreducens PCA produced 10-times-higher current levels in lactate-fed microbial electrolysis cells. The kinetic and comparative analyses reported here suggest a prominent role of G. sulfurreducens strains in metaland electrode-reducing communities supplied with lactate. © 2011, American Society for Microbiology.

  12. Lateral supraorbital approach to ipsilateral PCA-P1 and ICA-PCoA aneurysms.

    Science.gov (United States)

    Goehre, Felix; Jahromi, Behnam Rezai; Elsharkawy, Ahmed; Lehto, Hanna; Shekhtman, Oleg; Andrade-Barazarte, Hugo; Munoz, Francisco; Hijazy, Ferzat; Makhkamov, Makhkam; Hernesniemi, Juha

    2015-01-01

    Aneurysms of the posterior cerebral artery (PCA) are rare and often associated with anterior circulation aneurysms. The lateral supraorbital approach allows for a very fast and safe approach to the ipsilateral lesions Circle of Willis. A technical note on the successful clip occlusion of two aneurysms in the anterior and posterior Circle of Willis via this less invasive approach has not been published before. The objective of this technical note is to describe the simultaneous microsurgical clip occlusion of an ipsilateral PCA-P1 and an internal carotid artery - posterior communicating artery (ICA-PCoA) aneurysm via the lateral supraorbital approach. The authors present a technical report of successful clip occlusions of ipsilateral located PCA-P1 and ICA-PCoA aneurysms. A 59-year-old female patient was diagnosed with a PCA-P1 and an ipsilateral ICA-PCoA aneurysm by computed tomography angiography (CTA) after an ischemic stroke secondary to a contralateral ICA dissection. The patient underwent microsurgical clipping after a lateral supraorbital craniotomy. The intraoperative indocyanine green (ICG) videoangiography and the postoperative CTA showed a complete occlusion of both aneurysms; the parent vessels (ICA and PCA) were patent. The patient presents postoperative no new neurologic deficit. The lateral supraorbital approach is suitable for the simultaneous microsurgical treatment of proximal anterior circulation and ipsilateral proximal PCA aneurysms. Compared to endovascular treatment, direct visual control of brainstem perforators is possible.

  13. Efficacy and tolerability of intravenous morphine patient-controlled analgesia (PCA) in women undergoing cesarean delivery.

    Science.gov (United States)

    Andziak, Marta; Beta, Jarosław; Barwijuk, Michal; Issat, Tadeusz; Jakimiuk, Artur J

    2015-06-01

    The aim of the study was to evaluate analgesic efficacy and tolerability of patient-controlled analgesia (PCA) with intravenous morphine. Our observational study included 50 women who underwent a Misgav-Ladach or modified Misgav-Ladach cesarean section. Automated PCA infusion device (Medima S-PCA Syringe Pump, Medima, Krakow, Poland) was used for postoperative pain control. Time of morphine administration or initiation of intravenous patient-controlled analgesia (IV PCA) with morphine was recorded, as well as post-operative pain at rest assessed by a visual analogue scale (VAS). All patients were followed up for 24 hours after discharge from the operating room, taking into account patient records, worst pain score at rest, number of IV PCA attempts, and drug consumption. Median of total morphine doses used during the postoperative period was 42.9mg (IQR 35.6-48.5), with median infusion time of 687.0 min. (IQR 531.0-757.5). Pain severity and total drug consumption improved after the first 3 hours following cesarean delivery (p PCA attempts per patient was 33 (IQR: 24-37), with median of 11 placebo attempts (IQR: 3-27). Patient-controlled analgesia with morphine is an efficient and acceptable analgesic method in women undergoing cesarean section.

  14. Temporal networks

    CERN Document Server

    Saramäki, Jari

    2013-01-01

    The concept of temporal networks is an extension of complex networks as a modeling framework to include information on when interactions between nodes happen. Many studies of the last decade examine how the static network structure affect dynamic systems on the network. In this traditional approach  the temporal aspects are pre-encoded in the dynamic system model. Temporal-network methods, on the other hand, lift the temporal information from the level of system dynamics to the mathematical representation of the contact network itself. This framework becomes particularly useful for cases where there is a lot of structure and heterogeneity both in the timings of interaction events and the network topology. The advantage compared to common static network approaches is the ability to design more accurate models in order to explain and predict large-scale dynamic phenomena (such as, e.g., epidemic outbreaks and other spreading phenomena). On the other hand, temporal network methods are mathematically and concept...

  15. Optimization of temporal networks under uncertainty

    CERN Document Server

    Wiesemann, Wolfram

    2012-01-01

    Many decision problems in Operations Research are defined on temporal networks, that is, workflows of time-consuming tasks whose processing order is constrained by precedence relations. For example, temporal networks are used to model projects, computer applications, digital circuits and production processes. Optimization problems arise in temporal networks when a decision maker wishes to determine a temporal arrangement of the tasks and/or a resource assignment that optimizes some network characteristic (e.g. the time required to complete all tasks). The parameters of these optimization probl

  16. PCA as a practical indicator of OPLS-DA model reliability.

    Science.gov (United States)

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  17. PCA based clustering for brain tumor segmentation of T1w MRI images.

    Science.gov (United States)

    Kaya, Irem Ersöz; Pehlivanlı, Ayça Çakmak; Sekizkardeş, Emine Gezmez; Ibrikci, Turgay

    2017-03-01

    Medical images are huge collections of information that are difficult to store and process consuming extensive computing time. Therefore, the reduction techniques are commonly used as a data pre-processing step to make the image data less complex so that a high-dimensional data can be identified by an appropriate low-dimensional representation. PCA is one of the most popular multivariate methods for data reduction. This paper is focused on T1-weighted MRI images clustering for brain tumor segmentation with dimension reduction by different common Principle Component Analysis (PCA) algorithms. Our primary aim is to present a comparison between different variations of PCA algorithms on MRIs for two cluster methods. Five most common PCA algorithms; namely the conventional PCA, Probabilistic Principal Component Analysis (PPCA), Expectation Maximization Based Principal Component Analysis (EM-PCA), Generalize Hebbian Algorithm (GHA), and Adaptive Principal Component Extraction (APEX) were applied to reduce dimensionality in advance of two clustering algorithms, K-Means and Fuzzy C-Means. In the study, the T1-weighted MRI images of the human brain with brain tumor were used for clustering. In addition to the original size of 512 lines and 512 pixels per line, three more different sizes, 256 × 256, 128 × 128 and 64 × 64, were included in the study to examine their effect on the methods. The obtained results were compared in terms of both the reconstruction errors and the Euclidean distance errors among the clustered images containing the same number of principle components. According to the findings, the PPCA obtained the best results among all others. Furthermore, the EM-PCA and the PPCA assisted K-Means algorithm to accomplish the best clustering performance in the majority as well as achieving significant results with both clustering algorithms for all size of T1w MRI images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. 2D-3D Face Recognition Method Basedon a Modified CCA-PCA Algorithm

    Directory of Open Access Journals (Sweden)

    Patrik Kamencay

    2014-03-01

    Full Text Available This paper presents a proposed methodology for face recognition based on an information theory approach to coding and decoding face images. In this paper, we propose a 2D-3D face-matching method based on a principal component analysis (PCA algorithm using canonical correlation analysis (CCA to learn the mapping between a 2D face image and 3D face data. This method makes it possible to match a 2D face image with enrolled 3D face data. Our proposed fusion algorithm is based on the PCA method, which is applied to extract base features. PCA feature-level fusion requires the extraction of different features from the source data before features are merged together. Experimental results on the TEXAS face image database have shown that the classification and recognition results based on the modified CCA-PCA method are superior to those based on the CCA method. Testing the 2D-3D face match results gave a recognition rate for the CCA method of a quite poor 55% while the modified CCA method based on PCA-level fusion achieved a very good recognition score of 85%.

  19. Differential research of inflammatory and related mediators in BPH, histological prostatitis and PCa.

    Science.gov (United States)

    Huang, T R; Wang, G C; Zhang, H M; Peng, B

    2018-02-14

    Prostate cancer (PCa) is one of the most common male malignancies in the world. It was aimed to investigate differential expression of inflammatory and related factors in benign prostatic hyperplasia (BPH), prostate cancer (PCa), histological prostatitis (HP) and explore the role of Inducible nitric oxide synthase (iNOS), (VEGF) Vascular endothelial growth factor, androgen receptor (AR) and IL-2, IL-8 and TNF-α in the occurrence and development of prostate cancer. RT-PCR was used to detect the mRNA expression level of iNOS, VEGF, AR and IL-2, IL-8 and TNF-α in BPH, PCa and BPH+HP. Western blotting and immunohistochemical staining were used to detect the protein levels of various proteins in three diseases. The results showed the mRNA and protein levels of iNOS, VEGF and IL-2, IL-8 and TNF-α were significantly increased in PCa and BPH+HP groups compared with BPH group (p BPH+HP groups (p BPH+HP groups (p > .05). iNOS, VEGF, AR and IL-2, IL-8 and TNF-α are involved in the malignant transformation of prostate tissue and play an important role in the development and progression of Prostate cancer (PCa). © 2018 Blackwell Verlag GmbH.

  20. PCA criterion for SVM (MLP) classifier for flavivirus biomarker from salivary SERS spectra at febrile stage.

    Science.gov (United States)

    Radzol, A R M; Lee, Khuan Y; Mansor, W; Omar, I S

    2016-08-01

    Non-structural protein (NS1) has been conceded as one of the biomarkers for flavivirus that causes diseases with life threatening consequences. NS1 is an antigen that allows detection of the illness at febrile stage, mostly from blood samples currently. Our work here intends to define an optimum model for PCA-SVM with MLP kernel for classification of flavivirus biomarker, NS1 molecule, from SERS spectra of saliva, which to the best of our knowledge has never been explored. Since performance of the model depends on the PCA criterion and MLP parameters, both are examined in tandem. Input vector to classifier determined by each PCA criterion is subjected to brute force tuning of MLP parameters for entirety. Its performance is also compared to our previous works where a Linear and RBF kernel are used. It is found that the best PCA-SVM (MLP) model can be defined by 5 PCs from Cattel's Scree test for PCA, together with P1 and P2 values of 0.1 and -0.2 respectively, with a classification performance of [96.9%, 93.8%, 100.0%].

  1. Avoiding Optimal Mean ℓ2,1-Norm Maximization-Based Robust PCA for Reconstruction.

    Science.gov (United States)

    Luo, Minnan; Nie, Feiping; Chang, Xiaojun; Yang, Yi; Hauptmann, Alexander G; Zheng, Qinghua

    2017-04-01

    Robust principal component analysis (PCA) is one of the most important dimension-reduction techniques for handling high-dimensional data with outliers. However, most of the existing robust PCA presupposes that the mean of the data is zero and incorrectly utilizes the average of data as the optimal mean of robust PCA. In fact, this assumption holds only for the squared [Formula: see text]-norm-based traditional PCA. In this letter, we equivalently reformulate the objective of conventional PCA and learn the optimal projection directions by maximizing the sum of projected difference between each pair of instances based on [Formula: see text]-norm. The proposed method is robust to outliers and also invariant to rotation. More important, the reformulated objective not only automatically avoids the calculation of optimal mean and makes the assumption of centered data unnecessary, but also theoretically connects to the minimization of reconstruction error. To solve the proposed nonsmooth problem, we exploit an efficient optimization algorithm to soften the contributions from outliers by reweighting each data point iteratively. We theoretically analyze the convergence and computational complexity of the proposed algorithm. Extensive experimental results on several benchmark data sets illustrate the effectiveness and superiority of the proposed method.

  2. Constraining walking and custodial technicolor

    DEFF Research Database (Denmark)

    Foadi, Roshan; Frandsen, Mads Toudal; Sannino, Francesco

    2008-01-01

    We show how to constrain the physical spectrum of walking technicolor models via precision measurements and modified Weinberg sum rules. We also study models possessing a custodial symmetry for the S parameter at the effective Lagrangian level-custodial technicolor-and argue that these models...

  3. Project Temporalities

    DEFF Research Database (Denmark)

    Tryggestad, Kjell; Justesen, Lise; Mouritsen, Jan

    2013-01-01

    Purpose – The purpose of this paper is to explore how animals can become stakeholders in interaction with project management technologies and what happens with project temporalities when new and surprising stakeholders become part of a project and a recognized matter of concern to be taken...... into account. Design/methodology/approach – The paper is based on a qualitative case study of a project in the building industry. The authors use actor-network theory (ANT) to analyze the emergence of animal stakeholders, stakes and temporalities. Findings – The study shows how project temporalities can...... multiply in interaction with project management technologies and how conventional linear conceptions of project time may be contested with the emergence of new non-human stakeholders and temporalities. Research limitations/implications – The study draws on ANT to show how animals can become stakeholders...

  4. Predicting prostate biopsy outcome: prostate health index (phi) and prostate cancer antigen 3 (PCA3) are useful biomarkers.

    Science.gov (United States)

    Ferro, Matteo; Bruzzese, Dario; Perdonà, Sisto; Mazzarella, Claudia; Marino, Ada; Sorrentino, Alessandra; Di Carlo, Angelina; Autorino, Riccardo; Di Lorenzo, Giuseppe; Buonerba, Carlo; Altieri, Vincenzo; Mariano, Angela; Macchia, Vincenzo; Terracciano, Daniela

    2012-08-16

    Indication for prostate biopsy is presently mainly based on prostate-specific antigen (PSA) serum levels and digital-rectal examination (DRE). In view of the unsatisfactory accuracy of these two diagnostic exams, research has focused on novel markers to improve pre-biopsy prostate cancer detection, such as phi and PCA3. The purpose of this prospective study was to assess the diagnostic accuracy of phi and PCA3 for prostate cancer using biopsy as gold standard. Phi index (Beckman coulter immunoassay), PCA3 score (Progensa PCA3 assay) and other established biomarkers (tPSA, fPSA and %fPSA) were assessed before a 18-core prostate biopsy in a group of 251 subjects at their first biopsy. Values of %p2PSA and phi were significantly higher in patients with PCa compared with PCa-negative group (pphi and PCA3 are predictive of malignancy. In conclusion, %p2PSA, phi and PCA3 may predict a diagnosis of PCa in men undergoing their first prostate biopsy. PCA3 score is more useful in discriminating between HGPIN and non-cancer. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. COMBINING PCA ANALYSIS AND ARTIFICIAL NEURAL NETWORKS IN MODELLING ENTREPRENEURIAL INTENTIONS OF STUDENTS

    Directory of Open Access Journals (Sweden)

    Marijana Zekić-Sušac

    2013-02-01

    Full Text Available Despite increased interest in the entrepreneurial intentions and career choices of young adults, reliable prediction models are yet to be developed. Two nonparametric methods were used in this paper to model entrepreneurial intentions: principal component analysis (PCA and artificial neural networks (ANNs. PCA was used to perform feature extraction in the first stage of modelling, while artificial neural networks were used to classify students according to their entrepreneurial intentions in the second stage. Four modelling strategies were tested in order to find the most efficient model. Dataset was collected in an international survey on entrepreneurship self-efficacy and identity. Variables describe students’ demographics, education, attitudes, social and cultural norms, self-efficacy and other characteristics. The research reveals benefits from the combination of the PCA and ANNs in modeling entrepreneurial intentions, and provides some ideas for further research.

  6. Enhancement of noisy EDX HRSTEM spectrum-images by combination of filtering and PCA.

    Science.gov (United States)

    Potapov, Pavel; Longo, Paolo; Okunishi, Eiji

    2017-05-01

    STEM spectrum-imaging with collecting EDX signal is considered in view of the extraction of maximum information from very noisy data. It is emphasized that spectrum-images with weak EDX signal often suffer from information loss in the course of PCA treatment. The loss occurs when the level of random noise exceeds a certain threshold. Weighted PCA, though potentially helpful in isolation of meaningful variations from noise, might provoke the complete loss of information in the situation of weak EDX signal. Filtering datasets prior PCA can improve the situation and recover the lost information. In particular, Gaussian kernel filters are found to be efficient. A new filter useful in the case of sparse atomic-resolution EDX spectrum-images is suggested. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Fundamental flow and fracture analysis of prime candidate alloy (PCA) for path a (austenitics)

    International Nuclear Information System (INIS)

    Lucas, G.E.; Jayakumar, M.; Maziasz, P.J.

    1982-01-01

    Room temperature microhardness tests have been performed on samples of Prime Candidate Alloy (PCA) for the austenitics (Path A) subjected to various thermomechanical treatments (TMT). The TMTs have effected various microstructures, which have been well characterized by optical metallography and TEM. For comparison, microhardness tests have been performed on samples of N-lot, DO heat and MFE 316 stainless steel with similar TMTs. The results indicate that the TMTs investigated can significantly alter the microhardness of the PCA in a manner which is consistent with microstructural changes. Moreover, while PCA had the lowest microhardness of the four alloys types after cold working, its microhardness increased while the others decreased to comparable values after aging for 2 h at 750 0 C

  8. Improved swelling resistance for PCA austenitic stainless steel under HFIR irradiation through microstructural control

    International Nuclear Information System (INIS)

    Maziasz, P.J.; Braski, D.N.

    1983-01-01

    Six microstructural variants of Prime Candidate Alloy (PCA) were evaluated for swelling resistance during HFIR irradiation, together with several heats of type 316 stainless steel (316). Swelling was negligible in all the steels at 300 0 C after approx. 44 dpa. At 500 to 600 0 C 25%-cold-worked PCA showed better void swelling resistance than type 316 at approx. 44 dpa. There was less swelling variability among alloys at 400 0 C, but again 25%-cold-worked PCA was the best. Microstructurally, swelling resistance correlated with development of fine, stable bubbles whereas high swelling was due to coarser distributions of bubbles becoming unstable and converting to voids (bias-driven cavities)

  9. A comparative study of PCA, SIMCA and Cole model for classification of bioimpedance spectroscopy measurements.

    Science.gov (United States)

    Nejadgholi, Isar; Bolic, Miodrag

    2015-08-01

    Due to safety and low cost of bioimpedance spectroscopy (BIS), classification of BIS can be potentially a preferred way of detecting changes in living tissues. However, for longitudinal datasets linear classifiers fail to classify conventional Cole parameters extracted from BIS measurements because of their high variability. In some applications, linear classification based on Principal Component Analysis (PCA) has shown more accurate results. Yet, these methods have not been established for BIS classification, since PCA features have neither been investigated in combination with other classifiers nor have been compared to conventional Cole features in benchmark classification tasks. In this work, PCA and Cole features are compared in three synthesized benchmark classification tasks which are expected to be detected by BIS. These three tasks are classification of before and after geometry change, relative composition change and blood perfusion in a cylindrical organ. Our results show that in all tasks the features extracted by PCA are more discriminant than Cole parameters. Moreover, a pilot study was done on a longitudinal arm BIS dataset including eight subjects and three arm positions. The goal of the study was to compare different methods in arm position classification which includes all three synthesized changes mentioned above. Our comparative study on various classification methods shows that the best classification accuracy is obtained when PCA features are classified by a K-Nearest Neighbors (KNN) classifier. The results of this work suggest that PCA+KNN is a promising method to be considered for classification of BIS datasets that deal with subject and time variability. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Controversies in using urine samples for prostate cancer detection: PSA and PCA3 expression analysis

    Directory of Open Access Journals (Sweden)

    S. Fontenete

    2011-12-01

    Full Text Available PURPOSE: Prostate cancer (PCa is one of the most commonly diagnosed malignancies in the world. Although PSA utilization as a serum marker has improved prostate cancer detection it still presents some limitations, mainly regarding its specificity. The expression of this marker, along with the detection of PCA3 mRNA in urine samples, has been suggested as a new approach for PCa detection. The goal of this work was to evaluate the efficacy of the urinary detection of PCA3 mRNA and PSA mRNA without performing the somewhat embarrassing prostate massage. It was also intended to optimize and implement a methodological protocol for this kind of sampling. MATERIALS AND METHODS: Urine samples from 57 patients with suspected prostate disease were collected, without undergoing prostate massage. Increased serum PSA levels were confirmed by medical records review. RNA was extracted by different methods and a preamplification step was included in order to improve gene detection by Real-Time PCR. RESULTS: An increase in RNA concentration with the use of TriPure Isolation Reagent. Despite this optimization, only 15.8% of the cases showed expression of PSA mRNA and only 3.8% of prostate cancer patients presented detectable levels of PCA3 mRNA. The use of a preamplification step revealed no improvement in the results obtained. CONCLUSION: This work confirms that prostate massage is important before urine collection for gene expression analysis. Since PSA and PCA3 are prostate specific, it is necessary to promote the passage of cells from prostate to urinary tract, in order to detect these genetic markers in urine samples.

  11. Modification of the grain boundary microstructure of the austenitic PCA stainless steel to improve helium embrittlement resistance

    International Nuclear Information System (INIS)

    Maziasz, P.J.; Braski, D.N.

    1986-01-01

    Grain boundary MC precipitation was produced by a modified thermal-mechanical pretreatment in 25% cold worked (CW) austenitic prime candidate alloy (PCA) stainless steel prior to HFIR irradiation. Postirradiation tensile results and fracture analysis showed that the modified material (B3) resisted helium embrittlement better than either solution annealed (SA) or 25% CW PCA irradiated at 500 to 600 0 C to approx.21 dpa and 1370 at. ppM He. PCA SA and 25% CW were not embrittled at 300 to 400 0 C. Grain boundary MC survives in PCA-B3 during HFIR irradiation at 500 0 C but dissolves at 600 0 C; it does not form in either SA or 25% CW PCA during similar irradiation. The grain boundary MC appears to play an important role in the helium embrittlement resistance of PCA-B3

  12. Biopsy and treatment decisions in the initial management of prostate cancer and the role of PCA3; a systematic analysis of expert opinion

    NARCIS (Netherlands)

    Tombal, Bertrand; Ameye, Filip; de la Taille, Alexandre; de Reijke, Theo; Gontero, Paolo; Haese, Alexander; Kil, Paul; Perrin, Paul; Remzi, Mesut; Schröder, Jörg; Speakman, Mark; Volpe, Alessandro; Meesen, Bianca; Stoevelaar, Herman

    2012-01-01

    The Prostate CAncer gene 3 (PCA3) assay may guide prostate biopsy decisions and predict prostate cancer (PCa) aggressiveness. This study explored the appropriateness of (1) PCA3 testing; (2) biopsy; (3) active surveillance (AS) and the value of the PCA3 Score for biopsy and AS decisions. Using the

  13. Trends in PDE constrained optimization

    CERN Document Server

    Benner, Peter; Engell, Sebastian; Griewank, Andreas; Harbrecht, Helmut; Hinze, Michael; Rannacher, Rolf; Ulbrich, Stefan

    2014-01-01

    Optimization problems subject to constraints governed by partial differential equations (PDEs) are among the most challenging problems in the context of industrial, economical and medical applications. Almost the entire range of problems in this field of research was studied and further explored as part of the Deutsche Forschungsgemeinschaft (DFG) priority program 1253 on “Optimization with Partial Differential Equations” from 2006 to 2013. The investigations were motivated by the fascinating potential applications and challenging mathematical problems that arise in the field of PDE constrained optimization. New analytic and algorithmic paradigms have been developed, implemented and validated in the context of real-world applications. In this special volume, contributions from more than fifteen German universities combine the results of this interdisciplinary program with a focus on applied mathematics.   The book is divided into five sections on “Constrained Optimization, Identification and Control”...

  14. PCA3 noncoding RNA is involved in the control of prostate-cancer cell survival and modulates androgen receptor signaling

    International Nuclear Information System (INIS)

    Ferreira, Luciana Bueno; Gimba, Etel Rodrigues Pereira; Palumbo, Antonio; Mello, Kivvi Duarte de; Sternberg, Cinthya; Caetano, Mauricio S; Oliveira, Felipe Leite de; Neves, Adriana Freitas; Nasciutti, Luiz Eurico; Goulart, Luiz Ricardo

    2012-01-01

    PCA3 is a non-coding RNA (ncRNA) that is highly expressed in prostate cancer (PCa) cells, but its functional role is unknown. To investigate its putative function in PCa biology, we used gene expression knockdown by small interference RNA, and also analyzed its involvement in androgen receptor (AR) signaling. LNCaP and PC3 cells were used as in vitro models for these functional assays, and three different siRNA sequences were specifically designed to target PCA3 exon 4. Transfected cells were analyzed by real-time qRT-PCR and cell growth, viability, and apoptosis assays. Associations between PCA3 and the androgen-receptor (AR) signaling pathway were investigated by treating LNCaP cells with 100 nM dihydrotestosterone (DHT) and with its antagonist (flutamide), and analyzing the expression of some AR-modulated genes (TMPRSS2, NDRG1, GREB1, PSA, AR, FGF8, CdK1, CdK2 and PMEPA1). PCA3 expression levels were investigated in different cell compartments by using differential centrifugation and qRT-PCR. LNCaP siPCA3-transfected cells significantly inhibited cell growth and viability, and increased the proportion of cells in the sub G0/G1 phase of the cell cycle and the percentage of pyknotic nuclei, compared to those transfected with scramble siRNA (siSCr)-transfected cells. DHT-treated LNCaP cells induced a significant upregulation of PCA3 expression, which was reversed by flutamide. In siPCA3/LNCaP-transfected cells, the expression of AR target genes was downregulated compared to siSCr-transfected cells. The siPCA3 transfection also counteracted DHT stimulatory effects on the AR signaling cascade, significantly downregulating expression of the AR target gene. Analysis of PCA3 expression in different cell compartments provided evidence that the main functional roles of PCA3 occur in the nuclei and microsomal cell fractions. Our findings suggest that the ncRNA PCA3 is involved in the control of PCa cell survival, in part through modulating AR signaling, which may raise new

  15. Characteristics and Validation Techniques for PCA-Based Gene-Expression Signatures

    Directory of Open Access Journals (Sweden)

    Anders E. Berglund

    2017-01-01

    Full Text Available Background. Many gene-expression signatures exist for describing the biological state of profiled tumors. Principal Component Analysis (PCA can be used to summarize a gene signature into a single score. Our hypothesis is that gene signatures can be validated when applied to new datasets, using inherent properties of PCA. Results. This validation is based on four key concepts. Coherence: elements of a gene signature should be correlated beyond chance. Uniqueness: the general direction of the data being examined can drive most of the observed signal. Robustness: if a gene signature is designed to measure a single biological effect, then this signal should be sufficiently strong and distinct compared to other signals within the signature. Transferability: the derived PCA gene signature score should describe the same biology in the target dataset as it does in the training dataset. Conclusions. The proposed validation procedure ensures that PCA-based gene signatures perform as expected when applied to datasets other than those that the signatures were trained upon. Complex signatures, describing multiple independent biological components, are also easily identified.

  16. Decision tree and PCA-based fault diagnosis of rotating machinery

    Science.gov (United States)

    Sun, Weixiang; Chen, Jin; Li, Jiaqing

    2007-04-01

    After analysing the flaws of conventional fault diagnosis methods, data mining technology is introduced to fault diagnosis field, and a new method based on C4.5 decision tree and principal component analysis (PCA) is proposed. In this method, PCA is used to reduce features after data collection, preprocessing and feature extraction. Then, C4.5 is trained by using the samples to generate a decision tree model with diagnosis knowledge. At last the tree model is used to make diagnosis analysis. To validate the method proposed, six kinds of running states (normal or without any defect, unbalance, rotor radial rub, oil whirl, shaft crack and a simultaneous state of unbalance and radial rub), are simulated on Bently Rotor Kit RK4 to test C4.5 and PCA-based method and back-propagation neural network (BPNN). The result shows that C4.5 and PCA-based diagnosis method has higher accuracy and needs less training time than BPNN.

  17. PCA-MLP SVM distinction of salivary Raman spectra of dengue fever infection.

    Science.gov (United States)

    Radzol, A R M; Lee, Khuan Y; Mansor, W; Wong, P S; Looi, I

    2017-07-01

    Dengue fever (DF) is a disease of major concern caused by flavivirus infection. Delayed diagnosis leads to severe stages, which could be deadly. Of recent, non-structural protein (NS1) has been acknowledged as a biomarker, alternative to immunoglobulins for early detection of dengue in blood. Further, non-invasive detection of NS1 in saliva makes the approach more appealing. However, since its concentration in saliva is less than blood, a sensitive and specific technique, Surface Enhanced Raman Spectroscopy (SERS), is employed. Our work here intends to define an optimal PCA-SVM (Principal Component Analysis-Support Vector Machine) with Multilayer Layer Perceptron (MLP) kernel model to distinct between positive and negative NS1 infected samples from salivary SERS spectra, which, to the best of our knowledge, has never been explored. Salivary samples of DF positive and negative subjects were collected, pre-processed and analyzed. PCA and SVM classifier were then used to differentiate the SERS analyzed spectra. Since performance of the model depends on the PCA criterion and MLP parameters, both are examined in tandem. Its performance is also compared to our previous works on simulated NS1 salivary samples. It is found that the best PCA-SVM (MLP) model can be defined by 95 PCs from CPV criterion with P1 and P2 values of 0.01 and -0.2 respectively. A classification performance of [76.88%, 85.92%, 67.83%] is achieved.

  18. AlleleCoder: a PERL script for coding codominant polymorphism data for PCA analysis

    Science.gov (United States)

    A useful biological interpretation of diploid heterozygotes is in terms of the dose of the common allele (0, 1 or 2 copies). We have developed a PERL script that converts FASTA files into coded spreadsheets suitable for Principal Component Analysis (PCA). In combination with R and R Commander, two- ...

  19. Statistical Significance of the Contribution of Variables to the PCA Solution: An Alternative Permutation Strategy

    Science.gov (United States)

    Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.

    2011-01-01

    In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…

  20. Investigation of domain walls in PPLN by confocal raman microscopy and PCA analysis

    Science.gov (United States)

    Shur, Vladimir Ya.; Zelenovskiy, Pavel; Bourson, Patrice

    2017-07-01

    Confocal Raman microscopy (CRM) is a powerful tool for investigation of ferroelectric domains. Mechanical stresses and electric fields existed in the vicinity of neutral and charged domain walls modify frequency, intensity and width of spectral lines [1], thus allowing to visualize micro- and nanodomain structures both at the surface and in the bulk of the crystal [2,3]. Stresses and fields are naturally coupled in ferroelectrics due to inverse piezoelectric effect and hardly can be separated in Raman spectra. PCA is a powerful statistical method for analysis of large data matrix providing a set of orthogonal variables, called principal components (PCs). PCA is widely used for classification of experimental data, for example, in crystallization experiments, for detection of small amounts of components in solid mixtures etc. [4,5]. In Raman spectroscopy PCA was applied for analysis of phase transitions and provided critical pressure with good accuracy [6]. In the present work we for the first time applied Principal Component Analysis (PCA) method for analysis of Raman spectra measured in periodically poled lithium niobate (PPLN). We found that principal components demonstrate different sensitivity to mechanical stresses and electric fields in the vicinity of the domain walls. This allowed us to separately visualize spatial distribution of fields and electric fields at the surface and in the bulk of PPLN.

  1. Applications of PCA and SVM-PSO Based Real-Time Face Recognition System

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Shieh

    2014-01-01

    Full Text Available This paper incorporates principal component analysis (PCA with support vector machine-particle swarm optimization (SVM-PSO for developing real-time face recognition systems. The integrated scheme aims to adopt the SVM-PSO method to improve the validity of PCA based image recognition systems on dynamically visual perception. The face recognition for most human-robot interaction applications is accomplished by PCA based method because of its dimensionality reduction. However, PCA based systems are only suitable for processing the faces with the same face expressions and/or under the same view directions. Since the facial feature selection process can be considered as a problem of global combinatorial optimization in machine learning, the SVM-PSO is usually used as an optimal classifier of the system. In this paper, the PSO is used to implement a feature selection, and the SVMs serve as fitness functions of the PSO for classification problems. Experimental results demonstrate that the proposed method simplifies features effectively and obtains higher classification accuracy.

  2. Pre-processing data using wavelet transform and PCA based on ...

    Indian Academy of Sciences (India)

    Abazar Solgi

    2017-07-14

    Jul 14, 2017 ... Pre-processing data using wavelet transform and PCA based on support vector regression and gene expression programming for river flow simulation. Abazar Solgi1,*, Amir Pourhaghi1, Ramin Bahmani2 and Heidar Zarei3. 1. Department of Water Resources Engineering, Shahid Chamran University of ...

  3. Elemental concentration analysis in PCa, BPH and normal prostate tissues using SR-TXRF

    International Nuclear Information System (INIS)

    Leitao, Roberta G.; Anjos, Marcelino J.; Canellas, Catarine G.L.; Lopes, Ricardo T.

    2009-01-01

    Prostate cancer (PCa) is one of the main causes of illness and death all over the world. In Brazil, prostate cancer currently represents the second most prevalent malignant neoplasia in men, representing 21% of all cancer cases. Benign Prostate Hyperplasia (BPH) is an illness prevailing in men above the age of 50, close to 90% after the age of 80. The prostate presents a high zinc concentration, about 10-fold higher than any other body tissue. In this work, samples of human prostate tissues with cancer (PCa), BPH and normal tissue were analyzed utilizing the total reflection X-ray fluorescence spectroscopy using synchrotron radiation technique (SRTXRF) to investigate the differences in the elemental concentrations in these tissues. SR-TXRF analyses were performed at the X-Ray fluorescence beamline at Brazilian National Synchrotron Light Laboratory (LNLS), in Campinas, Sao Paulo. It was possible to determine the concentrations of the following elements: P, S, K, Ca, Fe, Cu, Zn, Br and Rb. By using Mann-Whitney U test it was observed that almost all elements presented concentrations with significant differences α = 0.05) between the groups studied. The elements and groups were: S, K, Ca, Fe, Zn, Br and Rb (PCa X Normal); S, Fe, Zn and Br (PCa X BPH); K, Ca, Fe, Zn, Br and Rb (BPH X Normal). (author)

  4. PCA-based bootstrap confidence interval tests for gene-disease association involving multiple SNPs

    Directory of Open Access Journals (Sweden)

    Xue Fuzhong

    2010-01-01

    Full Text Available Abstract Background Genetic association study is currently the primary vehicle for identification and characterization of disease-predisposing variant(s which usually involves multiple single-nucleotide polymorphisms (SNPs available. However, SNP-wise association tests raise concerns over multiple testing. Haplotype-based methods have the advantage of being able to account for correlations between neighbouring SNPs, yet assuming Hardy-Weinberg equilibrium (HWE and potentially large number degrees of freedom can harm its statistical power and robustness. Approaches based on principal component analysis (PCA are preferable in this regard but their performance varies with methods of extracting principal components (PCs. Results PCA-based bootstrap confidence interval test (PCA-BCIT, which directly uses the PC scores to assess gene-disease association, was developed and evaluated for three ways of extracting PCs, i.e., cases only(CAES, controls only(COES and cases and controls combined(CES. Extraction of PCs with COES is preferred to that with CAES and CES. Performance of the test was examined via simulations as well as analyses on data of rheumatoid arthritis and heroin addiction, which maintains nominal level under null hypothesis and showed comparable performance with permutation test. Conclusions PCA-BCIT is a valid and powerful method for assessing gene-disease association involving multiple SNPs.

  5. Regularized Pre-image Estimation for Kernel PCA De-noising

    DEFF Research Database (Denmark)

    Abrahamsen, Trine Julie; Hansen, Lars Kai

    2011-01-01

    The main challenge in de-noising by kernel Principal Component Analysis (PCA) is the mapping of de-noised feature space points back into input space, also referred to as “the pre-image problem”. Since the feature space mapping is typically not bijective, pre-image estimation is inherently illposed...

  6. Global Clustering Quality Coefficient Assessing the Efficiency of PCA Class Assignment

    Directory of Open Access Journals (Sweden)

    Mirela Praisler

    2014-01-01

    Full Text Available An essential factor influencing the efficiency of the predictive models built with principal component analysis (PCA is the quality of the data clustering revealed by the score plots. The sensitivity and selectivity of the class assignment are strongly influenced by the relative position of the clusters and by their dispersion. We are proposing a set of indicators inspired from analytical geometry that may be used for an objective quantitative assessment of the data clustering quality as well as a global clustering quality coefficient (GCQC that is a measure of the overall predictive power of the PCA models. The use of these indicators for evaluating the efficiency of the PCA class assignment is illustrated by a comparative study performed for the identification of the preprocessing function that is generating the most efficient PCA system screening for amphetamines based on their GC-FTIR spectra. The GCQC ranking of the tested feature weights is explained based on estimated density distributions and validated by using quadratic discriminant analysis (QDA.

  7. Temporal networks

    Science.gov (United States)

    Holme, Petter; Saramäki, Jari

    2012-10-01

    A great variety of systems in nature, society and technology-from the web of sexual contacts to the Internet, from the nervous system to power grids-can be modeled as graphs of vertices coupled by edges. The network structure, describing how the graph is wired, helps us understand, predict and optimize the behavior of dynamical systems. In many cases, however, the edges are not continuously active. As an example, in networks of communication via e-mail, text messages, or phone calls, edges represent sequences of instantaneous or practically instantaneous contacts. In some cases, edges are active for non-negligible periods of time: e.g., the proximity patterns of inpatients at hospitals can be represented by a graph where an edge between two individuals is on throughout the time they are at the same ward. Like network topology, the temporal structure of edge activations can affect dynamics of systems interacting through the network, from disease contagion on the network of patients to information diffusion over an e-mail network. In this review, we present the emergent field of temporal networks, and discuss methods for analyzing topological and temporal structure and models for elucidating their relation to the behavior of dynamical systems. In the light of traditional network theory, one can see this framework as moving the information of when things happen from the dynamical system on the network, to the network itself. Since fundamental properties, such as the transitivity of edges, do not necessarily hold in temporal networks, many of these methods need to be quite different from those for static networks. The study of temporal networks is very interdisciplinary in nature. Reflecting this, even the object of study has many names-temporal graphs, evolving graphs, time-varying graphs, time-aggregated graphs, time-stamped graphs, dynamic networks, dynamic graphs, dynamical graphs, and so on. This review covers different fields where temporal graphs are considered

  8. [Interest of evaluation of professional practice for the improvement of the management of postoperative pain with patient controlled analgesia (PCA)].

    Science.gov (United States)

    Baumann, A; Cuignet-Royer, E; Cornet, C; Trueck, S; Heck, M; Taron, F; Peignier, C; Chastel, A; Gervais, P; Bouaziz, H; Audibert, G; Mertes, P-M

    2010-10-01

    To evaluate the daily practice of postoperative PCA in Nancy University Hospital, in continuity with a quality program of postoperative pain (POP) care conducted in 2003. A retrospective audit of patient medical records. A review of all the medical records of consecutive surgical patients managed by PCA over a 5-week period in six surgical services. Criteria studied: Evaluation of hospital means (eight criteria) and of medical and nursing staff practice (16 criteria). A second audit was conducted 6 months after the implementation of quality improvement measures. Assessment of the hospital means: temperature chart including pain scores and PCA drug consumption, patient information leaflet, PCA protocol, postoperative pre-filled prescription form (PFPF) for post-anaesthesia care including PCA, and optional training of nurses in postoperative pain management. EVALUATION OF PRACTICES: One hundred and fifty-nine files of a total of 176 patients were analyzed (88%). Improvements noted after 6 months: trace of POP evaluation progressed from 73 to 87%, advance prescription of PCA adjustment increased from 56 to 68% and of the treatment of adverse effects from 54 to 68%, trace of PCA adaptation by attending nurse from 15 to 43%, trace of the administration of the treatment of adverse effects by attending nurse from 24% to 64%, as did the use of PFPF from 59 to 70%. The usefulness of a pre-filled prescription form for post-anaesthesia care including PCA prescription is demonstrated. Quality improvement measures include: poster information and pocket guides on PCA for nurses, training of 3 nurses per service to act as "PCA advisers" who will in turn train their ward colleagues in PCA management and the use of equipment until an acute pain team is established. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  9. Nested Sampling with Constrained Hamiltonian Monte Carlo

    OpenAIRE

    Betancourt, M. J.

    2010-01-01

    Nested sampling is a powerful approach to Bayesian inference ultimately limited by the computationally demanding task of sampling from a heavily constrained probability distribution. An effective algorithm in its own right, Hamiltonian Monte Carlo is readily adapted to efficiently sample from any smooth, constrained distribution. Utilizing this constrained Hamiltonian Monte Carlo, I introduce a general implementation of the nested sampling algorithm.

  10. Constrained minimization in C ++ environment

    International Nuclear Information System (INIS)

    Dymov, S.N.; Kurbatov, V.S.; Silin, I.N.; Yashchenko, S.V.

    1998-01-01

    Based on the ideas, proposed by one of the authors (I.N.Silin), the suitable software was developed for constrained data fitting. Constraints may be of the arbitrary type: equalities and inequalities. The simplest of possible ways was used. Widely known program FUMILI was realized to the C ++ language. Constraints in the form of inequalities φ (θ i ) ≥ a were taken into account by change into equalities φ (θ i ) = t and simple inequalities of type t ≥ a. The equalities were taken into account by means of quadratic penalty functions. The suitable software was tested on the model data of the ANKE setup (COSY accelerator, Forschungszentrum Juelich, Germany)

  11. Coherent states in constrained systems

    International Nuclear Information System (INIS)

    Nakamura, M.; Kojima, K.

    2001-01-01

    When quantizing the constrained systems, there often arise the quantum corrections due to the non-commutativity in the re-ordering of constraint operators in the products of operators. In the bosonic second-class constraints, furthermore, the quantum corrections caused by the uncertainty principle should be taken into account. In order to treat these corrections simultaneously, the alternative projection technique of operators is proposed by introducing the available minimal uncertainty states of the constraint operators. Using this projection technique together with the projection operator method (POM), these two kinds of quantum corrections were investigated

  12. Effect of process control agent (PCA) on the characteristics of mechanically alloyed Ti-Mg powders [Journal article

    CSIR Research Space (South Africa)

    Machio, Christopher N

    2011-03-01

    Full Text Available This paper reports the results of a study to determine the effect of process control agent (PCA) on the characteristics of Ti-Mg powders during milling. It has been shown that a 2% increase in PCA content leads to up to a 40% increase in yield...

  13. RXTE PCA and Swift BAT detects the millisecond pulsar Swift J1756.9-2508 in outburst

    NARCIS (Netherlands)

    Patruno, A.; Markwardt, C.B.; Strohmayer, T.E.; Swank, J.H.; Smith, S.E.; Pereira, D.

    2009-01-01

    We report a detection of increased activity of the accreting millisecond X-ray pulsar Swift J1756.9-2508 observed with the RXTE-PCA monitoring on July 8, 9hr UTC. Increased flux is detected simultaneously on the Swift-BAT camera. RXTE-PCA follow up observations starting on July 13, 23hr UTC,

  14. A comparison of PCA and PMF models for source identification of fugitive methane emissions

    Science.gov (United States)

    Assan, Sabina; Baudic, Alexia; Bsaibes, Sandy; Gros, Valerie; Ciais, Philippe; Staufer, Johannes; Robinson, Rod; Vogel, Felix

    2017-04-01

    Methane (CH_4) is a greenhouse gas with a global warming potential 28-32 times that of carbon dioxide (CO_2) on a 100 year period, and even greater on shorter timescales [Etminan, et al., 2016, Allen, 2014]. Thus, despite its relatively short life time and smaller emission quantities compared to CO_2, CH4 emissions contribute to approximately 20{%} of today's anthropogenic greenhouse gas warming [Kirschke et al., 2013]. Major anthropogenic sources include livestock (enteric fermentation), oil and gas production and distribution, landfills, and wastewater emissions [EPA, 2011]. Especially in densely populated areas multiple CH4 sources can be found in close vicinity. Thus, when measuring CH4 emissions at local scales it is necessary to distinguish between different CH4 source categories to effectively quantify the contribution of each sector and aid the implementation of greenhouse gas reduction strategies. To this end, source apportionment models can be used to aid the interpretation of spatial and temporal patterns in order to identify and characterise emission sources. The focus of this study is to evaluate two common linear receptor models, namely Principle Component Analysis (PCA) and Positive Matrix Factorisation (PMF) for CH4 source apportionment. The statistical models I will present combine continuous in-situ CH4 , C_2H_6, δ^1^3CH4 measured using a Cavity Ring Down Spectroscopy (CRDS) instrument [Assan et al. 2016] with volatile organic compound (VOC) observations performed using Gas Chromatography (GC) in order to explain the underlying variance of the data. The strengths and weaknesses of both models are identified for data collected in multi-source environments in the vicinity of four different types of sites; an agricultural farm with cattle, a natural gas compressor station, a wastewater treatment plant, and a pari-urban location in the Ile de France region impacted by various sources. To conclude, receptor model results to separate statistically the

  15. Temporal naturalism

    Science.gov (United States)

    Smolin, Lee

    2015-11-01

    Two people may claim both to be naturalists, but have divergent conceptions of basic elements of the natural world which lead them to mean different things when they talk about laws of nature, or states, or the role of mathematics in physics. These disagreements do not much affect the ordinary practice of science which is about small subsystems of the universe, described or explained against a background, idealized to be fixed. But these issues become crucial when we consider including the whole universe within our system, for then there is no fixed background to reference observables to. I argue here that the key issue responsible for divergent versions of naturalism and divergent approaches to cosmology is the conception of time. One version, which I call temporal naturalism, holds that time, in the sense of the succession of present moments, is real, and that laws of nature evolve in that time. This is contrasted with timeless naturalism, which holds that laws are immutable and the present moment and its passage are illusions. I argue that temporal naturalism is empirically more adequate than the alternatives, because it offers testable explanations for puzzles its rivals cannot address, and is likely a better basis for solving major puzzles that presently face cosmology and physics. This essay also addresses the problem of qualia and experience within naturalism and argues that only temporal naturalism can make a place for qualia as intrinsic qualities of matter.

  16. Amorphization of Fe-based alloy via wet mechanical alloying assisted by PCA decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Neamţu, B.V., E-mail: Bogdan.Neamtu@stm.utcluj.ro [Materials Science and Engineering Department, Technical University of Cluj-Napoca, 103-105, Muncii Avenue, 400641, Cluj-Napoca (Romania); Chicinaş, H.F.; Marinca, T.F. [Materials Science and Engineering Department, Technical University of Cluj-Napoca, 103-105, Muncii Avenue, 400641, Cluj-Napoca (Romania); Isnard, O. [Université Grenoble Alpes, Institut NEEL, F-38042, Grenoble (France); CNRS, Institut NEEL, 25 rue des martyrs, BP166, F-38042, Grenoble (France); Pană, O. [National Institute for Research and Development of Isotopic and Molecular Technologies, 65-103 Donath Street, 400293, Cluj-Napoca (Romania); Chicinaş, I. [Materials Science and Engineering Department, Technical University of Cluj-Napoca, 103-105, Muncii Avenue, 400641, Cluj-Napoca (Romania)

    2016-11-01

    Amorphization of Fe{sub 75}Si{sub 20}B{sub 5} (at.%) alloy has been attempted both by wet and dry mechanical alloying starting from a mixture of elemental powders. Powder amorphization was not achieved even after 140 hours of dry mechanical alloying. Using the same milling parameters, when wet mechanical alloying was used, the powder amorphization was achieved after 40 h of milling. Our assumption regarding the powder amorphization capability enhancement by contamination with carbon was proved by X-ray Photoelectron Spectroscopy (XPS) measurements which revealed the presence of carbon in the chemical composition of the wet mechanically alloyed sample. Using shorter milling times and several process control agents (PCA) (ethanol, oleic acid and benzene) with different carbon content it was proved that the milling duration required for powder amorphization is linked to the carbon content of the PCA. Differential Scanning Calorimetry (DSC), thermomagnetic (TG) and X-ray Diffraction (XRD) measurements performed to the heated samples revealed the fact that, the crystallisation occurs at 488 °C, thus leading to the formation of Fe{sub 3}Si and Fe{sub 2}B. Thermogravimetry measurements performed under H{sub 2} atmosphere, showed the same amount of contamination with C, which is about 2.3 wt%, for the amorphous samples regardless of the type of PCA. Saturation magnetisation of the wet milled samples decreases upon increasing milling time. In the case of the amorphous samples wet milled with benzene up to 20 h and with oleic acid up to 30 h, the saturation magnetisation has roughly the same value, indicating the same degree of contamination. The XRD performed on the samples milled using the same parameters, revealed that powder amorphization can be achieved even via dry milling, just by adding the equivalent amount of elemental C calculated from the TG plots. This proves that in this system by considering the atomic species which can contaminate the powder, they can be

  17. Use of principal components analysis (PCA) on estuarine sediment datasets: The effect of data pre-treatment

    International Nuclear Information System (INIS)

    Reid, M.K.; Spencer, K.L.

    2009-01-01

    Principal components analysis (PCA) is a multivariate statistical technique capable of discerning patterns in large environmental datasets. Although widely used, there is disparity in the literature with respect to data pre-treatment prior to PCA. This research examines the influence of commonly reported data pre-treatment methods on PCA outputs, and hence data interpretation, using a typical environmental dataset comprising sediment geochemical data from an estuary in SE England. This study demonstrated that applying the routinely used log (x + 1) transformation skewed the data and masked important trends. Removing outlying samples and correcting for the influence of grain size had the most significant effect on PCA outputs and data interpretation. Reducing the influence of grain size using granulometric normalisation meant that other factors affecting metal variability, including mineralogy, anthropogenic sources and distance along the salinity transect could be identified and interpreted more clearly. - Data pre-treatment can have a significant influence on the outcome of PCA.

  18. Antineoplastic and immunomodulatory effect of polyphenolic components of Achyranthes aspera (PCA) extract on urethane induced lung cancer in vivo.

    Science.gov (United States)

    Narayan, Chandradeo; Kumar, Arvind

    2014-01-01

    Polyphenolic compounds of Achyranthes aspera (PCA) extract is evaluated for anti-cancerous and cytokine based immunomodulatory effects. The PCA extract contains known components of phenolic acid and flavonoids such as mixture of quinic acid, chlorogenic acid, kaempferol, quercetin and chrysin along with many unknown components. PCA has been orally feed to urethane (ethyl carbamate) primed lung cancerous mice at a dosage of 100 mg/kg body weight for 30 consecutive days. 100 mg powder of A. aspera contains 2.4 mg phenolic acid and 1.1 mg flavonoid (2:1 ratio). Enhanced activities and expression of antioxidant enzymes GST, GR, CAT, SOD, while down regulated expression and activation of LDH enzymes in PCA feed urethane primed lung cancerous tissues as compared to PCA non-feed urethane primed lung cancerous tissues were observed. PCA feed urethane primed lung tissues showed down regulated expression of pro-inflammatory cytokines IL-1β, IL-6 and TNF-α along with TFs, NF-κB and Stat3 while the expression of pro-apoptotic proteins Bax and p53 were enhanced in PCA feed urethane primed lung tissues. FTIR and CD spectroscopy data revealed that PCA resisted the urethane mediated conformational changes of DNA which is evident by the shift in guanine and thymine bands in FTIR from 1,708 to 1,711 cm(-1) and 1,675 to 1,671 cm(-1), respectively in PCA feed urethane primed lung cancerous tissues DNA in comparison to urethane primed lung cancerous tissues DNA. The present study suggests that PCA components have synergistic anti-cancerous and cytokine based immunomodulatory role and DNA conformation restoring effects. However, more research is required to show the effects of each component separately and in combination for effective therapeutic use to cure and prevent lung cancer including other cancers.

  19. A Study of Wind Turbine Comprehensive Operational Assessment Model Based on EM-PCA Algorithm

    Science.gov (United States)

    Zhou, Minqiang; Xu, Bin; Zhan, Yangyan; Ren, Danyuan; Liu, Dexing

    2018-01-01

    To assess wind turbine performance accurately and provide theoretical basis for wind farm management, a hybrid assessment model based on Entropy Method and Principle Component Analysis (EM-PCA) was established, which took most factors of operational performance into consideration and reach to a comprehensive result. To verify the model, six wind turbines were chosen as the research objects, the ranking obtained by the method proposed in the paper were 4#>6#>1#>5#>2#>3#, which are completely in conformity with the theoretical ranking, which indicates that the reliability and effectiveness of the EM-PCA method are high. The method could give guidance for processing unit state comparison among different units and launching wind farm operational assessment.

  20. Short-term PV/T module temperature prediction based on PCA-RBF neural network

    Science.gov (United States)

    Li, Jiyong; Zhao, Zhendong; Li, Yisheng; Xiao, Jing; Tang, Yunfeng

    2018-02-01

    Aiming at the non-linearity and large inertia of temperature control in PV/T system, short-term temperature prediction of PV/T module is proposed, to make the PV/T system controller run forward according to the short-term forecasting situation to optimize control effect. Based on the analysis of the correlation between PV/T module temperature and meteorological factors, and the temperature of adjacent time series, the principal component analysis (PCA) method is used to pre-process the original input sample data. Combined with the RBF neural network theory, the simulation results show that the PCA method makes the prediction accuracy of the network model higher and the generalization performance stronger than that of the RBF neural network without the main component extraction.

  1. Extraction of prostatic lumina and automated recognition for prostatic calculus image using PCA-SVM.

    Science.gov (United States)

    Wang, Zhuocai; Xu, Xiangmin; Ding, Xiaojun; Xiao, Hui; Huang, Yusheng; Liu, Jian; Xing, Xiaofen; Wang, Hua; Liao, D Joshua

    2011-01-01

    Identification of prostatic calculi is an important basis for determining the tissue origin. Computation-assistant diagnosis of prostatic calculi may have promising potential but is currently still less studied. We studied the extraction of prostatic lumina and automated recognition for calculus images. Extraction of lumina from prostate histology images was based on local entropy and Otsu threshold recognition using PCA-SVM and based on the texture features of prostatic calculus. The SVM classifier showed an average time 0.1432 second, an average training accuracy of 100%, an average test accuracy of 93.12%, a sensitivity of 87.74%, and a specificity of 94.82%. We concluded that the algorithm, based on texture features and PCA-SVM, can recognize the concentric structure and visualized features easily. Therefore, this method is effective for the automated recognition of prostatic calculi.

  2. Thermogravimetry/mass spectrometry study of woody residues and an herbaceous biomass crop using PCA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, C.J.; Velo, E.; Puigjaner, L. [Department of Chemical Engineering, ETSEIB, Universitat Politecnica de Catalunya, Avinguda Diagonal 647, G2, E-08028 Barcelona (Spain); Meszaros, E.; Jakab, E. [Institute of Materials and Environmental Chemistry, Chemical Research Center, Hungarian Academy of Sciences, P.O. Box 17, Budapest 1525 (Hungary)

    2007-10-15

    The devolatilization behaviour of pine and beech wood from carpentry residuals and an herbaceous product from an energy plantation (artichoke thistle) was investigated by thermogravimetry/mass spectrometry (TG/MS). The effect of three pre-treatments, hot-water washing, ethanol extraction and their combination, was also studied. Principal component analysis (PCA) was employed to help in the evaluation of the large data set of results. The characteristics of the thermal decomposition of the herbaceous crop are considerably different from that of the woody biomass samples. The evolution profiles of some characteristic pyrolysis products revealed that the thermal behaviour of wood and thistle is still considerably different after the elimination of some of the inorganic ions and extractive compounds, although the macromolecular components of the samples decompose at similar temperatures. With the help of the PCA calculations, the effect of the different pre-treatments on the production of the main pyrolysis products was evidenced. (author)

  3. Microstructural design of PCA austenitic stainless steel for improved resistance to helium embrittlement under HFIR irradiation

    International Nuclear Information System (INIS)

    Maziasz, P.J.; Braski, D.N.

    1983-01-01

    Several variants of Prime Candidate Alloy (PCA) with different preirradiation thermal-mechanical treatments were irradiated in HFIR and were evaluated for embrittlement resistance via disk-bend tensile testing. Comparison tests were made on two heats of 20%-cold-worked type 316 stainless steel. None of the alloys were brittle after irradiation at 300 to 400 0 C to approx. 44 dpa and helium levels of 3000 to approx.3600 at. ppm. However, all were quite brittle after similar exposure at 600 0 C. Embrittlement varied with alloy and pretreatment for irradiation to 44 dpa at 500 0 C and to 22 dpa at 600 0 C. Better relative embrittlement resistance among PCA variants was found in alloys which contained prior grain boundary MC carbide particles that remained stable under irradiation

  4. Statistical Fractal Models Based on GND-PCA and Its Application on Classification of Liver Diseases

    Directory of Open Access Journals (Sweden)

    Huiyan Jiang

    2013-01-01

    Full Text Available A new method is proposed to establish the statistical fractal model for liver diseases classification. Firstly, the fractal theory is used to construct the high-order tensor, and then Generalized -dimensional Principal Component Analysis (GND-PCA is used to establish the statistical fractal model and select the feature from the region of liver; at the same time different features have different weights, and finally, Support Vector Machine Optimized Ant Colony (ACO-SVM algorithm is used to establish the classifier for the recognition of liver disease. In order to verify the effectiveness of the proposed method, PCA eigenface method and normal SVM method are chosen as the contrast methods. The experimental results show that the proposed method can reconstruct liver volume better and improve the classification accuracy of liver diseases.

  5. The effects of multidisciplinary rehabilitation: RePCa-a randomised study among primary prostate cancer patients

    DEFF Research Database (Denmark)

    Dieperink, K B; Johansen, C; Hansen, Steinbjørn

    2013-01-01

    Background:The objective of this study is the effectiveness of multidisciplinary rehabilitation on treatment-related adverse effects after completed radiotherapy in patients with prostate cancer (PCa).Methods:In a single-centre oncology unit in Odense, Denmark, 161 PCa patients treated...... with radiotherapy and androgen deprivation therapy were randomly assigned to either a programme of two nursing counselling sessions and two instructive sessions with a physical therapist (n=79) or to usual care (n=82). Primary outcome was Expanded Prostate Cancer Index Composite (EPIC-26) urinary irritative sum......-score.Before radiotherapy, pre-intervention 4 weeks after radiotherapy, and after a 20-week intervention, measurements included self-reported disease-specific quality of life (QoL; EPIC-26, including urinary, bowel, sexual, and hormonal symptoms), general QoL (Short-form-12, SF-12), pelvic floor muscle strength (Modified...

  6. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    Science.gov (United States)

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  7. Extraction of Prostatic Lumina and Automated Recognition for Prostatic Calculus Image Using PCA-SVM

    Science.gov (United States)

    Wang, Zhuocai; Xu, Xiangmin; Ding, Xiaojun; Xiao, Hui; Huang, Yusheng; Liu, Jian; Xing, Xiaofen; Wang, Hua; Liao, D. Joshua

    2011-01-01

    Identification of prostatic calculi is an important basis for determining the tissue origin. Computation-assistant diagnosis of prostatic calculi may have promising potential but is currently still less studied. We studied the extraction of prostatic lumina and automated recognition for calculus images. Extraction of lumina from prostate histology images was based on local entropy and Otsu threshold recognition using PCA-SVM and based on the texture features of prostatic calculus. The SVM classifier showed an average time 0.1432 second, an average training accuracy of 100%, an average test accuracy of 93.12%, a sensitivity of 87.74%, and a specificity of 94.82%. We concluded that the algorithm, based on texture features and PCA-SVM, can recognize the concentric structure and visualized features easily. Therefore, this method is effective for the automated recognition of prostatic calculi. PMID:21461364

  8. [Identification of varieties of cashmere by Vis/NIR spectroscopy technology based on PCA-SVM].

    Science.gov (United States)

    Wu, Gui-Fang; He, Yong

    2009-06-01

    One mixed algorithm was presented to discriminate cashmere varieties with principal component analysis (PCA) and support vector machine (SVM). Cashmere fiber has such characteristics as threadlike, softness, glossiness and high tensile strength. The quality characters and economic value of each breed of cashmere are very different. In order to safeguard the consumer's rights and guarantee the quality of cashmere product, quickly, efficiently and correctly identifying cashmere has significant meaning to the production and transaction of cashmere material. The present research adopts Vis/NIRS spectroscopy diffuse techniques to collect the spectral data of cashmere. The near infrared fingerprint of cashmere was acquired by principal component analysis (PCA), and support vector machine (SVM) methods were used to further identify the cashmere material. The result of PCA indicated that the score map made by the scores of PC1, PC2 and PC3 was used, and 10 principal components (PCs) were selected as the input of support vector machine (SVM) based on the reliabilities of PCs of 99.99%. One hundred cashmere samples were used for calibration and the remaining 75 cashmere samples were used for validation. A one-against-all multi-class SVM model was built, the capabilities of SVM with different kernel function were comparatively analyzed, and the result showed that SVM possessing with the Gaussian kernel function has the best identification capabilities with the accuracy of 100%. This research indicated that the data mining method of PCA-SVM has a good identification effect, and can work as a new method for rapid identification of cashmere material varieties.

  9. SU-C-BRF-03: PCA Modeling of Anatomical Changes During Head and Neck Radiation Therapy

    International Nuclear Information System (INIS)

    Chetvertkov, M; Kim, J; Siddiqui, F; Kumarasiri, A; Chetty, I; Gordon, J

    2014-01-01

    Purpose: To develop principal component analysis (PCA) models from daily cone beam CTs (CBCTs) of head and neck (H and N) patients that could be used prospectively in adaptive radiation therapy (ART). Methods: : For 7 H and N patients, Pinnacle Treatment Planning System (Philips Healthcare) was used to retrospectively deformably register daily CBCTs to the planning CT. The number N of CBCTs per treatment course ranged from 14 to 22. For each patient a PCA model was built from the deformation vector fields (DVFs), after first subtracting the mean DVF, producing N eigen-DVFs (EDVFs). It was hypothesized that EDVFs with large eigenvalues represent the major anatomical deformations during the course of treatment, and that it is feasible to relate each EDVF to a clinically meaningful systematic or random change in anatomy, such as weight loss, neck flexion, etc. Results: DVFs contained on the order of 3×87×87×58=1.3 million scalar values (3 times the number of voxels in the registered volume). The top 3 eigenvalues accounted for ∼90% of variance. Anatomical changes corresponding to an EDVF were evaluated by generating a synthetic DVF, and applying that DVF to the CT to produce a synthetic CBCT. For all patients, the EDVF for the largest eigenvalue was interpreted to model weight loss. The EDVF for other eigenvalues appeared to represented quasi-random fraction-to-fraction changes. Conclusion: The leading EDVFs from single-patient PCA models have tentatively been identified with weight loss changes during treatment. Other EDVFs are tentatively identified as quasi-random inter-fraction changes. Clean separation of systematic and random components may require further work. This work is expected to facilitate development of population-based PCA models that can be used to prospectively identify significant anatomical changes, such as weight loss, early in treatment, triggering replanning where beneficial

  10. Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.

    Science.gov (United States)

    Reena Benjamin, J; Jayasree, T

    2018-02-01

    In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.

  11. Wavelet Compressed PCA Models for Real-Time Image Registration in Augmented Reality Applications

    OpenAIRE

    Christopher Cooper; Kent Wise; John Cooper; Makarand Deo

    2015-01-01

    The use of augmented reality (AR) has shown great promise in enhancing medical training and diagnostics via interactive simulations. This paper presents a novel method to perform accurate and inexpensive image registration (IR) utilizing a pre-constructed database of reference objects in conjunction with a principal component analysis (PCA) model. In addition, a wavelet compression algorithm is utilized to enhance the speed of the registration process. The proposed method is used to perform r...

  12. In Vivo Imaging of Experimental Melanoma Tumors using the Novel Radiotracer 68Ga-NODAGA-Procainamide (PCA).

    Science.gov (United States)

    Kertész, István; Vida, András; Nagy, Gábor; Emri, Miklós; Farkas, Antal; Kis, Adrienn; Angyal, János; Dénes, Noémi; Szabó, Judit P; Kovács, Tünde; Bai, Péter; Trencsényi, György

    2017-01-01

    The most aggressive form of skin cancer is the malignant melanoma. Because of its high metastatic potential the early detection of primary melanoma tumors and metastases using non-invasive PET imaging determines the outcome of the disease. Previous studies have already shown that benzamide derivatives, such as procainamide (PCA) specifically bind to melanin pigment. The aim of this study was to synthesize and investigate the melanin specificity of the novel 68 Ga-labeled NODAGA-PCA molecule in vitro and in vivo using PET techniques. Procainamide (PCA) was conjugated with NODAGA chelator and was labeled with Ga-68 ( 68 Ga-NODAGA-PCA). The melanin specificity of 68 Ga-NODAGA-PCA was tested in vitro , ex vivo and in vivo using melanotic B16-F10 and amelanotic Melur melanoma cell lines. By subcutaneous and intravenous injection of melanoma cells tumor-bearing mice were prepared, on which biodistribution studies and small animal PET/CT scans were performed for 68 Ga-NODAGA-PCA and 18 FDG tracers. 68 Ga-NODAGA-PCA was produced with high specific activity (14.9±3.9 GBq/µmol) and with excellent radiochemical purity (98%PCA uptake of B16-F10 cells was significantly ( p ≤0.01) higher than Melur cells. Ex vivo biodistribution and in vivo PET/CT studies using subcutaneous and metastatic tumor models showed significantly ( p ≤0.01) higher 68 Ga-NODAGA-PCA uptake in B16-F10 primary tumors and lung metastases in comparison with amelanotic Melur tumors. In experiments where 18 FDG and 68 Ga-NODAGA-PCA uptake of B16-F10 tumors was compared, we found that the tumor-to-muscle (T/M) and tumor-to-lung (T/L) ratios were significantly ( p ≤0.05 and p ≤0.01) higher using 68 Ga-NODAGA-PCA than the 18 FDG accumulation. Our novel radiotracer 68 Ga-NODAGA-PCA showed specific binding to the melanin producing experimental melanoma tumors. Therefore, 68 Ga-NODAGA-PCA is a suitable diagnostic radiotracer for the detection of melanoma tumors and metastases in vivo .

  13. Temporal contingency

    Science.gov (United States)

    Gallistel, C.R.; Craig, Andrew R.; Shahan, Timothy A.

    2015-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. PMID:23994260

  14. Temporal contingency.

    Science.gov (United States)

    Gallistel, C R; Craig, Andrew R; Shahan, Timothy A

    2014-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Automatic detection of optic disc based on PCA and mathematical morphology.

    Science.gov (United States)

    Morales, Sandra; Naranjo, Valery; Angulo, Us; Alcaniz, Mariano

    2013-04-01

    The algorithm proposed in this paper allows to automatically segment the optic disc from a fundus image. The goal is to facilitate the early detection of certain pathologies and to fully automate the process so as to avoid specialist intervention. The method proposed for the extraction of the optic disc contour is mainly based on mathematical morphology along with principal component analysis (PCA). It makes use of different operations such as generalized distance function (GDF), a variant of the watershed transformation, the stochastic watershed, and geodesic transformations. The input of the segmentation method is obtained through PCA. The purpose of using PCA is to achieve the grey-scale image that better represents the original RGB image. The implemented algorithm has been validated on five public databases obtaining promising results. The average values obtained (a Jaccard's and Dice's coefficients of 0.8200 and 0.8932, respectively, an accuracy of 0.9947, and a true positive and false positive fractions of 0.9275 and 0.0036) demonstrate that this method is a robust tool for the automatic segmentation of the optic disc. Moreover, it is fairly reliable since it works properly on databases with a large degree of variability and improves the results of other state-of-the-art methods.

  16. Identifikasi Wajah Manusia untuk Sistem Monitoring Kehadiran Perkuliahan menggunakan Ekstraksi Fitur Principal Component Analysis (PCA

    Directory of Open Access Journals (Sweden)

    Cucu Suhery

    2017-04-01

    Full Text Available Berbagai sistem monitoring presensi yang ada memiliki kekurangan dan kelebihan masing-masing, dan perlu  untuk terus dikembangkan sehingga memudahkan dalam proses pengolahan datanya. Pada penelitian ini dikembangkan suatu sistem monitoring presensi menggunakan deteksi wajah manusia yang diintegrasikan dengan basis data menggunakan bahasa pemrograman Python dan library opencv. Akuisisi data citra dilakukan dengan ponsel android, kemudian citra tersebut dideteksi dan dipotong sehingga hanya didapat bagian wajah saja.  Deteksi wajah menggunakan metode Haar-Cascade Classifier, kemudian ekstraksi fitur dilakukan menggunakan metode Principal Component Analysis (PCA. Hasil dari PCA diberi label sesuai dengan data manusia yang ada pada basis data. Semua citra yang telah memiliki nilai PCA dan tersimpan di basis data akan dicari kemiripannya dengan citra wajah pada proses pengujian menggunakan metoda Euclidian Distance. Pada penelitian ini basis data yang digunakan yaitu MySQL. Hasil deteksi citra wajah pada proses pelatihan memiliki tingkat keberhasilan 100% dan hasil identifikasi wajah pada proses pengujian memiliki tingkat keberhasilan 90%..   Kata kunci— android, haar-cascade classifier, principal component analysis, euclidian distance, MySQL, sistem monitoring presensi, deteksi wajah

  17. EEG channels reduction using PCA to increase XGBoost's accuracy for stroke detection

    Science.gov (United States)

    Fitriah, N.; Wijaya, S. K.; Fanany, M. I.; Badri, C.; Rezal, M.

    2017-07-01

    In Indonesia, based on the result of Basic Health Research 2013, the number of stroke patients had increased from 8.3 ‰ (2007) to 12.1 ‰ (2013). These days, some researchers are using electroencephalography (EEG) result as another option to detect the stroke disease besides CT Scan image as the gold standard. A previous study on the data of stroke and healthy patients in National Brain Center Hospital (RS PON) used Brain Symmetry Index (BSI), Delta-Alpha Ratio (DAR), and Delta-Theta-Alpha-Beta Ratio (DTABR) as the features for classification by an Extreme Learning Machine (ELM). The study got 85% accuracy with sensitivity above 86 % for acute ischemic stroke detection. Using EEG data means dealing with many data dimensions, and it can reduce the accuracy of classifier (the curse of dimensionality). Principal Component Analysis (PCA) could reduce dimensionality and computation cost without decreasing classification accuracy. XGBoost, as the scalable tree boosting classifier, can solve real-world scale problems (Higgs Boson and Allstate dataset) with using a minimal amount of resources. This paper reuses the same data from RS PON and features from previous research, preprocessed with PCA and classified with XGBoost, to increase the accuracy with fewer electrodes. The specific fewer electrodes improved the accuracy of stroke detection. Our future work will examine the other algorithm besides PCA to get higher accuracy with less number of channels.

  18. A neuro-fuzzy inference system for sensor failure detection using wavelet denoising, PCA and SPRT

    International Nuclear Information System (INIS)

    Na, Man Gyun

    2001-01-01

    In this work, a neuro-fuzzy inference system combined with the wavelet denoising, PCA(principal component analysis) and SPRT (sequential probability ratio test) methods is developed to detect the relevant sensor failure using other sensor signals. The wavelet denoising technique is applied to remove noise components in input signals into the neuro-fuzzy system. The PCA is used to reduce the dimension of an input space without losing a significant amount of information, The PCA makes easy the selection of the input signals into the neuro-fuzzy system. Also, a lower dimensional input space usually reduces the time necessary to train a neuro-fuzzy system. The parameters of the neuro-fuzzy inference system which estimates the relevant sensor signal are optimized by a genetic algorithm and a least-squares algorithm. The residuals between the estimated signals and the measured signals are used to detect whether the sensors are failed or not. The SPRT is used in this failure detection algorithm. The proposed sensor-monitoring algorithm was verified through applications to the pressurizer water level and the hot-leg flowrate sensors in pressurized water reactors

  19. Towards Constraining Glacial Isostatic Adjustment in Greenland Using ICESat and GPS Observations

    DEFF Research Database (Denmark)

    Nielsen, Karina; Sørensen, Louise Sandberg; Khan, Shfaqat Abbas

    2014-01-01

    Constraining glacial isostatic adjustment (GIA) i.e. the Earth’s viscoelastic response to past ice changes, is an important task, because GIA is a significant correction in gravity-based ice sheet mass balance estimates. Here, we investigate how temporal variations in the observed and modeled cru...

  20. Formal language constrained path problems

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.; Jacob, R.; Marathe, M.

    1997-07-08

    In many path finding problems arising in practice, certain patterns of edge/vertex labels in the labeled graph being traversed are allowed/preferred, while others are disallowed. Motivated by such applications as intermodal transportation planning, the authors investigate the complexity of finding feasible paths in a labeled network, where the mode choice for each traveler is specified by a formal language. The main contributions of this paper include the following: (1) the authors show that the problem of finding a shortest path between a source and destination for a traveler whose mode choice is specified as a context free language is solvable efficiently in polynomial time, when the mode choice is specified as a regular language they provide algorithms with improved space and time bounds; (2) in contrast, they show that the problem of finding simple paths between a source and a given destination is NP-hard, even when restricted to very simple regular expressions and/or very simple graphs; (3) for the class of treewidth bounded graphs, they show that (i) the problem of finding a regular language constrained simple path between source and a destination is solvable in polynomial time and (ii) the extension to finding context free language constrained simple paths is NP-complete. Several extensions of these results are presented in the context of finding shortest paths with additional constraints. These results significantly extend the results in [MW95]. As a corollary of the results, they obtain a polynomial time algorithm for the BEST k-SIMILAR PATH problem studied in [SJB97]. The previous best algorithm was given by [SJB97] and takes exponential time in the worst case.

  1. Synthesis and bioactivities of Phenazine-1-carboxylic acid derivatives based on the modification of PCA carboxyl group.

    Science.gov (United States)

    Xiong, Zhipeng; Niu, Junfan; Liu, Hao; Xu, Zhihong; Li, Junkai; Wu, Qinglai

    2017-05-01

    Phenazine-1-carboxylic acid (PCA) as a natural product widely exists in microbial metabolites of Pseudomonads and Streptomycetes and has been registered for the fungicide against rice sheath blight in China. To find higher fungicidal activities compounds and study the effects on fungicidal activities after changing the carboxyl group of PCA, we synthesized a series of PCA derivatives by modifying the carboxyl group of PCA and their structures were confirmed by 1 H NMR and HRMS. Most compounds exhibited significant fungicidal activities in vitro. In particular, compound 6 exhibited inhibition effect against Rhizoctonia solani with EC 50 values of 4.35mg/L and compound 3b exhibited effect against Fusarium graminearum with EC 50 values of 8.30mg/L, compared to the positive control PCA with its EC 50 values of 7.88mg/L (Rhizoctonia solani) and 127.28mg/L (Fusarium graminearum), respectively. The results indicated that the carboxyl group of PCA could be modified to be amide group, acylhydrazine group, ester group, methyl, hydroxymethyl, chloromethyl and ether group etc. And appropriate modifications on carboxyl group of PCA were useful to extend the fungicidal scope. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Treatment of a patient with posterior cortical atrophy (PCA) with chiropractic manipulation and Dynamic Neuromuscular Stabilization (DNS): A case report.

    Science.gov (United States)

    Francio, Vinicius T; Boesch, Ron; Tunning, Michael

    2015-03-01

    Posterior cortical atrophy (PCA) is a rare progressive neurodegenerative syndrome which unusual symptoms include deficits of balance, bodily orientation, chronic pain syndrome and dysfunctional motor patterns. Current research provides minimal guidance on support, education and recommended evidence-based patient care. This case reports the utilization of chiropractic spinal manipulation, dynamic neuromuscular stabilization (DNS), and other adjunctive procedures along with medical treatment of PCA. A 54-year-old male presented to a chiropractic clinic with non-specific back pain associated with visual disturbances, slight memory loss, and inappropriate cognitive motor control. After physical examination, brain MRI and PET scan, the diagnosis of PCA was recognized. Chiropractic spinal manipulation and dynamic neuromuscular stabilization were utilized as adjunctive care to conservative pharmacological treatment of PCA. Outcome measurements showed a 60% improvement in the patient's perception of health with restored functional neuromuscular pattern, improvements in locomotion, posture, pain control, mood, tolerance to activities of daily living (ADLs) and overall satisfactory progress in quality of life. Yet, no changes on memory loss progression, visual space orientation, and speech were observed. PCA is a progressive and debilitating condition. Because of poor awareness of PCA by physicians, patients usually receive incomplete care. Additional efforts must be centered on the musculoskeletal features of PCA, aiming enhancement in quality of life and functional improvements (FI). Adjunctive rehabilitative treatment is considered essential for individuals with cognitive and motor disturbances, and manual medicine procedures may be consider a viable option.

  3. [The role of a single PCA3 test before a first negative prostate biopsy: 5-year follow-up].

    Science.gov (United States)

    Bernardeau, S; Charles, T; Fromont-Hankard, G; Irani, J

    2017-04-01

    We report a 5-year follow-up of a cohort of patients who underwent a first prostate biopsy following a prostate cancer antigen 3 (PCA3) test. We reviewed consecutive patients who had in 2008 a single urinary PCA3 test using the Gen-Probe ® assay before a first prostate biopsy for a prostate-specific antigen (PSA) between 3 and 20ng/mL and/or a suspicious digital rectal examination. PCA3 performances were analyzed in 2008 and then in 2013 after taking into account the results of repeat biopsies. At initial biopsy in 2008, among the 125 patients study cohort, prostate cancer was diagnosed in 47 patients (37.6%). Abnormal digital rectal exam, PSA density, prostate volume and PCA3 score were significantly associated with prostate cancer diagnosis. PCA3 area under the curve of the receiver operating curve was 0.67 [95%CI: 0.57-0.76] with an optimal threshold of PCA3 in this sample of 24 units. During the 5-year follow-up, among the 78 patients with a negative prostate biopsy in 2008, 23 (29.5%) had a repeat prostate biopsy of whom 14 were diagnosed with prostate cancer. PCA3 score measured in 2008 was associated with prostate cancer diagnosis (P=0.002). All 9 patients with a negative repeat prostate biopsy had a PCA3 score below the cut-off while this was the case in only 2 patients among the 14 with a positive repeat prostate biopsy. The results of a single PCA3 test before a first prostate biopsy seems to be a useful aid in deciding whether to perform a repeat biopsy. 4. Copyright © 2017. Published by Elsevier Masson SAS.

  4. A better understanding of long-range temporal dependence of traffic flow time series

    Science.gov (United States)

    Feng, Shuo; Wang, Xingmin; Sun, Haowei; Zhang, Yi; Li, Li

    2018-02-01

    Long-range temporal dependence is an important research perspective for modelling of traffic flow time series. Various methods have been proposed to depict the long-range temporal dependence, including autocorrelation function analysis, spectral analysis and fractal analysis. However, few researches have studied the daily temporal dependence (i.e. the similarity between different daily traffic flow time series), which can help us better understand the long-range temporal dependence, such as the origin of crossover phenomenon. Moreover, considering both types of dependence contributes to establishing more accurate model and depicting the properties of traffic flow time series. In this paper, we study the properties of daily temporal dependence by simple average method and Principal Component Analysis (PCA) based method. Meanwhile, we also study the long-range temporal dependence by Detrended Fluctuation Analysis (DFA) and Multifractal Detrended Fluctuation Analysis (MFDFA). The results show that both the daily and long-range temporal dependence exert considerable influence on the traffic flow series. The DFA results reveal that the daily temporal dependence creates crossover phenomenon when estimating the Hurst exponent which depicts the long-range temporal dependence. Furthermore, through the comparison of the DFA test, PCA-based method turns out to be a better method to extract the daily temporal dependence especially when the difference between days is significant.

  5. Arabidopsis PCaP2 Functions as a Linker Between ABA and SA Signals in Plant Water Deficit Tolerance

    Directory of Open Access Journals (Sweden)

    Xianling Wang

    2018-05-01

    Full Text Available Water stress has a major influence on plant growth, development, and productivity. However, the cross-talk networks involved in drought tolerance are not well understood. Arabidopsis PCaP2 is a plasma membrane-associated Ca2+-binding protein. In this study, we employ qRT-PCR and β-glucuronidase (GUS histochemical staining to demonstrate that PCaP2 expression was strongly induced in roots, cotyledons, true leaves, lateral roots, and whole plants under water deficit conditions. Compared with the wild type (WT plants, PCaP2-overexpressing (PCaP2-OE plants displayed enhanced water deficit tolerance in terms of seed germination, seedling growth, and plant survival status. On the contrary, PCaP2 mutation and reduction via PCaP2-RNAi rendered plants more sensitive to water deficit. Furthermore, PCaP2-RNAi and pcap2 seedlings showed shorter root hairs and lower relative water content compared to WT under normal conditions and these phenotypes were exacerbated under water deficit. Additionally, the expression of PCaP2 was strongly induced by exogenous abscisic acid (ABA and salicylic acid (SA treatments. PCaP2-OE plants showed insensitive to exogenous ABA and SA treatments, in contrast to the susceptible phenotypes of pcap2 and PCaP2-RNAi. It is well-known that SNF1-related kinase 2s (SnRK2s and pathogenesis-related (PRs are major factors that influence plant drought tolerance by ABA- and SA-mediated pathways, respectively. Interestingly, PCaP2 positively regulated the expression of drought-inducible genes (RD29A, KIN1, and KIN2, ABA-mediated drought responsive genes (SnRK2.2, -2.3, -2.6, ABF1, -2, -3, -4, and SA-mediated drought responsive genes (PR1, -2, -5 under water deficit, ABA, or SA treatments. Taken together, our results showed that PCaP2 plays an important and positive role in Arabidopsis water deficit tolerance by involving in response to both ABA and SA signals and regulating root hair growth. This study provides novel insights into the

  6. 2L-PCA: a two-level principal component analyzer for quantitative drug design and its applications.

    Science.gov (United States)

    Du, Qi-Shi; Wang, Shu-Qing; Xie, Neng-Zhong; Wang, Qing-Yan; Huang, Ri-Bo; Chou, Kuo-Chen

    2017-09-19

    A two-level principal component predictor (2L-PCA) was proposed based on the principal component analysis (PCA) approach. It can be used to quantitatively analyze various compounds and peptides about their functions or potentials to become useful drugs. One level is for dealing with the physicochemical properties of drug molecules, while the other level is for dealing with their structural fragments. The predictor has the self-learning and feedback features to automatically improve its accuracy. It is anticipated that 2L-PCA will become a very useful tool for timely providing various useful clues during the process of drug development.

  7. Wavelet library for constrained devices

    Science.gov (United States)

    Ehlers, Johan Hendrik; Jassim, Sabah A.

    2007-04-01

    The wavelet transform is a powerful tool for image and video processing, useful in a range of applications. This paper is concerned with the efficiency of a certain fast-wavelet-transform (FWT) implementation and several wavelet filters, more suitable for constrained devices. Such constraints are typically found on mobile (cell) phones or personal digital assistants (PDA). These constraints can be a combination of; limited memory, slow floating point operations (compared to integer operations, most often as a result of no hardware support) and limited local storage. Yet these devices are burdened with demanding tasks such as processing a live video or audio signal through on-board capturing sensors. In this paper we present a new wavelet software library, HeatWave, that can be used efficiently for image/video processing/analysis tasks on mobile phones and PDA's. We will demonstrate that HeatWave is suitable for realtime applications with fine control and range to suit transform demands. We shall present experimental results to substantiate these claims. Finally this library is intended to be of real use and applied, hence we considered several well known and common embedded operating system platform differences; such as a lack of common routines or functions, stack limitations, etc. This makes HeatWave suitable for a range of applications and research projects.

  8. Swelling and microstructural development in path A PCA and type 316 stainless steel irradiated in HFIR to about 22 dpa

    International Nuclear Information System (INIS)

    Maziasz, P.J.; Braski, D.N.

    1983-01-01

    Irradiation of several microstructural variants of PCA and 20%-cold-worked N-lot type 316 stainess steel (CW 316) in HFIR to about 10 dpa produced no visible cavities at 300 0 C, bubbles at 400 0 C, and varying distributions of bubbles and voids at 500 and 600 0 C. The PCA-B1 swells the most and CW 316 (N-lot) the least at 600 0 C. Irradiations have been extended to about 22 dpa. The PCA-Al swells 0.06%/dpa at 600 0 C but at a much lower rate at 500 0 C. The PCA-A3 shows the lowest swelling at 600 0 C, about the half the swelling rate of type 316 stainless steel

  9. Testing a Modified PCA-Based Sharpening Approach for Image Fusion

    Directory of Open Access Journals (Sweden)

    Jan Jelének

    2016-09-01

    Full Text Available Image data sharpening is a challenging field of remote sensing science, which has become more relevant as high spatial-resolution satellites and superspectral sensors have emerged. Although the spectral property is crucial for mineral mapping, spatial resolution is also important as it allows targeted minerals/rocks to be identified/interpreted in a spatial context. Therefore, improving the spatial context, while keeping the spectral property provided by the superspectral sensor, would bring great benefits for geological/mineralogical mapping especially in arid environments. In this paper, a new concept was tested using superspectral data (ASTER and high spatial-resolution panchromatic data (WorldView-2 for image fusion. A modified Principal Component Analysis (PCA-based sharpening method, which implements a histogram matching workflow that takes into account the real distribution of values, was employed to test whether the substitution of Principal Components (PC1–PC4 can bring a fused image which is spectrally more accurate. The new approach was compared to those most widely used—PCA sharpening and Gram–Schmidt sharpening (GS, both available in ENVI software (Version 5.2 and lower as well as to the standard approach—sharpening Landsat 8 multispectral bands (MUL using its own panchromatic (PAN band. The visual assessment and the spectral quality indicators proved that the spectral performance of the proposed sharpening approach employing PC1 and PC2 improve the performance of the PCA algorithm, moreover, comparable or better results are achieved compared to the GS method. It was shown that, when using the PC1, the visible-near infrared (VNIR part of the spectrum was preserved better, however, if the PC2 was used, the short-wave infrared (SWIR part was preserved better. Furthermore, this approach improved the output spectral quality when fusing image data from different sensors (e.g., ASTER and WorldView-2 while keeping the proper albedo

  10. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  11. Research on distributed heterogeneous data PCA algorithm based on cloud platform

    Science.gov (United States)

    Zhang, Jin; Huang, Gang

    2018-05-01

    Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.

  12. Convergence analysis of Chauvin's PCA learning algorithm with a constant learning rate

    International Nuclear Information System (INIS)

    Lv Jiancheng; Yi Zhang

    2007-01-01

    The convergence of Chauvin's PCA learning algorithm with a constant learning rate is studied in this paper by using a DDT method (deterministic discrete-time system method). Different from the DCT method (deterministic continuous-time system method), the DDT method does not require that the learning rate converges to zero. An invariant set of Chauvin's algorithm with a constant learning rate is obtained so that the non-divergence of this algorithm can be guaranteed. Rigorous mathematic proofs are provided to prove the local convergence of this algorithm

  13. Convergence analysis of Chauvin's PCA learning algorithm with a constant learning rate

    Energy Technology Data Exchange (ETDEWEB)

    Lv Jiancheng [Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054 (China); Yi Zhang [Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054 (China)]. E-mail: zhangyi@uestc.edu.cn

    2007-05-15

    The convergence of Chauvin's PCA learning algorithm with a constant learning rate is studied in this paper by using a DDT method (deterministic discrete-time system method). Different from the DCT method (deterministic continuous-time system method), the DDT method does not require that the learning rate converges to zero. An invariant set of Chauvin's algorithm with a constant learning rate is obtained so that the non-divergence of this algorithm can be guaranteed. Rigorous mathematic proofs are provided to prove the local convergence of this algorithm.

  14. FUZZY FUSION OF PCA, ICA AND ILDA FACE ALGORITHMS FOR ENHANCED USER AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    PRASHANT KUMAR JAIN

    2017-09-01

    Full Text Available Use of biometrics has increased over last few years due to its inherent advantages over customary identification tools such as token card and password, etc. In biometrics, after fingerprint, face recognition is second most preferred method with reasonably good accuracy. In some applications like CCTV cameras where face of a person is available for processing, face recognition techniques can to be very useful. In this paper, integration of face recognition techniques PCA, ICA and ILDA using fuzzy fusion method is detailed. The preliminary results clearly reveal that the fusion of methods improves the accuracy of the user identification.

  15. PMSVM: An Optimized Support Vector Machine Classification Algorithm Based on PCA and Multilevel Grid Search Methods

    Directory of Open Access Journals (Sweden)

    Yukai Yao

    2015-01-01

    Full Text Available We propose an optimized Support Vector Machine classifier, named PMSVM, in which System Normalization, PCA, and Multilevel Grid Search methods are comprehensively considered for data preprocessing and parameters optimization, respectively. The main goals of this study are to improve the classification efficiency and accuracy of SVM. Sensitivity, Specificity, Precision, and ROC curve, and so forth, are adopted to appraise the performances of PMSVM. Experimental results show that PMSVM has relatively better accuracy and remarkable higher efficiency compared with traditional SVM algorithms.

  16. Temporal Glare

    DEFF Research Database (Denmark)

    Ritschel, Tobias; Ihrke, Matthias; Frisvad, Jeppe Revall

    2009-01-01

    Glare is a consequence of light scattered within the human eye when looking at bright light sources. This effect can be exploited for tone mapping since adding glare to the depiction of high-dynamic range (HDR) imagery on a low-dynamic range (LDR) medium can dramatically increase perceived contra...... to initially static HDR images. By conducting psychophysical studies, we validate that our method improves perceived brightness and that dynamic glare-renderings are often perceived as more attractive depending on the chosen scene.......Glare is a consequence of light scattered within the human eye when looking at bright light sources. This effect can be exploited for tone mapping since adding glare to the depiction of high-dynamic range (HDR) imagery on a low-dynamic range (LDR) medium can dramatically increase perceived contrast....... Even though most, if not all, subjects report perceiving glare as a bright pattern that fluctuates in time, up to now it has only been modeled as a static phenomenon. We argue that the temporal properties of glare are a strong means to increase perceived brightness and to produce realistic...

  17. Order-constrained linear optimization.

    Science.gov (United States)

    Tidwell, Joe W; Dougherty, Michael R; Chrabaszcz, Jeffrey S; Thomas, Rick P

    2017-11-01

    Despite the fact that data and theories in the social, behavioural, and health sciences are often represented on an ordinal scale, there has been relatively little emphasis on modelling ordinal properties. The most common analytic framework used in psychological science is the general linear model, whose variants include ANOVA, MANOVA, and ordinary linear regression. While these methods are designed to provide the best fit to the metric properties of the data, they are not designed to maximally model ordinal properties. In this paper, we develop an order-constrained linear least-squares (OCLO) optimization algorithm that maximizes the linear least-squares fit to the data conditional on maximizing the ordinal fit based on Kendall's τ. The algorithm builds on the maximum rank correlation estimator (Han, 1987, Journal of Econometrics, 35, 303) and the general monotone model (Dougherty & Thomas, 2012, Psychological Review, 119, 321). Analyses of simulated data indicate that when modelling data that adhere to the assumptions of ordinary least squares, OCLO shows minimal bias, little increase in variance, and almost no loss in out-of-sample predictive accuracy. In contrast, under conditions in which data include a small number of extreme scores (fat-tailed distributions), OCLO shows less bias and variance, and substantially better out-of-sample predictive accuracy, even when the outliers are removed. We show that the advantages of OCLO over ordinary least squares in predicting new observations hold across a variety of scenarios in which researchers must decide to retain or eliminate extreme scores when fitting data. © 2017 The British Psychological Society.

  18. Ketamine PCA for treatment of end-of-life neuropathic pain in pediatrics.

    Science.gov (United States)

    Taylor, Matthew; Jakacki, Regina; May, Carol; Howrie, Denise; Maurer, Scott

    2015-12-01

    Control of neuropathic pain (NP) for children at end of life is challenging. Ketamine improves control of NP, but its use in children is not well described. We describe a retrospective case review of 14 children with terminal prognoses treated with ketamine patient-controlled analgesia (PCA) for management of opioid-refractory NP at the end of life. Median ketamine dose was 0.06 mg/kg/h (range 0.014-0.308 mg/kg/h) with a 0.05 mg/kg (range 0.03-0.5mg/kg) demand dose available every 15 minutes (range 10-60 minutes). All patients noted subjective pain relief with ketamine, and 79% had no adverse effects. Benzodiazepines limited neuropsychiatric side effects. Ketamine treatment arrested dose escalation of opioids in 64% of patients, and 79% were discharged to home hospice. Ketamine PCA is an effective, well-tolerated therapy for opioid-refractory NP in pediatric end-of-life care. © The Author(s) 2014.

  19. Location optimization of solar plants by an integrated hierarchical DEA PCA approach

    International Nuclear Information System (INIS)

    Azadeh, A.; Ghaderi, S.F.; Maghsoudi, A.

    2008-01-01

    Unique features of renewable energies such as solar energy has caused increasing demands for such resources. In order to use solar energy as a natural resource, environmental circumstances and geographical location related to solar intensity must be considered. Different factors may affect on the selection of a suitable location for solar plants. These factors must be considered concurrently for optimum location identification of solar plants. This article presents an integrated hierarchical approach for location of solar plants by data envelopment analysis (DEA), principal component analysis (PCA) and numerical taxonomy (NT). Furthermore, an integrated hierarchical DEA approach incorporating the most relevant parameters of solar plants is introduced. Moreover, 2 multivariable methods namely, PCA and NT are used to validate the results of DEA model. The prescribed approach is tested for 25 different cities in Iran with 6 different regions within each city. This is the first study that considers an integrated hierarchical DEA approach for geographical location optimization of solar plants. Implementation of the proposed approach would enable the energy policy makers to select the best-possible location for construction of a solar power plant with lowest possible costs

  20. PCA-based ANN approach to leak classification in the main pipes of VVER-1000

    International Nuclear Information System (INIS)

    Hadad, Kamal; Jabbari, Masoud; Tabadar, Z.; Hashemi-Tilehnoee, Mehdi

    2012-01-01

    This paper presents a neural network based fault diagnosing approach which allows dynamic crack and leaks fault identification. The method utilizes the Principal Component Analysis (PCA) technique to reduce the problem dimension. Such a dimension reduction approach leads to faster diagnosing and allows a better graphic presentation of the results. To show the effectiveness of the proposed approach, two methodologies are used to train the neural network (NN). At first, a training matrix composed of 14 variables is used to train a Multilayer Perceptron neural network (MLP) with Resilient Backpropagation (RBP) algorithm. Employing the proposed method, a more accurate and simpler network is designed where the input size is reduced from 14 to 6 variables for training the NN. In short, the application of PCA highly reduces the network topology and allows employing more efficient training algorithms. The accuracy, generalization ability, and reliability of the designed networks are verified using 10 simulated events data from a VVER-1000 simulation using DINAMIKA-97 code. Noise is added to the data to evaluate the robustness of the method and the method again shows to be effective and powerful. (orig.)

  1. PCA Based Stress Monitoring of Cylindrical Specimens Using PZTs and Guided Waves

    Directory of Open Access Journals (Sweden)

    Jabid Quiroga

    2017-12-01

    Full Text Available Since mechanical stress in structures affects issues such as strength, expected operational life and dimensional stability, a continuous stress monitoring scheme is necessary for a complete integrity assessment. Consequently, this paper proposes a stress monitoring scheme for cylindrical specimens, which are widely used in structures such as pipelines, wind turbines or bridges. The approach consists of tracking guided wave variations due to load changes, by comparing wave statistical patterns via Principal Component Analysis (PCA. Each load scenario is projected to the PCA space by means of a baseline model and represented using the Q-statistical indices. Experimental validation of the proposed methodology is conducted on two specimens: (i a 12.7 mm ( 1 / 2 ″ diameter, 0.4 m length, AISI 1020 steel rod, and (ii a 25.4 mm ( 1 ″ diameter, 6m length, schedule 40, A-106, hollow cylinder. Specimen 1 was subjected to axial loads, meanwhile specimen 2 to flexion. In both cases, simultaneous longitudinal and flexural guided waves were generated via piezoelectric devices (PZTs in a pitch-catch configuration. Experimental results show the feasibility of the approach and its potential use as in-situ continuous stress monitoring application.

  2. The PCA learning effect: An emerging correlate of face memory during childhood.

    Science.gov (United States)

    Gao, Xiaoqing; Maurer, Daphne; Wilson, Hugh R

    2015-10-01

    Human adults implicitly learn the prototype and the principal components of the variability distinguishing faces (Gao & Wilson, 2014). Here we measured the implicit learning effect in adults and 9-year-olds, and with a modified child-friendly procedure, in 7-year-olds. All age groups showed the implicit learning effect by falsely recognizing the average (the prototype effect) and the principal component faces as having been seen (the PCA learning effect). The PCA learning effect, but not the prototype effect increased between 9years of age and adulthood and at both ages was the better predictor of memory for the actually studied faces. In contrast, for the 7-year-olds, the better predictor of face memory was the prototype effect. The pattern suggests that there may be a developmental change between ages 7 and 9 in the mechanism underlying memory for faces. We provide the first evidence that children as young as age 7 can extract the most important dimensions of variation represented by principal components among individual faces, a key ability that grows stronger with age and comes to underlie memory for faces. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Features and performance of PCa board which cuts off the electromagnetic wave. Electromagnetic shield building using the carbon fiber contamination PCa board; Denjiha wo shadansuru PCa ban no tokucho to seino. Tanso sen'i konnyu PCa ban wo mochiita denji shirudobiru

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Katsuo; Kasai, Yasuaki [Obayashi Corp., Osaka (Japan); Okada, Shin' ichiro; Sakamoto, Shin [Osaka Gas Corp., Osaka (Japan)

    1999-03-10

    With the rapid popularization of public radio information communication equipment, portable telephones, wireless LAN, etc., the interception (building shield) of the electromagnetic wave internal and external the building becomes large problem. As process equal to the convention and the method in which the cost is possible, they did do not shield the whole building, and also ensure the comfort as an office, and PCa board which mixed the carbon fiber into the mortar was developed. They described the survey result of the electromagnetic shield performance of the building which constructed by using this in external wall. They explained electromagnetism characteristics of the contamination mortar and application to the PCa board and method. They carried out the measurement in the electromagnetic shield room laboratory, and they obtained next result. 1) There is seldom on the effect both only the concrete and only by the carbon fiber mesh. 2) They considered the effect that carbon fiber chop 1% are mixed into the concrete. 3) The effect became a maximum, when carbon fiber chop and carbon fiber mesh were mixed, and they confirmed being excellent cost-concerned. (NEDO)

  4. A Method for Aileron Actuator Fault Diagnosis Based on PCA and PGC-SVM

    Directory of Open Access Journals (Sweden)

    Wei-Li Qin

    2016-01-01

    Full Text Available Aileron actuators are pivotal components for aircraft flight control system. Thus, the fault diagnosis of aileron actuators is vital in the enhancement of the reliability and fault tolerant capability. This paper presents an aileron actuator fault diagnosis approach combining principal component analysis (PCA, grid search (GS, 10-fold cross validation (CV, and one-versus-one support vector machine (SVM. This method is referred to as PGC-SVM and utilizes the direct drive valve input, force motor current, and displacement feedback signal to realize fault detection and location. First, several common faults of aileron actuators, which include force motor coil break, sensor coil break, cylinder leakage, and amplifier gain reduction, are extracted from the fault quadrantal diagram; the corresponding fault mechanisms are analyzed. Second, the data feature extraction is performed with dimension reduction using PCA. Finally, the GS and CV algorithms are employed to train a one-versus-one SVM for fault classification, thus obtaining the optimal model parameters and assuring the generalization of the trained SVM, respectively. To verify the effectiveness of the proposed approach, four types of faults are introduced into the simulation model established by AMESim and Simulink. The results demonstrate its desirable diagnostic performance which outperforms that of the traditional SVM by comparison.

  5. Learning binary code via PCA of angle projection for image retrieval

    Science.gov (United States)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  6. External validation of a PCA-3-based nomogram for predicting prostate cancer and high-grade cancer on initial prostate biopsy.

    Science.gov (United States)

    Greene, Daniel J; Elshafei, Ahmed; Nyame, Yaw A; Kara, Onder; Malkoc, Ercan; Gao, Tianming; Jones, J Stephen

    2016-08-01

    The aim of this study was to externally validate a previously developed PCA3-based nomogram for the prediction of prostate cancer (PCa) and high-grade (intermediate and/or high-grade) prostate cancer (HGPCa) at the time of initial prostate biopsy. A retrospective review was performed on a cohort of 336 men from a large urban academic medical center. All men had serum PSA PCa, PSA at diagnosis, PCA3, total prostate volume (TPV), and abnormal finding on digital rectal exam (DRE). These variables were used to test the accuracy (concordance index) and calibration of a previously published PCA3 nomogram. Biopsy confirms PCa and HGPCa in 51.0% and 30.4% of validation patients, respectively. This differed from the original cohort in that it had significantly more PCa and HGPCA (51% vs. 44%, P = 0.019; and 30.4% vs. 19.1%, P PCa detection the concordance index was 75% and 77% for overall PCa and HGPCa, respectively. Calibration for overall PCa was good. This represents the first external validation of a PCA3-based prostate cancer predictive nomogram in a North American population. Prostate 76:1019-1023, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  7. Modeling the microstructural evolution during constrained sintering

    DEFF Research Database (Denmark)

    Bjørk, Rasmus; Frandsen, Henrik Lund; Tikare, V.

    A numerical model able to simulate solid state constrained sintering of a powder compact is presented. The model couples an existing kinetic Monte Carlo (kMC) model for free sintering with a finite element (FE) method for calculating stresses on a microstructural level. The microstructural response...... to the stress field as well as the FE calculation of the stress field from the microstructural evolution is discussed. The sintering behavior of two powder compacts constrained by a rigid substrate is simulated and compared to free sintering of the same samples. Constrained sintering result in a larger number...

  8. Quantum Temporal Imaging

    OpenAIRE

    Tsang, Mankei; Psaltis, Demetri

    2006-01-01

    The concept of quantum temporal imaging is proposed to manipulate the temporal correlation of entangled photons. In particular, we show that time correlation and anticorrelation can be converted to each other using quantum temporal imaging.

  9. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  10. Constrained bidirectional propagation and stroke segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Mori, S; Gillespie, W; Suen, C Y

    1983-03-01

    A new method for decomposing a complex figure into its constituent strokes is described. This method, based on constrained bidirectional propagation, is suitable for parallel processing. Examples of its application to the segmentation of Chinese characters are presented. 9 references.

  11. Mathematical Modeling of Constrained Hamiltonian Systems

    NARCIS (Netherlands)

    Schaft, A.J. van der; Maschke, B.M.

    1995-01-01

    Network modelling of unconstrained energy conserving physical systems leads to an intrinsic generalized Hamiltonian formulation of the dynamics. Constrained energy conserving physical systems are directly modelled as implicit Hamiltonian systems with regard to a generalized Dirac structure on the

  12. Client's Constraining Factors to Construction Project Management

    African Journals Online (AJOL)

    factors as a significant system that constrains project management success of public and ... finance for the project and prompt payment for work executed; clients .... consideration of the loading patterns of these variables, the major factor is ...

  13. On the origin of constrained superfields

    Energy Technology Data Exchange (ETDEWEB)

    Dall’Agata, G. [Dipartimento di Fisica “Galileo Galilei”, Università di Padova,Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova,Via Marzolo 8, 35131 Padova (Italy); Dudas, E. [Centre de Physique Théorique, École Polytechnique, CNRS, Université Paris-Saclay,F-91128 Palaiseau (France); Farakos, F. [Dipartimento di Fisica “Galileo Galilei”, Università di Padova,Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova,Via Marzolo 8, 35131 Padova (Italy)

    2016-05-06

    In this work we analyze constrained superfields in supersymmetry and supergravity. We propose a constraint that, in combination with the constrained goldstino multiplet, consistently removes any selected component from a generic superfield. We also describe its origin, providing the operators whose equations of motion lead to the decoupling of such components. We illustrate our proposal by means of various examples and show how known constraints can be reproduced by our method.

  14. Pain management in patients with adolescent idiopathic scoliosis undergoing posterior spinal fusion: combined intrathecal morphine and continuous epidural versus PCA.

    Science.gov (United States)

    Ravish, Matthew; Muldowney, Bridget; Becker, Aimee; Hetzel, Scott; McCarthy, James J; Nemeth, Blaise A; Noonan, Kenneth J

    2012-12-01

    A retrospective case-comparison study. Compare efficacy and safety of combined intrathecal morphine (ITM) and epidural analgesia (EPI) to that of conventional intravenous patient-controlled analgesia (IV-PCA) after posterior spinal fusion (PSF) for adolescent idiopathic scoliosis (AIS). Pain control after PSF in AIS has been managed traditionally with IV-PCA. More recently studies have shown improvement in pain control with the use of continuous EPI or intraoperative ITM. No studies to our knowledge have compared the use of both ITM and EPI analgesia to that of IV-PCA. An Institutional Review Board-approved retrospective case-comparison study was performed from 1989 to 2009 of all patients undergoing PSF for AIS. Patients received either IV-PCA or ITM/EPI. Daily pain scores were recorded along with total opioid and benzodiazepine use. Adverse events were recorded for all the patients. A total of 146 patients were initially included in the study; 95 patients received ITM/EPI and 51 received IV-PCA as a historical control. Eight patients from the ITM/EPI group were excluded from the pain comparison portion of the study. There were no statistical differences in age, sex, weight, or hospital stay between the 2 groups. The ITM/EPI group had, on average, 1 additional level of fusion (P = 0.001). Daily average pain scores were lower in the ITM/EPI group on all hospital days, and statistically lower in days 1 and 3 to 5. Total opioid requirement was significantly lower in the ITM/EPI patients, although oral opioid use was higher among this group. Total benzodiazepine use was lower among the IV-PCA group. A total of 15.7% of the IV-PCA patients had bladder hypotonia, compared with 1.1% of the ITM/EPI group (P = 0.002). The rate of illeus was 15.7% in the IV-PCA patients and 5.7% in the ITM/EPI (P = 0.071). Respiratory depression was reported in 4 ITM/EPI patients, 0 in our PCA group. Technical catheter malfunction was reported in 8.5% of the EPI group. The use of ITM

  15. Improved algorithms for the classification of rough rice using a bionic electronic nose based on PCA and the Wilks distribution.

    Science.gov (United States)

    Xu, Sai; Zhou, Zhiyan; Lu, Huazhong; Luo, Xiwen; Lan, Yubin

    2014-03-19

    Principal Component Analysis (PCA) is one of the main methods used for electronic nose pattern recognition. However, poor classification performance is common in classification and recognition when using regular PCA. This paper aims to improve the classification performance of regular PCA based on the existing Wilks Λ-statistic (i.e., combined PCA with the Wilks distribution). The improved algorithms, which combine regular PCA with the Wilks Λ-statistic, were developed after analysing the functionality and defects of PCA. Verification tests were conducted using a PEN3 electronic nose. The collected samples consisted of the volatiles of six varieties of rough rice (Zhongxiang1, Xiangwan13, Yaopingxiang, WufengyouT025, Pin 36, and Youyou122), grown in same area and season. The first two principal components used as analysis vectors cannot perform the rough rice varieties classification task based on a regular PCA. Using the improved algorithms, which combine the regular PCA with the Wilks Λ-statistic, many different principal components were selected as analysis vectors. The set of data points of the Mahalanobis distance between each of the varieties of rough rice was selected to estimate the performance of the classification. The result illustrates that the rough rice varieties classification task is achieved well using the improved algorithm. A Probabilistic Neural Networks (PNN) was also established to test the effectiveness of the improved algorithms. The first two principal components (namely PC1 and PC2) and the first and fifth principal component (namely PC1 and PC5) were selected as the inputs of PNN for the classification of the six rough rice varieties. The results indicate that the classification accuracy based on the improved algorithm was improved by 6.67% compared to the results of the regular method. These results prove the effectiveness of using the Wilks Λ-statistic to improve the classification accuracy of the regular PCA approach. The results

  16. Comparative evaluation of urinary PCA3 and TMPRSS2: ERG scores and serum PHI in predicting prostate cancer aggressiveness.

    Science.gov (United States)

    Tallon, Lucile; Luangphakdy, Devillier; Ruffion, Alain; Colombel, Marc; Devonec, Marian; Champetier, Denis; Paparel, Philippe; Decaussin-Petrucci, Myriam; Perrin, Paul; Vlaeminck-Guillem, Virginie

    2014-07-30

    It has been suggested that urinary PCA3 and TMPRSS2:ERG fusion tests and serum PHI correlate to cancer aggressiveness-related pathological criteria at prostatectomy. To evaluate and compare their ability in predicting prostate cancer aggressiveness, PHI and urinary PCA3 and TMPRSS2:ERG (T2) scores were assessed in 154 patients who underwent radical prostatectomy for biopsy-proven prostate cancer. Univariate and multivariate analyses using logistic regression and decision curve analyses were performed. All three markers were predictors of a tumor volume≥0.5 mL. Only PHI predicted Gleason score≥7. T2 score and PHI were both independent predictors of extracapsular extension(≥pT3), while multifocality was only predicted by PCA3 score. Moreover, when compared to a base model (age, digital rectal examination, serum PSA, and Gleason sum at biopsy), the addition of both PCA3 score and PHI to the base model induced a significant increase (+12%) when predicting tumor volume>0.5 mL. PHI and urinary PCA3 and T2 scores can be considered as complementary predictors of cancer aggressiveness at prostatectomy.

  17. Comparative Evaluation of Urinary PCA3 and TMPRSS2: ERG Scores and Serum PHI in Predicting Prostate Cancer Aggressiveness

    Directory of Open Access Journals (Sweden)

    Lucile Tallon

    2014-07-01

    Full Text Available It has been suggested that urinary PCA3 and TMPRSS2:ERG fusion tests and serum PHI correlate to cancer aggressiveness-related pathological criteria at prostatectomy. To evaluate and compare their ability in predicting prostate cancer aggressiveness, PHI and urinary PCA3 and TMPRSS2:ERG (T2 scores were assessed in 154 patients who underwent radical prostatectomy for biopsy-proven prostate cancer. Univariate and multivariate analyses using logistic regression and decision curve analyses were performed. All three markers were predictors of a tumor volume ≥0.5 mL. Only PHI predicted Gleason score ≥7. T2 score and PHI were both independent predictors of extracapsular extension (≥pT3, while multifocality was only predicted by PCA3 score. Moreover, when compared to a base model (age, digital rectal examination, serum PSA, and Gleason sum at biopsy, the addition of both PCA3 score and PHI to the base model induced a significant increase (+12% when predicting tumor volume >0.5 mL. PHI and urinary PCA3 and T2 scores can be considered as complementary predictors of cancer aggressiveness at prostatectomy.

  18. Human Classification Based on Gestural Motions by Using Components of PCA

    International Nuclear Information System (INIS)

    Aziz, Azri A; Wan, Khairunizam; Za'aba, S K; Shahriman A B; Asyekin H; Zuradzman M R; Adnan, Nazrul H

    2013-01-01

    Lately, a study of human capabilities with the aim to be integrated into machine is the famous topic to be discussed. Moreover, human are bless with special abilities that they can hear, see, sense, speak, think and understand each other. Giving such abilities to machine for improvement of human life is researcher's aim for better quality of life in the future. This research was concentrating on human gesture, specifically arm motions for differencing the individuality which lead to the development of the hand gesture database. We try to differentiate the human physical characteristic based on hand gesture represented by arm trajectories. Subjects are selected from different type of the body sizes, and then acquired data undergo resampling process. The results discuss the classification of human based on arm trajectories by using Principle Component Analysis (PCA)

  19. Permeability Estimation of Rock Reservoir Based on PCA and Elman Neural Networks

    Science.gov (United States)

    Shi, Ying; Jian, Shaoyong

    2018-03-01

    an intelligent method which based on fuzzy neural networks with PCA algorithm, is proposed to estimate the permeability of rock reservoir. First, the dimensionality reduction process is utilized for these parameters by principal component analysis method. Further, the mapping relationship between rock slice characteristic parameters and permeability had been found through fuzzy neural networks. The estimation validity and reliability for this method were tested with practical data from Yan’an region in Ordos Basin. The result showed that the average relative errors of permeability estimation for this method is 6.25%, and this method had the better convergence speed and more accuracy than other. Therefore, by using the cheap rock slice related information, the permeability of rock reservoir can be estimated efficiently and accurately, and it is of high reliability, practicability and application prospect.

  20. Evaluation of the application of BIM technology based on PCA - Q Clustering Algorithm and Choquet Integral

    Directory of Open Access Journals (Sweden)

    Wei Xiaozhao

    2016-03-01

    Full Text Available For the development of the construction industry, the construction of data era is approaching, BIM (building information model with the actual needs of the construction industry has been widely used as a building information clan system software, different software for the practical application of different maturity, through the expert scoring method for the application of BIM technology maturity index mark, establish the evaluation index system, using PCA - Q clustering algorithm for the evaluation index system of classification, comprehensive evaluation in combination with the Choquet integral on the classification of evaluation index system, to achieve a reasonable assessment of the application of BIM technology maturity index. To lay a foundation for the future development of BIM Technology in various fields of construction, at the same time provides direction for the comprehensive application of BIM technology.

  1. Visual tracking based on the sparse representation of the PCA subspace

    Science.gov (United States)

    Chen, Dian-bing; Zhu, Ming; Wang, Hui-li

    2017-09-01

    We construct a collaborative model of the sparse representation and the subspace representation. First, we represent the tracking target in the principle component analysis (PCA) subspace, and then we employ an L 1 regularization to restrict the sparsity of the residual term, an L 2 regularization term to restrict the sparsity of the representation coefficients, and an L 2 norm to restrict the distance between the reconstruction and the target. Then we implement the algorithm in the particle filter framework. Furthermore, an iterative method is presented to get the global minimum of the residual and the coefficients. Finally, an alternative template update scheme is adopted to avoid the tracking drift which is caused by the inaccurate update. In the experiment, we test the algorithm on 9 sequences, and compare the results with 5 state-of-art methods. According to the results, we can conclude that our algorithm is more robust than the other methods.

  2. Oxidation/volatilization rates in air for candidate fusion reactor blanket materials, PCA and HT-9

    International Nuclear Information System (INIS)

    Piet, S.J.; Kraus, H.G.; Neilson, R.M. Jr.; Jones, J.L.

    1986-01-01

    Large uncertainties exist in the quantity of neutron-induced activation products that can be mobilized in potential fusion accidents. The accidental combination of high temperatures and oxidizing conditions might lead to mobilization of a significant amount of activation products from structural materials. Here, the volatilization of constituents of PCA and HT-9 resulting from oxidation in air was investigated. Tests were conducted in flowing air at temperatures from 600 to 1300 0 C for 1, 5, or 20 h. Elemental volatility was calculated in terms of the weight fraction of the element volatilized from the initial alloy. Molybdenum and manganese were the radiologically significant primary constituents most volatilizized, suggesting that molybdenum and manganese should be minimized in fusion steel compositions. Higher chromium content appears beneficial in reducing hazards from mobile activation products. Scanning electron microscopy and energy dispersive spectroscopy were used to study the oxide layer on samples. (orig.)

  3. Oxidation/volatilization rates in air for candidate fusion reactor blanket materials, PCA and HT-9

    International Nuclear Information System (INIS)

    Piet, S.J.; Kraus, H.G.; Neilson, R.M. Jr.; Jones, J.L.

    1986-01-01

    Large uncertainties exist in the quantity of neutron-induced activation products that can be mobilized in potential fusion accidents. The accidental combination of high temperatures and oxidizing conditions might lead to mobilization of a significant amount of activation products from structural materials. Here, the volatilization of constituents of PCA and HT-9 resulting from oxidation in air was investigated. Tests were conducted in flowing air at temperatures from 600 to 1300 0 C for 1, 5, or 20 hours. Elemental volatility was calculated in terms of the weight fraction of the element volatilized from the initial alloy. Molybdenum and manganese were the radiologically significant primary constituents most volatilized, suggesting that molybdenum and manganese should be minimized in fusion steel compositions. Higher chromium content appears beneficial in reducing hazards from mobile activation products. Scanning electron microscopy and energy dispersive spectroscopy were used to study the oxide layer on samples

  4. Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso

    2005-04-01

    Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.

  5. Insights on the Spectral Signatures of Stellar Activity and Planets from PCA

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Allen B.; Fischer, Debra A. [Department of Astronomy, Yale University, 52 Hillhouse Avenue, New Haven, CT 06511 (United States); Cisewski, Jessi [Department of Statistics, Yale University, 24 Hillhouse Avenue, New Haven, CT 06511 (United States); Dumusque, Xavier [Observatoire de Genève, Université de Genève, 51 ch. des Maillettes, 1290 Versoix (Switzerland); Ford, Eric B., E-mail: allen.b.davis@yale.edu [Center for Exoplanets and Habitable Worlds, The Pennsylvania State University, University Park, PA 16802 (United States)

    2017-09-01

    Photospheric velocities and stellar activity features such as spots and faculae produce measurable radial velocity signals that currently obscure the detection of sub-meter-per-second planetary signals. However, photospheric velocities are imprinted differently in a high-resolution spectrum than are Keplerian Doppler shifts. Photospheric activity produces subtle differences in the shapes of absorption lines due to differences in how temperature or pressure affects the atomic transitions. In contrast, Keplerian Doppler shifts affect every spectral line in the same way. With a high enough signal-to-noise (S/N) and resolution, statistical techniques can exploit differences in spectra to disentangle the photospheric velocities and detect lower-amplitude exoplanet signals. We use simulated disk-integrated time-series spectra and principal component analysis (PCA) to show that photospheric signals introduce spectral line variability that is distinct from that of Doppler shifts. We quantify the impact of instrumental resolution and S/N for this work.

  6. Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA

    Directory of Open Access Journals (Sweden)

    Jorge Arroyo-Hernández

    2016-01-01

    Full Text Available The dimensionality reduction methods are algorithms mapping the set of data in subspaces derived from the original space, of fewer dimensions, that allow a description of the data at a lower cost. Due to their importance, they are widely used in processes associated with learning machine. This article presents a comparative analysis of PCA, PPCA and KPCA dimensionality reduction methods. A reconstruction experiment of worm-shape data was performed through structures of landmarks located in the body contour, with methods having different number of main components. The results showed that all methods can be seen as alternative processes. Nevertheless, thanks to the potential for analysis in the features space and the method for calculation of its preimage presented, KPCA offers a better method for recognition process and pattern extraction

  7. Intrathecal morphine is superior to intravenous PCA in patients undergoing minimally invasive cardiac surgery

    Directory of Open Access Journals (Sweden)

    Chirojit Mukherjee

    2012-01-01

    Full Text Available Aim of our study was to evaluate the beneficial effect of low dose intrathecal morphine on postoperative analgesia, over the use of intravenous patient controlled anesthesia (PCA, in patients undergoing fast track anesthesia during minimally invasive cardiac surgical procedures. A randomized controlled trial was undertaken after approval from local ethical committee. Written informed consent was obtained from 61 patients receiving mitral or tricuspid or both surgical valve repair in minimal invasive technique. Patients were assigned randomly to 2 groups. Group 1 received general anesthesia and intravenous patient controlled analgesia (PCA pump with Piritramide (GA group. Group 2 received a single shot of intrathecal morphine (1.5 μg/kg body weight prior to the administration of general anesthesia (ITM group. Site of puncture was confined to lumbar (L1-2 or L2-3 intrathecal space. The amount of intravenous piritramide used in post anesthesia care unit (PACU and the first postoperative day was defined as primary end point. Secondary end points included: time for tracheal extubation, pain and sedation scores in PACU upto third postoperative day. For statistical analysis Mann-Whitney-U Test and Fishers exact test (SPSS were used. We found that the demand for intravenous opioids in PACU was significantly reduced in ITM group (P <0.001. Pain scores were significantly decreased in ITM group until second postoperative day (P <0.01. There was no time delay for tracheal extubation in ITM group, and sedation scores did not differ in either group. We conclude that low dose single shot intrathecal morphine provides adequate postoperative analgesia, reduces the intravenous opioid consumption during the early postoperative period and does not defer early extubation.

  8. Identification of beta-2 as a key cell adhesion molecule in PCa cell neurotropic behavior: a novel ex vivo and biophysical approach.

    Science.gov (United States)

    Jansson, Keith H; Castillo, Deborah G; Morris, Joseph W; Boggs, Mary E; Czymmek, Kirk J; Adams, Elizabeth L; Schramm, Lawrence P; Sikes, Robert A

    2014-01-01

    Prostate cancer (PCa) is believed to metastasize through the blood/lymphatics systems; however, PCa may utilize the extensive innervation of the prostate for glandular egress. The interaction of PCa and its nerve fibers is observed in 80% of PCa and is termed perineural invasion (PNI). PCa cells have been observed traveling through the endoneurium of nerves, although the underlying mechanisms have not been elucidated. Voltage sensitive sodium channels (VSSC) are multimeric transmembrane protein complexes comprised of a pore-forming α subunit and one or two auxiliary beta (β) subunits with inherent cell adhesion molecule (CAM) functions. The beta-2 isoform (gene SCN2B) interacts with several neural CAMs, while interacting putatively with other prominent neural CAMs. Furthermore, beta-2 exhibits elevated mRNA and protein levels in highly metastatic and castrate-resistant PCa. When overexpressed in weakly aggressive LNCaP cells (2BECFP), beta-2 alters LNCaP cell morphology and enhances LNCaP cell metastasis associated behavior in vitro. We hypothesize that PCa cells use beta-2 as a CAM during PNI and subsequent PCa metastasis. The objective of this study was to determine the effect of beta-2 expression on PCa cell neurotropic metastasis associated behavior. We overexpressed beta-2 as a fusion protein with enhanced cyan fluorescence protein (ECFP) in weakly aggressive LNCaP cells and observed neurotropic effects utilizing our novel ex vivo organotypic spinal cord co-culture model, and performed functional assays with neural matrices and atomic force microscopy. With increased beta-2 expression, PCa cells display a trend of enhanced association with nerve axons. On laminin, a neural CAM, overexpression of beta-2 enhances PCa cell migration, invasion, and growth. 2BECFP cells exhibit marked binding affinity to laminin relative to LNECFP controls, and recombinant beta-2 ectodomain elicits more binding events to laminin than BSA control. Functional overexpression of VSSC

  9. Design and synthesis of prostate cancer antigen-1 (PCA-1/ALKBH3) inhibitors as anti-prostate cancer drugs.

    Science.gov (United States)

    Nakao, Syuhei; Mabuchi, Miyuki; Shimizu, Tadashi; Itoh, Yoshihiro; Takeuchi, Yuko; Ueda, Masahiro; Mizuno, Hiroaki; Shigi, Naoko; Ohshio, Ikumi; Jinguji, Kentaro; Ueda, Yuko; Yamamoto, Masatatsu; Furukawa, Tatsuhiko; Aoki, Shunji; Tsujikawa, Kazutake; Tanaka, Akito

    2014-02-15

    A series of 1-aryl-3,4-substituted-1H-pyrazol-5-ol derivatives was synthesized and evaluated as prostate cancer antigen-1 (PCA-1/ALKBH3) inhibitors to obtain a novel anti-prostate cancer drug. After modifying 1-(1H-benzimidazol-2-yl)-3,4-dimethyl-1H-pyrazol-5-ol (1), a hit compound found during random screening using a recombinant PCA-1/ALKBH3, 1-(1H-5-methylbenzimidazol-2-yl)-4-benzyl-3-methyl-1H-pyrazol-5-ol (35, HUHS015), was obtained as a potent PCA-1/ALKBH3 inhibitor both in vitro and in vivo. The bioavailability (BA) of 35 was 7.2% in rats after oral administration. As expected, continuously administering 35 significantly suppressed the growth of DU145 cells, which are human hormone-independent prostate cancer cells, in a mouse xenograft model without untoward effects. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Comparative Performance Of Using PCA With K-Means And Fuzzy C Means Clustering For Customer Segmentation

    Directory of Open Access Journals (Sweden)

    Fahmida Afrin

    2015-08-01

    Full Text Available Abstract Data mining is the process of analyzing data and discovering useful information. Sometimes it is called knowledge Discovery. Clustering refers to groups whereas data are grouped in such a way that the data in one cluster are similar data in different clusters are dissimilar. Many data mining technologies are developed for customer segmentation. PCA is working as a preprocessor of Fuzzy C means and K- means for reducing the high dimensional and noisy data. There are many clustering method apply on customer segmentation. In this paper the performance of Fuzzy C means and K-means after implementing Principal Component Analysis is analyzed. We analyze the performance on a standard dataset for these algorithms. The results indicate that PCA based fuzzy clustering produces better results than PCA based K-means and is a more stable method for customer segmentation.

  11. Towards General Temporal Aggregation

    DEFF Research Database (Denmark)

    Boehlen, Michael H.; Gamper, Johann; Jensen, Christian Søndergaard

    2008-01-01

    associated with the management of temporal data. Indeed, temporal aggregation is complex and among the most difficult, and thus interesting, temporal functionality to support. This paper presents a general framework for temporal aggregation that accommodates existing kinds of aggregation, and it identifies...

  12. Investigação da qualidade de farinhas enriquecidas utilizando Análise por Componentes Principais (PCA Enriched flour quality investigation using Principal Component Analysis (PCA

    Directory of Open Access Journals (Sweden)

    Bruno Thiago Soeiro

    2010-09-01

    Full Text Available Alguns países, incluindo o Brasil (RDC 344, 2002, instituíram uma regulamentação indicando que farinhas de milho e trigo devem ser enriquecidas com ácido fólico e ferro. O principal objetivo deste trabalho foi a avaliação de algumas características de farinhas enriquecidas usando a Análise por Componentes Principais (PCA. Parâmetros como o teor de ácido fólico, ferro, proteína, lipídios, umidade, cinzas e carboidratos foram avaliados em 30 embalagens de farinhas adquiridas em comércio local. As farinhas de trigo e milho apresentaram, em média, composição centesimal aceitável de acordo com a Legislação Brasileira. Para as farinhas de trigo, a concentração de ácido fólico estava, em média, próxima ao esperado. As farinhas de milho continham quantidade superior da vitamina. Para os dois tipos de farinha, constatou-se teor de ferro acima do valor declarado no rótulo dos produtos. Uma matriz com 30 linhas (amostras e 7 colunas (variáveis foi organizada e os dados foram autoescalados. A primeira informação observada foi uma clara diferenciação entre os tipos de farinhas. As farinhas de trigo foram caracterizadas por maior quantidade de proteínas, umidade e cinzas. Por outro lado, as farinhas de milho apresentaram maior concentração de ferro, lipídios, carboidratos e ácido fólico. Foi possível notar também que farinhas acondicionadas em embalagens de plástico apresentaram menor quantidade de ácido fólico (152 µg.100 g-1, em média, quando comparadas às amostras armazenadas em embalagens de papel (259 µg.100 g-1, em média. Esse estudo pode fornecer ferramentas importantes para a avaliação dos programas de enriquecimento de alimentos com ácido fólico, principalmente, por apontar, preliminarmente, para a importância do tipo de embalagem para o acondicionamento das farinhas enriquecidas com a vitamina.Some countries, including Brazil (resolution - RDC # 344, 2004, have issued a regulation stipulating

  13. Classification of prostate cancer grade using temporal ultrasound: in vivo feasibility study

    Science.gov (United States)

    Ghavidel, Sahar; Imani, Farhad; Khallaghi, Siavash; Gibson, Eli; Khojaste, Amir; Gaed, Mena; Moussa, Madeleine; Gomez, Jose A.; Siemens, D. Robert; Leveridge, Michael; Chang, Silvia; Fenster, Aaron; Ward, Aaron D.; Abolmaesumi, Purang; Mousavi, Parvin

    2016-03-01

    Temporal ultrasound has been shown to have high classification accuracy in differentiating cancer from benign tissue. In this paper, we extend the temporal ultrasound method to classify lower grade Prostate Cancer (PCa) from all other grades. We use a group of nine patients with mostly lower grade PCa, where cancerous regions are also limited. A critical challenge is to train a classifier with limited aggressive cancerous tissue compared to low grade cancerous tissue. To resolve the problem of imbalanced data, we use Synthetic Minority Oversampling Technique (SMOTE) to generate synthetic samples for the minority class. We calculate spectral features of temporal ultrasound data and perform feature selection using Random Forests. In leave-one-patient-out cross-validation strategy, an area under receiver operating characteristic curve (AUC) of 0.74 is achieved with overall sensitivity and specificity of 70%. Using an unsupervised learning approach prior to proposed method improves sensitivity and AUC to 80% and 0.79. This work represents promising results to classify lower and higher grade PCa with limited cancerous training samples, using temporal ultrasound.

  14. PCA3 Reference Set Application: T2-Erg-Martin Sanda-Emory (2014) — EDRN Public Portal

    Science.gov (United States)

    We hypothesize that combining T2:erg (T2:erg) fusion and PCA3 detection in urine collected after digital rectal exam can improve the specificity of identifying clinically significant prostate cancer presence over the standard PSA and DRE. To address this hypothesis we propose to validate the performance of the urinary T2:erg in a multiplex model predicting the diagnosis of clinically significant prostate cancer on subsequent prostate biopsy using post-DRE pre biopsy urine specimens from a cohort of 900 men on the EDRN’s PCA3 trial.

  15. Age-related postoperative morphine requirements in children following major surgery--an assessment using patient-controlled analgesia (PCA)

    DEFF Research Database (Denmark)

    Hansen, Tom Giedsing; Henneberg, Steen Winther; Hole, P

    1996-01-01

    To investigate if small children require less morphine for postoperative analgesia than do older children and adolescents we analysed the morphine consumption pattern of 28 consecutive children on intravenous patient-controlled analgesia (PCA) following major surgery. The median age-specific morp......To investigate if small children require less morphine for postoperative analgesia than do older children and adolescents we analysed the morphine consumption pattern of 28 consecutive children on intravenous patient-controlled analgesia (PCA) following major surgery. The median age...

  16. Towards weakly constrained double field theory

    Directory of Open Access Journals (Sweden)

    Kanghoon Lee

    2016-08-01

    Full Text Available We show that it is possible to construct a well-defined effective field theory incorporating string winding modes without using strong constraint in double field theory. We show that X-ray (Radon transform on a torus is well-suited for describing weakly constrained double fields, and any weakly constrained fields are represented as a sum of strongly constrained fields. Using inverse X-ray transform we define a novel binary operation which is compatible with the level matching constraint. Based on this formalism, we construct a consistent gauge transform and gauge invariant action without using strong constraint. We then discuss the relation of our result to the closed string field theory. Our construction suggests that there exists an effective field theory description for massless sector of closed string field theory on a torus in an associative truncation.

  17. Continuation of Sets of Constrained Orbit Segments

    DEFF Research Database (Denmark)

    Schilder, Frank; Brøns, Morten; Chamoun, George Chaouki

    Sets of constrained orbit segments of time continuous flows are collections of trajectories that represent a whole or parts of an invariant set. A non-trivial but simple example is a homoclinic orbit. A typical representation of this set consists of an equilibrium point of the flow and a trajectory...... that starts close and returns close to this fixed point within finite time. More complicated examples are hybrid periodic orbits of piecewise smooth systems or quasi-periodic invariant tori. Even though it is possible to define generalised two-point boundary value problems for computing sets of constrained...... orbit segments, this is very disadvantageous in practice. In this talk we will present an algorithm that allows the efficient continuation of sets of constrained orbit segments together with the solution of the full variational problem....

  18. The effectiveness of Patient Controlled Analgesia (PCA morphine-ketamine compared to Patient Controlled Analgesia (PCA morphine to reduce total dose of morphine and Visual Analog Scale (VAS in postoperative laparotomy surgery

    Directory of Open Access Journals (Sweden)

    I Gusti Ngurah Mahaalit Aribawa

    2017-05-01

    Full Text Available Background: Laparotomy may cause moderate to severe after surgery pain, thus adequate pain management is needed. The addition of ketamine in patient controlled analgesia (PCA morphine after surgery can be the option. This study aims to evaluate the effectiveness of PCA morphine-ketamine compared to PCA morphine in patient postoperative laparotomy surgery to reduce total dose of morphine requirement and pain intensity evaluated with visual analog scale (VAS. Methods: This study was a double-blind RCT in 58 patients of ASA I and II, age 18-64 years, underwent an elective laparotomy at Sanglah General Hospital. Patients were divided into 2 groups. Group A, got addition of ketamine (1mg/ml in PCA morphine (1mg/ml and patients in group B received morphine (1mg/ml by PCA. Prior to surgical incision both group were given a bolus ketamine 0,15mg/ kg and ketorolac 0,5mg/kg. The total dose of morphine and VAS were measured at 6, 12, and 24 hours postoperatively. Result: Total dose of morphine in the first 24 hours postoperatively at morphine-ketamine group (5,1±0,8mg is lower than morphine only group (6,5±0,9mg p<0,001. VAS (resting 6 and 12 hour postoperative in morphine-ketamine group (13,4±4,8 mm and (10,7±2,6 mm are lower than morphine (17,9±4,1mm p≤0,05 and (12,8±5,3mm p≤0,05. VAS (moving 6, 12, and 24 hour postoperative morphineketamine group (24,8±5,1mm, (18±5,6mm and (9±5,6mm are lower than morphine (28,7±5,2mm p≤0,05, (23,1±6,0mm p≤0,05, and (12,8±5,3mm p≤0,05. Conclusions: Addition of ketamine in PCA morphine for postoperative laparotomy surgery reduces total morphine requirements in 24 hours compared to PCA morphine alone.

  19. Neuroendocrine prostate cancer (NEPCa) increased the neighboring PCa chemo-resistance via altering the PTHrP/p38/Hsp27/androgen receptor (AR)/p21 signals

    Science.gov (United States)

    Cui, Yun; Sun, Yin; Hu, Shuai; Luo, Jie; Li, Lei; Li, Xin; Yeh, Shuyuan; Jin, Jie; Chang, Chawnshang

    2016-01-01

    Prostatic neuroendocrine cells (NE) are an integral part of prostate cancer (PCa) that are associated with PCa progression. As the current androgen-deprivation therapy (ADT) with anti-androgens may promote the neuroendocrine PCa (NEPCa) development, and few therapies can effectively suppress NEPCa, understanding the impact of NEPCa on PCa progression may help us to develop better therapies to battle PCa. Here we found NEPCa cells could increase the docetaxel-resistance of their neighboring PCa cells. Mechanism dissection revealed that through secretion of PTHrP, NEPCa cells could alter the p38/MAPK/Hsp27 signals in their neighboring PCa cells that resulted in increased androgen receptor (AR) activity via promoting AR nuclear translocation. The consequences of increased AR function might then increase docetaxel-resistance via increasing p21 expression. In vivo xenograft mice experiments also confirmed NEPCa could increase the docetaxel-resistance of neighboring PCa, and targeting this newly identified PTHrP/p38/Hsp27/AR/p21 signaling pathway with either p38 inhibitor (SB203580) or sh-PTHrP may result in improving/restoring the docetaxel sensitivity to better suppress PCa. PMID:27375022

  20. pcaH, a molecular marker for estimating the diversity of the protocatechuate-degrading bacterial community in the soil environment

    DEFF Research Database (Denmark)

    El Azhari, Najoi

    2007-01-01

    Microorganisms degrading phenolic compounds play an important role in soil carbon cycling as well as in pesticide degradation. The pcaH gene encoding a key ring-cleaving enzyme of the β-ketoadipate pathway was selected as a functional marker. Using a degenerate primer pair, pcaH fragments were cl......H sequences from Actinobacteria and Proteobacteria phyla. This confirms that the developed primer pair targets a wide diversity of pcaH sequences, thereby constituting a suitable molecular marker to estimate the response of the pca community to agricultural practices....

  1. Temporal resolution for the perception of features and conjunctions.

    Science.gov (United States)

    Bodelón, Clara; Fallah, Mazyar; Reynolds, John H

    2007-01-24

    The visual system decomposes stimuli into their constituent features, represented by neurons with different feature selectivities. How the signals carried by these feature-selective neurons are integrated into coherent object representations is unknown. To constrain the set of possible integrative mechanisms, we quantified the temporal resolution of perception for color, orientation, and conjunctions of these two features. We find that temporal resolution is measurably higher for each feature than for their conjunction, indicating that time is required to integrate features into a perceptual whole. This finding places temporal limits on the mechanisms that could mediate this form of perceptual integration.

  2. PCA-based approach for subtracting thermal background emission in high-contrast imaging data

    Science.gov (United States)

    Hunziker, S.; Quanz, S. P.; Amara, A.; Meyer, M. R.

    2018-03-01

    Aims.Ground-based observations at thermal infrared wavelengths suffer from large background radiation due to the sky, telescope and warm surfaces in the instrument. This significantly limits the sensitivity of ground-based observations at wavelengths longer than 3 μm. The main purpose of this work is to analyse this background emission in infrared high-contrast imaging data as illustrative of the problem, show how it can be modelled and subtracted and demonstrate that it can improve the detection of faint sources, such as exoplanets. Methods: We used principal component analysis (PCA) to model and subtract the thermal background emission in three archival high-contrast angular differential imaging datasets in the M' and L' filter. We used an M' dataset of β Pic to describe in detail how the algorithm works and explain how it can be applied. The results of the background subtraction are compared to the results from a conventional mean background subtraction scheme applied to the same dataset. Finally, both methods for background subtraction are compared by performing complete data reductions. We analysed the results from the M' dataset of HD 100546 only qualitatively. For the M' band dataset of β Pic and the L' band dataset of HD 169142, which was obtained with an angular groove phase mask vortex vector coronagraph, we also calculated and analysed the achieved signal-to-noise ratio (S/N). Results: We show that applying PCA is an effective way to remove spatially and temporarily varying thermal background emission down to close to the background limit. The procedure also proves to be very successful at reconstructing the background that is hidden behind the point spread function. In the complete data reductions, we find at least qualitative improvements for HD 100546 and HD 169142, however, we fail to find a significant increase in S/N of β Pic b. We discuss these findings and argue that in particular datasets with strongly varying observing conditions or

  3. Comparison of PCA and ICA based clutter reduction in GPR systems for anti-personal landmine detection

    DEFF Research Database (Denmark)

    Karlsen, Brian; Larsen, Jan; Sørensen, Helge Bjarup Dissing

    2001-01-01

    This paper presents statistical signal processing approaches for clutter reduction in stepped-frequency ground penetrating radar (SF-GPR) data. In particular, we suggest clutter/signal separation techniques based on principal and independent component analysis (PCA/ICA). The approaches...

  4. Accounting for baryonic effects in cosmic shear tomography: Determining a minimal set of nuisance parameters using PCA

    Energy Technology Data Exchange (ETDEWEB)

    Eifler, Tim; Krause, Elisabeth; Dodelson, Scott; Zentner, Andrew; Hearin, Andrew; Gnedin, Nickolay

    2014-05-28

    Systematic uncertainties that have been subdominant in past large-scale structure (LSS) surveys are likely to exceed statistical uncertainties of current and future LSS data sets, potentially limiting the extraction of cosmological information. Here we present a general framework (PCA marginalization) to consistently incorporate systematic effects into a likelihood analysis. This technique naturally accounts for degeneracies between nuisance parameters and can substantially reduce the dimension of the parameter space that needs to be sampled. As a practical application, we apply PCA marginalization to account for baryonic physics as an uncertainty in cosmic shear tomography. Specifically, we use CosmoLike to run simulated likelihood analyses on three independent sets of numerical simulations, each covering a wide range of baryonic scenarios differing in cooling, star formation, and feedback mechanisms. We simulate a Stage III (Dark Energy Survey) and Stage IV (Large Synoptic Survey Telescope/Euclid) survey and find a substantial bias in cosmological constraints if baryonic physics is not accounted for. We then show that PCA marginalization (employing at most 3 to 4 nuisance parameters) removes this bias. Our study demonstrates that it is possible to obtain robust, precise constraints on the dark energy equation of state even in the presence of large levels of systematic uncertainty in astrophysical processes. We conclude that the PCA marginalization technique is a powerful, general tool for addressing many of the challenges facing the precision cosmology program.

  5. PASS Reference Set Application: Lin UW (2010) TMPRSS2-ERG-PCA-PASS — EDRN Public Portal

    Science.gov (United States)

    Active surveillance is used to manage low-risk prostate cancer. Both PCA3 and TMPRSS2:ERG are promising biomarkers that may be associated with aggressive disease. This study examines the correlation of these biomarkers with higher cancer volume and grade determined at the time of biopsy in an active surveillance cohort.

  6. Discrimination of liver cancer in cellular level based on backscatter micro-spectrum with PCA algorithm and BP neural network

    Science.gov (United States)

    Yang, Jing; Wang, Cheng; Cai, Gan; Dong, Xiaona

    2016-10-01

    The incidence and mortality rate of the primary liver cancer are very high and its postoperative metastasis and recurrence have become important factors to the prognosis of patients. Circulating tumor cells (CTC), as a new tumor marker, play important roles in the early diagnosis and individualized treatment. This paper presents an effective method to distinguish liver cancer based on the cellular scattering spectrum, which is a non-fluorescence technique based on the fiber confocal microscopic spectrometer. Combining the principal component analysis (PCA) with back propagation (BP) neural network were utilized to establish an automatic recognition model for backscatter spectrum of the liver cancer cells from blood cell. PCA was applied to reduce the dimension of the scattering spectral data which obtained by the fiber confocal microscopic spectrometer. After dimensionality reduction by PCA, a neural network pattern recognition model with 2 input layer nodes, 11 hidden layer nodes, 3 output nodes was established. We trained the network with 66 samples and also tested it. Results showed that the recognition rate of the three types of cells is more than 90%, the relative standard deviation is only 2.36%. The experimental results showed that the fiber confocal microscopic spectrometer combining with the algorithm of PCA and BP neural network can automatically identify the liver cancer cell from the blood cells. This will provide a better tool for investigating the metastasis of liver cancers in vivo, the biology metabolic characteristics of liver cancers and drug transportation. Additionally, it is obviously referential in practical application.

  7. Contrasting Effects of Dissolved Organic Matter on Mercury Methylation by Geobacter sulfurreducens PCA and Desulfovibrio desulfuricans ND132.

    Science.gov (United States)

    Zhao, Linduo; Chen, Hongmei; Lu, Xia; Lin, Hui; Christensen, Geoff A; Pierce, Eric M; Gu, Baohua

    2017-09-19

    Natural dissolved organic matter (DOM) affects mercury (Hg) redox reactions and anaerobic microbial methylation in the environment. Several studies have shown that DOM can enhance Hg methylation, especially under sulfidic conditions, whereas others show that DOM inhibits Hg methylation due to strong Hg-DOM complexation. In this study, we investigated and compared the effects of DOM on Hg methylation by an iron-reducing bacterium Geobacter sulfurreducens PCA and a sulfate-reducing bacterium Desulfovibrio desulfuricans ND132 under nonsulfidic conditions. The methylation experiment was performed with washed cells either in the absence or presence of DOM or glutathione, both of which form strong complexes with Hg via thiol-functional groups. DOM was found to greatly inhibit Hg methylation by G. Sulfurreducens PCA but enhance Hg methylation by D. desulfuricans ND132 cells with increasing DOM concentration. These strain-dependent opposing effects of DOM were also observed with glutathione, suggesting that thiols in DOM likely played an essential role in affecting microbial Hg uptake and methylation. Additionally, DOM and glutathione greatly decreased Hg sorption by G. sulfurreducens PCA but showed little effect on D. desulfuricans ND132 cells, demonstrating that ND132 has a higher affinity to sorb or take up Hg than the PCA strain. These observations indicate that DOM effects on Hg methylation are bacterial strain specific, depend on the DOM:Hg ratio or site-specific conditions, and may thus offer new insights into the role of DOM in methylmercury production in the environment.

  8. Relationship between swelling and irradiation creep in cold worked PCA stainless steel to 178 DPA at∼400 degrees C

    International Nuclear Information System (INIS)

    Toloczko, M.B.; Garner, F.A.

    1993-01-01

    At 178 dpa and ∼400 degrees C, the irradiation creep behavior of 20% cold-worked PCA has become dominated by the creep disappearance phenomenon. The total diametral deformation rate has reached the limiting value of 0.33%/dpa at the three highest stress levels. The stress-enhancement of swelling tends to camouflage the onset of creep disappearance, however

  9. Improving Accuracy of Intrusion Detection Model Using PCA and optimized SVM

    Directory of Open Access Journals (Sweden)

    Sumaiya Thaseen Ikram

    2016-06-01

    Full Text Available Intrusion detection is very essential for providing security to different network domains and is mostly used for locating and tracing the intruders. There are many problems with traditional intrusion detection models (IDS such as low detection capability against unknown network attack, high false alarm rate and insufficient analysis capability. Hence the major scope of the research in this domain is to develop an intrusion detection model with improved accuracy and reduced training time. This paper proposes a hybrid intrusiondetection model by integrating the principal component analysis (PCA and support vector machine (SVM. The novelty of the paper is the optimization of kernel parameters of the SVM classifier using automatic parameter selection technique. This technique optimizes the punishment factor (C and kernel parameter gamma (γ, thereby improving the accuracy of the classifier and reducing the training and testing time. The experimental results obtained on the NSL KDD and gurekddcup dataset show that the proposed technique performs better with higher accuracy, faster convergence speed and better generalization. Minimum resources are consumed as the classifier input requires reduced feature set for optimum classification. A comparative analysis of hybrid models with the proposed model is also performed.

  10. PCA-based detection of damage in time-varying systems

    Science.gov (United States)

    Bellino, A.; Fasana, A.; Garibaldi, L.; Marchesiello, S.

    2010-10-01

    When performing Structural Health Monitoring, it is well known that the natural frequencies do not depend only on the damage but also on environmental conditions, such as temperature and humidity. The Principal Component Analysis is used to take this problem into account, because it allows eliminating the effect of external factors. The purpose of the present work is to show that this technique can be successfully used not only for time-invariant systems, but also for time-varying ones. Referring to the latter, one of the most studied systems which shows these characteristics is the bridge with crossing loads, such as the case of the railway bridge studied in present paper; in this case, the mass and the velocity of the train can be considered as "environmental" factors.This paper, after a brief description of the PCA method and one example of its application on time-invariant systems, presents the great potentialities of the methodology when applied to time-varying systems. The results show that this method is able to better detect the presence of damage and also to properly distinguish among different levels of crack depths.

  11. Hybrid Pixel-Based Method for Cardiac Ultrasound Fusion Based on Integration of PCA and DWT

    Directory of Open Access Journals (Sweden)

    Samaneh Mazaheri

    2015-01-01

    Full Text Available Medical image fusion is the procedure of combining several images from one or multiple imaging modalities. In spite of numerous attempts in direction of automation ventricle segmentation and tracking in echocardiography, due to low quality images with missing anatomical details or speckle noises and restricted field of view, this problem is a challenging task. This paper presents a fusion method which particularly intends to increase the segment-ability of echocardiography features such as endocardial and improving the image contrast. In addition, it tries to expand the field of view, decreasing impact of noise and artifacts and enhancing the signal to noise ratio of the echo images. The proposed algorithm weights the image information regarding an integration feature between all the overlapping images, by using a combination of principal component analysis and discrete wavelet transform. For evaluation, a comparison has been done between results of some well-known techniques and the proposed method. Also, different metrics are implemented to evaluate the performance of proposed algorithm. It has been concluded that the presented pixel-based method based on the integration of PCA and DWT has the best result for the segment-ability of cardiac ultrasound images and better performance in all metrics.

  12. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    Science.gov (United States)

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  13. PCA based feature reduction to improve the accuracy of decision tree c4.5 classification

    Science.gov (United States)

    Nasution, M. Z. F.; Sitompul, O. S.; Ramli, M.

    2018-03-01

    Splitting attribute is a major process in Decision Tree C4.5 classification. However, this process does not give a significant impact on the establishment of the decision tree in terms of removing irrelevant features. It is a major problem in decision tree classification process called over-fitting resulting from noisy data and irrelevant features. In turns, over-fitting creates misclassification and data imbalance. Many algorithms have been proposed to overcome misclassification and overfitting on classifications Decision Tree C4.5. Feature reduction is one of important issues in classification model which is intended to remove irrelevant data in order to improve accuracy. The feature reduction framework is used to simplify high dimensional data to low dimensional data with non-correlated attributes. In this research, we proposed a framework for selecting relevant and non-correlated feature subsets. We consider principal component analysis (PCA) for feature reduction to perform non-correlated feature selection and Decision Tree C4.5 algorithm for the classification. From the experiments conducted using available data sets from UCI Cervical cancer data set repository with 858 instances and 36 attributes, we evaluated the performance of our framework based on accuracy, specificity and precision. Experimental results show that our proposed framework is robust to enhance classification accuracy with 90.70% accuracy rates.

  14. Biological image construction by using Raman radiation and Pca: preliminary results

    International Nuclear Information System (INIS)

    Martinez E, J. C.; Cordova F, T.; Hugo R, V.

    2015-10-01

    Full text: In the last years, the Raman spectroscopy (Rs) technique has had some applications in the study and analysis of biological samples, due to it is able to detect concentrations or presence of certain organic and inorganic compounds of medical interest. In this work, raw data were obtained through measurements in selected points on a square regions in order to detect specific organic / inorganic compounds on biological samples. Gold nano stars samples were prepared and coated with membrane markers (CD 10+ and CD 19+) and diluted in leukemic B lymphocytes. Each data block was evaluated independently by the method of principal component analysis (Pca) in order to find representative dimensionless values (Cp) for each Raman spectrum in a specific coordinate. Each Cp was normalized in a range of 0-255 in order to generate a representative image of 8 bits of the region under study. Data acquisition was performed with Raman microscopy system Renishaw in Via in the range of 550 to 1700 cm-1 with a 785 nm laser source, with a power of 17 m W and 15 s of exposure time were used for each spectrum. In preliminary results could detect the presence of molecular markers CD 10+ and CD 19+ with gold nano stars and discrimination between both markers. The results suggest conducting studies with specific concentrations organic and inorganic materials. (Author)

  15. Biological image construction by using Raman radiation and Pca: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Martinez E, J. C. [IPN, Unidad Profesional Interdisciplinaria de Ingenieria, Campus Guanajuato, Av. Mineral de Valenciana 200, Col. Fracc. Industrial Puerto Interior, 36275 Silao, Guanajuato (Mexico); Cordova F, T. [Universidad de Guanajuato, DIC, Departamento de Ingenieria Fisica, Loma del Bosque No. 103, Col. Lomas del Campestre, 37150 Leon, Guanajuato (Mexico); Hugo R, V., E-mail: jcmartineze@ipn.mx [Universidad de Guadalajara, Centro Universitario de Tonala, Morelos No. 180, 69584 Tonala, Jalisco (Mexico)

    2015-10-15

    Full text: In the last years, the Raman spectroscopy (Rs) technique has had some applications in the study and analysis of biological samples, due to it is able to detect concentrations or presence of certain organic and inorganic compounds of medical interest. In this work, raw data were obtained through measurements in selected points on a square regions in order to detect specific organic / inorganic compounds on biological samples. Gold nano stars samples were prepared and coated with membrane markers (CD 10+ and CD 19+) and diluted in leukemic B lymphocytes. Each data block was evaluated independently by the method of principal component analysis (Pca) in order to find representative dimensionless values (Cp) for each Raman spectrum in a specific coordinate. Each Cp was normalized in a range of 0-255 in order to generate a representative image of 8 bits of the region under study. Data acquisition was performed with Raman microscopy system Renishaw in Via in the range of 550 to 1700 cm-1 with a 785 nm laser source, with a power of 17 m W and 15 s of exposure time were used for each spectrum. In preliminary results could detect the presence of molecular markers CD 10+ and CD 19+ with gold nano stars and discrimination between both markers. The results suggest conducting studies with specific concentrations organic and inorganic materials. (Author)

  16. Comprehensive analysis and evaluation of big data for main transformer equipment based on PCA and Apriority

    Science.gov (United States)

    Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun

    2018-01-01

    With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.

  17. Application of EOF/PCA-based methods in the post-processing of GRACE derived water variations

    Science.gov (United States)

    Forootan, Ehsan; Kusche, Jürgen

    2010-05-01

    Two problems that users of monthly GRACE gravity field solutions face are 1) the presence of correlated noise in the Stokes coefficients that increases with harmonic degree and causes ‘striping', and 2) the fact that different physical signals are overlaid and difficult to separate from each other in the data. These problems are termed the signal-noise separation problem and the signal-signal separation problem. Methods that are based on principal component analysis and empirical orthogonal functions (PCA/EOF) have been frequently proposed to deal with these problems for GRACE. However, different strategies have been applied to different (spatial: global/regional, spectral: global/order-wise, geoid/equivalent water height) representations of the GRACE level 2 data products, leading to differing results and a general feeling that PCA/EOF-based methods are to be applied ‘with care'. In addition, it is known that conventional EOF/PCA methods force separated modes to be orthogonal, and that, on the other hand, to either EOFs or PCs an arbitrary orthogonal rotation can be applied. The aim of this paper is to provide a common theoretical framework and to study the application of PCA/EOF-based methods as a signal separation tool due to post-process GRACE data products. In order to investigate and illustrate the applicability of PCA/EOF-based methods, we have employed them on GRACE level 2 monthly solutions based on the Center for Space Research, University of Texas (CSR/UT) RL04 products and on the ITG-GRACE03 solutions from the University of Bonn, and on various representations of them. Our results show that EOF modes do reveal the dominating annual, semiannual and also long-periodic signals in the global water storage variations, but they also show how choosing different strategies changes the outcome and may lead to unexpected results.

  18. Piper-PCA-Fisher Recognition Model of Water Inrush Source: A Case Study of the Jiaozuo Mining Area

    Directory of Open Access Journals (Sweden)

    Pinghua Huang

    2018-01-01

    Full Text Available Source discrimination of mine water plays an important role in guiding mine water prevention in mine water management. To accurately determine water inrush source from a mine in the Jiaozuo mining area, a Piper trilinear diagram based on hydrochemical experimental data of stratified underground water in the area was utilized to determine typical water samples. Additionally, principal component analysis (PCA was used for dimensionality reduction of conventional hydrochemical variables, after which mutually independent variables were extracted. The Piper-PCA-Fisher water inrush source recognition model was established by combining the Piper trilinear diagram and Fisher discrimination theory. Screened typical samples were used to conduct back-discriminate verification of the model. Results showed that 28 typical water samples in different aquifers were determined through the Piper trilinear diagram as a water sample set for training. Before PCA was carried out, the first five factors covered 98.92% of the information quantity of the original data and could effectively represent the data information of the original samples. During the one-by-one rediscrimination process of 28 groups of training samples using the Piper-PCA-Fisher water inrush source model, 100% correct discrimination rate was achieved. During the prediction and discrimination process of 13 samples, one water sample was misdiscriminated; hence, the correct prediscrimination rate was 92.3%. Compared with the traditional Fisher water source recognition model, the Piper-PCA-Fisher water source recognition model established in this study had higher accuracy in both rediscrimination and prediscrimination processes. Thus it had a strong ability to discriminate water inrush sources.

  19. On Tree-Constrained Matchings and Generalizations

    NARCIS (Netherlands)

    S. Canzar (Stefan); K. Elbassioni; G.W. Klau (Gunnar); J. Mestre

    2011-01-01

    htmlabstractWe consider the following \\textsc{Tree-Constrained Bipartite Matching} problem: Given two rooted trees $T_1=(V_1,E_1)$, $T_2=(V_2,E_2)$ and a weight function $w: V_1\\times V_2 \\mapsto \\mathbb{R}_+$, find a maximum weight matching $\\mathcal{M}$ between nodes of the two trees, such that

  20. Constrained systems described by Nambu mechanics

    International Nuclear Information System (INIS)

    Lassig, C.C.; Joshi, G.C.

    1996-01-01

    Using the framework of Nambu's generalised mechanics, we obtain a new description of constrained Hamiltonian dynamics, involving the introduction of another degree of freedom in phase space, and the necessity of defining the action integral on a world sheet. We also discuss the problem of quantizing Nambu mechanics. (authors). 5 refs

  1. Client's constraining factors to construction project management ...

    African Journals Online (AJOL)

    This study analyzed client's related factors that constrain project management success of public and private sector construction in Nigeria. Issues that concern clients in any project can not be undermined as they are the owners and the initiators of project proposals. It is assumed that success, failure or abandonment of ...

  2. Hyperbolicity and constrained evolution in linearized gravity

    International Nuclear Information System (INIS)

    Matzner, Richard A.

    2005-01-01

    Solving the 4-d Einstein equations as evolution in time requires solving equations of two types: the four elliptic initial data (constraint) equations, followed by the six second order evolution equations. Analytically the constraint equations remain solved under the action of the evolution, and one approach is to simply monitor them (unconstrained evolution). Since computational solution of differential equations introduces almost inevitable errors, it is clearly 'more correct' to introduce a scheme which actively maintains the constraints by solution (constrained evolution). This has shown promise in computational settings, but the analysis of the resulting mixed elliptic hyperbolic method has not been completely carried out. We present such an analysis for one method of constrained evolution, applied to a simple vacuum system, linearized gravitational waves. We begin with a study of the hyperbolicity of the unconstrained Einstein equations. (Because the study of hyperbolicity deals only with the highest derivative order in the equations, linearization loses no essential details.) We then give explicit analytical construction of the effect of initial data setting and constrained evolution for linearized gravitational waves. While this is clearly a toy model with regard to constrained evolution, certain interesting features are found which have relevance to the full nonlinear Einstein equations

  3. A Dynamic Programming Approach to Constrained Portfolios

    DEFF Research Database (Denmark)

    Kraft, Holger; Steffensen, Mogens

    2013-01-01

    This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies...

  4. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    2001-01-01

    A model for constrained computerized adaptive testing is proposed in which the information on the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  5. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    1997-01-01

    A model for constrained computerized adaptive testing is proposed in which the information in the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  6. Neutron Powder Diffraction and Constrained Refinement

    DEFF Research Database (Denmark)

    Pawley, G. S.; Mackenzie, Gordon A.; Dietrich, O. W.

    1977-01-01

    The first use of a new program, EDINP, is reported. This program allows the constrained refinement of molecules in a crystal structure with neutron diffraction powder data. The structures of p-C6F4Br2 and p-C6F4I2 are determined by packing considerations and then refined with EDINP. Refinement is...

  7. Terrestrial Sagnac delay constraining modified gravity models

    Science.gov (United States)

    Karimov, R. Kh.; Izmailov, R. N.; Potapov, A. A.; Nandi, K. K.

    2018-04-01

    Modified gravity theories include f(R)-gravity models that are usually constrained by the cosmological evolutionary scenario. However, it has been recently shown that they can also be constrained by the signatures of accretion disk around constant Ricci curvature Kerr-f(R0) stellar sized black holes. Our aim here is to use another experimental fact, viz., the terrestrial Sagnac delay to constrain the parameters of specific f(R)-gravity prescriptions. We shall assume that a Kerr-f(R0) solution asymptotically describes Earth's weak gravity near its surface. In this spacetime, we shall study oppositely directed light beams from source/observer moving on non-geodesic and geodesic circular trajectories and calculate the time gap, when the beams re-unite. We obtain the exact time gap called Sagnac delay in both cases and expand it to show how the flat space value is corrected by the Ricci curvature, the mass and the spin of the gravitating source. Under the assumption that the magnitude of corrections are of the order of residual uncertainties in the delay measurement, we derive the allowed intervals for Ricci curvature. We conclude that the terrestrial Sagnac delay can be used to constrain the parameters of specific f(R) prescriptions. Despite using the weak field gravity near Earth's surface, it turns out that the model parameter ranges still remain the same as those obtained from the strong field accretion disk phenomenon.

  8. Chance constrained uncertain classification via robust optimization

    NARCIS (Netherlands)

    Ben-Tal, A.; Bhadra, S.; Bhattacharayya, C.; Saketha Nat, J.

    2011-01-01

    This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out

  9. Integrating job scheduling and constrained network routing

    DEFF Research Database (Denmark)

    Gamst, Mette

    2010-01-01

    This paper examines the NP-hard problem of scheduling jobs on resources such that the overall profit of executed jobs is maximized. Job demand must be sent through a constrained network to the resource before execution can begin. The problem has application in grid computing, where a number...

  10. Neuroevolutionary Constrained Optimization for Content Creation

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    and thruster types and topologies) independently of game physics and steering strategies. According to the proposed framework, the designer picks a set of requirements for the spaceship that a constrained optimizer attempts to satisfy. The constraint satisfaction approach followed is based on neuroevolution...... and survival tasks and are also visually appealing....

  11. Models of Flux Tubes from Constrained Relaxation

    Indian Academy of Sciences (India)

    tribpo

    J. Astrophys. Astr. (2000) 21, 299 302. Models of Flux Tubes from Constrained Relaxation. Α. Mangalam* & V. Krishan†, Indian Institute of Astrophysics, Koramangala,. Bangalore 560 034, India. *e mail: mangalam @ iiap. ernet. in. † e mail: vinod@iiap.ernet.in. Abstract. We study the relaxation of a compressible plasma to ...

  12. Improving the prediction of pathologic outcomes in patients undergoing radical prostatectomy: the value of prostate cancer antigen 3 (PCA3), prostate health index (phi) and sarcosine.

    Science.gov (United States)

    Ferro, Matteo; Lucarelli, Giuseppe; Bruzzese, Dario; Perdonà, Sisto; Mazzarella, Claudia; Perruolo, Giuseppe; Marino, Ada; Cosimato, Vincenzo; Giorgio, Emilia; Tagliamonte, Virginia; Bottero, Danilo; De Cobelli, Ottavio; Terracciano, Daniela

    2015-02-01

    Several efforts have been made to find biomarkers that could help clinicians to preoperatively determine prostate cancer (PCa) pathological characteristics and choose the best therapeutic approach, avoiding over-treatment. On this effort, prostate cancer antigen 3 (PCA3), prostate health index (phi) and sarcosine have been presented as promising tools. We evaluated the ability of these biomarkers to predict the pathologic PCa characteristics within a prospectively collected contemporary cohort of patients who underwent radical prostatectomy (RP) for clinically localized PCa at a single high-volume Institution. The prognostic performance of PCA3, phi and sarcosine were evaluated in 78 patients undergoing RP for biopsy-proven PCa. Receiver operating characteristic (ROC) curve analyses tested the accuracy (area under the curve (AUC)) in predicting PCa pathological characteristics. Decision curve analyses (DCA) were used to assess the clinical benefit of the three biomarkers. We found that PCA3, phi and sarcosine levels were significantly higher in patients with tumor volume (TV)≥0.5 ml, pathologic Gleason sum (GS)≥7 and pT3 disease (all p-values≤0.01). ROC curve analysis showed that phi is an accurate predictor of high-stage (AUC 0.85 [0.77-0.93]), high-grade (AUC 0.83 [0.73-0.93]) and high-volume disease (AUC 0.94 [0.88-0.99]). Sarcosine showed a comparable AUC (0.85 [0.76-0.94]) only for T3 stage prediction, whereas PCA3 score showed lower AUCs, ranging from 0.74 (for GS) to 0.86 (for TV). PCA3, phi and sarcosine are predictors of PCa characteristics at final pathology. Successful clinical translation of these findings would reduce the frequency of surveillance biopsies and may enhance acceptance of active surveillance (AS). Copyright© 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  13. Interleukin-6: a bone marrow stromal cell paracrine signal that induces neuroendocrine differentiation and modulates autophagy in bone metastatic PCa cells.

    Science.gov (United States)

    Delk, Nikki A; Farach-Carson, Mary C

    2012-04-01

    Autophagy reallocates nutrients and clears normal cells of damaged proteins and organelles. In the context of metastatic disease, invading cancer cells hijack autophagic processes to survive and adapt in the host microenvironment. We sought to understand how autophagy is regulated in the metastatic niche for prostate cancer (PCa) cells where bone marrow stromal cell (BMSC) paracrine signaling induces PCa neuroendocrine differentiation (NED). In PCa, this transdifferentiation of metastatic PCa cells to neuronal-like cells correlates with advanced disease. Because autophagy provides a survival advantage for cancer cells and promotes cell differentiation, we hypothesized that autophagy mediates PCa NED in the bone. Thus, we determined the ability of paracrine factors in conditioned media (CM) from two separate BMSC subtypes, HS5 and HS27a, to induce autophagy in C4-2 and C4-2B bone metastatic PCa cells by characterizing the autophagy marker, LC3. Unlike HS27a CM, HS5 CM induced LC3 accumulation in PCa cells, suggesting autophagy was induced and indicating that HS5 and HS27a secrete a different milieu of paracrine factors that influence PCa autophagy. We identified interleukin-6 (IL-6), a cytokine more highly expressed in HS5 cells than in HS27a cells, as a paracrine factor that regulates PCa autophagy. Pharmacological inhibition of STAT3 activity did not attenuate LC3 accumulation, implying that IL-6 regulates NED and autophagy through different pathways. Finally, chloroquine inhibition of autophagic flux blocked PCa NED; hence autophagic flux maintains NED. Our studies imply that autophagy is cytoprotective for PCa cells in the bone, thus targeting autophagy is a potential therapeutic strategy.

  14. Revascularization of the upper posterior circulation with the anterior temporal artery: an anatomical feasibility study.

    Science.gov (United States)

    Tayebi Meybodi, Ali; Lawton, Michael T; Griswold, Dylan; Mokhtari, Pooneh; Payman, Andre; Tabani, Halima; Yousef, Sonia; Benet, Arnau

    2017-09-22

    OBJECTIVE In various disease processes, including unclippable aneurysms, a bypass to the upper posterior circulation (UPC) including the superior cerebellar artery (SCA) and posterior cerebral artery (PCA) may be needed. Various revascularization options exist, but the role of intracranial (IC) donors has not been scrutinized. The objective of this study was to evaluate the anatomical feasibility of utilizing the anterior temporal artery (ATA) for revascularization of the UPC. METHODS ATA-SCA and ATA-PCA bypasses were performed on 14 cadaver specimens. After performing an orbitozygomatic craniotomy and opening the basal cisterns, the ATA was divided at the M 3 -M 4 junction and mobilized to the crural cistern to complete an end-to-side bypass to the SCA and PCA. The length of the recipient artery between the anastomosis and origin was measured. RESULTS Seventeen ATAs were found. Successful anastomosis was performed in 14 (82%) of the ATAs. The anastomosis point on the PCA was 14.2 mm from its origin on the basilar artery. The SCA anastomosis point was 10.1 mm from its origin. Three ATAs did not reach the UPC region due to a common opercular origin with the middle temporal artery. The ATA-SCA bypass was also applied to the management of an incompletely coiled SCA aneurysm. CONCLUSIONS The ATA is a promising IC donor for UPC revascularization. The ATA is exposed en route to the proximal SCA and PCA through the pterional-orbitozygomatic approach. Also, the end-to-side anastomosis provides an efficient and straightforward bypass without the need to harvest a graft or perform multiple or difficult anastomoses.

  15. Baryonic effects in cosmic shear tomography: PCA parametrization and importance of extreme baryonic models

    Energy Technology Data Exchange (ETDEWEB)

    Mohammed, Irshad [Fermilab; Gnedin, Nickolay Y. [Fermilab

    2017-07-07

    Baryonic effects are amongst the most severe systematics to the tomographic analysis of weak lensing data which is the principal probe in many future generations of cosmological surveys like LSST, Euclid etc.. Modeling or parameterizing these effects is essential in order to extract valuable constraints on cosmological parameters. In a recent paper, Eifler et al. (2015) suggested a reduction technique for baryonic effects by conducting a principal component analysis (PCA) and removing the largest baryonic eigenmodes from the data. In this article, we conducted the investigation further and addressed two critical aspects. Firstly, we performed the analysis by separating the simulations into training and test sets, computing a minimal set of principle components from the training set and examining the fits on the test set. We found that using only four parameters, corresponding to the four largest eigenmodes of the training set, the test sets can be fitted thoroughly with an RMS $\\sim 0.0011$. Secondly, we explored the significance of outliers, the most exotic/extreme baryonic scenarios, in this method. We found that excluding the outliers from the training set results in a relatively bad fit and degraded the RMS by nearly a factor of 3. Therefore, for a direct employment of this method to the tomographic analysis of the weak lensing data, the principle components should be derived from a training set that comprises adequately exotic but reasonable models such that the reality is included inside the parameter domain sampled by the training set. The baryonic effects can be parameterized as the coefficients of these principle components and should be marginalized over the cosmological parameter space.

  16. Change detection of medical images using dictionary learning techniques and PCA

    Science.gov (United States)

    Nika, Varvara; Babyn, Paul; Zhu, Hongmei

    2014-03-01

    Automatic change detection methods for identifying the changes of serial MR images taken at different times are of great interest to radiologists. The majority of existing change detection methods in medical imaging, and those of brain images in particular, include many preprocessing steps and rely mostly on statistical analysis of MRI scans. Although most methods utilize registration software, tissue classification remains a difficult and overwhelming task. Recently, dictionary learning techniques are used in many areas of image processing, such as image surveillance, face recognition, remote sensing, and medical imaging. In this paper we present the Eigen-Block Change Detection algorithm (EigenBlockCD). It performs local registration and identifies the changes between consecutive MR images of the brain. Blocks of pixels from baseline scan are used to train local dictionaries that are then used to detect changes in the follow-up scan. We use PCA to reduce the dimensionality of the local dictionaries and the redundancy of data. Choosing the appropriate distance measure significantly affects the performance of our algorithm. We examine the differences between L1 and L2 norms as two possible similarity measures in the EigenBlockCD. We show the advantages of L2 norm over L1 norm theoretically and numerically. We also demonstrate the performance of the EigenBlockCD algorithm for detecting changes of MR images and compare our results with those provided in recent literature. Experimental results with both simulated and real MRI scans show that the EigenBlockCD outperforms the previous methods. It detects clinical changes while ignoring the changes due to patient's position and other acquisition artifacts.

  17. SU-F-R-41: Regularized PCA Can Model Treatment-Related Changes in Head and Neck Patients Using Daily CBCTs

    International Nuclear Information System (INIS)

    Chetvertkov, M; Siddiqui, F; Chetty, I; Kumarasiri, A; Liu, C; Gordon, J

    2016-01-01

    Purpose: To use daily cone beam CTs (CBCTs) to develop regularized principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients, to guide replanning decisions in adaptive radiation therapy (ART). Methods: Known deformations were applied to planning CT (pCT) images of 10 H&N patients to model several different systematic anatomical changes. A Pinnacle plugin was used to interpolate systematic changes over 35 fractions, generating a set of 35 synthetic CTs for each patient. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CTs and random fraction-to-fraction changes were superimposed on the DVFs. Standard non-regularized and regularized patient-specific PCA models were built using the DVFs. The ability of PCA to extract the known deformations was quantified. PCA models were also generated from clinical CBCTs, for which the deformations and DVFs were not known. It was hypothesized that resulting eigenvectors/eigenfunctions with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: As demonstrated with quantitative results in the supporting document regularized PCA is more successful than standard PCA at capturing systematic changes early in the treatment. Regularized PCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes. To be successful at guiding ART, regularized PCA should be coupled with models of when anatomical changes occur: early, late or throughout the treatment course. Conclusion: The leading eigenvector/eigenfunction from the both PCA approaches can tentatively be identified as a major systematic change during radiotherapy course when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the regularized PCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are

  18. Increasing patient knowledge on the proper usage of a PCA machine with the use of a post-operative instructional card.

    Science.gov (United States)

    Shovel, Louisa; Max, Bryan; Correll, Darin J

    2016-01-01

    The purpose of this study was to see if an instructional card, attached to the PCA machine following total hip arthroplasty describing proper use of the device, would positively affect subjects' understanding of device usage, pain scores, pain medication consumption and satisfaction. Eighty adults undergoing total hip replacements who had been prescribed PCA were randomized into two study groups. Forty participants received the standard post-operative instruction on PCA device usage at our institution. The other 40 participants received the standard of care in addition to being given a typed instructional card immediately post-operatively, describing proper PCA device use. This card was attached to the PCA device during their recovery period. On post-operative day one, each patient completed a questionnaire on PCA usage, pain scores and satisfaction scores. The pain scores in the Instructional Card group were significantly lower than the Control group (p = 0.024). Subjects' understanding of PCA usage was also improved in the Instructional Card group for six of the seven questions asked. The findings from this study strongly support that postoperative patient information on proper PCA use by means of an instructional card improves pain control and hence the overall recovery for patients undergoing surgery. In addition, through improved understanding it adds an important safety feature in that patients and potentially their family members and/or friends may refrain from PCA-by-proxy. This article demonstrates that the simple intervention of adding an instructional card to a PCA machine is an effective method to improve patients' knowledge as well as pain control and potentially increase the safety of the device use.

  19. SU-F-R-41: Regularized PCA Can Model Treatment-Related Changes in Head and Neck Patients Using Daily CBCTs

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkov, M [Wayne State University, Detroit, MI (United States); Henry Ford Health System, Detroit, MI (United States); Siddiqui, F; Chetty, I; Kumarasiri, A; Liu, C; Gordon, J [Henry Ford Health System, Detroit, MI (United States)

    2016-06-15

    Purpose: To use daily cone beam CTs (CBCTs) to develop regularized principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients, to guide replanning decisions in adaptive radiation therapy (ART). Methods: Known deformations were applied to planning CT (pCT) images of 10 H&N patients to model several different systematic anatomical changes. A Pinnacle plugin was used to interpolate systematic changes over 35 fractions, generating a set of 35 synthetic CTs for each patient. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CTs and random fraction-to-fraction changes were superimposed on the DVFs. Standard non-regularized and regularized patient-specific PCA models were built using the DVFs. The ability of PCA to extract the known deformations was quantified. PCA models were also generated from clinical CBCTs, for which the deformations and DVFs were not known. It was hypothesized that resulting eigenvectors/eigenfunctions with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: As demonstrated with quantitative results in the supporting document regularized PCA is more successful than standard PCA at capturing systematic changes early in the treatment. Regularized PCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes. To be successful at guiding ART, regularized PCA should be coupled with models of when anatomical changes occur: early, late or throughout the treatment course. Conclusion: The leading eigenvector/eigenfunction from the both PCA approaches can tentatively be identified as a major systematic change during radiotherapy course when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the regularized PCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are

  20. Self-constrained inversion of potential fields

    Science.gov (United States)

    Paoletti, V.; Ialongo, S.; Florio, G.; Fedi, M.; Cella, F.

    2013-11-01

    We present a potential-field-constrained inversion procedure based on a priori information derived exclusively from the analysis of the gravity and magnetic data (self-constrained inversion). The procedure is designed to be applied to underdetermined problems and involves scenarios where the source distribution can be assumed to be of simple character. To set up effective constraints, we first estimate through the analysis of the gravity or magnetic field some or all of the following source parameters: the source depth-to-the-top, the structural index, the horizontal position of the source body edges and their dip. The second step is incorporating the information related to these constraints in the objective function as depth and spatial weighting functions. We show, through 2-D and 3-D synthetic and real data examples, that potential field-based constraints, for example, structural index, source boundaries and others, are usually enough to obtain substantial improvement in the density and magnetization models.

  1. Cosmogenic photons strongly constrain UHECR source models

    Directory of Open Access Journals (Sweden)

    van Vliet Arjen

    2017-01-01

    Full Text Available With the newest version of our Monte Carlo code for ultra-high-energy cosmic ray (UHECR propagation, CRPropa 3, the flux of neutrinos and photons due to interactions of UHECRs with extragalactic background light can be predicted. Together with the recently updated data for the isotropic diffuse gamma-ray background (IGRB by Fermi LAT, it is now possible to severely constrain UHECR source models. The evolution of the UHECR sources especially plays an important role in the determination of the expected secondary photon spectrum. Pure proton UHECR models are already strongly constrained, primarily by the highest energy bins of Fermi LAT’s IGRB, as long as their number density is not strongly peaked at recent times.

  2. A constrained supersymmetric left-right model

    Energy Technology Data Exchange (ETDEWEB)

    Hirsch, Martin [AHEP Group, Instituto de Física Corpuscular - C.S.I.C./Universitat de València, Edificio de Institutos de Paterna, Apartado 22085, E-46071 València (Spain); Krauss, Manuel E. [Bethe Center for Theoretical Physics & Physikalisches Institut der Universität Bonn, Nussallee 12, 53115 Bonn (Germany); Institut für Theoretische Physik und Astronomie, Universität Würzburg,Emil-Hilb-Weg 22, 97074 Wuerzburg (Germany); Opferkuch, Toby [Bethe Center for Theoretical Physics & Physikalisches Institut der Universität Bonn, Nussallee 12, 53115 Bonn (Germany); Porod, Werner [Institut für Theoretische Physik und Astronomie, Universität Würzburg,Emil-Hilb-Weg 22, 97074 Wuerzburg (Germany); Staub, Florian [Theory Division, CERN,1211 Geneva 23 (Switzerland)

    2016-03-02

    We present a supersymmetric left-right model which predicts gauge coupling unification close to the string scale and extra vector bosons at the TeV scale. The subtleties in constructing a model which is in agreement with the measured quark masses and mixing for such a low left-right breaking scale are discussed. It is shown that in the constrained version of this model radiative breaking of the gauge symmetries is possible and a SM-like Higgs is obtained. Additional CP-even scalars of a similar mass or even much lighter are possible. The expected mass hierarchies for the supersymmetric states differ clearly from those of the constrained MSSM. In particular, the lightest down-type squark, which is a mixture of the sbottom and extra vector-like states, is always lighter than the stop. We also comment on the model’s capability to explain current anomalies observed at the LHC.

  3. Coding for Two Dimensional Constrained Fields

    DEFF Research Database (Denmark)

    Laursen, Torben Vaarbye

    2006-01-01

    a first order model to model higher order constraints by the use of an alphabet extension. We present an iterative method that based on a set of conditional probabilities can help in choosing the large numbers of parameters of the model in order to obtain a stationary model. Explicit results are given...... for the No Isolated Bits constraint. Finally we present a variation of the encoding scheme of bit-stuffing that is applicable to the class of checkerboard constrained fields. It is possible to calculate the entropy of the coding scheme thus obtaining lower bounds on the entropy of the fields considered. These lower...... bounds are very tight for the Run-Length limited fields. Explicit bounds are given for the diamond constrained field as well....

  4. Communication Schemes with Constrained Reordering of Resources

    DEFF Research Database (Denmark)

    Popovski, Petar; Utkovski, Zoran; Trillingsgaard, Kasper Fløe

    2013-01-01

    This paper introduces a communication model inspired by two practical scenarios. The first scenario is related to the concept of protocol coding, where information is encoded in the actions taken by an existing communication protocol. We investigate strategies for protocol coding via combinatorial...... reordering of the labelled user resources (packets, channels) in an existing, primary system. However, the degrees of freedom of the reordering are constrained by the operation of the primary system. The second scenario is related to communication systems with energy harvesting, where the transmitted signals...... are constrained by the energy that is available through the harvesting process. We have introduced a communication model that covers both scenarios and elicits their key feature, namely the constraints of the primary system or the harvesting process. We have shown how to compute the capacity of the channels...

  5. Q-deformed systems and constrained dynamics

    International Nuclear Information System (INIS)

    Shabanov, S.V.

    1993-01-01

    It is shown that quantum theories of the q-deformed harmonic oscillator and one-dimensional free q-particle (a free particle on the 'quantum' line) can be obtained by the canonical quantization of classical Hamiltonian systems with commutative phase-space variables and a non-trivial symplectic structure. In the framework of this approach, classical dynamics of a particle on the q-line coincides with the one of a free particle with friction. It is argued that q-deformed systems can be treated as ordinary mechanical systems with the second-class constraints. In particular, second-class constrained systems corresponding to the q-oscillator and q-particle are given. A possibility of formulating q-deformed systems via gauge theories (first-class constrained systems) is briefly discussed. (orig.)

  6. Online constrained model-based reinforcement learning

    CSIR Research Space (South Africa)

    Van Niekerk, B

    2017-08-01

    Full Text Available Constrained Model-based Reinforcement Learning Benjamin van Niekerk School of Computer Science University of the Witwatersrand South Africa Andreas Damianou∗ Amazon.com Cambridge, UK Benjamin Rosman Council for Scientific and Industrial Research, and School... MULTIPLE SHOOTING Using direct multiple shooting (Bock and Plitt, 1984), problem (1) can be transformed into a structured non- linear program (NLP). First, the time horizon [t0, t0 + T ] is partitioned into N equal subintervals [tk, tk+1] for k = 0...

  7. Constraining supergravity models from gluino production

    International Nuclear Information System (INIS)

    Barbieri, R.; Gamberini, G.; Giudice, G.F.; Ridolfi, G.

    1988-01-01

    The branching ratios for gluino decays g tilde → qanti qΧ, g tilde → gΧ into a stable undetected neutralino are computed as functions of the relevant parameters of the underlying supergravity theory. A simple way of constraining supergravity models from gluino production emerges. The effectiveness of hadronic versus e + e - colliders in the search for supersymmetry can be directly compared. (orig.)

  8. Cosmicflows Constrained Local UniversE Simulations

    Science.gov (United States)

    Sorce, Jenny G.; Gottlöber, Stefan; Yepes, Gustavo; Hoffman, Yehuda; Courtois, Helene M.; Steinmetz, Matthias; Tully, R. Brent; Pomarède, Daniel; Carlesi, Edoardo

    2016-01-01

    This paper combines observational data sets and cosmological simulations to generate realistic numerical replicas of the nearby Universe. The latter are excellent laboratories for studies of the non-linear process of structure formation in our neighbourhood. With measurements of radial peculiar velocities in the local Universe (cosmicflows-2) and a newly developed technique, we produce Constrained Local UniversE Simulations (CLUES). To assess the quality of these constrained simulations, we compare them with random simulations as well as with local observations. The cosmic variance, defined as the mean one-sigma scatter of cell-to-cell comparison between two fields, is significantly smaller for the constrained simulations than for the random simulations. Within the inner part of the box where most of the constraints are, the scatter is smaller by a factor of 2 to 3 on a 5 h-1 Mpc scale with respect to that found for random simulations. This one-sigma scatter obtained when comparing the simulated and the observation-reconstructed velocity fields is only 104 ± 4 km s-1, I.e. the linear theory threshold. These two results demonstrate that these simulations are in agreement with each other and with the observations of our neighbourhood. For the first time, simulations constrained with observational radial peculiar velocities resemble the local Universe up to a distance of 150 h-1 Mpc on a scale of a few tens of megaparsecs. When focusing on the inner part of the box, the resemblance with our cosmic neighbourhood extends to a few megaparsecs (<5 h-1 Mpc). The simulations provide a proper large-scale environment for studies of the formation of nearby objects.

  9. Dynamic Convex Duality in Constrained Utility Maximization

    OpenAIRE

    Li, Yusong; Zheng, Harry

    2016-01-01

    In this paper, we study a constrained utility maximization problem following the convex duality approach. After formulating the primal and dual problems, we construct the necessary and sufficient conditions for both the primal and dual problems in terms of FBSDEs plus additional conditions. Such formulation then allows us to explicitly characterize the primal optimal control as a function of the adjoint process coming from the dual FBSDEs in a dynamic fashion and vice versa. Moreover, we also...

  10. Statistical mechanics of budget-constrained auctions

    OpenAIRE

    Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.

    2009-01-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). Based on the cavity method of statistical mechanics, we introduce a message passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution,...

  11. Constraining neutron star matter with Quantum Chromodynamics

    CERN Document Server

    Kurkela, Aleksi; Schaffner-Bielich, Jurgen; Vuorinen, Aleksi

    2014-01-01

    In recent years, there have been several successful attempts to constrain the equation of state of neutron star matter using input from low-energy nuclear physics and observational data. We demonstrate that significant further restrictions can be placed by additionally requiring the pressure to approach that of deconfined quark matter at high densities. Remarkably, the new constraints turn out to be highly insensitive to the amount --- or even presence --- of quark matter inside the stars.

  12. Semantic category interference in overt picture naming: sharpening current density localization by PCA.

    Science.gov (United States)

    Maess, Burkhard; Friederici, Angela D; Damian, Markus; Meyer, Antje S; Levelt, Willem J M

    2002-04-01

    The study investigated the neuronal basis of the retrieval of words from the mental lexicon. The semantic category interference effect was used to locate lexical retrieval processes in time and space. This effect reflects the finding that, for overt naming, volunteers are slower when naming pictures out of a sequence of items from the same semantic category than from different categories. Participants named pictures blockwise either in the context of same- or mixed-category items while the brain response was registered using magnetoencephalography (MEG). Fifteen out of 20 participants showed longer response latencies in the same-category compared to the mixed-category condition. Event-related MEG signals for the participants demonstrating the interference effect were submitted to a current source density (CSD) analysis. As a new approach, a principal component analysis was applied to decompose the grand average CSD distribution into spatial subcomponents (factors). The spatial factor indicating left temporal activity revealed significantly different activation for the same-category compared to the mixed-category condition in the time window between 150 and 225 msec post picture onset. These findings indicate a major involvement of the left temporal cortex in the semantic interference effect. As this effect has been shown to take place at the level of lexical selection, the data suggest that the left temporal cortex supports processes of lexical retrieval during production.

  13. Modeling and query the uncertainty of network constrained moving objects based on RFID data

    Science.gov (United States)

    Han, Liang; Xie, Kunqing; Ma, Xiujun; Song, Guojie

    2007-06-01

    The management of network constrained moving objects is more and more practical, especially in intelligent transportation system. In the past, the location information of moving objects on network is collected by GPS, which cost high and has the problem of frequent update and privacy. The RFID (Radio Frequency IDentification) devices are used more and more widely to collect the location information. They are cheaper and have less update. And they interfere in the privacy less. They detect the id of the object and the time when moving object passed by the node of the network. They don't detect the objects' exact movement in side the edge, which lead to a problem of uncertainty. How to modeling and query the uncertainty of the network constrained moving objects based on RFID data becomes a research issue. In this paper, a model is proposed to describe the uncertainty of network constrained moving objects. A two level index is presented to provide efficient access to the network and the data of movement. The processing of imprecise time-slice query and spatio-temporal range query are studied in this paper. The processing includes four steps: spatial filter, spatial refinement, temporal filter and probability calculation. Finally, some experiments are done based on the simulated data. In the experiments the performance of the index is studied. The precision and recall of the result set are defined. And how the query arguments affect the precision and recall of the result set is also discussed.

  14. Constraining the mass of the Local Group

    Science.gov (United States)

    Carlesi, Edoardo; Hoffman, Yehuda; Sorce, Jenny G.; Gottlöber, Stefan

    2017-03-01

    The mass of the Local Group (LG) is a crucial parameter for galaxy formation theories. However, its observational determination is challenging - its mass budget is dominated by dark matter that cannot be directly observed. To meet this end, the posterior distributions of the LG and its massive constituents have been constructed by means of constrained and random cosmological simulations. Two priors are assumed - the Λ cold dark matter model that is used to set up the simulations, and an LG model that encodes the observational knowledge of the LG and is used to select LG-like objects from the simulations. The constrained simulations are designed to reproduce the local cosmography as it is imprinted on to the Cosmicflows-2 data base of velocities. Several prescriptions are used to define the LG model, focusing in particular on different recent estimates of the tangential velocity of M31. It is found that (a) different vtan choices affect the peak mass values up to a factor of 2, and change mass ratios of MM31 to MMW by up to 20 per cent; (b) constrained simulations yield more sharply peaked posterior distributions compared with the random ones; (c) LG mass estimates are found to be smaller than those found using the timing argument; (d) preferred Milky Way masses lie in the range of (0.6-0.8) × 1012 M⊙; whereas (e) MM31 is found to vary between (1.0-2.0) × 1012 M⊙, with a strong dependence on the vtan values used.

  15. The Feasibility Study for Multigeometries Identification of Uranium Components Using PCA-LSSVM Based on Correlation Measurements

    Directory of Open Access Journals (Sweden)

    Mi Zhou

    2018-01-01

    Full Text Available The geometry of uranium components is one of the key characteristics and strictly confidential. The geometry identification of metal uranium components was studied using 252Cf source-driven correlation measurement method. For the 3 uranium samples with the same mass and enrichment, there are subtle differences in neutron signals. Even worse, the correlation functions were disturbed by scatter neutrons and include “accidental” coincidence, which is not conductive to the geometry identification. In this paper, we proposed an identification method combining principal component analysis and least-square support vector machine (PCA-LSSVM. The results based on PCA-LSSVM showed that the training precision was 100% and the test precision was 95.83% of the identification model. The total precision of the identification model was 98.41%, which indicated that the identification model was an effective way to identify the geometry properties with the correlation functions.

  16. Preliminary identification of unicellular algal genus by using combined confocal resonance Raman spectroscopy with PCA and DPLS analysis

    Science.gov (United States)

    He, Shixuan; Xie, Wanyi; Zhang, Ping; Fang, Shaoxi; Li, Zhe; Tang, Peng; Gao, Xia; Guo, Jinsong; Tlili, Chaker; Wang, Deqiang

    2018-02-01

    The analysis of algae and dominant alga plays important roles in ecological and environmental fields since it can be used to forecast water bloom and control its potential deleterious effects. Herein, we combine in vivo confocal resonance Raman spectroscopy with multivariate analysis methods to preliminary identify the three algal genera in water blooms at unicellular scale. Statistical analysis of characteristic Raman peaks demonstrates that certain shifts and different normalized intensities, resulting from composition of different carotenoids, exist in Raman spectra of three algal cells. Principal component analysis (PCA) scores and corresponding loading weights show some differences from Raman spectral characteristics which are caused by vibrations of carotenoids in unicellular algae. Then, discriminant partial least squares (DPLS) classification method is used to verify the effectiveness of algal identification with confocal resonance Raman spectroscopy. Our results show that confocal resonance Raman spectroscopy combined with PCA and DPLS could handle the preliminary identification of dominant alga for forecasting and controlling of water blooms.

  17. News Schemes for Activity Recognition Systems Using PCA-WSVM, ICA-WSVM, and LDA-WSVM

    Directory of Open Access Journals (Sweden)

    M’hamed Bilal Abidine

    2015-08-01

    Full Text Available Feature extraction and classification are two key steps for activity recognition in a smart home environment. In this work, we used three methods for feature extraction: Principal Component Analysis (PCA, Independent Component Analysis (ICA, and Linear Discriminant Analysis (LDA. The new features selected by each method are then used as the inputs for a Weighted Support Vector Machines (WSVM classifier. This classifier is used to handle the problem of imbalanced activity data from the sensor readings. The experiments were implemented on multiple real-world datasets with Conditional Random Fields (CRF, standard Support Vector Machines (SVM, Weighted SVM, and combined methods PCA+WSVM, ICA+WSVM, and LDA+WSVM showed that LDA+WSVM had a higher recognition rate than other methods for activity recognition.

  18. Corrosion of path A PCA and 12 Cr-1 MoVW steel in thermally convective lithium

    International Nuclear Information System (INIS)

    Tortorelli, P.F.; DeVan, J.H.

    1984-01-01

    Exposure of path A PCA alloys to thermally convective lithium for 6700 h at 600 and 570 0 C resulted in corrosion reactions that were similar to what is observed for other austenitic alloys exposed under similar conditions. It corroded more rapidly than type 316 stainless steel, and the presence of nitride stringers in PCA did not affect the measured weight losses. Consideration of the weight change and surface analysis data for 12 Cr-1 MoVW steel exposed to thermally convective lithium between 500 and 350 0 C for 10,088 h revealed that reactions with carbon and nitrogen were probably the principal corrosion processes for this alloy in this temperature range. Corrosion was not severe

  19. An EEMD-PCA approach to extract heart rate, respiratory rate and respiratory activity from PPG signal.

    Science.gov (United States)

    Motin, Mohammod Abdul; Karmakar, Chandan Kumar; Palaniswami, Marimuthu

    2016-08-01

    The pulse oximeter's photoplethysmographic (PPG) signals, measure the local variations of blood volume in tissues, reflecting the peripheral pulse modulated by cardiac activity, respiration and other physiological effects. Therefore, PPG can be used to extract the vital cardiorespiratory signals like heart rate (HR), respiratory rate (RR) and respiratory activity (RA) and this will reduce the number of sensors connected to the patient's body for recording vital signs. In this paper, we propose an algorithm based on ensemble empirical mode decomposition with principal component analysis (EEMD-PCA) as a novel approach to estimate HR, RR and RA simultaneously from PPG signal. To examine the performance of the proposed algorithm, we used 45 epochs of PPG, electrocardiogram (ECG) and respiratory signal extracted from the MIMIC database (Physionet ATM data bank). The ECG and capnograph based respiratory signal were used as the ground truth and several metrics such as magnitude squared coherence (MSC), correlation coefficients (CC) and root mean square (RMS) error were used to compare the performance of EEMD-PCA algorithm with most of the existing methods in the literature. Results of EEMD-PCA based extraction of HR, RR and RA from PPG signal showed that the median RMS error (quartiles) obtained for RR was 0 (0, 0.89) breaths/min, for HR was 0.62 (0.56, 0.66) beats/min and for RA the average value of MSC and CC was 0.95 and 0.89 respectively. These results illustrated that the proposed EEMD-PCA approach is more accurate in estimating HR, RR and RA than other existing methods.

  20. Epigenetic Signature: A New Player as Predictor of Clinically Significant Prostate Cancer (PCa) in Patients on Active Surveillance (AS).

    Science.gov (United States)

    Ferro, Matteo; Ungaro, Paola; Cimmino, Amelia; Lucarelli, Giuseppe; Busetto, Gian Maria; Cantiello, Francesco; Damiano, Rocco; Terracciano, Daniela

    2017-05-27

    Widespread prostate-specific antigen (PSA) testing notably increased the number of prostate cancer (PCa) diagnoses. However, about 30% of these patients have low-risk tumors that are not lethal and remain asymptomatic during their lifetime. Overtreatment of such patients may reduce quality of life and increase healthcare costs. Active surveillance (AS) has become an accepted alternative to immediate treatment in selected men with low-risk PCa. Despite much progress in recent years toward identifying the best candidates for AS in recent years, the greatest risk remains the possibility of misclassification of the cancer or missing a high-risk cancer. This is particularly worrisome in men with a life expectancy of greater than 10-15 years. The Prostate Cancer Research International Active Surveillance (PRIAS) study showed that, in addition to age and PSA at diagnosis, both PSA density (PSA-D) and the number of positive cores at diagnosis (two compared with one) are the strongest predictors for reclassification biopsy or switching to deferred treatment. However, there is still no consensus upon guidelines for placing patients on AS. Each institution has its own protocol for AS that is based on PRIAS criteria. Many different variables have been proposed as tools to enrol patients in AS: PSA-D, the percentage of freePSA, and the extent of cancer on biopsy (number of positive cores or percentage of core involvement). More recently, the Prostate Health Index (PHI), the 4 Kallikrein (4K) score, and other patient factors, such as age, race, and family history, have been investigated as tools able to predict clinically significant PCa. Recently, some reports suggested that epigenetic mapping differs significantly between cancer patients and healthy subjects. These findings indicated as future prospect the use of epigenetic markers to identify PCa patients with low-grade disease, who are likely candidates for AS. This review explores literature data about the potential of

  1. Plaque Tissue Morphology-Based Stroke Risk Stratification Using Carotid Ultrasound: A Polling-Based PCA Learning Paradigm.

    Science.gov (United States)

    Saba, Luca; Jain, Pankaj K; Suri, Harman S; Ikeda, Nobutaka; Araki, Tadashi; Singh, Bikesh K; Nicolaides, Andrew; Shafique, Shoaib; Gupta, Ajay; Laird, John R; Suri, Jasjit S

    2017-06-01

    Severe atherosclerosis disease in carotid arteries causes stenosis which in turn leads to stroke. Machine learning systems have been previously developed for plaque wall risk assessment using morphology-based characterization. The fundamental assumption in such systems is the extraction of the grayscale features of the plaque region. Even though these systems have the ability to perform risk stratification, they lack the ability to achieve higher performance due their inability to select and retain dominant features. This paper introduces a polling-based principal component analysis (PCA) strategy embedded in the machine learning framework to select and retain dominant features, resulting in superior performance. This leads to more stability and reliability. The automated system uses offline image data along with the ground truth labels to generate the parameters, which are then used to transform the online grayscale features to predict the risk of stroke. A set of sixteen grayscale plaque features is computed. Utilizing the cross-validation protocol (K = 10), and the PCA cutoff of 0.995, the machine learning system is able to achieve an accuracy of 98.55 and 98.83%corresponding to the carotidfar wall and near wall plaques, respectively. The corresponding reliability of the system was 94.56 and 95.63%, respectively. The automated system was validated against the manual risk assessment system and the precision of merit for same cross-validation settings and PCA cutoffs are 98.28 and 93.92%for the far and the near wall, respectively.PCA-embedded morphology-based plaque characterization shows a powerful strategy for risk assessment and can be adapted in clinical settings.

  2. CT-guided thin needles percutaneous cryoablation (PCA) in patients with primary and secondary lung tumors: A preliminary experience

    Energy Technology Data Exchange (ETDEWEB)

    Pusceddu, Claudio, E-mail: clapusceddu@gmail.com [Division of Interventional Radiology, Department of Oncological Radiology, Businco Hospital, Regional Referral Center for Oncologic Diseases, Cagliari, Zip code 09100 (Italy); Sotgia, Barbara, E-mail: barbara.sotgia@gmail.com [Department of Oncological Radiology, Businco Hospital, Regional Referral Center for Oncological Diseases, Cagliari, Zip code 09100 (Italy); Fele, Rosa Maria, E-mail: rosellafele@tiscali.it [Department of Oncological Radiology, Businco Hospital, Regional Referral Center for Oncological Diseases, Cagliari, Zip code 09100 (Italy); Melis, Luca, E-mail: doclucamelis@tiscali.it [Department of Oncological Radiology, Businco Hospital, Regional Referral Center for Oncological Diseases, Cagliari, Zip code 09100 (Italy)

    2013-05-15

    Purpose: To report the data of our initial experience with CT-guided thin cryoprobes for percutaneous cryoablation (PCA) in patients with primary and secondary pulmonary tumors. Material and methods: CT-guided thin needles PCA was performed on 34 lung masses (11 NSCLC = 32%; 23 secondary lung malignancies = 68%) in 32 consecutive patients (24 men and 8 women; mean age 67 ± 10 years) not suitable for surgical resection. Lung masses were treated using two types of cryoprobes: IceRod and IceSeed able to obtain different size of iceball. The number of probes used ranged from 1 to 5 depending on the size of the tumor. After insertion of the cryoprobes into the lesion, the PCA were performed with two 2 (91%) or 3 (9%) cycles each of 12 min of freezing followed by a 4 min active thawing phase and a 4 min passive thawing phase for each one for all treatments. Results: All cryoablation sessions were successfully completed. All primary and metastatic lung tumors were ablated. No procedure-related deaths occurred. Morbidity consisted of 21% (7 of 34) pneumothorax and 3% (1 of 34) cases asymptomatic small pulmonary hemorrhage, respectively, all of CTCAE grade 1 (Common Terminology Criteria for Adverse Events). Low density of entire lesion, central necrosis and solid mass appearance were identify in 21 (62%), 7 (21%) and 6 (17%) of cryoablated tumors, respectively. No lymphadenopathy developed in the region of treated lesions. Technical success (complete lack of enhancement) was achieved in 82%, 97% and 91% of treated lesions at 1-, 3- and 6-months CT follow-up scan, respectively (p < .000). Comparing the tumor longest diameter between the baseline and at 6 month CT images, technical success was revealed in 92% cases (p < .000). Conclusion: Our preliminary experience suggests that PCA is a feasible treatment option. Well-designed clinical trials with a larger patient population are necessary to further investigate the long-term results and prognostic factors.

  3. A Principal Component Analysis (PCA Approach to Seasonal and Zooplankton Diversity Relationships in Fishing Grounds of Mannar Gulf, India

    Directory of Open Access Journals (Sweden)

    Selvin J. PITCHAIKANI

    2017-06-01

    Full Text Available Principal component analysis (PCA is a technique used to emphasize variation and bring out strong patterns in a dataset. It is often used to make data easy to explore and visualize. The primary objective of the present study was to record information of zooplankton diversity in a systematic way and to study the variability and relationships among seasons prevailed in Gulf of Mannar. The PCA for the zooplankton seasonal diversity was investigated using the four seasonal datasets to understand the statistical significance among the four seasons. Two different principal components (PC were segregated in all the seasons homogeneously. PCA analyses revealed that Temora turbinata is an opportunistic species and zooplankton diversity was significantly different from season to season and principally, the zooplankton abundance and its dynamics in Gulf of Mannar is structured by seasonal current patterns. The factor loadings of zooplankton for different seasons in Tiruchendur coastal water (GOM is different compared with the Southwest coast of India; particularly, routine and opportunistic species were found within the positive and negative factors. The copepods Acrocalanus gracilis and Acartia erythrea were dominant in summer and Southwest monsoon due to the rainfall and freshwater discharge during the summer season; however, these species were replaced by Temora turbinata during Northeast monsoon season.

  4. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Alessandra Caggiano

    2018-03-01

    Full Text Available Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA is proposed. PCA allowed to identify a smaller number of features (k = 2 features, the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax was achieved, with predicted values very close to the measured tool wear values.

  5. An integrated DEA PCA numerical taxonomy approach for energy efficiency assessment and consumption optimization in energy intensive manufacturing sectors

    International Nuclear Information System (INIS)

    Azadeh, A.; Amalnick, M.S.; Ghaderi, S.F.; Asadzadeh, S.M.

    2007-01-01

    This paper introduces an integrated approach based on data envelopment analysis (DEA), principal component analysis (PCA) and numerical taxonomy (NT) for total energy efficiency assessment and optimization in energy intensive manufacturing sectors. Total energy efficiency assessment and optimization of the proposed approach considers structural indicators in addition conventional consumption and manufacturing sector output indicators. The validity of the DEA model is verified and validated by PCA and NT through Spearman correlation experiment. Moreover, the proposed approach uses the measure-specific super-efficiency DEA model for sensitivity analysis to determine the critical energy carriers. Four energy intensive manufacturing sectors are discussed in this paper: iron and steel, pulp and paper, petroleum refining and cement manufacturing sectors. To show superiority and applicability, the proposed approach has been applied to refinery sub-sectors of some OECD (Organization for Economic Cooperation and Development) countries. This study has several unique features which are: (1) a total approach which considers structural indicators in addition to conventional energy efficiency indicators; (2) a verification and validation mechanism for DEA by PCA and NT and (3) utilization of DEA for total energy efficiency assessment and consumption optimization of energy intensive manufacturing sectors

  6. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition.

    Science.gov (United States)

    Caggiano, Alessandra

    2018-03-09

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features ( k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear ( VB max ) was achieved, with predicted values very close to the measured tool wear values.

  7. Improving the bioavailability and anticancer effect of the PCA-1/ALKBH3 inhibitor HUHS015 using sodium salt.

    Science.gov (United States)

    Mabuchi, Miyuki; Shimizu, Tadashi; Ueda, Masahiro; Sasakawa, Yuka; Nakao, Syuhei; Ueda, Yuko; Kawamura, Akio; Tsujikawa, Kazutake; Tanaka, Akito

    2015-01-01

    Prostate cancer antigen (PCA)-1/AlkB homologue 3 (ALKBH3) has been identified as a clinically significant factor and siRNA of PCA-1 inhibits DU145 proliferation both in vitro and in vivo. HUHS015 ( 1: ), a previous reported PCA-1 small-molecule inhibitor, was also effective without any obvious side-effects or toxicity. The potency of HUHS015, however, is not satisfying. We thought the reason is poor solubility of HUHS015 because insoluble material remained at the injection site after subcutaneous administration. To improve this inhibitor's solubility, we prepared various salts of HUHS015 and examined their solubility, which resulted in the selection of HUHS015 sodium salt ( 2: ) for further studies in vivo. Next, we compared the pharmacokinetics of 1: and 2: via several administration routes. We observed significant improvements in the pharmacokinetic parameters. For example, subcutaneous administration of 2: increased the area under the curve (AUC)0-24 by 8-fold compared to 1 and increased the suppressive effect on the proliferation of DU145 cells in a xenograft model. Copyright © 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  8. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    Science.gov (United States)

    2018-01-01

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features (k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax) was achieved, with predicted values very close to the measured tool wear values. PMID:29522443

  9. A PCA aided cross-covariance scheme for discriminative feature extraction from EEG signals.

    Science.gov (United States)

    Zarei, Roozbeh; He, Jing; Siuly, Siuly; Zhang, Yanchun

    2017-07-01

    Feature extraction of EEG signals plays a significant role in Brain-computer interface (BCI) as it can significantly affect the performance and the computational time of the system. The main aim of the current work is to introduce an innovative algorithm for acquiring reliable discriminating features from EEG signals to improve classification performances and to reduce the time complexity. This study develops a robust feature extraction method combining the principal component analysis (PCA) and the cross-covariance technique (CCOV) for the extraction of discriminatory information from the mental states based on EEG signals in BCI applications. We apply the correlation based variable selection method with the best first search on the extracted features to identify the best feature set for characterizing the distribution of mental state signals. To verify the robustness of the proposed feature extraction method, three machine learning techniques: multilayer perceptron neural networks (MLP), least square support vector machine (LS-SVM), and logistic regression (LR) are employed on the obtained features. The proposed methods are evaluated on two publicly available datasets. Furthermore, we evaluate the performance of the proposed methods by comparing it with some recently reported algorithms. The experimental results show that all three classifiers achieve high performance (above 99% overall classification accuracy) for the proposed feature set. Among these classifiers, the MLP and LS-SVM methods yield the best performance for the obtained feature. The average sensitivity, specificity and classification accuracy for these two classifiers are same, which are 99.32%, 100%, and 99.66%, respectively for the BCI competition dataset IVa and 100%, 100%, and 100%, for the BCI competition dataset IVb. The results also indicate the proposed methods outperform the most recently reported methods by at least 0.25% average accuracy improvement in dataset IVa. The execution time

  10. Advances in temporal logic

    CERN Document Server

    Fisher, Michael; Gabbay, Dov; Gough, Graham

    2000-01-01

    Time is a fascinating subject that has captured mankind's imagination from ancient times to the present. It has been, and continues to be studied across a wide range of disciplines, from the natural sciences to philosophy and logic. More than two decades ago, Pnueli in a seminal work showed the value of temporal logic in the specification and verification of computer programs. Today, a strong, vibrant international research community exists in the broad community of computer science and AI. This volume presents a number of articles from leading researchers containing state-of-the-art results in such areas as pure temporal/modal logic, specification and verification, temporal databases, temporal aspects in AI, tense and aspect in natural language, and temporal theorem proving. Earlier versions of some of the articles were given at the most recent International Conference on Temporal Logic, University of Manchester, UK. Readership: Any student of the area - postgraduate, postdoctoral or even research professor ...

  11. Spatial attention does improve temporal discrimination.

    Science.gov (United States)

    Chica, Ana B; Christie, John

    2009-02-01

    It has recently been stated that exogenous attention impairs temporal-resolution tasks (Hein, Rolke, & Ulrich, 2006; Rolke, Dinkelbach, Hein, & Ulrich, 2008; Yeshurun, 2004; Yeshurun & Levy, 2003). In comparisons of performance on spatially cued trials versus neutral cued trials, the results have suggested that spatial attention decreases temporal resolution. However, when performance on cued and uncued trials has been compared in order to equate for cue salience, typically speed-accuracy trade-offs (SATs) have been observed, making the interpretation of the results difficult. In the present experiments, we aimed at studying the effect of spatial attention in temporal resolution while using a procedure to control for SATs. We controlled reaction times (RTs) by constraining the time to respond, so that response decisions would be made within comparable time windows. The results revealed that when RT was controlled, performance was impaired for cued trials as compared with neutral trials, replicating previous findings. However, when cued and uncued trials were compared, performance was actually improved for cued trials as compared with uncued trials. These results suggest that SAT effects may have played an important role in the previous studies, because when they were controlled and measured, the results reversed, revealing that exogenous attention does improve performance on temporal-resolution tasks.

  12. Temporal fossa hemangiopericytoma: a case series.

    Science.gov (United States)

    Heiser, Marc A; Waldron, James S; Tihan, Tarik; Parsa, Andrew T; Cheung, Steven W

    2009-10-01

    Review clinical experience with temporal fossa hemangiopericytomas (HPCs). Retrospective case series review. Tertiary referral center. Intracranial HPCs within the temporal fossa. Craniotomy for either subtotal or gross total tumor excision. Determination of clinical outcome (alive with no evidence of disease, alive with disease, and died of disease). Five cases of HPC involving the temporal fossa were treated at our tertiary referral center for the period from 1995 to 2008. All but 1 patient were men. The age of presentation ranged from 31 to 62 years, and duration of follow-up ranged from 8 to 153 months. Clinical presentation was protean; headache was the most common symptom. Gross total tumor excision was achieved in 2 patients, whereas subtotal tumor excision was achieved in 3 patients. Reasons for subtotal resection included excessive intraoperative blood loss and inextricable tumor. Histologically, all tumors were composed of tightly packed, randomly oriented (jumbled-up) tumor cells with little intervening collagen. CD34 staining mostly highlighted the vascular background. One patient died of disease, 2 patients were alive with disease, and 2 patients had no evidence of disease. Management of temporal fossa HPC is challenging because clinical presentation is often late, and extent of tumor excision is constrained by vital structures in the cranial base and intracranial contents. A multidisciplinary approach with neurosurgery and neurotology undertaken to achieve the most complete tumor resection possible, whereas minimizing morbidity are likely to confer a longer period of symptom-free survival and improves curability of these difficult lesions.

  13. Opioid patient controlled analgesia use during the initial experience with the IMPROVE PCA trial: a phase III analgesic trial for hospitalized sickle cell patients with painful episodes.

    Science.gov (United States)

    Dampier, Carlton D; Smith, Wally R; Kim, Hae-Young; Wager, Carrie Greene; Bell, Margaret C; Minniti, Caterina P; Keefer, Jeffrey; Hsu, Lewis; Krishnamurti, Lakshmanan; Mack, A Kyle; McClish, Donna; McKinlay, Sonja M; Miller, Scott T; Osunkwo, Ifeyinwa; Seaman, Phillip; Telen, Marilyn J; Weiner, Debra L

    2011-12-01

    Opioid analgesics administered by patient-controlled analgesia (PCA)are frequently used for pain relief in children and adults with sickle cell disease (SCD) hospitalized for persistent vaso-occlusive pain, but optimum opioid dosing is not known. To better define PCA dosing recommendations,a multi-center phase III clinical trial was conducted comparing two alternative opioid PCA dosing strategies (HDLI—higher demand dose with low constant infusion or LDHI—lower demand dose and higher constant infusion) in 38 subjects who completed randomization prior to trial closure. Total opioid utilization (morphine equivalents,mg/kg) in 22 adults was 11.6 ± 2.6 and 4.7 ± 0.9 in the HDLI andin the LDHI arms, respectively, and in 12 children it was 3.7 ± 1.0 and 5.8 ± 2.2, respectively. Opioid-related symptoms were mild and similar in both PCA arms (mean daily opioid symptom intensity score: HDLI0.9 ± 0.1, LDHI 0.9 ± 0.2). The slow enrollment and early study termination limited conclusions regarding superiority of either treatment regimen. This study adds to our understanding of opioid PCA usage in SCD. Future clinical trial protocol designs for opioid PCA may need to consider potential differences between adults and children in PCA usage.

  14. TEXPLORE temporal difference reinforcement learning for robots and time-constrained domains

    CERN Document Server

    Hester, Todd

    2013-01-01

    This book presents and develops new reinforcement learning methods that enable fast and robust learning on robots in real-time. Robots have the potential to solve many problems in society, because of their ability to work in dangerous places doing necessary jobs that no one wants or is able to do. One barrier to their widespread deployment is that they are mainly limited to tasks where it is possible to hand-program behaviors for every situation that may be encountered. For robots to meet their potential, they need methods that enable them to learn and adapt to novel situations that they were not programmed for. Reinforcement learning (RL) is a paradigm for learning sequential decision making processes and could solve the problems of learning and adaptation on robots. This book identifies four key challenges that must be addressed for an RL algorithm to be practical for robotic control tasks. These RL for Robotics Challenges are: 1) it must learn in very few samples; 2) it must learn in domains with continuou...

  15. The precise temporal calibration of dinosaur origins.

    Science.gov (United States)

    Marsicano, Claudia A; Irmis, Randall B; Mancuso, Adriana C; Mundil, Roland; Chemale, Farid

    2016-01-19

    Dinosaurs have been major components of ecosystems for over 200 million years. Although different macroevolutionary scenarios exist to explain the Triassic origin and subsequent rise to dominance of dinosaurs and their closest relatives (dinosauromorphs), all lack critical support from a precise biostratigraphically independent temporal framework. The absence of robust geochronologic age control for comparing alternative scenarios makes it impossible to determine if observed faunal differences vary across time, space, or a combination of both. To better constrain the origin of dinosaurs, we produced radioisotopic ages for the Argentinian Chañares Formation, which preserves a quintessential assemblage of dinosaurian precursors (early dinosauromorphs) just before the first dinosaurs. Our new high-precision chemical abrasion thermal ionization mass spectrometry (CA-TIMS) U-Pb zircon ages reveal that the assemblage is early Carnian (early Late Triassic), 5- to 10-Ma younger than previously thought. Combined with other geochronologic data from the same basin, we constrain the rate of dinosaur origins, demonstrating their relatively rapid origin in a less than 5-Ma interval, thus halving the temporal gap between assemblages containing only dinosaur precursors and those with early dinosaurs. After their origin, dinosaurs only gradually dominated mid- to high-latitude terrestrial ecosystems millions of years later, closer to the Triassic-Jurassic boundary.

  16. Constrained motion estimation-based error resilient coding for HEVC

    Science.gov (United States)

    Guo, Weihan; Zhang, Yongfei; Li, Bo

    2018-04-01

    Unreliable communication channels might lead to packet losses and bit errors in the videos transmitted through it, which will cause severe video quality degradation. This is even worse for HEVC since more advanced and powerful motion estimation methods are introduced to further remove the inter-frame dependency and thus improve the coding efficiency. Once a Motion Vector (MV) is lost or corrupted, it will cause distortion in the decoded frame. More importantly, due to motion compensation, the error will propagate along the motion prediction path, accumulate over time, and significantly degrade the overall video presentation quality. To address this problem, we study the problem of encoder-sider error resilient coding for HEVC and propose a constrained motion estimation scheme to mitigate the problem of error propagation to subsequent frames. The approach is achieved by cutting off MV dependencies and limiting the block regions which are predicted by temporal motion vector. The experimental results show that the proposed method can effectively suppress the error propagation caused by bit errors of motion vector and can improve the robustness of the stream in the bit error channels. When the bit error probability is 10-5, an increase of the decoded video quality (PSNR) by up to1.310dB and on average 0.762 dB can be achieved, compared to the reference HEVC.

  17. Less favourable climates constrain demographic strategies in plants.

    Science.gov (United States)

    Csergő, Anna M; Salguero-Gómez, Roberto; Broennimann, Olivier; Coutts, Shaun R; Guisan, Antoine; Angert, Amy L; Welk, Erik; Stott, Iain; Enquist, Brian J; McGill, Brian; Svenning, Jens-Christian; Violle, Cyrille; Buckley, Yvonne M

    2017-08-01

    Correlative species distribution models are based on the observed relationship between species' occurrence and macroclimate or other environmental variables. In climates predicted less favourable populations are expected to decline, and in favourable climates they are expected to persist. However, little comparative empirical support exists for a relationship between predicted climate suitability and population performance. We found that the performance of 93 populations of 34 plant species worldwide - as measured by in situ population growth rate, its temporal variation and extinction risk - was not correlated with climate suitability. However, correlations of demographic processes underpinning population performance with climate suitability indicated both resistance and vulnerability pathways of population responses to climate: in less suitable climates, plants experienced greater retrogression (resistance pathway) and greater variability in some demographic rates (vulnerability pathway). While a range of demographic strategies occur within species' climatic niches, demographic strategies are more constrained in climates predicted to be less suitable. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  18. Indeterministic Temporal Logic

    Directory of Open Access Journals (Sweden)

    Trzęsicki Kazimierz

    2015-09-01

    Full Text Available The questions od determinism, causality, and freedom have been the main philosophical problems debated since the beginning of temporal logic. The issue of the logical value of sentences about the future was stated by Aristotle in the famous tomorrow sea-battle passage. The question has inspired Łukasiewicz’s idea of many-valued logics and was a motive of A. N. Prior’s considerations about the logic of tenses. In the scheme of temporal logic there are different solutions to the problem. In the paper we consider indeterministic temporal logic based on the idea of temporal worlds and the relation of accessibility between them.

  19. PHI and PCA3 improve the prognostic performance of PRIAS and Epstein criteria in predicting insignificant prostate cancer in men eligible for active surveillance.

    Science.gov (United States)

    Cantiello, Francesco; Russo, Giorgio Ivan; Cicione, Antonio; Ferro, Matteo; Cimino, Sebastiano; Favilla, Vincenzo; Perdonà, Sisto; De Cobelli, Ottavio; Magno, Carlo; Morgia, Giuseppe; Damiano, Rocco

    2016-04-01

    To assess the performance of prostate health index (PHI) and prostate cancer antigen 3 (PCA3) when added to the PRIAS or Epstein criteria in predicting the presence of pathologically insignificant prostate cancer (IPCa) in patients who underwent radical prostatectomy (RP) but eligible for active surveillance (AS). An observational retrospective study was performed in 188 PCa patients treated with laparoscopic or robot-assisted RP but eligible for AS according to Epstein or PRIAS criteria. Blood and urinary specimens were collected before initial prostate biopsy for PHI and PCA3 measurements. Multivariate logistic regression analyses and decision curve analysis were carried out to identify predictors of IPCa using the updated ERSPC definition. At the multivariate analyses, the inclusion of both PCA3 and PHI significantly increased the accuracy of the Epstein multivariate model in predicting IPCa with an increase of 17 % (AUC = 0.77) and of 32 % (AUC = 0.92), respectively. The inclusion of both PCA3 and PHI also increased the predictive accuracy of the PRIAS multivariate model with an increase of 29 % (AUC = 0.87) and of 39 % (AUC = 0.97), respectively. DCA revealed that the multivariable models with the addition of PHI or PCA3 showed a greater net benefit and performed better than the reference models. In a direct comparison, PHI outperformed PCA3 performance resulting in higher net benefit. In a same cohort of patients eligible for AS, the addition of PHI and PCA3 to Epstein or PRIAS models improved their prognostic performance. PHI resulted in greater net benefit in predicting IPCa compared to PCA3.

  20. Cascading Constrained 2-D Arrays using Periodic Merging Arrays

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Laursen, Torben Vaarby

    2003-01-01

    We consider a method for designing 2-D constrained codes by cascading finite width arrays using predefined finite width periodic merging arrays. This provides a constructive lower bound on the capacity of the 2-D constrained code. Examples include symmetric RLL and density constrained codes...

  1. Operator approach to solutions of the constrained BKP hierarchy

    International Nuclear Information System (INIS)

    Shen, Hsin-Fu; Lee, Niann-Chern; Tu, Ming-Hsien

    2011-01-01

    The operator formalism to the vector k-constrained BKP hierarchy is presented. We solve the Hirota bilinear equations of the vector k-constrained BKP hierarchy via the method of neutral free fermion. In particular, by choosing suitable group element of O(∞), we construct rational and soliton solutions of the vector k-constrained BKP hierarchy.

  2. Feature and Pose Constrained Visual Aided Inertial Navigation for Computationally Constrained Aerial Vehicles

    Science.gov (United States)

    Williams, Brian; Hudson, Nicolas; Tweddle, Brent; Brockers, Roland; Matthies, Larry

    2011-01-01

    A Feature and Pose Constrained Extended Kalman Filter (FPC-EKF) is developed for highly dynamic computationally constrained micro aerial vehicles. Vehicle localization is achieved using only a low performance inertial measurement unit and a single camera. The FPC-EKF framework augments the vehicle's state with both previous vehicle poses and critical environmental features, including vertical edges. This filter framework efficiently incorporates measurements from hundreds of opportunistic visual features to constrain the motion estimate, while allowing navigating and sustained tracking with respect to a few persistent features. In addition, vertical features in the environment are opportunistically used to provide global attitude references. Accurate pose estimation is demonstrated on a sequence including fast traversing, where visual features enter and exit the field-of-view quickly, as well as hover and ingress maneuvers where drift free navigation is achieved with respect to the environment.

  3. Incomplete Dirac reduction of constrained Hamiltonian systems

    Energy Technology Data Exchange (ETDEWEB)

    Chandre, C., E-mail: chandre@cpt.univ-mrs.fr

    2015-10-15

    First-class constraints constitute a potential obstacle to the computation of a Poisson bracket in Dirac’s theory of constrained Hamiltonian systems. Using the pseudoinverse instead of the inverse of the matrix defined by the Poisson brackets between the constraints, we show that a Dirac–Poisson bracket can be constructed, even if it corresponds to an incomplete reduction of the original Hamiltonian system. The uniqueness of Dirac brackets is discussed. The relevance of this procedure for infinite dimensional Hamiltonian systems is exemplified.

  4. Capturing Hotspots For Constrained Indoor Movement

    DEFF Research Database (Denmark)

    Ahmed, Tanvir; Pedersen, Torben Bach; Lu, Hua

    2013-01-01

    Finding the hotspots in large indoor spaces is very important for getting overloaded locations, security, crowd management, indoor navigation and guidance. The tracking data coming from indoor tracking are huge in volume and not readily available for finding hotspots. This paper presents a graph......-based model for constrained indoor movement that can map the tracking records into mapping records which represent the entry and exit times of an object in a particular location. Then it discusses the hotspots extraction technique from the mapping records....

  5. Quantization of soluble classical constrained systems

    International Nuclear Information System (INIS)

    Belhadi, Z.; Menas, F.; Bérard, A.; Mohrbach, H.

    2014-01-01

    The derivation of the brackets among coordinates and momenta for classical constrained systems is a necessary step toward their quantization. Here we present a new approach for the determination of the classical brackets which does neither require Dirac’s formalism nor the symplectic method of Faddeev and Jackiw. This approach is based on the computation of the brackets between the constants of integration of the exact solutions of the equations of motion. From them all brackets of the dynamical variables of the system can be deduced in a straightforward way

  6. Quantization of soluble classical constrained systems

    Energy Technology Data Exchange (ETDEWEB)

    Belhadi, Z. [Laboratoire de physique et chimie quantique, Faculté des sciences, Université Mouloud Mammeri, BP 17, 15000 Tizi Ouzou (Algeria); Laboratoire de physique théorique, Faculté des sciences exactes, Université de Bejaia, 06000 Bejaia (Algeria); Menas, F. [Laboratoire de physique et chimie quantique, Faculté des sciences, Université Mouloud Mammeri, BP 17, 15000 Tizi Ouzou (Algeria); Ecole Nationale Préparatoire aux Etudes d’ingéniorat, Laboratoire de physique, RN 5 Rouiba, Alger (Algeria); Bérard, A. [Equipe BioPhysStat, Laboratoire LCP-A2MC, ICPMB, IF CNRS No 2843, Université de Lorraine, 1 Bd Arago, 57078 Metz Cedex (France); Mohrbach, H., E-mail: herve.mohrbach@univ-lorraine.fr [Equipe BioPhysStat, Laboratoire LCP-A2MC, ICPMB, IF CNRS No 2843, Université de Lorraine, 1 Bd Arago, 57078 Metz Cedex (France)

    2014-12-15

    The derivation of the brackets among coordinates and momenta for classical constrained systems is a necessary step toward their quantization. Here we present a new approach for the determination of the classical brackets which does neither require Dirac’s formalism nor the symplectic method of Faddeev and Jackiw. This approach is based on the computation of the brackets between the constants of integration of the exact solutions of the equations of motion. From them all brackets of the dynamical variables of the system can be deduced in a straightforward way.

  7. Probing the mysteries of the X-ray binary 4U 1210-64 with ASM, PCA, MAXI, BAT, and Suzaku

    Energy Technology Data Exchange (ETDEWEB)

    Coley, Joel B.; Corbet, Robin H. D.; Mukai, Koji; Pottschmidt, Katja, E-mail: jcoley1@umbc.edu [University of Maryland Baltimore County, 1000 Hilltop Cir, Baltimore, MD 21250 (United States)

    2014-10-01

    4U 1210-64 has been postulated to be a high-mass X-ray binary powered by the Be mechanism. X-ray observations with Suzaku, the ISS Monitor of All-sky X-ray Image (MAXI), and the Rossi X-ray Timing Explorer Proportional Counter Array (PCA) and All Sky Monitor (ASM) provide detailed temporal and spectral information on this poorly understood source. Long-term ASM and MAXI observations show distinct high and low states and the presence of a 6.7101 ± 0.0005 day modulation, interpreted as the orbital period. Folded light curves reveal a sharp dip, interpreted as an eclipse. To determine the nature of the mass donor, the predicted eclipse half-angle was calculated as a function of inclination angle for several stellar spectral types. The eclipse half-angle is not consistent with a mass donor of spectral type B5 V; however, stars with spectral types B0 V or B0-5 III are possible. The best-fit spectral model consists of a power law with index Γ = 1.85{sub −0.05}{sup +0.04} and a high-energy cutoff at 5.5 ± 0.2 keV modified by an absorber that fully covers the source as well as partially covering absorption. Emission lines from S XVI Kα, Fe Kα, Fe XXV Kα, and Fe XXVI Kα were observed in the Suzaku spectra. Out of eclipse, the Fe Kα line flux was strongly correlated with unabsorbed continuum flux, indicating that the Fe I emission is the result of fluorescence of cold dense material near the compact object. The Fe I feature is not detected during eclipse, further supporting an origin close to the compact object.

  8. Chondroblastoma of temporal bone

    Energy Technology Data Exchange (ETDEWEB)

    Tanohta, K.; Noda, M.; Katoh, H.; Okazaki, A.; Sugiyama, S.; Maehara, T.; Onishi, S.; Tanida, T.

    1986-07-01

    The case of a 55-year-old female with chondroblastoma arising from the left temporal bone is presented. Although 10 cases of temporal chondroblastoma have been reported, this is the first in which plain radiography, pluridirectional tomography, computed tomography (CT) and angiography were performed. We discuss the clinical and radiological aspects of this rare tumor.

  9. Chondroblastoma of temporal bone

    International Nuclear Information System (INIS)

    Tanohta, K.; Noda, M.; Katoh, H.; Okazaki, A.; Sugiyama, S.; Maehara, T.; Onishi, S.; Tanida, T.

    1986-01-01

    The case of a 55-year-old female with chondroblastoma arising from the left temporal bone is presented. Although 10 cases of temporal chondroblastoma have been reported, this is the first in which plain radiography, pluridirectional tomography, computed tomography (CT) and angiography were performed. We discuss the clinical and radiological aspects of this rare tumor. (orig.)

  10. Pole shifting with constrained output feedback

    International Nuclear Information System (INIS)

    Hamel, D.; Mensah, S.; Boisvert, J.

    1984-03-01

    The concept of pole placement plays an important role in linear, multi-variable, control theory. It has received much attention since its introduction, and several pole shifting algorithms are now available. This work presents a new method which allows practical and engineering constraints such as gain limitation and controller structure to be introduced right into the pole shifting design strategy. This is achieved by formulating the pole placement problem as a constrained optimization problem. Explicit constraints (controller structure and gain limits) are defined to identify an admissible region for the feedback gain matrix. The desired pole configuration is translated into an appropriate cost function which must be closed-loop minimized. The resulting constrained optimization problem can thus be solved with optimization algorithms. The method has been implemented as an algorithmic interactive module in a computer-aided control system design package, MVPACK. The application of the method is illustrated to design controllers for an aircraft and an evaporator. The results illustrate the importance of controller structure on overall performance of a control system

  11. Changes in epistemic frameworks: Random or constrained?

    Directory of Open Access Journals (Sweden)

    Ananka Loubser

    2012-11-01

    Full Text Available Since the emergence of a solid anti-positivist approach in the philosophy of science, an important question has been to understand how and why epistemic frameworks change in time, are modified or even substituted. In contemporary philosophy of science three main approaches to framework-change were detected in the humanist tradition:1. In both the pre-theoretical and theoretical domains changes occur according to a rather constrained, predictable or even pre-determined pattern (e.g. Holton.2. Changes occur in a way that is more random or unpredictable and free from constraints (e.g. Kuhn, Feyerabend, Rorty, Lyotard.3. Between these approaches, a middle position can be found, attempting some kind of synthesis (e.g. Popper, Lakatos.Because this situation calls for clarification and systematisation, this article in fact tried to achieve more clarity on how changes in pre-scientific frameworks occur, as well as provided transcendental criticism of the above positions. This article suggested that the above-mentioned positions are not fully satisfactory, as change and constancy are not sufficiently integrated. An alternative model was suggested in which changes in epistemic frameworks occur according to a pattern, neither completely random nor rigidly constrained, which results in change being dynamic but not arbitrary. This alternative model is integral, rather than dialectical and therefore does not correspond to position three. 

  12. Fringe instability in constrained soft elastic layers.

    Science.gov (United States)

    Lin, Shaoting; Cohen, Tal; Zhang, Teng; Yuk, Hyunwoo; Abeyaratne, Rohan; Zhao, Xuanhe

    2016-11-04

    Soft elastic layers with top and bottom surfaces adhered to rigid bodies are abundant in biological organisms and engineering applications. As the rigid bodies are pulled apart, the stressed layer can exhibit various modes of mechanical instabilities. In cases where the layer's thickness is much smaller than its length and width, the dominant modes that have been studied are the cavitation, interfacial and fingering instabilities. Here we report a new mode of instability which emerges if the thickness of the constrained elastic layer is comparable to or smaller than its width. In this case, the middle portion along the layer's thickness elongates nearly uniformly while the constrained fringe portions of the layer deform nonuniformly. When the applied stretch reaches a critical value, the exposed free surfaces of the fringe portions begin to undulate periodically without debonding from the rigid bodies, giving the fringe instability. We use experiments, theory and numerical simulations to quantitatively explain the fringe instability and derive scaling laws for its critical stress, critical strain and wavelength. We show that in a force controlled setting the elastic fingering instability is associated with a snap-through buckling that does not exist for the fringe instability. The discovery of the fringe instability will not only advance the understanding of mechanical instabilities in soft materials but also have implications for biological and engineered adhesives and joints.

  13. Molecular Characterization of the Genes pcaG and pcaH, Encoding Protocatechuate 3,4-Dioxygenase, Which Are Essential for Vanillin Catabolism in Pseudomonas sp. Strain HR199

    Science.gov (United States)

    Overhage, Jörg; Kresse, Andreas U.; Priefert, Horst; Sommer, Horst; Krammer, Gerhard; Rabenhorst, Jürgen; Steinbüchel, Alexander

    1999-01-01

    Pseudomonas sp. strain HR199 is able to utilize eugenol (4-allyl-2-methoxyphenol), vanillin (4-hydroxy-3-methoxybenzaldehyde), or protocatechuate as the sole carbon source for growth. Mutants of this strain which were impaired in the catabolism of vanillin but retained the ability to utilize eugenol or protocatechuate were obtained after nitrosoguanidine mutagenesis. One mutant (SK6169) was used as recipient of a Pseudomonas sp. strain HR199 genomic library in cosmid pVK100, and phenotypic complementation was achieved with a 5.8-kbp EcoRI fragment (E58). The amino acid sequences deduced from two corresponding open reading frames (ORF) identified on E58 revealed high degrees of homology to pcaG and pcaH, encoding the two subunits of protocatechuate 3,4-dioxygenase. Three additional ORF most probably encoded a 4-hydroxybenzoate 3-hydroxylase (PobA) and two putative regulatory proteins, which exhibited homology to PcaQ of Agrobacterium tumefaciens and PobR of Pseudomonas aeruginosa, respectively. Since mutant SK6169 was also complemented by a subfragment of E58 that harbored only pcaH, this mutant was most probably lacking a functional β subunit of the protocatechuate 3,4-dioxygenase. Since this mutant was still able to grow on protocatechuate and lacked protocatechuate 4,5-dioxygenase and protocatechuate 2,3-dioxygenase, the degradation had to be catalyzed by different enzymes. Two other mutants (SK6184 and SK6190), which were also impaired in the catabolism of vanillin, were not complemented by fragment E58. Since these mutants accumulated 3-carboxy muconolactone during cultivation on eugenol, they most probably exhibited a defect in a step of the catabolic pathway following the ortho cleavage. Moreover, in these mutants cyclization of 3-carboxymuconic acid seems to occur by a syn absolute stereochemical course, which is normally only observed for cis,cis-muconate lactonization in pseudomonads. In conclusion, vanillin is degraded through the ortho-cleavage pathway

  14. Comparative Effectiveness of Semantic Feature Analysis (SFA and Phonological Components Analysis (PCA for Anomia Treatment in Persian Speaking Patients With Aphasia

    Directory of Open Access Journals (Sweden)

    Zahra Sadeghi

    2017-09-01

    Discussion: While PCA is more effective for participants with phonological impairments, SFA is more effective for participants with semantic impairments. Therefore, a direct relationship between underlying functional deficit and response to specific treatment was established for all participants.

  15. NOAA TIFF Image - 4m Bathymetric Principal Component Analysis (PCA) of Red Snapper Research Areas in the South Atlantic Bight, 2010

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains unified Bathymetric PCA GeoTiffs with 4x4 meter cell resolution describing the topography of 15 areas along the shelf edge off the South...

  16. Principle Component Analysis with Incomplete Data: A simulation of R pcaMethods package in Constructing an Environmental Quality Index with Missing Data

    Science.gov (United States)

    Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...

  17. The singular value filter: a general filter design strategy for PCA-based signal separation in medical ultrasound imaging.

    Science.gov (United States)

    Mauldin, F William; Lin, Dan; Hossack, John A

    2011-11-01

    A general filtering method, called the singular value filter (SVF), is presented as a framework for principal component analysis (PCA) based filter design in medical ultrasound imaging. The SVF approach operates by projecting the original data onto a new set of bases determined from PCA using singular value decomposition (SVD). The shape of the SVF weighting function, which relates the singular value spectrum of the input data to the filtering coefficients assigned to each basis function, is designed in accordance with a signal model and statistical assumptions regarding the underlying source signals. In this paper, we applied SVF for the specific application of clutter artifact rejection in diagnostic ultrasound imaging. SVF was compared to a conventional PCA-based filtering technique, which we refer to as the blind source separation (BSS) method, as well as a simple frequency-based finite impulse response (FIR) filter used as a baseline for comparison. The performance of each filter was quantified in simulated lesion images as well as experimental cardiac ultrasound data. SVF was demonstrated in both simulation and experimental results, over a wide range of imaging conditions, to outperform the BSS and FIR filtering methods in terms of contrast-to-noise ratio (CNR) and motion tracking performance. In experimental mouse heart data, SVF provided excellent artifact suppression with an average CNR improvement of 1.8 dB with over 40% reduction in displacement tracking error. It was further demonstrated from simulation and experimental results that SVF provided superior clutter rejection, as reflected in larger CNR values, when filtering was achieved using complex pulse-echo received data and non-binary filter coefficients.

  18. A Hybrid PCA-CART-MARS-Based Prognostic Approach of the Remaining Useful Life for Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Fernando Sánchez Lasheras

    2015-03-01

    Full Text Available Prognostics is an engineering discipline that predicts the future health of a system. In this research work, a data-driven approach for prognostics is proposed. Indeed, the present paper describes a data-driven hybrid model for the successful prediction of the remaining useful life of aircraft engines. The approach combines the multivariate adaptive regression splines (MARS technique with the principal component analysis (PCA, dendrograms and classification and regression trees (CARTs. Elements extracted from sensor signals are used to train this hybrid model, representing different levels of health for aircraft engines. In this way, this hybrid algorithm is used to predict the trends of these elements. Based on this fitting, one can determine the future health state of a system and estimate its remaining useful life (RUL with accuracy. To evaluate the proposed approach, a test was carried out using aircraft engine signals collected from physical sensors (temperature, pressure, speed, fuel flow, etc.. Simulation results show that the PCA-CART-MARS-based approach can forecast faults long before they occur and can predict the RUL. The proposed hybrid model presents as its main advantage the fact that it does not require information about the previous operation states of the input variables of the engine. The performance of this model was compared with those obtained by other benchmark models (multivariate linear regression and artificial neural networks also applied in recent years for the modeling of remaining useful life. Therefore, the PCA-CART-MARS-based approach is very promising in the field of prognostics of the RUL for aircraft engines.

  19. Evaluation of significant sources influencing the variation of water quality of Kandla creek, Gulf of Katchchh, using PCA

    Digital Repository Service at National Institute of Oceanography (India)

    Dalal, S.G.; Shirodkar, P.V.; Jagtap, T.G.; Naik, B.G.; Rao, G.S.

    and Marhaba 2003). The use of PCA for wa- ter quality assessment has increased in the last few years, mainly due to the need to obtain appreciable data reduction for analysis and decision (Morales et al. 1999). Bartlett’s sphericity test (χ 2 with degrees....). Florida: CRC. Morales, M. M., Mart, P., Llopis, A., Campos, L., & Sagrado, J. (1999). An environmental study by fac- tor analysis of surface seawater in the Gulf of Valen- cia (western Mediterranean). Analytica Chimica Acta, 394, 109–117. doi:10.1016/S0003...

  20. PCA/INCREMENT MEMORY interface for analog processors on-line with PC-XT/AT IBM

    International Nuclear Information System (INIS)

    Biri, S.; Buttsev, V.S.; Molnar, J.; Samojlov, V.N.

    1989-01-01

    The functional and operational descriptions on PCA/INCREMENT MEMORY interface are discussed. The following is solved with this unit: connection between the analogue signal processor and PC, nuclear spectrum acquisition up to 2 24 -1 counts/channel using increment or decrement method, data read/write from or to memory via data bus PC during the spectrum acquisition. Dual ported memory organization is 4096x24 bit, increment cycle time at 4.77 MHz system clock frequency is 1.05 μs. 6 refs.; 2 figs

  1. Otosclerosis: Temporal Bone Pathology.

    Science.gov (United States)

    Quesnel, Alicia M; Ishai, Reuven; McKenna, Michael J

    2018-04-01

    Otosclerosis is pathologically characterized by abnormal bony remodeling, which includes bone resorption, new bone deposition, and vascular proliferation in the temporal bone. Sensorineural hearing loss in otosclerosis is associated with extension of otosclerosis to the cochlear endosteum and deposition of collagen throughout the spiral ligament. Persistent or recurrent conductive hearing loss after stapedectomy has been associated with incomplete footplate fenestration, poor incus-prosthesis connection, and incus resorption in temporal bone specimens. Human temporal bone pathology has helped to define the role of computed tomography imaging for otosclerosis, confirming that computed tomography is highly sensitive for diagnosis, yet limited in assessing cochlear endosteal involvement. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) significantly improve prostate cancer detection at initial biopsy in a total PSA range of 2-10 ng/ml.

    Science.gov (United States)

    Ferro, Matteo; Bruzzese, Dario; Perdonà, Sisto; Marino, Ada; Mazzarella, Claudia; Perruolo, Giuseppe; D'Esposito, Vittoria; Cosimato, Vincenzo; Buonerba, Carlo; Di Lorenzo, Giuseppe; Musi, Gennaro; De Cobelli, Ottavio; Chun, Felix K; Terracciano, Daniela

    2013-01-01

    Many efforts to reduce prostate specific antigen (PSA) overdiagnosis and overtreatment have been made. To this aim, Prostate Health Index (Phi) and Prostate Cancer Antigen 3 (PCA3) have been proposed as new more specific biomarkers. We evaluated the ability of phi and PCA3 to identify prostate cancer (PCa) at initial prostate biopsy in men with total PSA range of 2-10 ng/ml. The performance of phi and PCA3 were evaluated in 300 patients undergoing first prostate biopsy. ROC curve analyses tested the accuracy (AUC) of phi and PCA3 in predicting PCa. Decision curve analyses (DCA) were used to compare the clinical benefit of the two biomarkers. We found that the AUC value of phi (0.77) was comparable to those of %p2PSA (0.76) and PCA3 (0.73) with no significant differences in pairwise comparison (%p2PSA vs phi p = 0.673, %p2PSA vs. PCA3 p = 0.417 and phi vs. PCA3 p = 0.247). These three biomarkers significantly outperformed fPSA (AUC = 0.60), % fPSA (AUC = 0.62) and p2PSA (AUC = 0.63). At DCA, phi and PCA3 exhibited a very close net benefit profile until the threshold probability of 25%, then phi index showed higher net benefit than PCA3. Multivariable analysis showed that the addition of phi and PCA3 to the base multivariable model (age, PSA, %fPSA, DRE, prostate volume) increased predictive accuracy, whereas no model improved single biomarker performance. Finally we showed that subjects with active surveillance (AS) compatible cancer had significantly lower phi and PCA3 values (pphi and PCA3 comparably increase the accuracy in predicting the presence of PCa in total PSA range 2-10 ng/ml at initial biopsy, outperforming currently used %fPSA.

  3. Scheduling of resource-constrained projects

    CERN Document Server

    Klein, Robert

    2000-01-01

    Project management has become a widespread instrument enabling organizations to efficiently master the challenges of steadily shortening product life cycles, global markets and decreasing profit margins. With projects increasing in size and complexity, their planning and control represents one of the most crucial management tasks. This is especially true for scheduling, which is concerned with establishing execution dates for the sub-activities to be performed in order to complete the project. The ability to manage projects where resources must be allocated between concurrent projects or even sub-activities of a single project requires the use of commercial project management software packages. However, the results yielded by the solution procedures included are often rather unsatisfactory. Scheduling of Resource-Constrained Projects develops more efficient procedures, which can easily be integrated into software packages by incorporated programming languages, and thus should be of great interest for practiti...

  4. Constrained mathematics evaluation in probabilistic logic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arlin Cooper, J

    1998-06-01

    A challenging problem in mathematically processing uncertain operands is that constraints inherent in the problem definition can require computations that are difficult to implement. Examples of possible constraints are that the sum of the probabilities of partitioned possible outcomes must be one, and repeated appearances of the same variable must all have the identical value. The latter, called the 'repeated variable problem', will be addressed in this paper in order to show how interval-based probabilistic evaluation of Boolean logic expressions, such as those describing the outcomes of fault trees and event trees, can be facilitated in a way that can be readily implemented in software. We will illustrate techniques that can be used to transform complex constrained problems into trivial problems in most tree logic expressions, and into tractable problems in most other cases.

  5. Constraining dark sectors with monojets and dijets

    International Nuclear Information System (INIS)

    Chala, Mikael; Kahlhoefer, Felix; Nardini, Germano; Schmidt-Hoberg, Kai; McCullough, Matthew

    2015-03-01

    We consider dark sector particles (DSPs) that obtain sizeable interactions with Standard Model fermions from a new mediator. While these particles can avoid observation in direct detection experiments, they are strongly constrained by LHC measurements. We demonstrate that there is an important complementarity between searches for DSP production and searches for the mediator itself, in particular bounds on (broad) dijet resonances. This observation is crucial not only in the case where the DSP is all of the dark matter but whenever - precisely due to its sizeable interactions with the visible sector - the DSP annihilates away so efficiently that it only forms a dark matter subcomponent. To highlight the different roles of DSP direct detection and LHC monojet and dijet searches, as well as perturbativity constraints, we first analyse the exemplary case of an axial-vector mediator and then generalise our results. We find important implications for the interpretation of LHC dark matter searches in terms of simplified models.

  6. Constrained KP models as integrable matrix hierarchies

    International Nuclear Information System (INIS)

    Aratyn, H.; Ferreira, L.A.; Gomes, J.F.; Zimerman, A.H.

    1997-01-01

    We formulate the constrained KP hierarchy (denoted by cKP K+1,M ) as an affine [cflx sl](M+K+1) matrix integrable hierarchy generalizing the Drinfeld endash Sokolov hierarchy. Using an algebraic approach, including the graded structure of the generalized Drinfeld endash Sokolov hierarchy, we are able to find several new universal results valid for the cKP hierarchy. In particular, our method yields a closed expression for the second bracket obtained through Dirac reduction of any untwisted affine Kac endash Moody current algebra. An explicit example is given for the case [cflx sl](M+K+1), for which a closed expression for the general recursion operator is also obtained. We show how isospectral flows are characterized and grouped according to the semisimple non-regular element E of sl(M+K+1) and the content of the center of the kernel of E. copyright 1997 American Institute of Physics

  7. Quantum cosmology of classically constrained gravity

    International Nuclear Information System (INIS)

    Gabadadze, Gregory; Shang Yanwen

    2006-01-01

    In [G. Gabadadze, Y. Shang, hep-th/0506040] we discussed a classically constrained model of gravity. This theory contains known solutions of General Relativity (GR), and admits solutions that are absent in GR. Here we study cosmological implications of some of these new solutions. We show that a spatially-flat de Sitter universe can be created from 'nothing'. This universe has boundaries, and its total energy equals to zero. Although the probability to create such a universe is exponentially suppressed, it favors initial conditions suitable for inflation. Then we discuss a finite-energy solution with a nonzero cosmological constant and zero space-time curvature. There is no tunneling suppression to fluctuate into this state. We show that for a positive cosmological constant this state is unstable-it can rapidly transition to a de Sitter universe providing a new unsuppressed channel for inflation. For a negative cosmological constant the space-time flat solutions is stable.

  8. Multiple Clustering Views via Constrained Projections

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Assent, Ira; Bailey, James

    2012-01-01

    Clustering, the grouping of data based on mutual similarity, is often used as one of principal tools to analyze and understand data. Unfortunately, most conventional techniques aim at finding only a single clustering over the data. For many practical applications, especially those being described...... in high dimensional data, it is common to see that the data can be grouped into different yet meaningful ways. This gives rise to the recently emerging research area of discovering alternative clusterings. In this preliminary work, we propose a novel framework to generate multiple clustering views....... The framework relies on a constrained data projection approach by which we ensure that a novel alternative clustering being found is not only qualitatively strong but also distinctively different from a reference clustering solution. We demonstrate the potential of the proposed framework using both synthetic...

  9. Shape space exploration of constrained meshes

    KAUST Repository

    Yang, Yongliang

    2011-12-12

    We present a general computational framework to locally characterize any shape space of meshes implicitly prescribed by a collection of non-linear constraints. We computationally access such manifolds, typically of high dimension and co-dimension, through first and second order approximants, namely tangent spaces and quadratically parameterized osculant surfaces. Exploration and navigation of desirable subspaces of the shape space with regard to application specific quality measures are enabled using approximants that are intrinsic to the underlying manifold and directly computable in the parameter space of the osculant surface. We demonstrate our framework on shape spaces of planar quad (PQ) meshes, where each mesh face is constrained to be (nearly) planar, and circular meshes, where each face has a circumcircle. We evaluate our framework for navigation and design exploration on a variety of inputs, while keeping context specific properties such as fairness, proximity to a reference surface, etc. © 2011 ACM.

  10. Shape space exploration of constrained meshes

    KAUST Repository

    Yang, Yongliang; Yang, Yijun; Pottmann, Helmut; Mitra, Niloy J.

    2011-01-01

    We present a general computational framework to locally characterize any shape space of meshes implicitly prescribed by a collection of non-linear constraints. We computationally access such manifolds, typically of high dimension and co-dimension, through first and second order approximants, namely tangent spaces and quadratically parameterized osculant surfaces. Exploration and navigation of desirable subspaces of the shape space with regard to application specific quality measures are enabled using approximants that are intrinsic to the underlying manifold and directly computable in the parameter space of the osculant surface. We demonstrate our framework on shape spaces of planar quad (PQ) meshes, where each mesh face is constrained to be (nearly) planar, and circular meshes, where each face has a circumcircle. We evaluate our framework for navigation and design exploration on a variety of inputs, while keeping context specific properties such as fairness, proximity to a reference surface, etc. © 2011 ACM.

  11. Constrained vertebrate evolution by pleiotropic genes.

    Science.gov (United States)

    Hu, Haiyang; Uesaka, Masahiro; Guo, Song; Shimai, Kotaro; Lu, Tsai-Ming; Li, Fang; Fujimoto, Satoko; Ishikawa, Masato; Liu, Shiping; Sasagawa, Yohei; Zhang, Guojie; Kuratani, Shigeru; Yu, Jr-Kai; Kusakabe, Takehiro G; Khaitovich, Philipp; Irie, Naoki

    2017-11-01

    Despite morphological diversification of chordates over 550 million years of evolution, their shared basic anatomical pattern (or 'bodyplan') remains conserved by unknown mechanisms. The developmental hourglass model attributes this to phylum-wide conserved, constrained organogenesis stages that pattern the bodyplan (the phylotype hypothesis); however, there has been no quantitative testing of this idea with a phylum-wide comparison of species. Here, based on data from early-to-late embryonic transcriptomes collected from eight chordates, we suggest that the phylotype hypothesis would be better applied to vertebrates than chordates. Furthermore, we found that vertebrates' conserved mid-embryonic developmental programmes are intensively recruited to other developmental processes, and the degree of the recruitment positively correlates with their evolutionary conservation and essentiality for normal development. Thus, we propose that the intensively recruited genetic system during vertebrates' organogenesis period imposed constraints on its diversification through pleiotropic constraints, which ultimately led to the common anatomical pattern observed in vertebrates.

  12. Constraining Lyman continuum escape using Machine Learning

    Science.gov (United States)

    Giri, Sambit K.; Zackrisson, Erik; Binggeli, Christian; Pelckmans, Kristiaan; Cubo, Rubén; Mellema, Garrelt

    2018-05-01

    The James Webb Space Telescope (JWST) will observe the rest-frame ultraviolet/optical spectra of galaxies from the epoch of reionization (EoR) in unprecedented detail. While escaping into the intergalactic medium, hydrogen-ionizing (Lyman continuum; LyC) photons from the galaxies will contribute to the bluer end of the UV slope and make nebular emission lines less prominent. We present a method to constrain leakage of the LyC photons using the spectra of high redshift (z >~ 6) galaxies. We simulate JWST/NIRSpec observations of galaxies at z =6-9 by matching the fluxes of galaxies observed in the Frontier Fields observations of galaxy cluster MACS-J0416. Our method predicts the escape fraction fesc with a mean absolute error Δfesc ~ 0.14. The method also predicts the redshifts of the galaxies with an error .

  13. Statistical mechanics of budget-constrained auctions

    International Nuclear Information System (INIS)

    Altarelli, F; Braunstein, A; Realpe-Gomez, J; Zecchina, R

    2009-01-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise

  14. Constraining Dark Sectors with Monojets and Dijets

    CERN Document Server

    Chala, Mikael; McCullough, Matthew; Nardini, Germano; Schmidt-Hoberg, Kai

    2015-01-01

    We consider dark sector particles (DSPs) that obtain sizeable interactions with Standard Model fermions from a new mediator. While these particles can avoid observation in direct detection experiments, they are strongly constrained by LHC measurements. We demonstrate that there is an important complementarity between searches for DSP production and searches for the mediator itself, in particular bounds on (broad) dijet resonances. This observation is crucial not only in the case where the DSP is all of the dark matter but whenever - precisely due to its sizeable interactions with the visible sector - the DSP annihilates away so efficiently that it only forms a dark matter subcomponent. To highlight the different roles of DSP direct detection and LHC monojet and dijet searches, as well as perturbativity constraints, we first analyse the exemplary case of an axial-vector mediator and then generalise our results. We find important implications for the interpretation of LHC dark matter searches in terms of simpli...

  15. Statistical mechanics of budget-constrained auctions

    Science.gov (United States)

    Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.

    2009-07-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.

  16. Constrained least squares regularization in PET

    International Nuclear Information System (INIS)

    Choudhury, K.R.; O'Sullivan, F.O.

    1996-01-01

    Standard reconstruction methods used in tomography produce images with undesirable negative artifacts in background and in areas of high local contrast. While sophisticated statistical reconstruction methods can be devised to correct for these artifacts, their computational implementation is excessive for routine operational use. This work describes a technique for rapid computation of approximate constrained least squares regularization estimates. The unique feature of the approach is that it involves no iterative projection or backprojection steps. This contrasts with the familiar computationally intensive algorithms based on algebraic reconstruction (ART) or expectation-maximization (EM) methods. Experimentation with the new approach for deconvolution and mixture analysis shows that the root mean square error quality of estimators based on the proposed algorithm matches and usually dominates that of more elaborate maximum likelihood, at a fraction of the computational effort

  17. Constraining dark sectors with monojets and dijets

    Energy Technology Data Exchange (ETDEWEB)

    Chala, Mikael; Kahlhoefer, Felix; Nardini, Germano; Schmidt-Hoberg, Kai [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); McCullough, Matthew [European Organization for Nuclear Research (CERN), Geneva (Switzerland). Theory Div.

    2015-03-15

    We consider dark sector particles (DSPs) that obtain sizeable interactions with Standard Model fermions from a new mediator. While these particles can avoid observation in direct detection experiments, they are strongly constrained by LHC measurements. We demonstrate that there is an important complementarity between searches for DSP production and searches for the mediator itself, in particular bounds on (broad) dijet resonances. This observation is crucial not only in the case where the DSP is all of the dark matter but whenever - precisely due to its sizeable interactions with the visible sector - the DSP annihilates away so efficiently that it only forms a dark matter subcomponent. To highlight the different roles of DSP direct detection and LHC monojet and dijet searches, as well as perturbativity constraints, we first analyse the exemplary case of an axial-vector mediator and then generalise our results. We find important implications for the interpretation of LHC dark matter searches in terms of simplified models.

  18. Constraining the dark side with observations

    International Nuclear Information System (INIS)

    Diez-Tejedor, Alberto

    2007-01-01

    The main purpose of this talk is to use the observational evidences pointing out to the existence of a dark side in the universe in order to infer some of the properties of the unseen material. We will work within the Unified Dark Matter models, in which both, Dark Matter and Dark Energy appear as the result of one unknown component. By modeling effectively this component with a classical scalar field minimally coupled to gravity, we will use the observations to constrain the form of the dark action. Using the flat rotation curves of spiral galaxies we will see that we are restringed to the use of purely kinetic actions, previously studied in cosmology by Scherrer. Finally we arrive to a simple action which fits both cosmological and astrophysical observations

  19. Constraining the dark side with observations

    Energy Technology Data Exchange (ETDEWEB)

    Diez-Tejedor, Alberto [Dpto. de Fisica Teorica, Universidad del PaIs Vasco, Apdo. 644, 48080, Bilbao (Spain)

    2007-05-15

    The main purpose of this talk is to use the observational evidences pointing out to the existence of a dark side in the universe in order to infer some of the properties of the unseen material. We will work within the Unified Dark Matter models, in which both, Dark Matter and Dark Energy appear as the result of one unknown component. By modeling effectively this component with a classical scalar field minimally coupled to gravity, we will use the observations to constrain the form of the dark action. Using the flat rotation curves of spiral galaxies we will see that we are restringed to the use of purely kinetic actions, previously studied in cosmology by Scherrer. Finally we arrive to a simple action which fits both cosmological and astrophysical observations.

  20. Hard exclusive meson production to constrain GPDs

    Energy Technology Data Exchange (ETDEWEB)

    Wolbeek, Johannes ter; Fischer, Horst; Gorzellik, Matthias; Gross, Arne; Joerg, Philipp; Koenigsmann, Kay; Malm, Pasquale; Regali, Christopher; Schmidt, Katharina; Sirtl, Stefan; Szameitat, Tobias [Physikalisches Institut, Albert-Ludwigs-Universitaet Freiburg, Freiburg im Breisgau (Germany); Collaboration: COMPASS Collaboration

    2014-07-01

    The concept of Generalized Parton Distributions (GPDs) combines the two-dimensional spatial information, given by form factors, with the longitudinal momentum information from the PDFs. Thus, GPDs provide a three-dimensional 'tomography' of the nucleon. Furthermore, according to Ji's sum rule, the GPDs H and E enable access to the total angular momenta of quarks, antiquarks and gluons. While H can be approached using electroproduction cross section, hard exclusive meson production off a transversely polarized target can help to constrain the GPD E. At the COMPASS experiment at CERN, two periods of data taking were performed in 2007 and 2010, using a longitudinally polarized 160 GeV/c muon beam and a transversely polarized NH{sub 3} target. This talk introduces the data analysis of the process μ + p → μ' + p' + V, and recent results are presented.

  1. Recent progresses of neural network unsupervised learning: I. Independent component analyses generalizing PCA

    Science.gov (United States)

    Szu, Harold H.

    1999-03-01

    The early vision principle of redundancy reduction of 108 sensor excitations is understandable from computer vision viewpoint toward sparse edge maps. It is only recently derived using a truly unsupervised learning paradigm of artificial neural networks (ANN). In fact, the biological vision, Hubel- Wiesel edge maps, is reproduced seeking the underlying independent components analyses (ICA) among 102 image samples by maximizing the ANN output entropy (partial)H(V)/(partial)[W] equals (partial)[W]/(partial)t. When a pair of newborn eyes or ears meet the bustling and hustling world without supervision, they seek ICA by comparing 2 sensory measurements (x1(t), x2(t))T equalsV X(t). Assuming a linear and instantaneous mixture model of the external world X(t) equals [A] S(t), where both the mixing matrix ([A] equalsV [a1, a2] of ICA vectors and the source percentages (s1(t), s2(t))T equalsV S(t) are unknown, we seek the independent sources approximately equals [I] where the approximated sign indicates that higher order statistics (HOS) may not be trivial. Without a teacher, the ANN weight matrix [W] equalsV [w1, w2] adjusts the outputs V(t) equals tanh([W]X(t)) approximately equals [W]X(t) until no desired outputs except the (Gaussian) 'garbage' (neither YES '1' nor NO '-1' but at linear may-be range 'origin 0') defined by Gaussian covariance G equals [I] equals [W][A] the internal knowledge representation [W], as the inverse of the external world matrix [A]-1. To unify IC, PCA, ANN & HOS theories since 1991 (advanced by Jutten & Herault, Comon, Oja, Bell-Sejnowski, Amari-Cichocki, Cardoso), the LYAPONOV function L(v1,...,vn, w1,...wn,) equals E(v1,...,vn) - H(w1,...wn) is constructed as the HELMHOTZ free energy to prove both convergences of supervised energy E and unsupervised entropy H learning. Consequently, rather using the faithful but dumb computer: 'GARBAGE-IN, GARBAGE-OUT,' the smarter neurocomputer will be equipped with an unsupervised learning that extracts

  2. Nontraumatic temporal subcortical hemorrhage

    International Nuclear Information System (INIS)

    Weisberg, L.A.; Stazio, A.; Shamsnia, M.; Elliott, D.; Charity Hospital, New Orleans, LA

    1990-01-01

    Thirty patients with temporal hematomas were analyzed. Four with frontal extension survived. Of 6 with ganglionic extension, three had residual deficit. Of 8 with parietal extension, 4 had delayed deterioration and died, two patients recovered, and two with peritumoral hemorrhage due to glioblastoma multiforme died. Five patients with posterior temporal hematomas recovered. In 7 patients with basal-inferior temporal hematomas, angiography showed aneurysms in 3 cases, angiomas in 2 cases and no vascular lesion in 2 cases. Of 23 cases with negative angiography and no systemic cause for temporal hematoma, 12 patients were hypertensive and 11 were normotensive. Ten hypertensive patients without evidence of chronic vascular disease had the largest hematomas, extending into the parietal or ganglionic regions. Seven of these patients died; 3 had residual deficit. Eleven normotensive and two hypertensive patients with evidence of chronic vascular change had smaller hematomas. They survived with good functional recovery. (orig.)

  3. Temporal Lobe Seizure

    Science.gov (United States)

    ... functions, including having odd feelings — such as euphoria, deja vu or fear. Temporal lobe seizures are sometimes called ... sudden sense of unprovoked fear or joy A deja vu experience — a feeling that what's happening has happened ...

  4. SHEEP TEMPORAL BONE

    Directory of Open Access Journals (Sweden)

    Kesavan

    2016-03-01

    Full Text Available INTRODUCTION Human temporal bones are difficult to procure now a days due to various ethical issues. Sheep temporal bone is a good alternative due to morphological similarities, easy to procure and less cost. Many middle ear exercises can be done easily and handling of instruments is done in the procedures like myringoplasty, tympanoplasty, stapedotomy, facial nerve dissection and some middle ear implants. This is useful for resident training programme.

  5. PCA-induced respiratory depression simulating stroke following endoluminal repair of abdominal aortic aneurysm: a case report

    Directory of Open Access Journals (Sweden)

    Ahmad Javed

    2007-07-01

    Full Text Available Abstract Aim To report a case of severe respiratory depression with PCA fentanyl use simulating stroke in a patient who underwent routine elective endoluminal graft repair for abdominal aortic aneurysm (AAA Case presentation A 78-year-old obese lady underwent routine endoluminal graft repair for AAA that was progressively increasing in size. Following an uneventful operation postoperative analgesia was managed with a patient-controlled analgesia (PCA device with fentanyl. On the morning following operation the patient was found to be unusually drowsy and unresponsive to stimuli. Her GCS level was 11 with plantars upgoing bilaterally. A provisional diagnosis of stroke was made. Urgent transfer to a high-dependency unit (HDU was arranged and she was given ventilatory support with a BiPap device. CT was performed and found to be normal. Arterial blood gas (ABG analysis showed respiratory acidosis with PaCO2 81 mmHg, PaO2 140 mmHg, pH 7.17 and base excess -2 mmol/l. A total dose of 600 mcg of fentanyl was self-administered in the 16 hours following emergence from general anaesthesia. Naloxone was given with good effect. There was an increase in the creatinine level from 90 μmol/L preoperatively to 167 μmol/L on the first postoperative day. The patient remained on BiPap for two days that resulted in marked improvement in gas exchange. Recovery was complete.

  6. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models

    Science.gov (United States)

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today’s increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong’s Hang Seng futures, Japan’s NIKKEI 225 futures, Singapore’s MSCI futures, South Korea’s KOSPI 200 futures, and Taiwan’s TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis. PMID:27248692

  7. Regionalization and classification of bioclimatic zones in the central-northeastern region of Mexico using principal component analysis (PCA)

    Energy Technology Data Exchange (ETDEWEB)

    Pineda-Martinez, L.F.; Carbajal, N.; Medina-Roldan, E. [Instituto Potosino de Investigacion Cientifica y Tecnologica, A. C., San Luis Potosi (Mexico)]. E-mail: lpineda@ipicyt.edu.mx

    2007-04-15

    Applying principal component analysis (PCA), we determined climate zones in a topographic gradient in the central-northeastern part of Mexico. We employed nearly 30 years of monthly temperature and precipitation data at 173 meteorological stations. The climate classification was carried out applying the Koeppen system modified for the conditions of Mexico. PCA indicates a regionalization in agreement with topographic characteristics and vegetation. We describe the different bioclimatic zones, associated with typical vegetation, for each climate using geographical information systems (GIS). [Spanish] Utilizando un analisis de componentes principales, determinamos zonas climaticas en un gradiente topografico en la zona centro-noreste de Mexico. Se emplearon datos de precipitacion y temperatura medias mensuales por un periodo de 30 anos de 173 estaciones meteorologicas. La clasificacion del clima fue llevada a cabo de acuerdo con el sistema de Koeppen modificado para las condiciones de Mexico. El analisis de componentes principales indico una regionalizacion que concuerda con caracteristicas de topografia y vegetacion. Se describen zonas bioclimaticas, asociadas a vegetacion tipica para cada clima, usando sistemas de informacion geografica (SIG).

  8. PCA determination of the radiometric noise of high spectral resolution infrared observations from spectral residuals: Application to IASI

    Science.gov (United States)

    Serio, C.; Masiello, G.; Camy-Peyret, C.; Jacquette, E.; Vandermarcq, O.; Bermudo, F.; Coppens, D.; Tobin, D.

    2018-02-01

    The problem of characterizing and estimating the instrumental or radiometric noise of satellite high spectral resolution infrared spectrometers directly from Earth observations is addressed in this paper. An approach has been developed, which relies on the Principal Component Analysis (PCA) with a suitable criterion to select the optimal number of PC scores. Different selection criteria have been set up and analysed, which is based on the estimation theory of Least Squares and/or Maximum Likelihood Principle. The approach is independent of any forward model and/or radiative transfer calculations. The PCA is used to define an orthogonal basis, which, in turn, is used to derive an optimal linear reconstruction of the observations. The residual vector that is the observation vector minus the calculated or reconstructed one is then used to estimate the instrumental noise. It will be shown that the use of the spectral residuals to assess the radiometric instrumental noise leads to efficient estimators, which are largely independent of possible departures of the true noise from that assumed a priori to model the observational covariance matrix. Application to the Infrared Atmospheric Sounder Interferometer (IASI) has been considered. A series of case studies has been set up, which make use of IASI observations. As a major result, the analysis confirms the high stability and radiometric performance of IASI. The approach also proved to be efficient in characterizing noise features due to mechanical micro-vibrations of the beam splitter of the IASI instrument.

  9. Detection of l-Cysteine in wheat flour by Raman microspectroscopy combined chemometrics of HCA and PCA.

    Science.gov (United States)

    Cebi, Nur; Dogan, Canan Ekinci; Develioglu, Ayşen; Yayla, Mediha Esra Altuntop; Sagdic, Osman

    2017-08-01

    l-Cysteine is deliberately added to various flour types since l-Cysteine has enabled favorable baking conditions such as low viscosity, increased elasticity and rise during baking. In Turkey, usage of l-Cysteine as a food additive isn't allowed in wheat flour according to the Turkish Food Codex Regulation on food additives. There is an urgent need for effective methods to detect l-Cysteine in wheat flour. In this study, for the first time, a new, rapid, effective, non-destructive and cost-effective method was developed for detection of l-Cysteine in wheat flour using Raman microscopy. Detection of l-Cysteine in wheat flour was accomplished successfully using Raman microscopy combined chemometrics of PCA (Principal Component Analysis) and HCA (Hierarchical Cluster Analysis). In this work, 500-2000cm -1 spectral range (fingerprint region) was determined to perform PCA and HCA analysis. l-Cysteine and l-Cystine were determined with detection limit of 0.125% (w/w) in different wheat flour samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Radar target classification method with high accuracy and decision speed performance using MUSIC spectrum vectors and PCA projection

    Science.gov (United States)

    Secmen, Mustafa

    2011-10-01

    This paper introduces the performance of an electromagnetic target recognition method in resonance scattering region, which includes pseudo spectrum Multiple Signal Classification (MUSIC) algorithm and principal component analysis (PCA) technique. The aim of this method is to classify an "unknown" target as one of the "known" targets in an aspect-independent manner. The suggested method initially collects the late-time portion of noise-free time-scattered signals obtained from different reference aspect angles of known targets. Afterward, these signals are used to obtain MUSIC spectrums in real frequency domain having super-resolution ability and noise resistant feature. In the final step, PCA technique is applied to these spectrums in order to reduce dimensionality and obtain only one feature vector per known target. In the decision stage, noise-free or noisy scattered signal of an unknown (test) target from an unknown aspect angle is initially obtained. Subsequently, MUSIC algorithm is processed for this test signal and resulting test vector is compared with feature vectors of known targets one by one. Finally, the highest correlation gives the type of test target. The method is applied to wire models of airplane targets, and it is shown that it can tolerate considerable noise levels although it has a few different reference aspect angles. Besides, the runtime of the method for a test target is sufficiently low, which makes the method suitable for real-time applications.

  11. Estimating the number of components and detecting outliers using Angle Distribution of Loading Subspaces (ADLS) in PCA analysis.

    Science.gov (United States)

    Liu, Y J; Tran, T; Postma, G; Buydens, L M C; Jansen, J

    2018-08-22

    Principal Component Analysis (PCA) is widely used in analytical chemistry, to reduce the dimensionality of a multivariate data set in a few Principal Components (PCs) that summarize the predominant patterns in the data. An accurate estimate of the number of PCs is indispensable to provide meaningful interpretations and extract useful information. We show how existing estimates for the number of PCs may fall short for datasets with considerable coherence, noise or outlier presence. We present here how Angle Distribution of the Loading Subspaces (ADLS) can be used to estimate the number of PCs based on the variability of loading subspace across bootstrap resamples. Based on comprehensive comparisons with other well-known methods applied on simulated dataset, we show that ADLS (1) may quantify the stability of a PCA model with several numbers of PCs simultaneously; (2) better estimate the appropriate number of PCs when compared with the cross-validation and scree plot methods, specifically for coherent data, and (3) facilitate integrated outlier detection, which we introduce in this manuscript. We, in addition, demonstrate how the analysis of different types of real-life spectroscopic datasets may benefit from these advantages of ADLS. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  12. An Improved Pathological Brain Detection System Based on Two-Dimensional PCA and Evolutionary Extreme Learning Machine.

    Science.gov (United States)

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-12-07

    Pathological brain detection has made notable stride in the past years, as a consequence many pathological brain detection systems (PBDSs) have been proposed. But, the accuracy of these systems still needs significant improvement in order to meet the necessity of real world diagnostic situations. In this paper, an efficient PBDS based on MR images is proposed that markedly improves the recent results. The proposed system makes use of contrast limited adaptive histogram equalization (CLAHE) to enhance the quality of the input MR images. Thereafter, two-dimensional PCA (2DPCA) strategy is employed to extract the features and subsequently, a PCA+LDA approach is used to generate a compact and discriminative feature set. Finally, a new learning algorithm called MDE-ELM is suggested that combines modified differential evolution (MDE) and extreme learning machine (ELM) for segregation of MR images as pathological or healthy. The MDE is utilized to optimize the input weights and hidden biases of single-hidden-layer feed-forward neural networks (SLFN), whereas an analytical method is used for determining the output weights. The proposed algorithm performs optimization based on both the root mean squared error (RMSE) and norm of the output weights of SLFNs. The suggested scheme is benchmarked on three standard datasets and the results are compared against other competent schemes. The experimental outcomes show that the proposed scheme offers superior results compared to its counterparts. Further, it has been noticed that the proposed MDE-ELM classifier obtains better accuracy with compact network architecture than conventional algorithms.

  13. Antiallergic effect of fisetin on IgE-mediated mast cell activation in vitro and on passive cutaneous anaphylaxis (PCA).

    Science.gov (United States)

    Jo, Woo-Ri; Park, Hye-Jin

    2017-10-01

    Fisetin (3,7,3',4'-tetrahydroxyflavone), a naturally occurring bioactive flavonoid, has been shown to inhibit inflammation. However, little is known about the effect of fisetin on immunoglobulin E (IgE)-mediated allergic responses. In this study, the effect of fisetin on rat basophilic leukemia (RBL-2H3) cell-mediated allergic reactions was investigated. Fisetin inhibited β-hexosaminidase release and decreased the level of interleukin-4 and tumor necrosis factor-α mRNA in IgE/antigen (IgE/Ag)-stimulated RBL-2H3 cells. To elucidate the antiallergic mechanism, we examined the levels of signaling molecules responsible for degranulation and release of inflammatory cytokines. Fisetin decreased the levels of activated spleen tyrosine kinase, Gab2 proteins, linker of activated T cells, extracellular signal-related kinase 1/2 in the IgE/Ag-stimulated RBL2H3 cells, and NFκB and STAT3 proteins activated in the ear tissue of mice with passive cutaneous anaphylaxis (PCA). In addition, fisetin significantly lowered of FcɛRI α-subunit mRNA expression. Consistent with the cellular data, fisetin markedly suppressed RBL-2H3 cell-dependent PCA in IgE/Ag-sensitized mice. These results suggest that fisetin may have potential as a therapeutic agent for the treatment of allergic diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A novel fusion method of improved adaptive LTP and two-directional two-dimensional PCA for face feature extraction

    Science.gov (United States)

    Luo, Yuan; Wang, Bo-yu; Zhang, Yi; Zhao, Li-ming

    2018-03-01

    In this paper, under different illuminations and random noises, focusing on the local texture feature's defects of a face image that cannot be completely described because the threshold of local ternary pattern (LTP) cannot be calculated adaptively, a local three-value model of improved adaptive local ternary pattern (IALTP) is proposed. Firstly, the difference function between the center pixel and the neighborhood pixel weight is established to obtain the statistical characteristics of the central pixel and the neighborhood pixel. Secondly, the adaptively gradient descent iterative function is established to calculate the difference coefficient which is defined to be the threshold of the IALTP operator. Finally, the mean and standard deviation of the pixel weight of the local region are used as the coding mode of IALTP. In order to reflect the overall properties of the face and reduce the dimension of features, the two-directional two-dimensional PCA ((2D)2PCA) is adopted. The IALTP is used to extract local texture features of eyes and mouth area. After combining the global features and local features, the fusion features (IALTP+) are obtained. The experimental results on the Extended Yale B and AR standard face databases indicate that under different illuminations and random noises, the algorithm proposed in this paper is more robust than others, and the feature's dimension is smaller. The shortest running time reaches 0.329 6 s, and the highest recognition rate reaches 97.39%.

  15. Prebiotic Low Sugar Chocolate Dairy Desserts: Physical and Optical Characteristics and Performance of PARAFAC and PCA Preference Map.

    Science.gov (United States)

    Morais, E C; Esmerino, E A; Monteiro, R A; Pinheiro, C M; Nunes, C A; Cruz, A G; Bolini, Helena M A

    2016-01-01

    The addition of prebiotic and sweeteners in chocolate dairy desserts opens up new opportunities to develop dairy desserts that besides having a lower calorie intake still has functional properties. In this study, prebiotic low sugar dairy desserts were evaluated by 120 consumers using a 9-point hedonic scale, in relation to the attributes of appearance, aroma, flavor, texture, and overall liking. Internal preference map using parallel factor analysis (PARAFAC) and principal component analysis (PCA) was performed using the consumer data. In addition, physical (texture profile) and optical (instrumental color) analyses were also performed. Prebiotic dairy desserts containing sucrose and sucralose were equally liked by the consumers. These samples were characterized by firmness and gumminess, which can be considered drivers of liking by the consumers. Optimization of the prebiotic low sugar dessert formulation should take in account the choice of ingredients that contribute in a positive manner for these parameters. PARAFAC allowed the extraction of more relevant information in relation to PCA, demonstrating that consumer acceptance analysis can be evaluated by simultaneously considering several attributes. Multiple factor analysis reported Rv value of 0.964, suggesting excellent concordance for both methods. © 2015 Institute of Food Technologists®

  16. An Investigation of GIS Overlay and PCA Techniques for Urban Environmental Quality Assessment: A Case Study in Toronto, Ontario, Canada

    Directory of Open Access Journals (Sweden)

    Kamil Faisal

    2017-03-01

    Full Text Available The United Nations estimates that the global population is going to be double in the coming 40 years, which may cause a negative impact on the environment and human life. Such an impact may instigate increased water demand, overuse of power, anthropogenic noise, etc. Thus, modelling the Urban Environmental Quality (UEQ becomes indispensable for a better city planning and an efficient urban sprawl control. This study aims to investigate the ability of using remote sensing and Geographic Information System (GIS techniques to model the UEQ with a case study in the city of Toronto via deriving different environmental, urban and socio-economic parameters. Remote sensing, GIS and census data were first obtained to derive environmental, urban and socio-economic parameters. Two techniques, GIS overlay and Principal Component Analysis (PCA, were used to integrate all of these environmental, urban and socio-economic parameters. Socio-economic parameters including family income, higher education and land value were used as a reference to assess the outcomes derived from the two integration methods. The outcomes were assessed through evaluating the relationship between the extracted UEQ results and the reference layers. Preliminary findings showed that the GIS overlay represents a better precision and accuracy (71% and 65%, respectively, comparing to the PCA technique. The outcomes of the research can serve as a generic indicator to help the authority for better city planning with consideration of all possible social, environmental and urban requirements or constraints.

  17. Retrieval of spheroid particle size distribution from spectral extinction data in the independent mode using PCA approach

    International Nuclear Information System (INIS)

    Tang, Hong; Lin, Jian-Zhong

    2013-01-01

    An improved anomalous diffraction approximation (ADA) method is presented for calculating the extinction efficiency of spheroids firstly. In this approach, the extinction efficiency of spheroid particles can be calculated with good accuracy and high efficiency in a wider size range by combining the Latimer method and the ADA theory, and this method can present a more general expression for calculating the extinction efficiency of spheroid particles with various complex refractive indices and aspect ratios. Meanwhile, the visible spectral extinction with varied spheroid particle size distributions and complex refractive indices is surveyed. Furthermore, a selection principle about the spectral extinction data is developed based on PCA (principle component analysis) of first derivative spectral extinction. By calculating the contribution rate of first derivative spectral extinction, the spectral extinction with more significant features can be selected as the input data, and those with less features is removed from the inversion data. In addition, we propose an improved Tikhonov iteration method to retrieve the spheroid particle size distributions in the independent mode. Simulation experiments indicate that the spheroid particle size distributions obtained with the proposed method coincide fairly well with the given distributions, and this inversion method provides a simple, reliable and efficient method to retrieve the spheroid particle size distributions from the spectral extinction data. -- Highlights: ► Improved ADA is presented for calculating the extinction efficiency of spheroids. ► Selection principle about spectral extinction data is developed based on PCA. ► Improved Tikhonov iteration method is proposed to retrieve the spheroid PSD.

  18. Constrained optimization via simulation models for new product innovation

    Science.gov (United States)

    Pujowidianto, Nugroho A.

    2017-11-01

    We consider the problem of constrained optimization where the decision makers aim to optimize the primary performance measure while constraining the secondary performance measures. This paper provides a brief overview of stochastically constrained optimization via discrete event simulation. Most review papers tend to be methodology-based. This review attempts to be problem-based as decision makers may have already decided on the problem formulation. We consider constrained optimization models as there are usually constraints on secondary performance measures as trade-off in new product development. It starts by laying out different possible methods and the reasons using constrained optimization via simulation models. It is then followed by the review of different simulation optimization approach to address constrained optimization depending on the number of decision variables, the type of constraints, and the risk preferences of the decision makers in handling uncertainties.

  19. Mapping the Wetland Vegetation Communities of the Australian Great Artesian Basin Springs Using SAM, Mtmf and Spectrally Segmented PCA Hyperspectral Analyses

    Science.gov (United States)

    White, D. C.; Lewis, M. M.

    2012-07-01

    The Australian Great Artesian Basin (GAB) supports a unique and diverse range of groundwater dependent wetland ecosystems termed GAB springs. In recent decades the ecological sustainability of the springs has become uncertain as demands on this iconic groundwater resource increase. The impacts of existing water extractions for mining and pastoral activities are unknown. This situation is compounded by the likelihood of future increasing demand for extractions. Hyperspectral remote sensing provides the necessary spectral and spatial detail to discriminate wetland vegetation communities. Therefore the objectives of this paper are to discriminate the spatial extent and distribution of key spring wetland vegetation communities associated with the GAB springs evaluating three hyperspectral techniques: Spectral Angle Mapper (SAM), Mixture Tuned Matched Filtering (MTMF) and Spectrally Segmented PCA. In addition, to determine if the hyperspectral techniques developed can be applied at a number of sites representative of the range of spring formations and geomorphic settings and at two temporal intervals. Two epochs of HyMap airborne hyperspectral imagery were captured for this research in March 2009 and April 2011 at a number of sites representative of the floristic and geomorphic diversity of GAB spring groups/complexes within South Australia. Colour digital aerial photography at 30 cm GSD was acquired concurrently with the HyMap imagery. The image acquisition coincided with a field campaign of spectroradiometry measurements and a botanical survey. To identify key wavebands which have the greatest capability to discriminate vegetation communities of the GAB springs and surrounding area three hyperspectral data reduction techniques were employed: (i) Spectrally Segmented PCA (SSPCA); (ii) the Minimum Noise Transform (MNF); and (iii) the Pixel Purity Index (PPI). SSPCA was applied to NDVI-masked vegetation portions of the HyMap imagery with wavelength regions spectrally

  20. Wave speed in excitable random networks with spatially constrained connections.

    Directory of Open Access Journals (Sweden)

    Nikita Vladimirov

    Full Text Available Very fast oscillations (VFO in neocortex are widely observed before epileptic seizures, and there is growing evidence that they are caused by networks of pyramidal neurons connected by gap junctions between their axons. We are motivated by the spatio-temporal waves of activity recorded using electrocorticography (ECoG, and study the speed of activity propagation through a network of neurons axonally coupled by gap junctions. We simulate wave propagation by excitable cellular automata (CA on random (Erdös-Rényi networks of special type, with spatially constrained connections. From the cellular automaton model, we derive a mean field theory to predict wave propagation. The governing equation resolved by the Fisher-Kolmogorov PDE fails to describe wave speed. A new (hyperbolic PDE is suggested, which provides adequate wave speed v( that saturates with network degree , in agreement with intuitive expectations and CA simulations. We further show that the maximum length of connection is a much better predictor of the wave speed than the mean length. When tested in networks with various degree distributions, wave speeds are found to strongly depend on the ratio of network moments / rather than on mean degree , which is explained by general network theory. The wave speeds are strikingly similar in a diverse set of networks, including regular, Poisson, exponential and power law distributions, supporting our theory for various network topologies. Our results suggest practical predictions for networks of electrically coupled neurons, and our mean field method can be readily applied for a wide class of similar problems, such as spread of epidemics through spatial networks.

  1. Constraining the SIF - GPP relationship via estimation of NPQ

    Science.gov (United States)

    Silva, C. E.; Yang, X.; Tang, J.; Lee, J. E.; Cushman, K.; Toh Yuan Kun, L.; Kellner, J. R.

    2016-12-01

    Airborne and satellite measurements of solar-induced fluorescence (SIF) have the potential to improve estimates of gross primary production (GPP). Plants dissipate absorbed photosynthetically active radiation (APAR) among three de-excitation pathways: SIF, photochemical quenching (PQ), which results in electron transport and the production of ATP and NADPH consumed during carbon fixation (i.e., GPP), and heat dissipation via conversion of xanthophyll pigments (non-photochemical quenching: NPQ). As a result, the relationship between SIF and GPP is a function of NPQ and may vary temporally and spatially with environmental conditions (e.g., light and water availability) and plant traits (e.g., leaf N content). Accurate estimates of any one of the de-excitation pathways require measurement of the other two. Here we combine half-hourly measurements of canopy APAR and SIF with eddy covariance estimates of GPP at Harvard Forest to close the canopy radiation budget and infer canopy NPQ throughout the 2013 growing season. We use molecular-level photosynthesis equations to compute PQ (umol photons m-2s-1) from GPP (umol CO2 m-2s-1) and invert an integrated canopy radiative transfer and leaf-level photosynthesis/fluorescence model (SCOPE) to quantify hemispherically and spectrally-integrated SIF emission (umol photons m-2s-1) from single band (760 nm) top-of-canopy SIF measurements. We estimate half-hourly NPQ as the residual required to close the radiation budget (NPQ = APAR - SIF - PQ). Our future work will test estimated NPQ against simultaneously acquired measurements of the photochemical reflectance index (PRI), a spectral index sensitive to xanthopyll pigments. By constraining two of the three de-excitation pathways, simultaneous SIF and PRI measurements are likely to improve GPP estimates, which are crucial to the study of climate - carbon cycle interactions.

  2. Anatomically constrained dipole adjustment (ANACONDA) for accurate MEG/EEG focal source localizations

    Science.gov (United States)

    Im, Chang-Hwan; Jung, Hyun-Kyo; Fujimaki, Norio

    2005-10-01

    This paper proposes an alternative approach to enhance localization accuracy of MEG and EEG focal sources. The proposed approach assumes anatomically constrained spatio-temporal dipoles, initial positions of which are estimated from local peak positions of distributed sources obtained from a pre-execution of distributed source reconstruction. The positions of the dipoles are then adjusted on the cortical surface using a novel updating scheme named cortical surface scanning. The proposed approach has many advantages over the conventional ones: (1) as the cortical surface scanning algorithm uses spatio-temporal dipoles, it is robust with respect to noise; (2) it requires no a priori information on the numbers and initial locations of the activations; (3) as the locations of dipoles are restricted only on a tessellated cortical surface, it is physiologically more plausible than the conventional ECD model. To verify the proposed approach, it was applied to several realistic MEG/EEG simulations and practical experiments. From the several case studies, it is concluded that the anatomically constrained dipole adjustment (ANACONDA) approach will be a very promising technique to enhance accuracy of focal source localization which is essential in many clinical and neurological applications of MEG and EEG.

  3. Anatomically constrained dipole adjustment (ANACONDA) for accurate MEG/EEG focal source localizations

    International Nuclear Information System (INIS)

    Im, Chang-Hwan; Jung, Hyun-Kyo; Fujimaki, Norio

    2005-01-01

    This paper proposes an alternative approach to enhance localization accuracy of MEG and EEG focal sources. The proposed approach assumes anatomically constrained spatio-temporal dipoles, initial positions of which are estimated from local peak positions of distributed sources obtained from a pre-execution of distributed source reconstruction. The positions of the dipoles are then adjusted on the cortical surface using a novel updating scheme named cortical surface scanning. The proposed approach has many advantages over the conventional ones: (1) as the cortical surface scanning algorithm uses spatio-temporal dipoles, it is robust with respect to noise; (2) it requires no a priori information on the numbers and initial locations of the activations; (3) as the locations of dipoles are restricted only on a tessellated cortical surface, it is physiologically more plausible than the conventional ECD model. To verify the proposed approach, it was applied to several realistic MEG/EEG simulations and practical experiments. From the several case studies, it is concluded that the anatomically constrained dipole adjustment (ANACONDA) approach will be a very promising technique to enhance accuracy of focal source localization which is essential in many clinical and neurological applications of MEG and EEG

  4. PAPR-Constrained Pareto-Optimal Waveform Design for OFDM-STAP Radar

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Satyabrata [ORNL

    2014-01-01

    We propose a peak-to-average power ratio (PAPR) constrained Pareto-optimal waveform design approach for an orthogonal frequency division multiplexing (OFDM) radar signal to detect a target using the space-time adaptive processing (STAP) technique. The use of an OFDM signal does not only increase the frequency diversity of our system, but also enables us to adaptively design the OFDM coefficients in order to further improve the system performance. First, we develop a parametric OFDM-STAP measurement model by considering the effects of signaldependent clutter and colored noise. Then, we observe that the resulting STAP-performance can be improved by maximizing the output signal-to-interference-plus-noise ratio (SINR) with respect to the signal parameters. However, in practical scenarios, the computation of output SINR depends on the estimated values of the spatial and temporal frequencies and target scattering responses. Therefore, we formulate a PAPR-constrained multi-objective optimization (MOO) problem to design the OFDM spectral parameters by simultaneously optimizing four objective functions: maximizing the output SINR, minimizing two separate Cramer-Rao bounds (CRBs) on the normalized spatial and temporal frequencies, and minimizing the trace of CRB matrix on the target scattering coefficients estimations. We present several numerical examples to demonstrate the achieved performance improvement due to the adaptive waveform design.

  5. Challenges in constraining anthropogenic aerosol effects on cloud radiative forcing using present-day spatiotemporal variability.

    Science.gov (United States)

    Ghan, Steven; Wang, Minghuai; Zhang, Shipeng; Ferrachat, Sylvaine; Gettelman, Andrew; Griesfeller, Jan; Kipling, Zak; Lohmann, Ulrike; Morrison, Hugh; Neubauer, David; Partridge, Daniel G; Stier, Philip; Takemura, Toshihiko; Wang, Hailong; Zhang, Kai

    2016-05-24

    A large number of processes are involved in the chain from emissions of aerosol precursor gases and primary particles to impacts on cloud radiative forcing. Those processes are manifest in a number of relationships that can be expressed as factors dlnX/dlnY driving aerosol effects on cloud radiative forcing. These factors include the relationships between cloud condensation nuclei (CCN) concentration and emissions, droplet number and CCN concentration, cloud fraction and droplet number, cloud optical depth and droplet number, and cloud radiative forcing and cloud optical depth. The relationship between cloud optical depth and droplet number can be further decomposed into the sum of two terms involving the relationship of droplet effective radius and cloud liquid water path with droplet number. These relationships can be constrained using observations of recent spatial and temporal variability of these quantities. However, we are most interested in the radiative forcing since the preindustrial era. Because few relevant measurements are available from that era, relationships from recent variability have been assumed to be applicable to the preindustrial to present-day change. Our analysis of Aerosol Comparisons between Observations and Models (AeroCom) model simulations suggests that estimates of relationships from recent variability are poor constraints on relationships from anthropogenic change for some terms, with even the sign of some relationships differing in many regions. Proxies connecting recent spatial/temporal variability to anthropogenic change, or sustained measurements in regions where emissions have changed, are needed to constrain estimates of anthropogenic aerosol impacts on cloud radiative forcing.

  6. Reflected stochastic differential equation models for constrained animal movement

    Science.gov (United States)

    Hanks, Ephraim M.; Johnson, Devin S.; Hooten, Mevin B.

    2017-01-01

    Movement for many animal species is constrained in space by barriers such as rivers, shorelines, or impassable cliffs. We develop an approach for modeling animal movement constrained in space by considering a class of constrained stochastic processes, reflected stochastic differential equations. Our approach generalizes existing methods for modeling unconstrained animal movement. We present methods for simulation and inference based on augmenting the constrained movement path with a latent unconstrained path and illustrate this augmentation with a simulation example and an analysis of telemetry data from a Steller sea lion (Eumatopias jubatus) in southeast Alaska.

  7. Resource Management in Constrained Dynamic Situations

    Science.gov (United States)

    Seok, Jinwoo

    Resource management is considered in this dissertation for systems with limited resources, possibly combined with other system constraints, in unpredictably dynamic environments. Resources may represent fuel, power, capabilities, energy, and so on. Resource management is important for many practical systems; usually, resources are limited, and their use must be optimized. Furthermore, systems are often constrained, and constraints must be satisfied for safe operation. Simplistic resource management can result in poor use of resources and failure of the system. Furthermore, many real-world situations involve dynamic environments. Many traditional problems are formulated based on the assumptions of given probabilities or perfect knowledge of future events. However, in many cases, the future is completely unknown, and information on or probabilities about future events are not available. In other words, we operate in unpredictably dynamic situations. Thus, a method is needed to handle dynamic situations without knowledge of the future, but few formal methods have been developed to address them. Thus, the goal is to design resource management methods for constrained systems, with limited resources, in unpredictably dynamic environments. To this end, resource management is organized hierarchically into two levels: 1) planning, and 2) control. In the planning level, the set of tasks to be performed is scheduled based on limited resources to maximize resource usage in unpredictably dynamic environments. In the control level, the system controller is designed to follow the schedule by considering all the system constraints for safe and efficient operation. Consequently, this dissertation is mainly divided into two parts: 1) planning level design, based on finite state machines, and 2) control level methods, based on model predictive control. We define a recomposable restricted finite state machine to handle limited resource situations and unpredictably dynamic environments

  8. Management of Localized Prostate Cancer by Focal Transurethral Resection of Prostate Cancer: An Application of Radical TUR-PCa to Focal Therapy

    Directory of Open Access Journals (Sweden)

    Masaru Morita

    2012-01-01

    Full Text Available Background. We analyzed radical TUR-PCa against localized prostate cancer. Patients and Methods. Seventy-nine out of 209 patients with prostate cancer in one lobe were studied. Patients’ age ranged from 58 to 91 years and preoperative PSA, 0.70 to 17.30 ng/mL. In other 16 additional patients we performed focal TUR-PCa. Patients’ age ranged from 51 to 87 years and preoperative PSA, 1.51 to 25.74 ng/mL. Results. PSA failure in radical TUR-PCa was 5.1% during the mean follow-up period of 58.9 months. The actuarial biochemical non-recurrence rate was 98.2% for pT2a and 90.5% for pT2b. Bladder neck contracture occurred in 28 patients (35.4%. In 209 patients, pathological study revealed prostate cancer of the peripheral zone near the neurovascular bundle bilaterally in 25%, unilaterally in 39% and no cancer bilaterally in 35%, suggesting the possibility of focal TUR-PCa. Postoperative PSA of 16 patients treated by focal TUR-PCa was stable between 0.007 and 0.406 ng/mL at 24.2 months’ follow-up. No patients suffered from urinary incontinence. Bladder neck contracture developed in only 1 patient and all 5 patients underwent nerve-preserving TUR-PCa did not show erectile dysfunction. Conclusion. Focal TUR-PCa was considered to be a promising option among focal therapies against localized prostate cancer.

  9. Management of Localized Prostate Cancer by Focal Transurethral Resection of Prostate Cancer: An Application of Radical TUR-PCa to Focal Therapy.

    Science.gov (United States)

    Morita, Masaru; Matsuura, Takeshi

    2012-01-01

    Background. We analyzed radical TUR-PCa against localized prostate cancer. Patients and Methods. Seventy-nine out of 209 patients with prostate cancer in one lobe were studied. Patients' age ranged from 58 to 91 years and preoperative PSA, 0.70 to 17.30 ng/mL. In other 16 additional patients we performed focal TUR-PCa. Patients' age ranged from 51 to 87 years and preoperative PSA, 1.51 to 25.74 ng/mL. Results. PSA failure in radical TUR-PCa was 5.1% during the mean follow-up period of 58.9 months. The actuarial biochemical non-recurrence rate was 98.2% for pT2a and 90.5% for pT2b. Bladder neck contracture occurred in 28 patients (35.4%). In 209 patients, pathological study revealed prostate cancer of the peripheral zone near the neurovascular bundle bilaterally in 25%, unilaterally in 39% and no cancer bilaterally in 35%, suggesting the possibility of focal TUR-PCa. Postoperative PSA of 16 patients treated by focal TUR-PCa was stable between 0.007 and 0.406 ng/mL at 24.2 months' follow-up. No patients suffered from urinary incontinence. Bladder neck contracture developed in only 1 patient and all 5 patients underwent nerve-preserving TUR-PCa did not show erectile dysfunction. Conclusion. Focal TUR-PCa was considered to be a promising option among focal therapies against localized prostate cancer.

  10. Tracking polychlorinated biphenyls (PCBs) congener patterns in Newark Bay surface sediment using principal component analysis (PCA) and positive matrix factorization (PMF).

    Science.gov (United States)

    Saba, Tarek; Su, Steave

    2013-09-15

    PCB congener data for Newark Bay surface sediments were analyzed using PCA and PMF, and relationships between the outcomes from these two techniques were explored. The PCA scores plot separated the Lower Passaic River Mouth samples from North Newark Bay, thus indicating dissimilarity. Although PCA was able to identify subareas in the Bay system with specific PCB congener patterns (e.g., higher chlorinated congeners in Elizabeth River), further conclusions reading potential PCB source profiles or potential upland source areas were not clear for the PCA scores plot. PMF identified five source factors, and explained the Bay sample congener profiles as a mix of these Factors. This PMF solution was equivalent to (1) defining an envelope that encompasses all samples on the PCA scores plot, (2) defining source factors that plot on that envelope, and (3) explaining the congener profile for each Bay sediment sample (inside the scores plot envelope) as a mix of factors. PMF analysis allowed identifying characteristic features in the source factor congener distributions that allowed tracking of source factors to shoreline areas where PCB inputs to the Bay may have originated. The combined analysis from PCA and PMF showed that direct discharges to the Bay are likely the dominant sources of PCBs to the sediment. Review of historical upland activities and regulatory files will be needed, in addition to the PCA and PMF analysis, to fully reconstruct the history of operations and PCB releases around the Newark Bay area that impacted the Bay sediment. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Multiple concurrent temporal recalibrations driven by audiovisual stimuli with apparent physical differences.

    Science.gov (United States)

    Yuan, Xiangyong; Bi, Cuihua; Huang, Xiting

    2015-05-01

    Out-of-synchrony experiences can easily recalibrate one's subjective simultaneity point in the direction of the experienced asynchrony. Although temporal adjustment of multiple audiovisual stimuli has been recently demonstrated to be spatially specific, perceptual grouping processes that organize separate audiovisual stimuli into distinctive "objects" may play a more important role in forming the basis for subsequent multiple temporal recalibrations. We investigated whether apparent physical differences between audiovisual pairs that make them distinct from each other can independently drive multiple concurrent temporal recalibrations regardless of spatial overlap. Experiment 1 verified that reducing the physical difference between two audiovisual pairs diminishes the multiple temporal recalibrations by exposing observers to two utterances with opposing temporal relationships spoken by one single speaker rather than two distinct speakers at the same location. Experiment 2 found that increasing the physical difference between two stimuli pairs can promote multiple temporal recalibrations by complicating their non-temporal dimensions (e.g., disks composed of two rather than one attribute and tones generated by multiplying two frequencies); however, these recalibration aftereffects were subtle. Experiment 3 further revealed that making the two audiovisual pairs differ in temporal structures (one transient and one gradual) was sufficient to drive concurrent temporal recalibration. These results confirm that the more audiovisual pairs physically differ, especially in temporal profile, the more likely multiple temporal perception adjustments will be content-constrained regardless of spatial overlap. These results indicate that multiple temporal recalibrations are based secondarily on the outcome of perceptual grouping processes.

  12. Sequential unconstrained minimization algorithms for constrained optimization

    International Nuclear Information System (INIS)

    Byrne, Charles

    2008-01-01

    The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results

  13. Explaining evolution via constrained persistent perfect phylogeny

    Science.gov (United States)

    2014-01-01

    Background The perfect phylogeny is an often used model in phylogenetics since it provides an efficient basic procedure for representing the evolution of genomic binary characters in several frameworks, such as for example in haplotype inference. The model, which is conceptually the simplest, is based on the infinite sites assumption, that is no character can mutate more than once in the whole tree. A main open problem regarding the model is finding generalizations that retain the computational tractability of the original model but are more flexible in modeling biological data when the infinite site assumption is violated because of e.g. back mutations. A special case of back mutations that has been considered in the study of the evolution of protein domains (where a domain is acquired and then lost) is persistency, that is the fact that a character is allowed to return back to the ancestral state. In this model characters can be gained and lost at most once. In this paper we consider the computational problem of explaining binary data by the Persistent Perfect Phylogeny model (referred as PPP) and for this purpose we investigate the problem of reconstructing an evolution where some constraints are imposed on the paths of the tree. Results We define a natural generalization of the PPP problem obtained by requiring that for some pairs (character, species), neither the species nor any of its ancestors can have the character. In other words, some characters cannot be persistent for some species. This new problem is called Constrained PPP (CPPP). Based on a graph formulation of the CPPP problem, we are able to provide a polynomial time solution for the CPPP problem for matrices whose conflict graph has no edges. Using this result, we develop a parameterized algorithm for solving the CPPP problem where the parameter is the number of characters. Conclusions A preliminary experimental analysis shows that the constrained persistent perfect phylogeny model allows to

  14. Constrained Sypersymmetric Flipped SU (5) GUT Phenomenology

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, John; /CERN /King' s Coll. London; Mustafayev, Azar; /Minnesota U., Theor. Phys. Inst.; Olive, Keith A.; /Minnesota U., Theor. Phys. Inst. /Minnesota U. /Stanford U., Phys. Dept. /SLAC

    2011-08-12

    We explore the phenomenology of the minimal supersymmetric flipped SU(5) GUT model (CFSU(5)), whose soft supersymmetry-breaking (SSB) mass parameters are constrained to be universal at some input scale, Min, above the GUT scale, M{sub GUT}. We analyze the parameter space of CFSU(5) assuming that the lightest supersymmetric particle (LSP) provides the cosmological cold dark matter, paying careful attention to the matching of parameters at the GUT scale. We first display some specific examples of the evolutions of the SSB parameters that exhibit some generic features. Specifically, we note that the relationship between the masses of the lightest neutralino {chi} and the lighter stau {tilde {tau}}{sub 1} is sensitive to M{sub in}, as is the relationship between m{sub {chi}} and the masses of the heavier Higgs bosons A,H. For these reasons, prominent features in generic (m{sub 1/2}, m{sub 0}) planes such as coannihilation strips and rapid-annihilation funnels are also sensitive to Min, as we illustrate for several cases with tan {beta} = 10 and 55. However, these features do not necessarily disappear at large Min, unlike the case in the minimal conventional SU(5) GUT. Our results are relatively insensitive to neutrino masses.

  15. Constrained supersymmetric flipped SU(5) GUT phenomenology

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, John [CERN, TH Division, PH Department, Geneva 23 (Switzerland); King' s College London, Theoretical Physics and Cosmology Group, Department of Physics, London (United Kingdom); Mustafayev, Azar [University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States); Olive, Keith A. [University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States); Stanford University, Department of Physics and SLAC, Palo Alto, CA (United States)

    2011-07-15

    We explore the phenomenology of the minimal supersymmetric flipped SU(5) GUT model (CFSU(5)), whose soft supersymmetry-breaking (SSB) mass parameters are constrained to be universal at some input scale, M{sub in}, above the GUT scale, M{sub GUT}. We analyze the parameter space of CFSU(5) assuming that the lightest supersymmetric particle (LSP) provides the cosmological cold dark matter, paying careful attention to the matching of parameters at the GUT scale. We first display some specific examples of the evolutions of the SSB parameters that exhibit some generic features. Specifically, we note that the relationship between the masses of the lightest neutralino {chi} and the lighter stau {tau}{sub 1} is sensitive to M{sub in}, as is the relationship between m{sub {chi}} and the masses of the heavier Higgs bosons A,H. For these reasons, prominent features in generic (m{sub 1/2},m{sub 0}) planes such as coannihilation strips and rapid-annihilation funnels are also sensitive to M{sub in}, as we illustrate for several cases with tan {beta}=10 and 55. However, these features do not necessarily disappear at large M{sub in}, unlike the case in the minimal conventional SU(5) GUT. Our results are relatively insensitive to neutrino masses. (orig.)

  16. Joint Chance-Constrained Dynamic Programming

    Science.gov (United States)

    Ono, Masahiro; Kuwata, Yoshiaki; Balaram, J. Bob

    2012-01-01

    This paper presents a novel dynamic programming algorithm with a joint chance constraint, which explicitly bounds the risk of failure in order to maintain the state within a specified feasible region. A joint chance constraint cannot be handled by existing constrained dynamic programming approaches since their application is limited to constraints in the same form as the cost function, that is, an expectation over a sum of one-stage costs. We overcome this challenge by reformulating the joint chance constraint into a constraint on an expectation over a sum of indicator functions, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the primal variables can be optimized by a standard dynamic programming, while the dual variable is optimized by a root-finding algorithm that converges exponentially. Error bounds on the primal and dual objective values are rigorously derived. We demonstrate the algorithm on a path planning problem, as well as an optimal control problem for Mars entry, descent and landing. The simulations are conducted using a real terrain data of Mars, with four million discrete states at each time step.

  17. Constraining the roughness degree of slip heterogeneity

    KAUST Repository

    Causse, Mathieu

    2010-05-07

    This article investigates different approaches for assessing the degree of roughness of the slip distribution of future earthquakes. First, we analyze a database of slip images extracted from a suite of 152 finite-source rupture models from 80 events (Mw = 4.1–8.9). This results in an empirical model defining the distribution of the slip spectrum corner wave numbers (kc) as a function of moment magnitude. To reduce the “epistemic” uncertainty, we select a single slip model per event and screen out poorly resolved models. The number of remaining models (30) is thus rather small. In addition, the robustness of the empirical model rests on a reliable estimation of kc by kinematic inversion methods. We address this issue by performing tests on synthetic data with a frequency domain inversion method. These tests reveal that due to smoothing constraints used to stabilize the inversion process, kc tends to be underestimated. We then develop an alternative approach: (1) we establish a proportionality relationship between kc and the peak ground acceleration (PGA), using a k−2 kinematic source model, and (2) we analyze the PGA distribution, which is believed to be better constrained than slip images. These two methods reveal that kc follows a lognormal distribution, with similar standard deviations for both methods.

  18. Technologies for a greenhouse-constrained society

    International Nuclear Information System (INIS)

    Kuliasha, M.A.; Zucker, A.; Ballew, K.J.

    1992-01-01

    This conference explored how three technologies might help society adjust to life in a greenhouse-constrained environment. Technology experts and policy makers from around the world met June 11--13, 1991, in Oak Ridge, Tennessee, to address questions about how energy efficiency, biomass, and nuclear technologies can mitigate the greenhouse effect and to explore energy production and use in countries in various stages of development. The conference was organized by Oak Ridge National Laboratory and sponsored by the US Department of Energy. Energy efficiency biomass, and nuclear energy are potential substitutes for fossil fuels that might help slow or even reverse the global warming changes that may result from mankind's thirst for energy. Many other conferences have questioned whether the greenhouse effect is real and what reductions in greenhouse gas emissions might be necessary to avoid serious ecological consequences; this conference studied how these reductions might actually be achieved. For these conference proceedings, individuals papers are processed separately for the Energy Data Base

  19. Constrained Supersymmetric Flipped SU(5) GUT Phenomenology

    CERN Document Server

    Ellis, John; Olive, Keith A

    2011-01-01

    We explore the phenomenology of the minimal supersymmetric flipped SU(5) GUT model (CFSU(5)), whose soft supersymmetry-breaking (SSB) mass parameters are constrained to be universal at some input scale, $M_{in}$, above the GUT scale, $M_{GUT}$. We analyze the parameter space of CFSU(5) assuming that the lightest supersymmetric particle (LSP) provides the cosmological cold dark matter, paying careful attention to the matching of parameters at the GUT scale. We first display some specific examples of the evolutions of the SSB parameters that exhibit some generic features. Specifically, we note that the relationship between the masses of the lightest neutralino and the lighter stau is sensitive to $M_{in}$, as is the relationship between the neutralino mass and the masses of the heavier Higgs bosons. For these reasons, prominent features in generic $(m_{1/2}, m_0)$ planes such as coannihilation strips and rapid-annihilation funnels are also sensitive to $M_{in}$, as we illustrate for several cases with tan(beta)...

  20. Constrained supersymmetric flipped SU(5) GUT phenomenology

    International Nuclear Information System (INIS)

    Ellis, John; Mustafayev, Azar; Olive, Keith A.

    2011-01-01

    We explore the phenomenology of the minimal supersymmetric flipped SU(5) GUT model (CFSU(5)), whose soft supersymmetry-breaking (SSB) mass parameters are constrained to be universal at some input scale, M in , above the GUT scale, M GUT . We analyze the parameter space of CFSU(5) assuming that the lightest supersymmetric particle (LSP) provides the cosmological cold dark matter, paying careful attention to the matching of parameters at the GUT scale. We first display some specific examples of the evolutions of the SSB parameters that exhibit some generic features. Specifically, we note that the relationship between the masses of the lightest neutralino χ and the lighter stau τ 1 is sensitive to M in , as is the relationship between m χ and the masses of the heavier Higgs bosons A,H. For these reasons, prominent features in generic (m 1/2 ,m 0 ) planes such as coannihilation strips and rapid-annihilation funnels are also sensitive to M in , as we illustrate for several cases with tan β=10 and 55. However, these features do not necessarily disappear at large M in , unlike the case in the minimal conventional SU(5) GUT. Our results are relatively insensitive to neutrino masses. (orig.)

  1. Scheduling Aircraft Landings under Constrained Position Shifting

    Science.gov (United States)

    Balakrishnan, Hamsa; Chandran, Bala

    2006-01-01

    Optimal scheduling of airport runway operations can play an important role in improving the safety and efficiency of the National Airspace System (NAS). Methods that compute the optimal landing sequence and landing times of aircraft must accommodate practical issues that affect the implementation of the schedule. One such practical consideration, known as Constrained Position Shifting (CPS), is the restriction that each aircraft must land within a pre-specified number of positions of its place in the First-Come-First-Served (FCFS) sequence. We consider the problem of scheduling landings of aircraft in a CPS environment in order to maximize runway throughput (minimize the completion time of the landing sequence), subject to operational constraints such as FAA-specified minimum inter-arrival spacing restrictions, precedence relationships among aircraft that arise either from airline preferences or air traffic control procedures that prevent overtaking, and time windows (representing possible control actions) during which each aircraft landing can occur. We present a Dynamic Programming-based approach that scales linearly in the number of aircraft, and describe our computational experience with a prototype implementation on realistic data for Denver International Airport.

  2. Should we still believe in constrained supersymmetry?

    International Nuclear Information System (INIS)

    Balazs, Csaba; Buckley, Andy; Carter, Daniel; Farmer, Benjamin; White, Martin

    2013-01-01

    We calculate partial Bayes factors to quantify how the feasibility of the constrained minimal supersymmetric standard model (CMSSM) has changed in the light of a series of observations. This is done in the Bayesian spirit where probability reflects a degree of belief in a proposition and Bayes' theorem tells us how to update it after acquiring new information. Our experimental baseline is the approximate knowledge that was available before LEP, and our comparison model is the Standard Model with a simple dark matter candidate. To quantify the amount by which experiments have altered our relative belief in the CMSSM since the baseline data we compute the partial Bayes factors that arise from learning in sequence the LEP Higgs constraints, the XENON100 dark matter constraints, the 2011 LHC supersymmetry search results, and the early 2012 LHC Higgs search results. We find that LEP and the LHC strongly shatter our trust in the CMSSM (with M 0 and M 1/2 below 2 TeV), reducing its posterior odds by approximately two orders of magnitude. This reduction is largely due to substantial Occam factors induced by the LEP and LHC Higgs searches. (orig.)

  3. Electricity in a Climate-Constrained World

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    After experiencing a historic drop in 2009, electricity generation reached a record high in 2010, confirming the close linkage between economic growth and electricity usage. Unfortunately, CO2 emissions from electricity have also resumed their growth: Electricity remains the single-largest source of CO2 emissions from energy, with 11.7 billion tonnes of CO2 released in 2010. The imperative to 'decarbonise' electricity and improve end-use efficiency remains essential to the global fight against climate change. The IEA’s Electricity in a Climate-Constrained World provides an authoritative resource on progress to date in this area, including statistics related to CO2 and the electricity sector across ten regions of the world (supply, end-use and capacity additions). It also presents topical analyses on the challenge of rapidly curbing CO2 emissions from electricity. Looking at policy instruments, it focuses on emissions trading in China, using energy efficiency to manage electricity supply crises and combining policy instruments for effective CO2 reductions. On regulatory issues, it asks whether deregulation can deliver decarbonisation and assesses the role of state-owned enterprises in emerging economies. And from technology perspectives, it explores the rise of new end-uses, the role of electricity storage, biomass use in Brazil, and the potential of carbon capture and storage for ‘negative emissions’ electricity supply.

  4. Automatic temporal segment detection via bilateral long short-term memory recurrent neural networks

    Science.gov (United States)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun; Li, Liandong

    2017-03-01

    Constrained by the physiology, the temporal factors associated with human behavior, irrespective of facial movement or body gesture, are described by four phases: neutral, onset, apex, and offset. Although they may benefit related recognition tasks, it is not easy to accurately detect such temporal segments. An automatic temporal segment detection framework using bilateral long short-term memory recurrent neural networks (BLSTM-RNN) to learn high-level temporal-spatial features, which synthesizes the local and global temporal-spatial information more efficiently, is presented. The framework is evaluated in detail over the face and body database (FABO). The comparison shows that the proposed framework outperforms state-of-the-art methods for solving the problem of temporal segment detection.

  5. The Comparison of Intrathecal Morphine and IV Morphine PCA on Pain Control, Patient Satisfaction, Morphine Consumption, and Adverse Effects in Patients Undergoing Reduction Mammoplasty.

    Science.gov (United States)

    Karamese, Mehtap; Akdağ, Osman; Kara, İnci; Yıldıran, Gokce Unal; Tosun, Zekeriya

    2015-01-01

    Following breast reduction procedures, the level of postoperative pain can be severe, and sufficient pain control influences a patient's physiological, immunological, and psychological status. The aim of this study was to examine the use of intrathecal morphine (ITM) in breast reduction surgery with patient-controlled analgesia (PCA). Sixty-two female patients who underwent breast reductions with the same technique participated in this study. The study group (ITM + PCA) included 32 patients; a single shot (0.2 mg) of ITM and intravenous morphine with PCA were administered. In the control group, morphine PCA alone was intravenously administered to 30 patients. Comparisons between the groups of cumulative morphine consumption, visual analog scale scores, and patient satisfaction scores, which were the primary outcome measures, and adverse effects, which were the secondary outcome measures, were conducted. The patients in the 2 groups had similar degrees of pain and satisfaction scores. The study group had lower cumulative morphine consumption (P = .001) than the PCA-only control group; there was no statistically significant difference in adverse effects between the 2 groups. Intrathecal morphine may effectively control pain with lower total morphine consumption following breast reduction surgery.

  6. Novel PCA-VIP scheme for ranking MRI protocols and identifying computer-extracted MRI measurements associated with central gland and peripheral zone prostate tumors.

    Science.gov (United States)

    Ginsburg, Shoshana B; Viswanath, Satish E; Bloch, B Nicolas; Rofsky, Neil M; Genega, Elizabeth M; Lenkinski, Robert E; Madabhushi, Anant

    2015-05-01

    To identify computer-extracted features for central gland and peripheral zone prostate cancer localization on multiparametric magnetic resonance imaging (MRI). Preoperative T2-weighted (T2w), diffusion-weighted imaging (DWI), and dynamic contrast-enhanced (DCE) MRI were acquired from 23 men with confirmed prostate cancer. Following radical prostatectomy, the cancer extent was delineated by a pathologist on ex vivo histology and mapped to MRI by nonlinear registration of histology and corresponding MRI slices. In all, 244 computer-extracted features were extracted from MRI, and principal component analysis (PCA) was employed to reduce the data dimensionality so that a generalizable classifier could be constructed. A novel variable importance on projection (VIP) measure for PCA (PCA-VIP) was leveraged to identify computer-extracted MRI features that discriminate between cancer and normal prostate, and these features were used to construct classifiers for cancer localization. Classifiers using features selected by PCA-VIP yielded an area under the curve (AUC) of 0.79 and 0.85 for peripheral zone and central gland tumors, respectively. For tumor localization in the central gland, T2w, DCE, and DWI MRI features contributed 71.6%, 18.1%, and 10.2%, respectively; for peripheral zone tumors T2w, DCE, and DWI MRI contributed 29.6%, 21.7%, and 48.7%, respectively. PCA-VIP identified relatively stable subsets of MRI features that performed well in localizing prostate cancer on MRI. © 2014 Wiley Periodicals, Inc.

  7. Automatic individual arterial input functions calculated from PCA outperform manual and population-averaged approaches for the pharmacokinetic modeling of DCE-MR images.

    Science.gov (United States)

    Sanz-Requena, Roberto; Prats-Montalbán, José Manuel; Martí-Bonmatí, Luis; Alberich-Bayarri, Ángel; García-Martí, Gracián; Pérez, Rosario; Ferrer, Alberto

    2015-08-01

    To introduce a segmentation method to calculate an automatic arterial input function (AIF) based on principal component analysis (PCA) of dynamic contrast enhanced MR (DCE-MR) imaging and compare it with individual manually selected and population-averaged AIFs using calculated pharmacokinetic parameters. The study included 65 individuals with prostate examinations (27 tumors and 38 controls). Manual AIFs were individually extracted and also averaged to obtain a population AIF. Automatic AIFs were individually obtained by applying PCA to volumetric DCE-MR imaging data and finding the highest correlation of the PCs with a reference AIF. Variability was assessed using coefficients of variation and repeated measures tests. The different AIFs were used as inputs to the pharmacokinetic model and correlation coefficients, Bland-Altman plots and analysis of variance tests were obtained to compare the results. Automatic PCA-based AIFs were successfully extracted in all cases. The manual and PCA-based AIFs showed good correlation (r between pharmacokinetic parameters ranging from 0.74 to 0.95), with differences below the manual individual variability (RMSCV up to 27.3%). The population-averaged AIF showed larger differences (r from 0.30 to 0.61). The automatic PCA-based approach minimizes the variability associated to obtaining individual volume-based AIFs in DCE-MR studies of the prostate. © 2014 Wiley Periodicals, Inc.

  8. Serum crosslinked-N-terminal telopeptide of type I collagen (NTx) has prognostic implications for patients with initial prostate carcinoma (PCa): a pilot study.

    Science.gov (United States)

    Jablonka, Fernando; Alves, Beatriz da Costa Aguiar; de Oliveira, Claudia Giorgia Bronzati; Wroclawski, Marcelo L; Szwarc, Marcelo; Vitória, Webster de Oliveira; Fonseca, Fernando; Del Giglio, Auro

    2014-09-25

    NTx is a type I collagen metabolite previously shown to be increased in patients with bone metastasis. We evaluate NTx potential prognostic role in PCa at diagnosis, when most of the patients have no overt bone involvement. Men with histologic diagnosis of PCa were included at diagnosis. Serum NTx was measured serially every 3 months up to two years by ELISA. Fifty-five PCa patients with a median age of 67 y (51-83 y) were included. Most (86%) had stage I; 4% stage II; 2% stage III and 10% stage IV disease. At entry, median NTx was 14.65 nMBCE and it did not correlate with age, Gleason score or PSA, but we observed a significant direct correlation with stage (p=0.0094). With a median follow up of 23 months, serum NTx correlated significantly with biochemical recurrence (p=0.012), as did Gleason score (p=0.00056), stage (p=0.012) and PSA (pPCa at diagnosis. These results emphasize the importance of bone metabolism biomarkers in patients with PCa even without evident overt bone involvement. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Temporal network epidemiology

    CERN Document Server

    Holme, Petter

    2017-01-01

    This book covers recent developments in epidemic process models and related data on temporally varying networks. It is widely recognized that contact networks are indispensable for describing, understanding, and intervening to stop the spread of infectious diseases in human and animal populations; “network epidemiology” is an umbrella term to describe this research field. More recently, contact networks have been recognized as being highly dynamic. This observation, also supported by an increasing amount of new data, has led to research on temporal networks, a rapidly growing area. Changes in network structure are often informed by epidemic (or other) dynamics, in which case they are referred to as adaptive networks. This volume gathers contributions by prominent authors working in temporal and adaptive network epidemiology, a field essential to understanding infectious diseases in real society.

  10. A discretized algorithm for the solution of a constrained, continuous ...

    African Journals Online (AJOL)

    A discretized algorithm for the solution of a constrained, continuous quadratic control problem. ... The results obtained show that the Discretized constrained algorithm (DCA) is much more accurate and more efficient than some of these techniques, particularly the FSA. Journal of the Nigerian Association of Mathematical ...

  11. I/O-Efficient Construction of Constrained Delaunay Triangulations

    DEFF Research Database (Denmark)

    Agarwal, Pankaj Kumar; Arge, Lars; Yi, Ke

    2005-01-01

    In this paper, we designed and implemented an I/O-efficient algorithm for constructing constrained Delaunay triangulations. If the number of constraining segments is smaller than the memory size, our algorithm runs in expected O( N B logM/B NB ) I/Os for triangulating N points in the plane, where...

  12. Feature Selection pada Dataset Faktor Kesiapan Bencana pada Provinsi di Indonesia Menggunakan Metode PCA (Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Septa Firmansyah Putra

    2017-01-01

    Full Text Available Penelitian ini bertujuan untuk mengetahui atribut-atribut apa yang akan digunakan untuk klasterisasi provinsi di Indonesia berdasarkan faktor kesiapan dalam menghadapi bencana. Data yang digunakan terdiri dari tiga kelompok data yaitu data jumlah kejadian bencana yang terdiri dari 19 sub-atribut, data jumlah fasilitas kesehatan yang terdiri dari 14 sub-atribut dan data jumlah tenaga kesehatan yang terdiri dari 11 sub atribut. Penelitian ini dapat menjadi gambaran tentang bagaimana melakukan pembersihan dan pemilihan data sebelum digunakan dalam proses klasterisasi. Data-data ini akan dibersihkan dan dipilih sebelum nantinya digunakan pada proses klasterisasi. Proses pembersihan dan pemilihan data dilakukan dengan bantuan PCA (Principal Component Analysis namun sebelumnya dibersihkan telebih dahulu dengan cara manual. Penelitian dibagi menjadi 3 percobaan. Pada percobaan pertama didapatkan 31 sub-atribut yang siap digunakan, percobaan kedua didapatkan 29 sub-atribut yang siap digunakan dan pada percobaan ketiga didapatkan 24 sub-atribut yang siap digunakan.

  13. Empresas de trabajo temporal

    OpenAIRE

    Chico Abad, Virginia

    2015-01-01

    Las empresas de trabajo temporal han ido tomando mayor relevancia debido a la estructura de la sociedad y de la economía. La entrada en vigor de la ley 14/1994 por la que se regulan las empresas de trabajo temporal suposo la incorporación al ordenamiento jurífico español de un tipo de empresas cuya actuación se habia extendido en otros países del entorno europeo. La idea general gira en torno a la flexibilidad de un nuevo marco económico y organizativo y exige a las empresas una capa...

  14. Medial temporal lobe

    International Nuclear Information System (INIS)

    Silver, A.J.; Cross, D.T.; Friedman, D.P.; Bello, J.A.; Hilal, S.K.

    1989-01-01

    To better define the MR appearance of hippocampal sclerosis, the authors have reviewed over 500 MR coronal images of the temporal lobes. Many cysts were noted that analysis showed were of choroid-fissure (arachnoid) origin. Their association with seizures was low. A few nontumorous, static, medial temporal lesions, noted on T2-weighted coronal images, were poorly visualized on T1-weighted images and did not enhance with gadolinium. The margins were irregular, involved the hippocampus, and were often associated with focal atrophy. The lesions usually were associated with seizure disorders and specific electroencephalographic changes, and the authors believe they represented hippocampal sclerosis

  15. Probabilistic M/EEG source imaging from sparse spatio-temporal event structure

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Wipf, David

    While MEG and EEG source imaging methods have to tackle a severely ill-posed problem their success can be stated as their ability to constrain the solutions using appropriate priors. In this paper we propose a hierarchical Bayesian model facilitating spatio-temporal patterns through the use of bo...

  16. Spatial and temporal air quality pattern recognition using environmetric techniques: a case study in Malaysia.

    Science.gov (United States)

    Syed Abdul Mutalib, Sharifah Norsukhairin; Juahir, Hafizan; Azid, Azman; Mohd Sharif, Sharifah; Latif, Mohd Talib; Aris, Ahmad Zaharin; Zain, Sharifuddin M; Dominick, Doreena

    2013-09-01

    The objective of this study is to identify spatial and temporal patterns in the air quality at three selected Malaysian air monitoring stations based on an eleven-year database (January 2000-December 2010). Four statistical methods, Discriminant Analysis (DA), Hierarchical Agglomerative Cluster Analysis (HACA), Principal Component Analysis (PCA) and Artificial Neural Networks (ANNs), were selected to analyze the datasets of five air quality parameters, namely: SO2, NO2, O3, CO and particulate matter with a diameter size of below 10 μm (PM10). The three selected air monitoring stations share the characteristic of being located in highly urbanized areas and are surrounded by a number of industries. The DA results show that spatial characterizations allow successful discrimination between the three stations, while HACA shows the temporal pattern from the monthly and yearly factor analysis which correlates with severe haze episodes that have happened in this country at certain periods of time. The PCA results show that the major source of air pollution is mostly due to the combustion of fossil fuel in motor vehicles and industrial activities. The spatial pattern recognition (S-ANN) results show a better prediction performance in discriminating between the regions, with an excellent percentage of correct classification compared to DA. This study presents the necessity and usefulness of environmetric techniques for the interpretation of large datasets aiming to obtain better information about air quality patterns based on spatial and temporal characterizations at the selected air monitoring stations.

  17. [Pattern recognition of decorative papers with different visual characteristics using visible spectroscopy coupled with principal component analysis (PCA)].

    Science.gov (United States)

    Zhang, Mao-mao; Yang, Zhong; Lu, Bin; Liu, Ya-na; Sun, Xue-dong

    2015-02-01

    As one of the most important decorative materials for the modern household products, decorative papers impregnated with melamine not only have better decorative performance, but also could greatly improve the surface properties of materials. However, the appearance quality (such as color-difference evaluation and control) of decorative papers, as an important index for the surface quality of decorative paper, has been a puzzle for manufacturers and consumers. Nowadays, human eye is used to discriminate whether there exist color difference in the factory, which is not only of low efficiency but also prone to bring subjective error. Thus, it is of great significance to find an effective method in order to realize the fast recognition and classification of the decorative papers. In the present study, the visible spectroscopy coupled with principal component analysis (PCA) was used for the pattern recognition of decorative papers with different visual characteristics to investigate the feasibility of visible spectroscopy to rapidly recognize the types of decorative papers. The results showed that the correlation between visible spectroscopy and visual characteristics (L*, a* and b*) was significant, and the correlation coefficients wereup to 0.85 and some was even more than 0. 99, which might suggest that the visible spectroscopy reflected some information about visual characteristics on the surface of decorative papers. When using the visible spectroscopy coupled with PCA to recognize the types of decorative papers, the accuracy reached 94%-100%, which might suggest that the visible spectroscopy was a very potential new method for the rapid, objective and accurate recognition of decorative papers with different visual characteristics.

  18. On application of kernel PCA for generating stimulus features for fMRI during continuous music listening.

    Science.gov (United States)

    Tsatsishvili, Valeri; Burunat, Iballa; Cong, Fengyu; Toiviainen, Petri; Alluri, Vinoo; Ristaniemi, Tapani

    2018-06-01

    There has been growing interest towards naturalistic neuroimaging experiments, which deepen our understanding of how human brain processes and integrates incoming streams of multifaceted sensory information, as commonly occurs in real world. Music is a good example of such complex continuous phenomenon. In a few recent fMRI studies examining neural correlates of music in continuous listening settings, multiple perceptual attributes of music stimulus were represented by a set of high-level features, produced as the linear combination of the acoustic descriptors computationally extracted from the stimulus audio. NEW METHOD: fMRI data from naturalistic music listening experiment were employed here. Kernel principal component analysis (KPCA) was applied to acoustic descriptors extracted from the stimulus audio to generate a set of nonlinear stimulus features. Subsequently, perceptual and neural correlates of the generated high-level features were examined. The generated features captured musical percepts that were hidden from the linear PCA features, namely Rhythmic Complexity and Event Synchronicity. Neural correlates of the new features revealed activations associated to processing of complex rhythms, including auditory, motor, and frontal areas. Results were compared with the findings in the previously published study, which analyzed the same fMRI data but applied linear PCA for generating stimulus features. To enable comparison of the results, methodology for finding stimulus-driven functional maps was adopted from the previous study. Exploiting nonlinear relationships among acoustic descriptors can lead to the novel high-level stimulus features, which can in turn reveal new brain structures involved in music processing. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Characterization of dynamic changes of current source localization based on spatiotemporal fMRI constrained EEG source imaging

    Science.gov (United States)

    Nguyen, Thinh; Potter, Thomas; Grossman, Robert; Zhang, Yingchun

    2018-06-01

    Objective. Neuroimaging has been employed as a promising approach to advance our understanding of brain networks in both basic and clinical neuroscience. Electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) represent two neuroimaging modalities with complementary features; EEG has high temporal resolution and low spatial resolution while fMRI has high spatial resolution and low temporal resolution. Multimodal EEG inverse methods have attempted to capitalize on these properties but have been subjected to localization error. The dynamic brain transition network (DBTN) approach, a spatiotemporal fMRI constrained EEG source imaging method, has recently been developed to address these issues by solving the EEG inverse problem in a Bayesian framework, utilizing fMRI priors in a spatial and temporal variant manner. This paper presents a computer simulation study to provide a detailed characterization of the spatial and temporal accuracy of the DBTN method. Approach. Synthetic EEG data were generated in a series of computer simulations, designed to represent realistic and complex brain activity at superficial and deep sources with highly dynamical activity time-courses. The source reconstruction performance of the DBTN method was tested against the fMRI-constrained minimum norm estimates algorithm (fMRIMNE). The performances of the two inverse methods were evaluated both in terms of spatial and temporal accuracy. Main results. In comparison with the commonly used fMRIMNE method, results showed that the DBTN method produces results with increased spatial and temporal accuracy. The DBTN method also demonstrated the capability to reduce crosstalk in the reconstructed cortical time-course(s) induced by neighboring regions, mitigate depth bias and improve overall localization accuracy. Significance. The improved spatiotemporal accuracy of the reconstruction allows for an improved characterization of complex neural activity. This improvement can be

  20. Regional Responses to Constrained Water Availability

    Science.gov (United States)

    Cui, Y.; Calvin, K. V.; Hejazi, M. I.; Clarke, L.; Kim, S. H.; Patel, P.

    2017-12-01

    There have been many concerns about water as a constraint to agricultural production, electricity generation, and many other human activities in the coming decades. Nevertheless, how different countries/economies would respond to such constraints has not been explored. Here, we examine the responding mechanism of binding water availability constraints at the water basin level and across a wide range of socioeconomic, climate and energy technology scenarios. Specifically, we look at the change in water withdrawals between energy, land-use and other sectors within an integrated framework, by using the Global Change Assessment Model (GCAM) that also endogenizes water use and allocation decisions based on costs. We find that, when water is taken into account as part of the production decision-making, countries/basins in general fall into three different categories, depending on the change of water withdrawals and water re-allocation between sectors. First, water is not a constraining factor for most of the basins. Second, advancements in water-saving technologies of the electricity generation cooling systems are sufficient of reducing water withdrawals to meet binding water availability constraints, such as in China and the EU-15. Third, water-saving in the electricity sector alone is not sufficient and thus cannot make up the lowered water availability from the binding case; for example, many basins in Pakistan, Middle East and India have to largely reduce irrigated water withdrawals by either switching to rain-fed agriculture or reducing production. The dominant responding strategy for individual countries/basins is quite robust across the range of alternate scenarios that we test. The relative size of water withdrawals between energy and agriculture sectors is one of the most important factors that affect the dominant mechanism.

  1. Constraining Cosmic Evolution of Type Ia Supernovae

    Energy Technology Data Exchange (ETDEWEB)

    Foley, Ryan J.; Filippenko, Alexei V.; Aguilera, C.; Becker, A.C.; Blondin, S.; Challis, P.; Clocchiatti, A.; Covarrubias, R.; Davis, T.M.; Garnavich, P.M.; Jha, S.; Kirshner, R.P.; Krisciunas, K.; Leibundgut, B.; Li, W.; Matheson, T.; Miceli, A.; Miknaitis, G.; Pignata, G.; Rest, A.; Riess, A.G.; /UC, Berkeley, Astron. Dept. /Cerro-Tololo InterAmerican Obs. /Washington U., Seattle, Astron. Dept. /Harvard-Smithsonian Ctr. Astrophys. /Chile U., Catolica /Bohr Inst. /Notre Dame U. /KIPAC, Menlo Park /Texas A-M /European Southern Observ. /NOAO, Tucson /Fermilab /Chile U., Santiago /Harvard U., Phys. Dept. /Baltimore, Space Telescope Sci. /Johns Hopkins U. /Res. Sch. Astron. Astrophys., Weston Creek /Stockholm U. /Hawaii U. /Illinois U., Urbana, Astron. Dept.

    2008-02-13

    We present the first large-scale effort of creating composite spectra of high-redshift type Ia supernovae (SNe Ia) and comparing them to low-redshift counterparts. Through the ESSENCE project, we have obtained 107 spectra of 88 high-redshift SNe Ia with excellent light-curve information. In addition, we have obtained 397 spectra of low-redshift SNe through a multiple-decade effort at Lick and Keck Observatories, and we have used 45 ultraviolet spectra obtained by HST/IUE. The low-redshift spectra act as a control sample when comparing to the ESSENCE spectra. In all instances, the ESSENCE and Lick composite spectra appear very similar. The addition of galaxy light to the Lick composite spectra allows a nearly perfect match of the overall spectral-energy distribution with the ESSENCE composite spectra, indicating that the high-redshift SNe are more contaminated with host-galaxy light than their low-redshift counterparts. This is caused by observing objects at all redshifts with similar slit widths, which corresponds to different projected distances. After correcting for the galaxy-light contamination, subtle differences in the spectra remain. We have estimated the systematic errors when using current spectral templates for K-corrections to be {approx}0.02 mag. The variance in the composite spectra give an estimate of the intrinsic variance in low-redshift maximum-light SN spectra of {approx}3% in the optical and growing toward the ultraviolet. The difference between the maximum-light low and high-redshift spectra constrain SN evolution between our samples to be < 10% in the rest-frame optical.

  2. Laterally constrained inversion for CSAMT data interpretation

    Science.gov (United States)

    Wang, Ruo; Yin, Changchun; Wang, Miaoyue; Di, Qingyun

    2015-10-01

    Laterally constrained inversion (LCI) has been successfully applied to the inversion of dc resistivity, TEM and airborne EM data. However, it hasn't been yet applied to the interpretation of controlled-source audio-frequency magnetotelluric (CSAMT) data. In this paper, we apply the LCI method for CSAMT data inversion by preconditioning the Jacobian matrix. We apply a weighting matrix to Jacobian to balance the sensitivity of model parameters, so that the resolution with respect to different model parameters becomes more uniform. Numerical experiments confirm that this can improve the convergence of the inversion. We first invert a synthetic dataset with and without noise to investigate the effect of LCI applications to CSAMT data, for the noise free data, the results show that the LCI method can recover the true model better compared to the traditional single-station inversion; and for the noisy data, the true model is recovered even with a noise level of 8%, indicating that LCI inversions are to some extent noise insensitive. Then, we re-invert two CSAMT datasets collected respectively in a watershed and a coal mine area in Northern China and compare our results with those from previous inversions. The comparison with the previous inversion in a coal mine shows that LCI method delivers smoother layer interfaces that well correlate to seismic data, while comparison with a global searching algorithm of simulated annealing (SA) in a watershed shows that though both methods deliver very similar good results, however, LCI algorithm presented in this paper runs much faster. The inversion results for the coal mine CSAMT survey show that a conductive water-bearing zone that was not revealed by the previous inversions has been identified by the LCI. This further demonstrates that the method presented in this paper works for CSAMT data inversion.

  3. The cost-constrained traveling salesman problem

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P.R.

    1990-10-01

    The Cost-Constrained Traveling Salesman Problem (CCTSP) is a variant of the well-known Traveling Salesman Problem (TSP). In the TSP, the goal is to find a tour of a given set of cities such that the total cost of the tour is minimized. In the CCTSP, each city is given a value, and a fixed cost-constraint is specified. The objective is to find a subtour of the cities that achieves maximum value without exceeding the cost-constraint. Thus, unlike the TSP, the CCTSP requires both selection and sequencing. As a consequence, most results for the TSP cannot be extended to the CCTSP. We show that the CCTSP is NP-hard and that no K-approximation algorithm or fully polynomial approximation scheme exists, unless P = NP. We also show that several special cases are polynomially solvable. Algorithms for the CCTSP, which outperform previous methods, are developed in three areas: upper bounding methods, exact algorithms, and heuristics. We found that a bounding strategy based on the knapsack problem performs better, both in speed and in the quality of the bounds, than methods based on the assignment problem. Likewise, we found that a branch-and-bound approach using the knapsack bound was superior to a method based on a common branch-and-bound method for the TSP. In our study of heuristic algorithms, we found that, when selecting modes for inclusion in the subtour, it is important to consider the neighborhood'' of the nodes. A node with low value that brings the subtour near many other nodes may be more desirable than an isolated node of high value. We found two types of repetition to be desirable: repetitions based on randomization in the subtour buildings process, and repetitions encouraging the inclusion of different subsets of the nodes. By varying the number and type of repetitions, we can adjust the computation time required by our method to obtain algorithms that outperform previous methods.

  4. Optimizing Temporal Queries

    DEFF Research Database (Denmark)

    Toman, David; Bowman, Ivan Thomas

    2003-01-01

    Recent research in the area of temporal databases has proposed a number of query languages that vary in their expressive power and the semantics they provide to users. These query languages represent a spectrum of solutions to the tension between clean semantics and efficient evaluation. Often, t...

  5. Temporal Concurrent Constraint Programming

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Palamidessi, Catuscia; Valencia, Frank Dan

    2002-01-01

    The ntcc calculus is a model of non-deterministic temporal concurrent constraint programming. In this paper we study behavioral notions for this calculus. In the underlying computational model, concurrent constraint processes are executed in discrete time intervals. The behavioral notions studied...

  6. Temporal Photon Differentials

    DEFF Research Database (Denmark)

    Schjøth, Lars; Frisvad, Jeppe Revall; Erleben, Kenny

    2010-01-01

    The finite frame rate also used in computer animated films is cause of adverse temporal aliasing effects. Most noticeable of these is a stroboscopic effect that is seen as intermittent movement of fast moving illumination. This effect can be mitigated using non-zero shutter times, effectively...

  7. Temporal compressive sensing systems

    Science.gov (United States)

    Reed, Bryan W.

    2017-12-12

    Methods and systems for temporal compressive sensing are disclosed, where within each of one or more sensor array data acquisition periods, one or more sensor array measurement datasets comprising distinct linear combinations of time slice data are acquired, and where mathematical reconstruction allows for calculation of accurate representations of the individual time slice datasets.

  8. Information and Temporality

    Directory of Open Access Journals (Sweden)

    Christian Flender

    2016-09-01

    Full Text Available Being able to give reasons for what the world is and how it works is one of the defining characteristics of modernity. Mathematical reason and empirical observation brought science and engineering to unprecedented success. However, modernity has reached a post-state where an instrumental view of technology needs revision with reasonable arguments and evidence, i.e. without falling back to superstition and mysticism. Instrumentally, technology bears the potential to ease and to harm. Easing and harming can't be controlled like the initial development of technology is a controlled exercise for a specific, mostly easing purpose. Therefore, a revised understanding of information technology is proposed based upon mathematical concepts and intuitions as developed in quantum mechanics. Quantum mechanics offers unequaled opportunities because it raises foundational questions in a precise form. Beyond instrumentalism it enables to raise the question of essences as that what remains through time what it is. The essence of information technology is acausality. The time of acausality is temporality. Temporality is not a concept or a category. It is not epistemological. As an existential and thus more comprehensive and fundamental than a concept or a category temporality is ontological; it does not simply have ontic properties. Rather it exhibits general essences. Datability, significance, spannedness and openness are general essences of equiprimordial time (temporality.

  9. Temporal logic motion planning

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-01-01

    Full Text Available In this paper, a critical review on temporal logic motion planning is presented. The review paper aims to address the following problems: (a) In a realistic situation, the motion planning problem is carried out in real-time, in a dynamic, uncertain...

  10. Experimental temporal quantum steering

    Czech Academy of Sciences Publication Activity Database

    Bartkiewicz, K.; Černoch, Antonín; Lemr, K.; Miranowicz, A.; Nori, F.

    2016-01-01

    Roč. 6, Nov (2016), 1-8, č. článku 38076. ISSN 2045-2322 R&D Projects: GA ČR GAP205/12/0382 Institutional support: RVO:68378271 Keywords : temporal quantum steering * EPR steering Subject RIV: BH - Optics, Masers, Lasers Impact factor: 4.259, year: 2016

  11. Cluster analysis of commercial samples of Bauhinia spp. using HPLC-UV/PDA and MCR-ALS/PCA without peak alignment procedure.

    Science.gov (United States)

    Ardila, Jorge Armando; Funari, Cristiano Soleo; Andrade, André Marques; Cavalheiro, Alberto José; Carneiro, Renato Lajarim

    2015-01-01

    Bauhinia forficata Link. is recognised by the Brazilian Health Ministry as a treatment of hypoglycemia and diabetes. Analytical methods are useful to assess the plant identity due the similarities found in plants from Bauhinia spp. HPLC-UV/PDA in combination with chemometric tools is an alternative widely used and suitable for authentication of plant material, however, the shifts of retention times for similar compounds in different samples is a problem. To perform comparisons between the authentic medicinal plant (Bauhinia forficata Link.) and samples commercially available in drugstores claiming to be "Bauhinia spp. to treat diabetes" and to evaluate the performance of multivariate curve resolution - alternating least squares (MCR-ALS) associated to principal component analysis (PCA) when compared to pure PCA. HPLC-UV/PDA data obtained from extracts of leaves were evaluated employing a combination of MCR-ALS and PCA, which allowed the use of the full chromatographic and spectrometric information without the need of peak alignment procedures. The use of MCR-ALS/PCA showed better results than the conventional PCA using only one wavelength. Only two of nine commercial samples presented characteristics similar to the authentic Bauhinia forficata spp., considering the full HPLC-UV/PDA data. The combination of MCR-ALS and PCA is very useful when applied to a group of samples where a general alignment procedure could not be applied due to the different chromatographic profiles. This work also demonstrates the need of more strict control from the health authorities regarding herbal products available on the market. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Masked-Volume-Wise PCA and "reference Logan" illustrate similar regional differences in kinetic behavior in human brain PET study using [11C]-PIB

    Directory of Open Access Journals (Sweden)

    Engler Henry

    2009-01-01

    Full Text Available Abstract Background Kinetic modeling using reference Logan is commonly used to analyze data obtained from dynamic Positron Emission Tomography (PET studies on patients with Alzheimer's disease (AD and healthy volunteers (HVs using amyloid imaging agent N-methyl [11C]2-(4'-methylaminophenyl-6-hydroxy-benzothiazole, [11C]-PIB. The aim of the present study was to explore whether results obtained using the newly introduced method, Masked Volume Wise Principal Component Analysis, MVW-PCA, were similar to the results obtained using reference Logan. Methods MVW-PCA and reference Logan were performed on dynamic PET images obtained from four Alzheimer's disease (AD patients on two occasions (baseline and follow-up and on four healthy volunteers (HVs. Regions of interest (ROIs of similar sizes were positioned in different parts of the brain in both AD patients and HVs where the difference between AD patients and HVs is largest. Signal-to-noise ratio (SNR and discrimination power (DP were calculated for images generated by the different methods and the results were compared both qualitatively and quantitatively. Results MVW-PCA generated images that illustrated similar regional binding patterns compared to reference Logan images and with slightly higher quality, enhanced contrast, improved SNR and DP, without being based on modeling assumptions. MVW-PCA also generated additional MVW-PC images by using the whole dataset, which illustrated regions with different and uncorrelated kinetic behaviors of the administered tracer. This additional information might improve the understanding of kinetic behavior of the administered tracer. Conclusion MVW-PCA is a potential multivariate method that without modeling assumptions generates high quality images, which illustrated similar regional changes compared to modeling methods such as reference Logan. In addition, MVW-PCA could be used as a new technique, applicable not only on dynamic human brain studies but also on

  13. Predicting the effectiveness of virtual reality relaxation on pain and anxiety when added to PCA morphine in patients having burns dressings changes.

    Science.gov (United States)

    Konstantatos, A H; Angliss, M; Costello, V; Cleland, H; Stafrace, S

    2009-06-01

    Pain arising in burns sufferers is often severe and protracted. The prospect of a dressing change can heighten existing pain by impacting both physically and psychologically. In this trial we examined whether pre-procedural virtual reality guided relaxation added to patient controlled analgesia with morphine reduced pain severity during awake dressings changes in burns patients. We conducted a prospective randomized clinical trial in all patients with burns necessitating admission to a tertiary burns referral centre. Eligible patients requiring awake dressings changes were randomly allocated to single use virtual reality relaxation plus intravenous morphine patient controlled analgesia (PCA) infusion or to intravenous morphine patient controlled analgesia infusion alone. Patients rated their worst pain intensity during the dressing change using a visual analogue scale. The primary outcome measure was presence of 30% or greater difference in pain intensity ratings between the groups in estimation of worst pain during the dressing change. Of 88 eligible and consenting patients having awake dressings changes, 43 were assigned to virtual reality relaxation plus intravenous morphine PCA infusion and 43 to morphine PCA infusion alone. The group receiving virtual reality relaxation plus morphine PCA infusion reported significantly higher pain intensities during the dressing change (mean=7.3) compared with patients receiving morphine PCA alone (mean=5.3) (p=0.003) (95% CI 0.6-2.8). The addition of virtual reality guided relaxation to morphine PCA infusion in burns patients resulted in a significant increase in pain experienced during awake dressings changes. In the absence of a validated predictor for responsiveness to virtual reality relaxation such a therapy cannot be recommended for general use in burns patients having awake dressings changes.

  14. APLICACIÓN DEL SOFTWARE PCA 1.0: PARA REDUCIR EL DETERIORO DE LA CALIDAD DEL AIRE EN CALI-COLOMBIA (FASE I

    Directory of Open Access Journals (Sweden)

    Luis Granada

    2009-01-01

    Full Text Available El PCA 1.0 se diseñó como herramienta de soporte para un Procedimiento Organizacional que colecta y trata la información obtenida en el monitoreo y control de contaminantes atmosféricos en Cali - Colombia1. El PCA 1.0 es una aplicación hecha en java con la Interfaz de Desarrollo (IDE NetBeans2. El software PCA 1.0 está diseñado para recepcionar, depurar, validar y generar reportes de manera sistemática de los datos obtenidos en la Red de Monitoreo de Calidad del Aire, Meteorología y de la Inspección Técnico Mecánica y de Gases. El PCA 1.0 estima los Factores de Emisión y la Carga Ambiental diaria generada por las fuentes móviles en kilogramos, así como el valor promedio horario de la concentración de contaminantes criterio. La metodología general del Procedimiento Organizacional se fundamentó en lo establecido en la ISO 14040 (Análisis del Ciclo de Vida. El resultado más importante de la aplicación del PCA 1.0, se evidencia en facilitar a la autoridad ambiental, sanitaria, tránsito y transporte, la toma de decisiones con base en la selección de escenarios de contaminación atmosférica y actuaciones encaminadas hacia la publicación de normas que buscan reducir el deterioro de la calidad del aire en Cali ¿ Colombia. Finalmente, el PCA 1.0 se perfila como una herramienta ágil y adaptable en los sistemas de gestión ambiental municipal en su componente aire.

  15. The Northern Norway Mother-and-Child Contaminant Cohort (MISA) Study: PCA analyses of environmental contaminants in maternal sera and dietary intake in early pregnancy.

    Science.gov (United States)

    Veyhe, Anna Sofía; Hofoss, Dag; Hansen, Solrunn; Thomassen, Yngvar; Sandanger, Torkjel M; Odland, Jon Øyvind; Nieboer, Evert

    2015-03-01

    Although predictors of contaminants in serum or whole blood are usually examined by chemical groups (e.g., POPs, toxic and/or essential elements; dietary sources), principal component analysis (PCA) permits consideration of both individual substances and combined variables. Our study had two primary objectives: (i) Characterize the sources and predictors of a suite of eight PCBs, four organochlorine (OC) pesticides, five essential and five toxic elements in serum and/or whole blood of pregnant women recruited as part of the Mother-and-Child Contaminant Cohort Study conducted in Northern Norway (The MISA study); and (ii) determine the influence of personal and social characteristics on both dietary and contaminant factors. Recruitment and sampling started in May 2007 and continued for the next 31 months until December 2009. Blood/serum samples were collected during the 2nd trimester (mean: 18.2 weeks, range 9.0-36.0). A validated questionnaire was administered to obtain personal information. The samples were analysed by established laboratories employing verified methods and reference standards. PCA involved Varimax rotation, and significant predictors (p≤0.05) in linear regression models were included in the multivariable linear regression analysis. When considering all the contaminants, three prominent PCA axes stood out with prominent loadings of: all POPs; arsenic, selenium and mercury; and cadmium and lead. Respectively, in the multivariate models the following were predictors: maternal age, parity and consumption of freshwater fish and land-based wild animals; marine fish; cigarette smoking, dietary PCA axes reflecting consumption of grains and cereals, and food items involving hunting. PCA of only the POPs separated them into two axes that, in terms of recently published findings, could be understood to reflect longitudinal trends and their relative contributions to summed POPs. The linear combinations of variables generated by PCA identified prominent

  16. Communication, Technology, Temporality

    Directory of Open Access Journals (Sweden)

    Mark A. Martinez

    2012-08-01

    Full Text Available This paper proposes a media studies that foregrounds technological objects as communicative and historical agents. Specifically, I take the digital computer as a powerful catalyst of crises in communication theories and certain key features of modernity. Finally, the computer is the motor of “New Media” which is at once a set of technologies, a historical epoch, and a field of knowledge. As such the computer shapes “the new” and “the future” as History pushes its origins further in the past and its convergent quality pushes its future as a predominate medium. As treatment of information and interface suggest, communication theories observe computers, and technologies generally, for the mediated languages they either afford or foreclose to us. My project describes the figures information and interface for the different ways they can be thought of as aspects of communication. I treat information not as semantic meaning, formal or discursive language, but rather as a physical organism. Similarly an interface is not a relationship between a screen and a human visual intelligence, but is instead a reciprocal, affective and physical process of contact. I illustrate that historically there have been conceptions of information and interface complimentary to mine, fleeting as they have been in the face of a dominant temporality of mediation. I begin with a theoretically informed approach to media history, and extend it to a new theory of communication. In doing so I discuss a model of time common to popular, scientific, and critical conceptions of media technologies especially in theories of computer technology. This is a predominate model with particular rules of temporal change and causality for thinking about mediation, and limits the conditions of possibility for knowledge production about communication. I suggest a new model of time as integral to any event of observation and analysis, and that human mediation does not exhaust the

  17. Temporal Variability of Observed and Simulated Hyperspectral Earth Reflectance

    Science.gov (United States)

    Roberts, Yolanda; Pilewskie, Peter; Kindel, Bruce; Feldman, Daniel; Collins, William D.

    2012-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system designed to study Earth's climate variability with unprecedented absolute radiometric accuracy and SI traceability. Observation System Simulation Experiments (OSSEs) were developed using GCM output and MODTRAN to simulate CLARREO reflectance measurements during the 21st century as a design tool for the CLARREO hyperspectral shortwave imager. With OSSE simulations of hyperspectral reflectance, Feldman et al. [2011a,b] found that shortwave reflectance is able to detect changes in climate variables during the 21st century and improve time-to-detection compared to broadband measurements. The OSSE has been a powerful tool in the design of the CLARREO imager and for understanding the effect of climate change on the spectral variability of reflectance, but it is important to evaluate how well the OSSE simulates the Earth's present-day spectral variability. For this evaluation we have used hyperspectral reflectance measurements from the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY), a shortwave spectrometer that was operational between March 2002 and April 2012. To study the spectral variability of SCIAMACHY-measured and OSSE-simulated reflectance, we used principal component analysis (PCA), a spectral decomposition technique that identifies dominant modes of variability in a multivariate data set. Using quantitative comparisons of the OSSE and SCIAMACHY PCs, we have quantified how well the OSSE captures the spectral variability of Earth?s climate system at the beginning of the 21st century relative to SCIAMACHY measurements. These results showed that the OSSE and SCIAMACHY data sets share over 99% of their total variance in 2004. Using the PCs and the temporally distributed reflectance spectra projected onto the PCs (PC scores), we can study the temporal variability of the observed and simulated reflectance spectra. Multivariate time

  18. Real-time dynamic MR image reconstruction using compressed sensing and principal component analysis (CS-PCA): Demonstration in lung tumor tracking.

    Science.gov (United States)

    Dietz, Bryson; Yip, Eugene; Yun, Jihyun; Fallone, B Gino; Wachowicz, Keith

    2017-08-01

    This work presents a real-time dynamic image reconstruction technique, which combines compressed sensing and principal component analysis (CS-PCA), to achieve real-time adaptive radiotherapy with the use of a linac-magnetic resonance imaging system. Six retrospective fully sampled dynamic data sets of patients diagnosed with non-small-cell lung cancer were used to investigate the CS-PCA algorithm. Using a database of fully sampled k-space, principal components (PC's) were calculated to aid in the reconstruction of undersampled images. Missing k-space data were calculated by projecting the current undersampled k-space data onto the PC's to generate the corresponding PC weights. The weighted PC's were summed together, and the missing k-space was iteratively updated. To gain insight into how the reconstruction might proceed at lower fields, 6× noise was added to the 3T data to investigate how the algorithm handles noisy data. Acceleration factors ranging from 2 to 10× were investigated using CS-PCA and Split Bregman CS for comparison. Metrics to determine the reconstruction quality included the normalized mean square error (NMSE), as well as the dice coefficients (DC) and centroid displacement of the tumor segmentations. Our results demonstrate that CS-PCA performed superior than CS alone. The CS-PCA patient averaged DC for 3T and 6× noise added data remained above 0.9 for acceleration factors up to 10×. The patient averaged NMSE gradually increased with increasing acceleration; however, it remained below 0.06 up to an acceleration factor of 10× for both 3T and 6× noise added data. The CS-PCA reconstruction speed ranged from 5 to 20 ms (Intel i7-4710HQ CPU @ 2.5 GHz), depending on the chosen parameters. A real-time reconstruction technique was developed for adaptive radiotherapy using a Linac-MRI system. Our CS-PCA algorithm can achieve tumor contours with DC greater than 0.9 and NMSE less than 0.06 at acceleration factors of up to, and including, 10×. The

  19. pcaGoPromoter--an R package for biological and regulatory interpretation of principal components in genome-wide gene expression data

    DEFF Research Database (Denmark)

    Hansen, Morten; Gerds, Thomas Alexander; Nielsen, Ole Haagen

    2012-01-01

    Analyzing data obtained from genome-wide gene expression experiments is challenging due to the quantity of variables, the need for multivariate analyses, and the demands of managing large amounts of data. Here we present the R package pcaGoPromoter, which facilitates the interpretation of genome.......g., cell cycle progression and the predicted involvement of expected transcription factors, including E2F. In addition, unexpected results, e.g., cholesterol synthesis in serum-depleted cells and NF-¿B activation in inhibitor treated cells, were noted. In summary, the pcaGoPromoter R package provides...

  20. KINETIC CONSEQUENCES OF CONSTRAINING RUNNING BEHAVIOR

    Directory of Open Access Journals (Sweden)

    John A. Mercer

    2005-06-01

    Full Text Available It is known that impact forces increase with running velocity as well as when stride length increases. Since stride length naturally changes with changes in submaximal running velocity, it was not clear which factor, running velocity or stride length, played a critical role in determining impact characteristics. The aim of the study was to investigate whether or not stride length influences the relationship between running velocity and impact characteristics. Eight volunteers (mass=72.4 ± 8.9 kg; height = 1.7 ± 0.1 m; age = 25 ± 3.4 years completed two running conditions: preferred stride length (PSL and stride length constrained at 2.5 m (SL2.5. During each condition, participants ran at a variety of speeds with the intent that the range of speeds would be similar between conditions. During PSL, participants were given no instructions regarding stride length. During SL2.5, participants were required to strike targets placed on the floor that resulted in a stride length of 2.5 m. Ground reaction forces were recorded (1080 Hz as well as leg and head accelerations (uni-axial accelerometers. Impact force and impact attenuation (calculated as the ratio of head and leg impact accelerations were recorded for each running trial. Scatter plots were generated plotting each parameter against running velocity. Lines of best fit were calculated with the slopes recorded for analysis. The slopes were compared between conditions using paired t-tests. Data from two subjects were dropped from analysis since the velocity ranges were not similar between conditions resulting in the analysis of six subjects. The slope of impact force vs. velocity relationship was different between conditions (PSL: 0.178 ± 0.16 BW/m·s-1; SL2.5: -0.003 ± 0.14 BW/m·s-1; p < 0.05. The slope of the impact attenuation vs. velocity relationship was different between conditions (PSL: 5.12 ± 2.88 %/m·s-1; SL2.5: 1.39 ± 1.51 %/m·s-1; p < 0.05. Stride length was an important factor