WorldWideScience

Sample records for multi-parametric mapping software

  1. A generalized parametric response mapping method for analysis of multi-parametric imaging: A feasibility study with application to glioblastoma.

    Science.gov (United States)

    Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene

    2017-11-01

    Parametric response map (PRM) analysis of functional imaging has been shown to be an effective tool for early prediction of cancer treatment outcomes and may also be well-suited toward guiding personalized adaptive radiotherapy (RT) strategies such as sub-volume boosting. However, the PRM method was primarily designed for analysis of longitudinally acquired pairs of single-parameter image data. The purpose of this study was to demonstrate the feasibility of a generalized parametric response map analysis framework, which enables analysis of multi-parametric data while maintaining the key advantages of the original PRM method. MRI-derived apparent diffusion coefficient (ADC) and relative cerebral blood volume (rCBV) maps acquired at 1 and 3-months post-RT for 19 patients with high-grade glioma were used to demonstrate the algorithm. Images were first co-registered and then standardized using normal tissue image intensity values. Tumor voxels were then plotted in a four-dimensional Cartesian space with coordinate values equal to a voxel's image intensity in each of the image volumes and an origin defined as the multi-parametric mean of normal tissue image intensity values. Voxel positions were orthogonally projected onto a line defined by the origin and a pre-determined response vector. The voxels are subsequently classified as positive, negative or nil, according to whether projected positions along the response vector exceeded a threshold distance from the origin. The response vector was selected by identifying the direction in which the standard deviation of tumor image intensity values was maximally different between responding and non-responding patients within a training dataset. Voxel classifications were visualized via familiar three-class response maps and then the fraction of tumor voxels associated with each of the classes was investigated for predictive utility analogous to the original PRM method. Independent PRM and MPRM analyses of the contrast

  2. A microcomputer based multi parametric system for nuclear data acquisition and processing

    International Nuclear Information System (INIS)

    Toledo Acosta B, Rene; Osorio Deliz F, Juan; Arista Romeu, Eduardo; Perez Sanchez, Reinaldo; Lopes Torres, E.

    1997-01-01

    A four-parameter Multi parametric System for the acquisition and processing of nuclear data is described. It is characterized for its flexibility and relatively low cost, also guaranteeing a high acquisition capacity. The system allows to be utilized in a multi parametric manner, in pulse height analysis or in many combination of both for parameter. It is described the hardware and the software of the system

  3. Personalized precision radiotherapy by integration of multi-parametric functional and biological imaging in prostate cancer. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Thorwarth, Daniela [Tuebingen Univ. (Germany). Section for Biomedical Physics; Notohamiprodjo, Mike [Tuebingen Univ. (Germany). Dept. of Diagnostic and Interventional Radiology; Zips, Daniel; Mueller, Arndt-Christan [Tuebingen Univ. (Germany). Dept. of Radiation Oncology

    2017-05-01

    To increase tumour control probability (TCP) in prostate cancer a method was developed integrating multi-parametric functional and biological information into a dose painting treatment plan aiming focal dose-escalation to tumour sub-volumes. A dose-escalation map was derived considering individual, multi-parametric estimated tumour aggressiveness. Multi-parametric functional imaging (MRI, Choline-/PSMA-/FMISO-PET/CT) was acquired for a high risk prostate cancer patient with a high level of tumour load (cT3b cN0 cM0) indicated by subtotal involvement of prostate including the right seminal vesicle and by PSA-level >100. Probability of tumour presence was determined by a combination of multi-parametric functional image information resulting in a voxel-based map of tumour aggressiveness. This probability map was directly integrated into dose optimization in order to plan for inhomogeneous, biological imaging based dose painting. Histograms of the multi-parametric prescription function were generated in addition to a differential histogram of the planned inhomogeneous doses. Comparison of prescribed doses with planned doses on a voxel level was realized using an effective DVH, containing the ratio of prescribed vs. planned dose for each tumour voxel. Multi-parametric imaging data of PSMA, Choline and FMISO PET/CT as well as ADC maps derived from diffusion weighted MRI were combined to an individual probability map of tumour presence. Voxel-based prescription doses ranged from 75.3 Gy up to 93.4 Gy (median: 79.6 Gy), whereas the planned dose painting doses varied only between 72.5 and 80.0 Gy with a median dose of 75.7 Gy. However, inhomogeneous voxel-based dose prescriptions can only be implemented into a treatment plan until a certain level. Multi-parametric probability based dose painting in prostate cancer is technically and clinically feasible. However, detailed calibration functions to define the necessary probability functions need to be assessed in future

  4. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.

    2011-08-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques. The algorithm features two key steps: (i) a dynamic programming step, in which the mp-MPC problem is decomposed into a set of smaller subproblems in which only the current control, state variables, and constraints are considered, and (ii) a multi-parametric programming step, in which each subproblem is solved as a convex multi-parametric programming problem, to derive the control variables as an explicit function of the states. The key feature of the proposed method is that it overcomes potential limitations of previous methods for solving multi-parametric programming problems with dynamic programming, such as the need for global optimization for each subproblem of the dynamic programming step. © 2011 Elsevier Ltd. All rights reserved.

  5. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  6. Theoretical and algorithmic advances in multi-parametric programming and control

    KAUST Repository

    Pistikopoulos, Efstratios N.; Dominguez, Luis; Panos, Christos; Kouramas, Konstantinos; Chinchuluun, Altannar

    2012-01-01

    This paper presents an overview of recent theoretical and algorithmic advances, and applications in the areas of multi-parametric programming and explicit/multi-parametric model predictive control (mp-MPC). In multi-parametric programming, advances include areas such as nonlinear multi-parametric programming (mp-NLP), bi-level programming, dynamic programming and global optimization for multi-parametric mixed-integer linear programming problems (mp-MILPs). In multi-parametric/explicit MPC (mp-MPC), advances include areas such as robust multi-parametric control, multi-parametric nonlinear MPC (mp-NMPC) and model reduction in mp-MPC. A comprehensive framework for multi-parametric programming and control is also presented. Recent applications include a hydrogen storage device, a fuel cell power generation system, an unmanned autonomous vehicle (UAV) and a hybrid pressure swing adsorption (PSA) system. © 2012 Springer-Verlag.

  7. Theoretical and algorithmic advances in multi-parametric programming and control

    KAUST Repository

    Pistikopoulos, Efstratios N.

    2012-04-21

    This paper presents an overview of recent theoretical and algorithmic advances, and applications in the areas of multi-parametric programming and explicit/multi-parametric model predictive control (mp-MPC). In multi-parametric programming, advances include areas such as nonlinear multi-parametric programming (mp-NLP), bi-level programming, dynamic programming and global optimization for multi-parametric mixed-integer linear programming problems (mp-MILPs). In multi-parametric/explicit MPC (mp-MPC), advances include areas such as robust multi-parametric control, multi-parametric nonlinear MPC (mp-NMPC) and model reduction in mp-MPC. A comprehensive framework for multi-parametric programming and control is also presented. Recent applications include a hydrogen storage device, a fuel cell power generation system, an unmanned autonomous vehicle (UAV) and a hybrid pressure swing adsorption (PSA) system. © 2012 Springer-Verlag.

  8. Multi parametric system for the acquisition and processing of nuclear data on a personal computer

    International Nuclear Information System (INIS)

    Toledo Acosta, R. B.; Osorio Deliz, J. F.; Arista Romeu, E.; Perez Sanchez, R.; Lopez Torres, E.

    1997-01-01

    A four-parameter Multi parametric System for the acquisition and processing of nuclear data is described. It is characterized for its flexibility and relatively low cost, also guaranteeing a high acquisition capacity. The system allows to be utilized in a multi parametric manner, in pulse height analysis or in any combination of both for parameter. The hardware and the software of the system are described. A general explanation of the operation and the characteristics of the system is offered. (author) [es

  9. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.; Faí sca, N.P.; Panos, C.; Pistikopoulos, E.N.

    2011-01-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques

  10. Integrable multi parametric SU(N) chain

    International Nuclear Information System (INIS)

    Foerster, Angela; Roditi, Itzhak; Rodrigues, Ligia M.C.S.

    1996-03-01

    We analyse integrable models associated to a multi parametric SU(N) R-matrix. We show that the Hamiltonians describe SU(N) chains with twisted boundary conditions and that the underlying algebraic structure is the multi parametric deformation of SU(N) enlarged by the introduction of a central element. (author). 15 refs

  11. Close-range geophotogrammetric mapping of trench walls using multi-model stereo restitution software

    Energy Technology Data Exchange (ETDEWEB)

    Coe, J.A.; Taylor, E.M.; Schilling, S.P.

    1991-06-01

    Methods for mapping geologic features exposed on trench walls have advanced from conventional gridding and sketch mapping to precise close-range photogrammetric mapping. In our study, two strips of small-format (60 {times} 60) stereo pairs, each containing 42 photos and covering approximately 60 m of nearly vertical trench wall (2-4 m high), were contact printed onto eight 205 {times} 255-mm transparent film sheets. Each strip was oriented in a Kern DSR15 analytical plotter using the bundle adjustment module of Multi-Model Stereo Restitution Software (MMSRS). We experimented with several systematic-control-point configurations to evaluate orientation accuracies as a function of the number and position of control points. We recommend establishing control-point columns (each containing 2-3 points) in every 5th photo to achieve the 7-mm Root Mean Square Error (RMSE) accuracy required by our trench-mapping project. 7 refs., 8 figs., 1 tab.

  12. Close-range geophotogrammetric mapping of trench walls using multi-model stereo restitution software

    International Nuclear Information System (INIS)

    Coe, J.A.; Taylor, E.M.; Schilling, S.P.

    1991-01-01

    Methods for mapping geologic features exposed on trench walls have advanced from conventional gridding and sketch mapping to precise close-range photogrammetric mapping. In our study, two strips of small-format (60 x 60) stereo pairs, each containing 42 photos and covering approximately 60 m of nearly vertical trench wall (2-4 m high), were contact printed onto eight 205 x 255-mm transparent film sheets. Each strip was oriented in a Kern DSR15 analytical plotter using the bundle adjustment module of Multi-Model Stereo Restitution Software (MMSRS). We experimented with several systematic-control-point configurations to evaluate orientation accuracies as a function of the number and position of control points. We recommend establishing control-point columns (each containing 2-3 points) in every 5th photo to achieve the 7-mm Root Mean Square Error (RMSE) accuracy required by our trench-mapping project. 7 refs., 8 figs., 1 tab

  13. Machine learning-based dual-energy CT parametric mapping.

    Science.gov (United States)

    Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Al Helo, Rose; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C; Rassouli, Negin; Gilkeson, Robert C; Traughber, Bryan J; Cheng, Chee-Wai; Muzic, Raymond F

    2018-05-22

    The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Zeff), relative electron density (ρe), mean excitation energy (Ix), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 seconds. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency. . © 2018 Institute of Physics and Engineering in

  14. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Ren, S [Stanford University, Stanford, CA (United States); Tianjin University, Tianjin (China); Hara, W; Le, Q; Wang, L; Xing, L; Li, R [Stanford University, Stanford, CA (United States)

    2016-06-15

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  15. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    International Nuclear Information System (INIS)

    Ren, S; Hara, W; Le, Q; Wang, L; Xing, L; Li, R

    2016-01-01

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  16. Application of statistical parametric mapping in PET and SPECT brain functional imaging

    International Nuclear Information System (INIS)

    Guo Wanhua

    2002-01-01

    Regional of interest (ROI) is the method regularly used to analyze brain functional imaging. But, due to its obvious shortcomings such as subjectivity and poor reproducibility, precise analyzing the brain function was seriously limited. Therefore, statistical parametric mapping (SPM) as an automatic analyze software was developed based on voxel or pixel to resolve this problem. Using numerous mathematical models, it can be used to statistically assess the whole brain pixel. Present review introduces its main principle, modular composition and practical application. It can be concluded, with development of neuroscience, the SPM software will be used more widely in relative field, like neurobiology, cognition and neuropharmacology

  17. Multi-level approach for parametric roll analysis

    Science.gov (United States)

    Kim, Taeyoung; Kim, Yonghwan

    2011-03-01

    The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF), and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude- Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  18. Documenting the location of systematic transrectal ultrasound-guided prostate biopsies: correlation with multi-parametric MRI.

    Science.gov (United States)

    Turkbey, Baris; Xu, Sheng; Kruecker, Jochen; Locklin, Julia; Pang, Yuxi; Shah, Vijay; Bernardo, Marcelino; Baccala, Angelo; Rastinehad, Ardeshir; Benjamin, Compton; Merino, Maria J; Wood, Bradford J; Choyke, Peter L; Pinto, Peter A

    2011-03-29

    During transrectal ultrasound (TRUS)-guided prostate biopsies, the actual location of the biopsy site is rarely documented. Here, we demonstrate the capability of TRUS-magnetic resonance imaging (MRI) image fusion to document the biopsy site and correlate biopsy results with multi-parametric MRI findings. Fifty consecutive patients (median age 61 years) with a median prostate-specific antigen (PSA) level of 5.8 ng/ml underwent 12-core TRUS-guided biopsy of the prostate. Pre-procedural T2-weighted magnetic resonance images were fused to TRUS. A disposable needle guide with miniature tracking sensors was attached to the TRUS probe to enable fusion with MRI. Real-time TRUS images during biopsy and the corresponding tracking information were recorded. Each biopsy site was superimposed onto the MRI. Each biopsy site was classified as positive or negative for cancer based on the results of each MRI sequence. Sensitivity, specificity, and receiver operating curve (ROC) area under the curve (AUC) values were calculated for multi-parametric MRI. Gleason scores for each multi-parametric MRI pattern were also evaluated. Six hundred and 5 systemic biopsy cores were analyzed in 50 patients, of whom 20 patients had 56 positive cores. MRI identified 34 of 56 positive cores. Overall, sensitivity, specificity, and ROC area values for multi-parametric MRI were 0.607, 0.727, 0.667, respectively. TRUS-MRI fusion after biopsy can be used to document the location of each biopsy site, which can then be correlated with MRI findings. Based on correlation with tracked biopsies, T2-weighted MRI and apparent diffusion coefficient maps derived from diffusion-weighted MRI are the most sensitive sequences, whereas the addition of delayed contrast enhancement MRI and three-dimensional magnetic resonance spectroscopy demonstrated higher specificity consistent with results obtained using radical prostatectomy specimens.

  19. Comparison of Absolute Apparent Diffusion Coefficient (ADC) Values in ADC Maps Generated Across Different Postprocessing Software: Reproducibility in Endometrial Carcinoma.

    Science.gov (United States)

    Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan

    2017-12-01

    Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.

  20. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  1. Parametric perturbations and suppression of chaos in n-dimensional maps

    International Nuclear Information System (INIS)

    Loskutov, A.Y.; Rybalko, S.D.

    1994-11-01

    The problem of a qualitative change in dynamics of n-dimensional chaotic maps under the influence of parametric perturbations is considered. We prove that for certain maps, - the quadratic maps family, a piece wise linear maps family, and a two-dimensional map having a hyberbolic attractor, - there are perturbations which lead to suppression of chaos. Arguments that for such maps the set of parameter values corresponding to the ordered behaviour has the positive Lebesgue measure, are given. (author). 36 refs, 12 figs

  2. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  3. Design of parametric software tools

    DEFF Research Database (Denmark)

    Sabra, Jakob Borrits; Mullins, Michael

    2011-01-01

    The studies investigate the field of evidence-based design used in architectural design practice and propose a method using 2D/3D CAD applications to: 1) enhance integration of evidence-based design knowledge in architectural design phases with a focus on lighting and interior design and 2) assess...... fulfilment of evidence-based design criterion regarding light distribution and location in relation to patient safety in architectural health care design proposals. The study uses 2D/3D CAD modelling software Rhinoceros 3D with plug-in Grasshopper to create parametric tool prototypes to exemplify...... the operations and functions of the design method. To evaluate the prototype potentials, surveys with architectural and healthcare design companies are conducted. Evaluation is done by the administration of questionnaires being part of the development of the tools. The results show that architects, designers...

  4. A parametric visualization software for the assignment problem

    Directory of Open Access Journals (Sweden)

    Papamanthou Charalampos

    2005-01-01

    Full Text Available In this paper we present a parametric visualization software used to assist the teaching of the Network Primal Simplex Algorithm for the assignment problem (AP. The assignment problem is a special case of the balanced transportation problem. The main functions of the algorithm and design techniques are also presented. Through this process, we aim to underline the importance and necessity of using such educational methods in order to improve the teaching of Computer Algorithms.

  5. Multi-parametric variational data assimilation for hydrological forecasting

    Science.gov (United States)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  6. Computer-aided diagnosis of prostate cancer using multi-parametric MRI: comparison between PUN and Tofts models

    Science.gov (United States)

    Mazzetti, S.; Giannini, V.; Russo, F.; Regge, D.

    2018-05-01

    Computer-aided diagnosis (CAD) systems are increasingly being used in clinical settings to report multi-parametric magnetic resonance imaging (mp-MRI) of the prostate. Usually, CAD systems automatically highlight cancer-suspicious regions to the radiologist, reducing reader variability and interpretation errors. Nevertheless, implementing this software requires the selection of which mp-MRI parameters can best discriminate between malignant and non-malignant regions. To exploit functional information, some parameters are derived from dynamic contrast-enhanced (DCE) acquisitions. In particular, much CAD software employs pharmacokinetic features, such as K trans and k ep, derived from the Tofts model, to estimate a likelihood map of malignancy. However, non-pharmacokinetic models can be also used to describe DCE-MRI curves, without any requirement for prior knowledge or measurement of the arterial input function, which could potentially lead to large errors in parameter estimation. In this work, we implemented an empirical function derived from the phenomenological universalities (PUN) class to fit DCE-MRI. The parameters of the PUN model are used in combination with T2-weighted and diffusion-weighted acquisitions to feed a support vector machine classifier to produce a voxel-wise malignancy likelihood map of the prostate. The results were all compared to those for a CAD system based on Tofts pharmacokinetic features to describe DCE-MRI curves, using different quality aspects of image segmentation, while also evaluating the number and size of false positive (FP) candidate regions. This study included 61 patients with 70 biopsy-proven prostate cancers (PCa). The metrics used to evaluate segmentation quality between the two CAD systems were not statistically different, although the PUN-based CAD reported a lower number of FP, with reduced size compared to the Tofts-based CAD. In conclusion, the CAD software based on PUN parameters is a feasible means with which to

  7. Variability in Multi-Tenant Enterprise Software

    OpenAIRE

    Kabbedijk, J.

    2014-01-01

    Enterprise software applications have changed significantly over the last decades. Increasingly, software is deployed in a central location to be accessed through the internet, instead of installing software at end-users. Having software in a central location enables multi-tenancy, where multiple customers transparently share a system’s resources. Currently, multi-tenancy is a popular way to offer functionality of a software product through the internet to numerous customers, offering many ad...

  8. Generating Multi-Destination Maps.

    Science.gov (United States)

    Zhang, Junsong; Fan, Jiepeng; Luo, Zhenshan

    2017-08-01

    Multi-destination maps are a kind of navigation maps aimed to guide visitors to multiple destinations within a region, which can be of great help to urban visitors. However, they have not been developed in the current online map service. To address this issue, we introduce a novel layout model designed especially for generating multi-destination maps, which considers the global and local layout of a multi-destination map. We model the layout problem as a graph drawing that satisfies a set of hard and soft constraints. In the global layout phase, we balance the scale factor between ROIs. In the local layout phase, we make all edges have good visibility and optimize the map layout to preserve the relative length and angle of roads. We also propose a perturbation-based optimization method to find an optimal layout in the complex solution space. The multi-destination maps generated by our system are potential feasible on the modern mobile devices and our result can show an overview and a detail view of the whole map at the same time. In addition, we perform a user study to evaluate the effectiveness of our method, and the results prove that the multi-destination maps achieve our goals well.

  9. Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.

    Science.gov (United States)

    Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J

    2017-10-20

    This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.

  10. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    Software process improvement in small, agile organizations is often problematic. Model-based approaches seem to overlook problems. We have been seeking an alternative approach to overcome this through action research. Here we report on a piece of action research from which we developed an approach...... to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...... software process improvement....

  11. Application of computer-generated functional (parametric) maps in radionuclide renography

    International Nuclear Information System (INIS)

    Agress, H. Jr.; Levenson, S.M.; Gelfand, M.J.; Green, M.V.; Bailey, J.J.; Johnston, G.S.

    1975-01-01

    A functional (parametric) map is a single visual display of regional dynamic phenomena which facilitates interpretation of the nature of focal abnormalities in renal function. Methods for producing several kinds of functional maps based on computer calculations of radionuclide scan data are briefly described. Three abnormal cases are presented to illustrate the use of functional maps to separate focal lesions and to specify the dynamic nature of the abnormalities in a way which is difficult to achieve with conventional sequential renal scans and renograms alone

  12. IDAS, software support for mathematical models and map-based graphics

    International Nuclear Information System (INIS)

    Birnbaum, M.D.; Wecker, D.B.

    1984-01-01

    IDAS (Intermediate Dose Assessment System) was developed for the U.S. Nuclear Regulatory Commission as a hardware/software host for radiological models and display of map-based plume graphics at the Operations Center (HQ), regional incident response centers, and site emergency facilities. IDAS design goals acknowledged the likelihood of future changes in the suite of models and the composition of map features for analysis and graphical display. IDAS provides a generalized software support environment to programmers and users of modeling programs. A database manager process provides multi-user access control to all input and output data for modeling programs. A programmer-created data description file (schema) specifies data field names, data types, legal and recommended ranges, default values, preferred units of measurement, and ''help'' text. Subroutine calls to IDAS from a model program invoke a consistent user interface which can show any of the schema contents, convert units of measurement, and route data to multiple logical devices, including the database. A stand-alone data editor allows the user to read and write model data records without execution of a model. IDAS stores digitized map features in a 4-level naming hierarchy. A user can select the map icon, color, and whether to show a stored name tag, for each map feature. The user also selects image scale (zoom) within limits set by map digitization. The resulting image combines static map information, computed analytic modeling results, and the user's feature selections for display to decision-makers

  13. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  14. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  15. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    Science.gov (United States)

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  16. Multi-pulse orbits and chaotic dynamics in motion of parametrically excited viscoelastic moving belt

    International Nuclear Information System (INIS)

    Zhang Wei; Yao Minghui

    2006-01-01

    In this paper, the Shilnikov type multi-pulse orbits and chaotic dynamics of parametrically excited viscoelastic moving belt are studied in detail. Using Kelvin-type viscoelastic constitutive law, the equations of motion for viscoelastic moving belt with the external damping and parametric excitation are given. The four-dimensional averaged equation under the case of primary parametric resonance is obtained by directly using the method of multiple scales and Galerkin's approach to the partial differential governing equation of viscoelastic moving belt. From the averaged equations obtained here, the theory of normal form is used to give the explicit expressions of normal form with a double zero and a pair of pure imaginary eigenvalues. Based on normal form, the energy-phrase method is employed to analyze the global bifurcations and chaotic dynamics in parametrically excited viscoelastic moving belt. The global bifurcation analysis indicates that there exist the heteroclinic bifurcations and the Silnikov type multi-pulse homoclinic orbits in the averaged equation. The results obtained above mean the existence of the chaos for the Smale horseshoe sense in parametrically excited viscoelastic moving belt. The chaotic motions of viscoelastic moving belts are also found by using numerical simulation. A new phenomenon on the multi-pulse jumping orbits is observed from three-dimensional phase space

  17. Variability in Multi-Tenant Enterprise Software

    NARCIS (Netherlands)

    Kabbedijk, J.

    2014-01-01

    Enterprise software applications have changed significantly over the last decades. Increasingly, software is deployed in a central location to be accessed through the internet, instead of installing software at end-users. Having software in a central location enables multi-tenancy, where multiple

  18. Multi-parametric MRI findings of granulomatous prostatitis developing after intravesical bacillus calmette-guérin therapy.

    Science.gov (United States)

    Gottlieb, Josh; Princenthal, Robert; Cohen, Martin I

    2017-07-01

    To evaluate the multi-parametric MRI (mpMRI) findings in patients with biopsy-proven granulomatous prostatitis and prior Bacillus Calmette-Guérin (BCG) exposure. MRI was performed in six patients with pathologically proven granulomatous prostatitis and a prior history of bladder cancer treated with intravesical BCG therapy. Multi-parametric prostate MRI images were recorded on a GE 750W or Philips Achieva 3.0 Tesla MRI scanner with high-resolution, small-field-of-view imaging consisting of axial T2, axial T1, coronal T2, sagittal T2, axial multiple b-value diffusion (multiple values up to 1200 or 1400), and dynamic contrast-enhanced 3D axial T1 with fat suppression sequence. Two different patterns of MR findings were observed. Five of the six patients had a low mean ADC value prostatitis. The other pattern seen in one of the six patients was decreased signal on the ADC map images with increased signal on the high-b-value sequence, revealing true restricted diffusion indistinguishable from aggressive prostate cancer. This patient had biopsy-confirmed acute BCG prostatitis. Our study suggests that patients with known BCG exposure and PI-RADS v2 scores ≤3, showing similar mpMRI findings as demonstrated, may not require prostate biopsy.

  19. Massively multi-parametric immunoassays using ICPMS

    International Nuclear Information System (INIS)

    Tanner, S.D.; Ornatsky, O.; Bandura, D.R.; Baranov, V.I.

    2009-01-01

    The use of stable isotopes as tags in immunoassays, and their determination by ICPMS, is poised to have a huge impact on multi-parametric bioanalysis. A new technology, which we term 'mass cytometry', enables high throughput, highly multiplexed individual cell analysis. Preliminary results for T-cell immunophenotyping in peripheral blood mononuclear cells (PBMC), agonist influence on concomitant phosphorylation pathways, and sub-classification of acute myeloid leukemia patients' samples will be presented. The significance of individual cell analysis is demonstrated by the identification of populations of rogue cells in PBMC samples through the use of multidimensional neural network cluster analysis. (author)

  20. Optimizing a machine learning based glioma grading system using multi-parametric MRI histogram and texture features.

    Science.gov (United States)

    Zhang, Xin; Yan, Lin-Feng; Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin

    2017-07-18

    Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization.

  1. The parametric open-plus-closed-loop control of chaotic maps and its robustness

    International Nuclear Information System (INIS)

    Chen Liqun

    2004-01-01

    This paper proposes a parametric open-plus-closed-loop control approach to controlling chaos. The logistic map is treated as an example to demonstrate the application of the proposed approach. It is proved that the approach is robust to the model error. Its relations to the open-plus-closed-loop control and the parametric entrainment control are discussed

  2. Ordered and isomorphic mapping of periodic structures in the parametrically forced logistic map

    Energy Technology Data Exchange (ETDEWEB)

    Maranhão, Dariel M., E-mail: dariel@ifsp.edu.br [Departamento de Ciências e Matemática, Instituto Federal de Educação, Ciência e Tecnologia de São Paulo, São Paulo (Brazil); Diretoria de Informática, Universidade Nove de Julho, São Paulo (Brazil)

    2016-09-23

    Highlights: • A direct description of the internal structure of a periodic window in terms of winding numbers is proposed. • Periodic structures in parameter spaces are mapped in a recurrent and isomorphic way. • Sequences of winding numbers show global and local organization of periodic domains. - Abstract: We investigate the periodic domains found in the parametrically forced logistic map, the classical logistic map when its control parameter changes dynamically. Phase diagrams in two-parameter spaces reveal intricate periodic structures composed of patterns of intersecting superstable orbits curves, defining the cell of a periodic window. Cells appear multifoliated and ordered, and they are isomorphically mapped when one changes the map parameters. Also, we identify the characteristics of simplest cell and apply them to other more complex, discussing how the topography on parameter space is affected. By use of the winding number as defined in periodically forced oscillators, we show that the hierarchical organization of the periodic domains is manifested in global and local scales.

  3. Non-parametric estimation of the individual's utility map

    OpenAIRE

    Noguchi, Takao; Sanborn, Adam N.; Stewart, Neil

    2013-01-01

    Models of risky choice have attracted much attention in behavioural economics. Previous research has repeatedly demonstrated that individuals' choices are not well explained by expected utility theory, and a number of alternative models have been examined using carefully selected sets of choice alternatives. The model performance however, can depend on which choice alternatives are being tested. Here we develop a non-parametric method for estimating the utility map over the wide range of choi...

  4. Multi-Level Formation of Complex Software Systems

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-05-01

    Full Text Available We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices.

  5. ABMapper: a suffix array-based tool for multi-location searching and splice-junction mapping.

    Science.gov (United States)

    Lou, Shao-Ke; Ni, Bing; Lo, Leung-Yau; Tsui, Stephen Kwok-Wing; Chan, Ting-Fung; Leung, Kwong-Sak

    2011-02-01

    Sequencing reads generated by RNA-sequencing (RNA-seq) must first be mapped back to the genome through alignment before they can be further analyzed. Current fast and memory-saving short-read mappers could give us a quick view of the transcriptome. However, they are neither designed for reads that span across splice junctions nor for repetitive reads, which can be mapped to multiple locations in the genome (multi-reads). Here, we describe a new software package: ABMapper, which is specifically designed for exploring all putative locations of reads that are mapped to splice junctions or repetitive in nature. The software is freely available at: http://abmapper.sourceforge.net/. The software is written in C++ and PERL. It runs on all major platforms and operating systems including Windows, Mac OS X and LINUX.

  6. Multi-moment maps

    DEFF Research Database (Denmark)

    Swann, Andrew Francis; Madsen, Thomas Bruun

    2012-01-01

    We introduce a notion of moment map adapted to actions of Lie groups that preserve a closed three-form. We show existence of our multi-moment maps in many circumstances, including mild topological assumptions on the underlying manifold. Such maps are also shown to exist for all groups whose second...

  7. A novel approach to study effects of asymmetric stiffness on parametric instabilities of multi-rotor-system

    Science.gov (United States)

    Jain, Anuj Kumar; Rastogi, Vikas; Agrawal, Atul Kumar

    2018-01-01

    The main focus of this paper is to study effects of asymmetric stiffness on parametric instabilities of multi-rotor-system through extended Lagrangian formalism, where symmetries are broken in terms of the rotor stiffness. The complete insight of dynamic behaviour of multi-rotor-system with asymmetries is evaluated through extension of Lagrangian equation with a case study. In this work, a dynamic mathematical model of a multi-rotor-system through a novel approach of extension of Lagrangian mechanics is developed, where the system is having asymmetries due to varying stiffness. The amplitude and the natural frequency of the rotor are obtained analytically through the proposed methodology. The bond graph modeling technique is used for modeling the asymmetric rotor. Symbol-shakti® software is used for the simulation of the model. The effects of the stiffness of multi-rotor-system on amplitude and frequencies are studied using numerical simulation. Simulation results show a considerable agreement with the theoretical results obtained through extended Lagrangian formalism. It is further shown that amplitude of the rotor increases inversely the stiffness of the rotor up to a certain limit, which is also affirmed theoretically.

  8. A Parametric Study on Welding Process Simulation for Multi-pass welds in a Plate

    Energy Technology Data Exchange (ETDEWEB)

    Park, Won Dong; Bahn, Chi Bum; Kim, Ji Hun [Pusan National University, Busan (Korea, Republic of)

    2016-05-15

    EPRI (MRP-316, 317) (1,2) and USNRC (NUREG-2162)(3) have performed related studies for FEA models to predict the weld residual stress distribution. In this work, a systematic parametric study was performed to find out how major assumptions and conditions used in the simulation could affect the weld residual stress distribution. 2- dimensional simulation was conducted by using commercial FEA software, ABAQUS(4) , for multi-pass Alloy 82 welds performed in a stainless steel plate (EPRI MRP-316, P-4, phase 1). From the previous results, we could make the following conclusions. 1. The method of applying power density is more realistic than predefined temperature. 2. It seems that annealing effect reduces the transverse direction weld residual stress (S33). However more detailed analyses for annealing effect are needed.

  9. WE-AB-202-12: Voxel-Wise Analysis of Apparent Diffusion Coefficient and Perfusion Maps in Multi-Parametric MRI of Prostate Cancer

    International Nuclear Information System (INIS)

    Engstroem, K; Casares-Magaz, O; Muren, L; Roervik, J; Andersen, E

    2016-01-01

    Purpose: Multi-parametric MRI (mp-MRI) is being introduced in radiotherapy (RT) of prostate cancer, including for tumour delineation in focal boosting strategies. We recently developed an image-based tumour control probability model, based on cell density distributions derived from apparent diffusion coefficient (ADC) maps. Beyond tumour volume and cell densities, tumour hypoxia is also an important determinant of RT response. Since tissue perfusion from mp-MRI has been related to hypoxia we have explored the patterns of ADC and perfusion maps, and the relations between them, inside and outside prostate index lesions. Methods: ADC and perfusion maps from 20 prostate cancer patients were used, with the prostate and index lesion delineated by a dedicated uro-radiologist. To reduce noise, the maps were averaged over a 3×3×3 voxel cube. Associations between different ADC and perfusion histogram parameters within the prostate, inside and outside the index lesion, were evaluated with the Pearson’s correlation coefficient. In the voxel-wise analysis, scatter plots of ADC vs perfusion were analysed for voxels in the prostate, inside and outside of the index lesion, again with the associations quantified with the Pearson’s correlation coefficient. Results: Overall ADC was lower inside the index lesion than in the normal prostate as opposed to ktrans that was higher inside the index lesion than outside. In the histogram analysis, the minimum ktrans was significantly correlated with the maximum ADC (Pearson=0.47; p=0.03). At the voxel level, 15 of the 20 cases had a statistically significant inverse correlation between ADC and perfusion inside the index lesion; ten of the cases had a Pearson < −0.4. Conclusion: The minimum value of ktrans across the tumour was correlated to the maximum ADC. However, on the voxel level, the ‘local’ ktrans in the index lesion is inversely (i.e. negatively) correlated to the ‘local’ ADC in most patients. Research agreement with

  10. Parametric mapping using spectral analysis for 11C-PBR28 PET reveals neuroinflammation in mild cognitive impairment subjects.

    Science.gov (United States)

    Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul

    2018-07-01

    Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.

  11. Pixel-based parametric source depth map for Cerenkov luminescence imaging

    International Nuclear Information System (INIS)

    Altabella, L.; Spinelli, A.E.; Boschi, F.

    2016-01-01

    Optical tomography represents a challenging problem in optical imaging because of the intrinsically ill-posed inverse problem due to photon diffusion. Cerenkov luminescence tomography (CLT) for optical photons produced in tissues by several radionuclides (i.e.: 32P, 18F, 90Y), has been investigated using both 3D multispectral approach and multiviews methods. Difficult in convergence of 3D algorithms can discourage to use this technique to have information of depth and intensity of source. For these reasons, we developed a faster 2D corrected approach based on multispectral acquisitions, to obtain source depth and its intensity using a pixel-based fitting of source intensity. Monte Carlo simulations and experimental data were used to develop and validate the method to obtain the parametric map of source depth. With this approach we obtain parametric source depth maps with a precision between 3% and 7% for MC simulation and 5–6% for experimental data. Using this method we are able to obtain reliable information about the source depth of Cerenkov luminescence with a simple and flexible procedure

  12. Poster: A Software-Defined Multi-Camera Network

    OpenAIRE

    Chen, Po-Yen; Chen, Chien; Selvaraj, Parthiban; Claesen, Luc

    2016-01-01

    The widespread popularity of OpenFlow leads to a significant increase in the number of applications developed in SoftwareDefined Networking (SDN). In this work, we propose the architecture of a Software-Defined Multi-Camera Network consisting of small, flexible, economic, and programmable cameras which combine the functions of the processor, switch, and camera. A Software-Defined Multi-Camera Network can effectively reduce the overall network bandwidth and reduce a large amount of the Capex a...

  13. Performance of non-parametric algorithms for spatial mapping of tropical forest structure

    Directory of Open Access Journals (Sweden)

    Liang Xu

    2016-08-01

    Full Text Available Abstract Background Mapping tropical forest structure is a critical requirement for accurate estimation of emissions and removals from land use activities. With the availability of a wide range of remote sensing imagery of vegetation characteristics from space, development of finer resolution and more accurate maps has advanced in recent years. However, the mapping accuracy relies heavily on the quality of input layers, the algorithm chosen, and the size and quality of inventory samples for calibration and validation. Results By using airborne lidar data as the “truth” and focusing on the mean canopy height (MCH as a key structural parameter, we test two commonly-used non-parametric techniques of maximum entropy (ME and random forest (RF for developing maps over a study site in Central Gabon. Results of mapping show that both approaches have improved accuracy with more input layers in mapping canopy height at 100 m (1-ha pixels. The bias-corrected spatial models further improve estimates for small and large trees across the tails of height distributions with a trade-off in increasing overall mean squared error that can be readily compensated by increasing the sample size. Conclusions A significant improvement in tropical forest mapping can be achieved by weighting the number of inventory samples against the choice of image layers and the non-parametric algorithms. Without future satellite observations with better sensitivity to forest biomass, the maps based on existing data will remain slightly biased towards the mean of the distribution and under and over estimating the upper and lower tails of the distribution.

  14. Hardware-Software Complex for Functional and Parametric Tests of ARM Microcontrollers STM32F1XX

    Directory of Open Access Journals (Sweden)

    Egorov Aleksey

    2016-01-01

    Full Text Available The article presents the hardware-software complex for functional and parametric tests of ARM microcontrollers STM32F1XX. The complex is based on PXI devices by National Instruments and LabVIEW software environment. Data exchange procedure between a microcontroller under test and the complex hardware is describes. Some test results are also presented.

  15. WIN Energy: A case study in using MultiSpeak to enable best of breed software selection

    Energy Technology Data Exchange (ETDEWEB)

    Wolven, G. [WIN Energy REMC, Vincennes, IN (United States)

    2004-10-01

    Automation of a small 16,000 member rural electric cooperative covering approximately 2,500 miles of distribution lines in Indiana, is described. The project was undertaken in an effort to meet the challenge of annual load growth of 15 per cent over the last several years, and to keep rates low by investing in technological solutions. To ensure the best possible computer software in each area of operation, WIN Energy decided to use the Best Breed approach (in place of the 'single vendor' approach) to select software for accounting, staking, mapping, automated mater reading and customer information systems. This decision was taken despite the obvious difficulties involved in getting software vendors to communicate willingly among themselves, and to come up with the custom interfaces or integration between the various systems. Based on the success of their participation in a cooperative study to test the viability of interfacing different software systems using a software specification called MultiSpeak, WIN Energy decided to focus on MultiSpeak compliant products. This article describes the implementation of the following software packages: Minimax Stakeout for field design and automation, Lookout for utility-wide map viewing, the ArcGIS geographic information system, Hunt Technologies' AMR for automated meter reading, NISC's CAPsXL+ financial accounting and Milsoft's Windmill for use in engineering analysis. To date, implementation is proceeding smoothly. Plans include the addition of Milsoft's DisSPatch Outage package at a future date.

  16. Multi-parametric MR imaging for prostate carcinoma; Multiparametrische MR-Bildgebung beim Prostatakarzinom

    Energy Technology Data Exchange (ETDEWEB)

    Schlemmer, Heinz-Peter [Deutsches Krebsforschungszentrum, Heidelberg (Germany). Abt. Radiologie

    2017-03-15

    Multi-parametric NMR imaging in case of prostate carcinoma can improve diagnostics, allows reliable prognostic estimations and helps to find the optimum individual therapy. The contribution is focused to deliver the needed methodological tools and background knowledge for the daily routine.

  17. Development of a specific geological mapping software under MAPGIS

    International Nuclear Information System (INIS)

    Zhang Wenkai

    2010-01-01

    The most often used mapping software in geological exploration is MAPGIS system, and related standard is established based on it. The software has more agile functions, except for the following shortages: more parameters to select, difficult to master, different parameters to use for each one, low efficiency. As a result, a specific software is developed for geological mapping by using VC++ on the platform of MAPGIS. According to the standards, toolbars are built for strata, rock, geographic information and materials, etc. By pushing on the buttons, the parameters are selected, and menus of toolbars can be modified to select parameters for each working areas, legends can be sorted automatically. So, the speed can be improved greatly, and the parameters can be identical. The software can complete the transition between Gauss coordinate and longitude-latitude coordinate, drawing points, frames by longitude-latitude, responsible form, plain diagram and profile, etc. The software also improves the way of clipping, topologizing, node catching methods. The application of the software indicates that it can improve the speed of geological mapping greatly, and can improve the standardized level of the final maps. (authors)

  18. Incorporating Oxygen-Enhanced MRI into Multi-Parametric Assessment of Human Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Heling Zhou

    2017-08-01

    Full Text Available Hypoxia is associated with prostate tumor aggressiveness, local recurrence, and biochemical failure. Magnetic resonance imaging (MRI offers insight into tumor pathophysiology and recent reports have related transverse relaxation rate (R2* and longitudinal relaxation rate (R1 measurements to tumor hypoxia. We have investigated the inclusion of oxygen-enhanced MRI for multi-parametric evaluation of tumor malignancy. Multi-parametric MRI sequences at 3 Tesla were evaluated in 10 patients to investigate hypoxia in prostate cancer prior to radical prostatectomy. Blood oxygen level dependent (BOLD, tissue oxygen level dependent (TOLD, dynamic contrast enhanced (DCE, and diffusion weighted imaging MRI were intercorrelated and compared with the Gleason score. The apparent diffusion coefficient (ADC was significantly lower in tumor than normal prostate. Baseline R2* (BOLD-contrast was significantly higher in tumor than normal prostate. Upon the oxygen breathing challenge, R2* decreased significantly in the tumor tissue, suggesting improved vascular oxygenation, however changes in R1 were minimal. R2* of contralateral normal prostate decreased in most cases upon oxygen challenge, although the differences were not significant. Moderate correlation was found between ADC and Gleason score. ADC and R2* were correlated and trends were found between Gleason score and R2*, as well as maximum-intensity-projection and area-under-the-curve calculated from DCE. Tumor ADC and R2* have been associated with tumor hypoxia, and thus the correlations are of particular interest. A multi-parametric approach including oxygen-enhanced MRI is feasible and promises further insights into the pathophysiological information of tumor microenvironment.

  19. Closed forms and multi-moment maps

    DEFF Research Database (Denmark)

    Madsen, Thomas Bruun; Swann, Andrew Francis

    2013-01-01

    We extend the notion of multi-moment map to geometries defined by closed forms of arbitrary degree. We give fundamental existence and uniqueness results and discuss a number of essential examples, including geometries related to special holonomy. For forms of degree four, multi-moment maps are gu...

  20. Reliable single chip genotyping with semi-parametric log-concave mixtures.

    Directory of Open Access Journals (Sweden)

    Ralph C A Rippe

    Full Text Available The common approach to SNP genotyping is to use (model-based clustering per individual SNP, on a set of arrays. Genotyping all SNPs on a single array is much more attractive, in terms of flexibility, stability and applicability, when developing new chips. A new semi-parametric method, named SCALA, is proposed. It is based on a mixture model using semi-parametric log-concave densities. Instead of using the raw data, the mixture is fitted on a two-dimensional histogram, thereby making computation time almost independent of the number of SNPs. Furthermore, the algorithm is effective in low-MAF situations.Comparisons between SCALA and CRLMM on HapMap genotypes show very reliable calling of single arrays. Some heterozygous genotypes from HapMap are called homozygous by SCALA and to lesser extent by CRLMM too. Furthermore, HapMap's NoCalls (NN could be genotyped by SCALA, mostly with high probability. The software is available as R scripts from the website www.math.leidenuniv.nl/~rrippe.

  1. Functional brain mapping using H215O positron emission tomography (I): statistical parametric mapping method

    International Nuclear Information System (INIS)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul

    1998-01-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H 2 15 O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory

  2. Parametric Response Mapping as an Indicator of Bronchiolitis Obliterans Syndrome after Hematopoietic Stem Cell Transplantation

    NARCIS (Netherlands)

    Galban, Craig J.; Boes, Jennifer L.; Bule, Maria; Kitko, Carrie L.; Couriel, Daniel R.; Johnson, Timothy D.; Lama, Vihba; Telenga, Eef D.; van den Berge, Maarten; Rehemtulla, Alnawaz; Kazerooni, Ella A.; Ponkowski, Michael J.; Ross, Brian D.; Yanik, Gregory A.

    2014-01-01

    The management of bronchiolitis obliterans syndrome (BOS) after hematopoietic cell transplantation presents many challenges, both diagnostically and therapeutically. We developed a computed tomography (CT) voxel-wise methodology termed parametric response mapping (PRM) that quantifies normal

  3. An approach to multi-attribute utility analysis under parametric uncertainty

    International Nuclear Information System (INIS)

    Kelly, M.; Thorne, M.C.

    2001-01-01

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States

  4. Dual frequency parametric excitation of a nonlinear, multi degree of freedom mechanical amplifier with electronically modified topology

    Science.gov (United States)

    Dolev, A.; Bucher, I.

    2018-04-01

    Mechanical or electromechanical amplifiers can exploit the high-Q and low noise features of mechanical resonance, in particular when parametric excitation is employed. Multi-frequency parametric excitation introduces tunability and is able to project weak input signals on a selected resonance. The present paper addresses multi degree of freedom mechanical amplifiers or resonators whose analysis and features require treatment of the spatial as well as temporal behavior. In some cases, virtual electronic coupling can alter the given topology of the resonator to better amplify specific inputs. An analytical development is followed by a numerical and experimental sensitivity and performance verifications, illustrating the advantages and disadvantages of such topologies.

  5. Use of statistical parametric mapping of 18F-FDG-PET in frontal lobe epilepsy

    International Nuclear Information System (INIS)

    Plotkin, M.; Amthauer, H.; Luedemann, L.; Hartkop, E.; Ruf, J.; Gutberlet, M.; Bertram, H.; Felix, R.; Venz, St.; Merschhemke, M.; Meencke, H.-J.

    2003-01-01

    Aim: Evaluation of the use of statistical parametrical mapping (SPM) of FDG-PET for seizure lateralization in frontal lobe epilepsy. Patients: 38 patients with suspected frontal lobe epilepsy supported by clinical findings and video-EEG monitoring. Method: Statistical parametrical maps were generated by subtraction of individual scans from a control group, formed by 16 patients with negative neurological/psychiatric history and no abnormalities in the MR scan. The scans were also analyzed visually as well as semiquantitatively by manually drawn ROIs. Results: SPM showed a better accordance to the results of surface EEG monitoring compared with visual scan analysis and ROI quantification. In comparison with intracranial EEG recordings, the best performance was achieved by combining the ROI based quantification with SPM analysis. Conclusion: These findings suggest that SPM analysis of FDG-PET data could be a useful as complementary tool in the evaluation of seizure focus lateralization in patients with supposed frontal lobe epilepsy. (orig.)

  6. Parametric tools over crowdsourced maps as means for participatory consideration of environmental issues in cities

    Science.gov (United States)

    Montoya, Paula; Ballesteros, José; Gervás, Pablo

    2015-04-01

    The increasing complexity of space use and resource cycles in cities, demands an understanding of the built environment as "ecological": enabling mutation while remaining balanced and biologically sustainable. Designing man`s environment is no longer a question of defining types, but rather an act of inserting changes within a complex system. Architecture and urban planning have become increasingly aware of their condition as system-oriented disciplines, and they are in the process of developing the necessary languages, design tools, and alliances. We will argue the relevance of parametric maps as one of the most powerful of those tools, in terms of their potential for adaptive prototype design, convergence of disciplines, and collaborative work. Cities need to change in order to survive. As the main human landscape (by 2050 75% of the world's population will live in urban areas) cities follow biological patterns of behaviour, constantly replacing their cells, renovating infrastructure systems and refining methods for energy provision and waste management. They need to adapt constantly. As responsive entities, they develop their own protocols for reaction to environmental change and challenge the increasing pressure of several issues related to scale: population, mobility, water and energy supply, pollution... The representation of these urban issues on maps becomes crucial for understanding and addressing them in design. Maps enhanced with parametric tools are relational and not only they register environmental dynamics but they allow adaptation of the system through interwoven parameters of mutation. Citizens are taking part in decisions and becoming aware of their role as urban experts in a bottom-up design process of the cities where they live. Modern tools for dynamic visualisation and collaborative edition of maps have an important role to play in this process. More and more people consult maps on hand-held devices as part of their daily routine. The advent

  7. Influence of spatial beam inhomogeneities on the parameters of a petawatt laser system based on multi-stage parametric amplification

    International Nuclear Information System (INIS)

    Frolov, S A; Trunov, V I; Pestryakov, Efim V; Leshchenko, V E

    2013-01-01

    We have developed a technique for investigating the evolution of spatial inhomogeneities in high-power laser systems based on multi-stage parametric amplification. A linearised model of the inhomogeneity development is first devised for parametric amplification with the small-scale self-focusing taken into account. It is shown that the application of this model gives the results consistent (with high accuracy and in a wide range of inhomogeneity parameters) with the calculation without approximations. Using the linearised model, we have analysed the development of spatial inhomogeneities in a petawatt laser system based on multi-stage parametric amplification, developed at the Institute of Laser Physics, Siberian Branch of the Russian Academy of Sciences (ILP SB RAS). (control of laser radiation parameters)

  8. Shift of critical points in the parametrically modulated Henon map with coexisting attractors

    International Nuclear Information System (INIS)

    Saucedo-Solorio, J.M.; Pisarchik, A.N.; Aboites, V.

    2002-01-01

    We study how the critical point positions change in the parametrically modulated Henon map with coexisting period-1 and period-3 attractors. In particular, a new type of scaling law is found coinciding with that evidenced by laser experiments. We show that resonance phenomena play a crucial role in deformation of attractors and their basins of attraction

  9. VESsel GENeration Analysis (VESGEN): Innovative Vascular Mappings for Astronaut Exploration Health Risks and Human Terrestrial Medicine

    Science.gov (United States)

    Parsons-Wingerter, Patricia; Kao, David; Valizadegan, Hamed; Martin, Rodney; Murray, Matthew C.; Ramesh, Sneha; Sekaran, Srinivaas

    2017-01-01

    Currently, astronauts face significant health risks in future long-duration exploration missions such as colonizing the Moon and traveling to Mars. Numerous risks include greatly increased radiation exposures beyond the low earth orbit (LEO) of the ISS, and visual and ocular impairments in response to microgravity environments. The cardiovascular system is a key mediator in human physiological responses to radiation and microgravity. Moreover, blood vessels are necessarily involved in the progression and treatment of vascular-dependent terrestrial diseases such as cancer, coronary vessel disease, wound-healing, reproductive disorders, and diabetes. NASA developed an innovative, globally requested beta-level software, VESsel GENeration Analysis (VESGEN) to map and quantify vascular remodeling for application to astronaut and terrestrial health challenges. VESGEN mappings of branching vascular trees and networks are based on a weighted multi-parametric analysis derived from vascular physiological branching rules. Complex vascular branching patterns are determined by biological signaling mechanisms together with the fluid mechanics of multi-phase laminar blood flow.

  10. Variance in parametric images: direct estimation from parametric projections

    International Nuclear Information System (INIS)

    Maguire, R.P.; Leenders, K.L.; Spyrou, N.M.

    2000-01-01

    Recent work has shown that it is possible to apply linear kinetic models to dynamic projection data in PET in order to calculate parameter projections. These can subsequently be back-projected to form parametric images - maps of parameters of physiological interest. Critical to the application of these maps, to test for significant changes between normal and pathophysiology, is an assessment of the statistical uncertainty. In this context, parametric images also include simple integral images from, e.g., [O-15]-water used to calculate statistical parametric maps (SPMs). This paper revisits the concept of parameter projections and presents a more general formulation of the parameter projection derivation as well as a method to estimate parameter variance in projection space, showing which analysis methods (models) can be used. Using simulated pharmacokinetic image data we show that a method based on an analysis in projection space inherently calculates the mathematically rigorous pixel variance. This results in an estimation which is as accurate as either estimating variance in image space during model fitting, or estimation by comparison across sets of parametric images - as might be done between individuals in a group pharmacokinetic PET study. The method based on projections has, however, a higher computational efficiency, and is also shown to be more precise, as reflected in smooth variance distribution images when compared to the other methods. (author)

  11. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    International Nuclear Information System (INIS)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A.

    1998-01-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual's T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  12. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    Energy Technology Data Exchange (ETDEWEB)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A. [Austin and Repatriation Medical Centre, Heidelberg, VIC (Australia). Departments of Nuclear Medicine and Centre for PET Neurology and Clinical Neuropsychology

    1998-06-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual`s T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  13. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  14. Assessing pupil and school performance by non-parametric and parametric techniques

    NARCIS (Netherlands)

    de Witte, K.; Thanassoulis, E.; Simpson, G.; Battisti, G.; Charlesworth-May, A.

    2010-01-01

    This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall

  15. Assessment of three different software systems in the evaluation of dynamic MRI of the breast

    International Nuclear Information System (INIS)

    Kurz, K.D.; Steinhaus, D.; Klar, V.; Cohnen, M.; Wittsack, H.J.; Saleh, A.; Moedder, U.; Blondin, D.

    2009-01-01

    Objective: The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ('CADstream' and '3TP') and one self-developed software system ('Mammatool'). Materials and methods: Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. Results: There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. 'CADstream' showed the best score on subjective quality criteria. '3TP' showed the lowest number of false-positive results. 'Mammatool' produced the lowest number of benign tissues indicated with parametric overlay. Conclusion: All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable

  16. Assessment of three different software systems in the evaluation of dynamic MRI of the breast.

    Science.gov (United States)

    Kurz, K D; Steinhaus, D; Klar, V; Cohnen, M; Wittsack, H J; Saleh, A; Mödder, U; Blondin, D

    2009-02-01

    The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ("CADstream" and "3TP") and one self-developed software system ("Mammatool"). Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. "CADstream" showed the best score on subjective quality criteria. "3TP" showed the lowest number of false-positive results. "Mammatool" produced the lowest number of benign tissues indicated with parametric overlay. All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.

  17. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  18. [The primary research and development of software oversampling mapping system for electrocardiogram].

    Science.gov (United States)

    Zhou, Yu; Ren, Jie

    2011-04-01

    We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.

  19. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  20. Unified Multi-Layer among Software Defined Multi-Domain Optical Networks (Invited

    Directory of Open Access Journals (Sweden)

    Hui Yang

    2015-06-01

    Full Text Available The software defined networking (SDN enabled by OpenFlow protocol has gained popularity which can enable the network to be programmable and accommodate both fixed and flexible bandwidth services. In this paper, we present a unified multi-layer (UML architecture with multiple controllers and a dynamic orchestra plane (DOP for software defined multi-domain optical networks. The proposed architecture can shield the differences among various optical devices from multi-vendors and the details of connecting heterogeneous networks. The cross-domain services with on-demand bandwidth can be deployed via unified interfaces provided by the dynamic orchestra plane. Additionally, the globalization strategy and practical capture of signal processing are presented based on the architecture. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based testbed. The performance of globalization strategy under heavy traffic load scenario is also quantitatively evaluated based on UML architecture compared with other strategies in terms of blocking probability, average hops, and average resource consumption.

  1. Applying the metro map to software development management

    Science.gov (United States)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  2. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  3. Mapping of multi-floor buildings: A barometric approach

    DEFF Research Database (Denmark)

    Özkil, Ali Gürcan; Fan, Zhun; Xiao, Jizhong

    2011-01-01

    This paper presents a new method for mapping multi5floor buildings. The method combines laser range sensor for metric mapping and barometric pressure sensor for detecting floor transitions and map segmentation. We exploit the fact that the barometric pressure is a function of the elevation......, and it varies between different floors. The method is tested with a real robot in a typical indoor environment, and the results show that physically consistent multi5floor representations are achievable....

  4. Assessment of three different software systems in the evaluation of dynamic MRI of the breast

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, K.D. [Department of Radiology, Stavanger University Hospital, Postbox 8100, Stavanger (Norway)], E-mail: kurk@sus.no; Steinhaus, D. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: Daniele.Steinhaus@med.uni-duesseldorf.de; Klar, V. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: verena.klar@uni-duesseldorf.de; Cohnen, M. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: cohnen@med.uni-duesseldorf.de; Wittsack, H.J. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: wittsack@uni-duesseldorf.de; Saleh, A. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: saleh@uni-duesseldorf.de; Moedder, U. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: moedder@med.uni-duesseldorf.de; Blondin, D. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: blondin@med.uni-duesseldorf.de

    2009-02-15

    Objective: The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ('CADstream' and '3TP') and one self-developed software system ('Mammatool'). Materials and methods: Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. Results: There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. 'CADstream' showed the best score on subjective quality criteria. '3TP' showed the lowest number of false-positive results. 'Mammatool' produced the lowest number of benign tissues indicated with parametric overlay. Conclusion: All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.

  5. Functional brain mapping using H{sub 2}{sup 15}O positron emission tomography (I): statistical parametric mapping method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)

    1998-08-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H{sub 2}{sup 15}O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory.

  6. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  7. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  8. Monopole and dipole estimation for multi-frequency sky maps by linear regression

    Science.gov (United States)

    Wehus, I. K.; Fuskeland, U.; Eriksen, H. K.; Banday, A. J.; Dickinson, C.; Ghosh, T.; Górski, K. M.; Lawrence, C. R.; Leahy, J. P.; Maino, D.; Reich, P.; Reich, W.

    2017-01-01

    We describe a simple but efficient method for deriving a consistent set of monopole and dipole corrections for multi-frequency sky map data sets, allowing robust parametric component separation with the same data set. The computational core of this method is linear regression between pairs of frequency maps, often called T-T plots. Individual contributions from monopole and dipole terms are determined by performing the regression locally in patches on the sky, while the degeneracy between different frequencies is lifted whenever the dominant foreground component exhibits a significant spatial spectral index variation. Based on this method, we present two different, but each internally consistent, sets of monopole and dipole coefficients for the nine-year WMAP, Planck 2013, SFD 100 μm, Haslam 408 MHz and Reich & Reich 1420 MHz maps. The two sets have been derived with different analysis assumptions and data selection, and provide an estimate of residual systematic uncertainties. In general, our values are in good agreement with previously published results. Among the most notable results are a relative dipole between the WMAP and Planck experiments of 10-15μK (depending on frequency), an estimate of the 408 MHz map monopole of 8.9 ± 1.3 K, and a non-zero dipole in the 1420 MHz map of 0.15 ± 0.03 K pointing towards Galactic coordinates (l,b) = (308°,-36°) ± 14°. These values represent the sum of any instrumental and data processing offsets, as well as any Galactic or extra-Galactic component that is spectrally uniform over the full sky.

  9. Multi-index Stochastic Collocation Convergence Rates for Random PDEs with Parametric Regularity

    KAUST Repository

    Haji Ali, Abdul Lateef; Nobile, Fabio; Tamellini, Lorenzo; Tempone, Raul

    2016-01-01

    We analyze the recent Multi-index Stochastic Collocation (MISC) method for computing statistics of the solution of a partial differential equation (PDE) with random data, where the random coefficient is parametrized by means of a countable sequence of terms in a suitable expansion. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data, and naturally, the error analysis uses the joint regularity of the solution with respect to both the variables in the physical domain and parametric variables. In MISC, the number of problem solutions performed at each discretization level is not determined by balancing the spatial and stochastic components of the error, but rather by suitably extending the knapsack-problem approach employed in the construction of the quasi-optimal sparse-grids and Multi-index Monte Carlo methods, i.e., we use a greedy optimization procedure to select the most effective mixed differences to include in the MISC estimator. We apply our theoretical estimates to a linear elliptic PDE in which the log-diffusion coefficient is modeled as a random field, with a covariance similar to a Matérn model, whose realizations have spatial regularity determined by a scalar parameter. We conduct a complexity analysis based on a summability argument showing algebraic rates of convergence with respect to the overall computational work. The rate of convergence depends on the smoothness parameter, the physical dimensionality and the efficiency of the linear solver. Numerical experiments show the effectiveness of MISC in this infinite dimensional setting compared with the Multi-index Monte Carlo method and compare the convergence rate against the rates predicted in our theoretical analysis. © 2016 SFoCM

  10. Multi-index Stochastic Collocation Convergence Rates for Random PDEs with Parametric Regularity

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-08-26

    We analyze the recent Multi-index Stochastic Collocation (MISC) method for computing statistics of the solution of a partial differential equation (PDE) with random data, where the random coefficient is parametrized by means of a countable sequence of terms in a suitable expansion. MISC is a combination technique based on mixed differences of spatial approximations and quadratures over the space of random data, and naturally, the error analysis uses the joint regularity of the solution with respect to both the variables in the physical domain and parametric variables. In MISC, the number of problem solutions performed at each discretization level is not determined by balancing the spatial and stochastic components of the error, but rather by suitably extending the knapsack-problem approach employed in the construction of the quasi-optimal sparse-grids and Multi-index Monte Carlo methods, i.e., we use a greedy optimization procedure to select the most effective mixed differences to include in the MISC estimator. We apply our theoretical estimates to a linear elliptic PDE in which the log-diffusion coefficient is modeled as a random field, with a covariance similar to a Matérn model, whose realizations have spatial regularity determined by a scalar parameter. We conduct a complexity analysis based on a summability argument showing algebraic rates of convergence with respect to the overall computational work. The rate of convergence depends on the smoothness parameter, the physical dimensionality and the efficiency of the linear solver. Numerical experiments show the effectiveness of MISC in this infinite dimensional setting compared with the Multi-index Monte Carlo method and compare the convergence rate against the rates predicted in our theoretical analysis. © 2016 SFoCM

  11. Realization of multi-parameter and multi-state in fault tree computer-aided building software

    International Nuclear Information System (INIS)

    Guo Xiaoli; Tong Jiejuan; Xue Dazhi

    2004-01-01

    More than one parameter and more than one failed state of a parameter are often involved in building fault tree, so it is necessary for fault tree computer-aided building software to deal with multi-parameter and multi-state. Fault Tree Expert System (FTES) has the target of aiding the FT-building work of hydraulic systems. This paper expatiates on how to realize multi-parameter and multi-state in FTES with focus on Knowledge Base and Illation Engine. (author)

  12. Technical issues relating to the statistical parametric mapping of brain SPECT studies

    International Nuclear Information System (INIS)

    Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.

    2000-01-01

    Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  13. Multi-physics fluid-structure interaction modelling software

    CSIR Research Space (South Africa)

    Malan, AG

    2008-11-01

    Full Text Available -structure interaction modelling software AG MALAN AND O OXTOBY CSIR Defence, Peace, Safety and Security, PO Box 395, Pretoria, 0001 Email: amalan@csir.co.za – www.csir.co.za Internationally leading aerospace company Airbus sponsored key components... of the development of the CSIR fl uid-structure interaction (FSI) software. Below are extracts from their evaluation of the devel- oped technology: “The fi eld of FSI covers a massive range of engineering problems, each with their own multi-parameter, individual...

  14. Multi-copy entanglement purification with practical spontaneous parametric down conversion sources

    International Nuclear Information System (INIS)

    Zhang Shuai-Shuai; Shu Qi; Sheng Yu-Bo; Zhou Lan

    2017-01-01

    Entanglement purification is to distill the high quality entanglement from the low quality entanglement with local operations and classical communications. It is one of the key technologies in long-distance quantum communication. We discuss an entanglement purification protocol (EPP) with spontaneous parametric down conversion (SPDC) sources, in contrast to previous EPP with multi-copy mixed states, which requires ideal entanglement sources. We show that the SPDC source is not an obstacle for purification, but can benefit the fidelity of the purified mixed state. This EPP works for linear optics and is feasible in current experiment technology. (paper)

  15. Multi-parametric monitoring and assessment of high-intensity focused ultrasound (HIFU) boiling by harmonic motion imaging for focused ultrasound (HMIFU): an ex vivo feasibility study

    International Nuclear Information System (INIS)

    Hou, Gary Y; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E

    2014-01-01

    Harmonic motion imaging for focused ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase shift during high energy HIFU treatment with tissue boiling. Forty three (n = 43) thermal lesions were formed in ex vivo canine liver specimens (n = 28). Two-dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10 s, 20 s and 30 s HIFU durations at three different acoustic powers of 8, 10, and 11 W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and passive cavitation detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δϕ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite the expectedly chaotic changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property changes throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with

  16. Logistic discriminant parametric mapping: a novel method for the pixel-based differential diagnosis of Parkinson's disease

    International Nuclear Information System (INIS)

    Acton, P.D.; Mozley, P.D.; Kung, H.F.; Pennsylvania Univ., Philadelphia, PA

    1999-01-01

    Positron emission tomography (PET) and single-photon emission tomography (SPET) imaging of the dopaminergic system is a powerful tool for distinguishing groups of patients with neurodegenerative disorders, such as Parkinson's disease (PD). However, the differential diagnosis of individual subjects presenting early in the progress of the disease is much more difficult, particularly using region-of-interest analysis where small localized differences between subjects are diluted. In this paper we present a novel pixel-based technique using logistic discriminant analysis to distinguish between a group of PD patients and age-matched healthy controls. Simulated images of an anthropomorphic head phantom were used to test the sensitivity of the technique to striatal lesions of known size. The methodology was applied to real clinical SPET images of binding of technetium-99m labelled TRODAT-1 to dopamine transporters in PD patients (n=42) and age-matched controls (n=23). The discriminant model was trained on a subset (n=17) of patients for whom the diagnosis was unequivocal. Logistic discriminant parametric maps were obtained for all subjects, showing the probability distribution of pixels classified as being consistent with PD. The probability maps were corrected for correlated multiple comparisons assuming an isotropic Gaussian point spread function. Simulated lesion sizes measured by logistic discriminant parametric mapping (LDPM) gave strong correlations with the known data (r 2 =0.985, P<0.001). LDPM correctly classified all PD patients (sensitivity 100%) and only misclassified one control (specificity 95%). All patients who had equivocal clinical symptoms associated with early onset PD (n=4) were correctly assigned to the patient group. Statistical parametric mapping (SPM) had a sensitivity of only 24% on the same patient group. LDPM is a powerful pixel-based tool for the differential diagnosis of patients with PD and healthy controls. The diagnosis of disease even

  17. Some software issues in mapping of power distribution feeders

    International Nuclear Information System (INIS)

    Mufti, I.A.

    1994-01-01

    This paper is about the in-house developed software for distribution feeders mapping project. It first gives birds eye view of the project, highlight its technical complexity in management and logistics, introduced by sheer size of the project. It gives an overview of the software developed and the moves on to describe circuit tracing, circuit model, leaves isolation (for tree structured network) and backtracking in more detail among many different parts of software, description of all which is not possible because of space limitations. (author)

  18. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  19. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  20. Dynamic modeling and explicit/multi-parametric MPC control of pressure swing adsorption systems

    KAUST Repository

    Khajuria, Harish

    2011-01-01

    Pressure swing adsorption (PSA) is a flexible, albeit complex gas separation system. Due to its inherent nonlinear nature and discontinuous operation, the design of a model based PSA controller, especially with varying operating conditions, is a challenging task. This work focuses on the design of an explicit/multi-parametric model predictive controller for a PSA system. Based on a system involving four adsorbent beds separating 70% H2, 30% CH4 mixture into high purity hydrogen, the key controller objective is to fast track H2 purity to a set point value of 99.99%. To perform this task, a rigorous and systematic framework is employed. First, a high fidelity detailed dynamic model is built to represent the system\\'s real operation, and understand its dynamic behavior. The model is then used to derive appropriate linear models by applying suitable system identification techniques. For the reduced models, a model predictive control (MPC) step is formulated, where latest developments in multi-parametric programming and control are applied to derive a novel explicit MPC controller. To test the performance of the designed controller, closed loop simulations are performed where the dynamic model is used as the virtual plant. Comparison studies of the derived explicit MPC controller are also performed with conventional PID controllers. © 2010 Elsevier Ltd. All rights reserved.

  1. Multi-parametric MRI in cervical cancer. Early prediction of response to concurrent chemoradiotherapy in combination with clinical prognostic factors

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Wei; Chen, Bing; Wang, Ai Jun; Zhao, Jian Guo [The General Hospital of Ningxia Medical University, Department of Radiology, Yinchuan (China); Qiang, Jin Wei [Fudan University, Department of Radiology, Jinshan Hospital, Shanghai (China); Tian, Hai Ping [The General Hospital of Ningxia Medical University, Department of Pathology, Yinchuan (China)

    2018-01-15

    To investigate the prediction of response to concurrent chemoradiotherapy (CCRT) through a combination of pretreatment multi-parametric magnetic resonance imaging (MRI) with clinical prognostic factors (CPF) in cervical cancer patients. Sixty-five patients underwent conventional MRI, diffusion-weighted imaging (DWI), and dynamic contrast-enhanced MRI (DCE-MRI) before CCRT. The patients were divided into non- and residual tumour groups according to post-treatment MRI. Pretreatment MRI parameters and CPF between the two groups were compared and prognostic factors, optimal thresholds, and predictive performance for post-treatment residual tumour occurrence were estimated. The residual group showed a lower maximum slope of increase (MSI{sub L}) and signal enhancement ratio (SER{sub L}) in low-perfusion subregions, a higher apparent diffusion coefficient (ADC) value, and a higher stage than the non-residual tumour group (p < 0.001, p = 0.003, p < 0.001, and p < 0.001, respectively). MSI{sub L} and ADC were independent prognostic factors. The combination of both measures improved the diagnostic performance compared with individual MRI parameters. A further combination of these two factors with CPF exhibited the highest predictive performance. Pretreatment MSI{sub L} and ADC were independent prognostic factors for cervical cancer. The predictive capacity of multi-parametric MRI was superior to individual MRI parameters. The combination of multi-parametric MRI with CPF further improved the predictive performance. (orig.)

  2. MRI volumetric measurement of hippocampal formation based on statistic parametric mapping

    International Nuclear Information System (INIS)

    Hua Jianming; Jiang Biao; Zhou Jiong; Zhang Weimin

    2010-01-01

    Objective: To study MRI volumetric measurement of hippocampal formation using statistic parametric mapping (SPM) software and to discuss the value of the method applied to Alzheimer's disease (AD). Methods: The SPM software was used to divide the three-dimensional MRI brain image into gray matter, white matter and CSF separately. The bilateral hippocampal formations in both AD group and normal control group were delineated and the volumes were measured. The SPM method was compared with conventional method based on region of interest (ROI), which was the gold standard of volume measurement. The time used in measuring the volume by these two methods were respectively recorded and compared by two independent samples't test. Moreover, 7 physicians measured the left hippocampal formation of one same control with both of the two methods. The frequency distribution and dispersion of data acquired with the two methods were evaluated using standard deviation coefficient. Results (1) The volume of the bilateral hippocampal formations with SPM method was (1.88 ± 0.07) cm 3 and (1.93 ± 0.08) cm 3 respectively in the AD group, while was (2.99 ± 0.07) cm 3 and (3.02 ± 0.06) cm 3 in the control group. The volume of bilateral hippocampal formations measured by ROI method was (1.87 ± 0.06) cm 3 and (1.91 ± 0.09) cm 3 in the AD group, while was (2.97 ± 0.08) cm 3 and (3.00 ± 0.05) cm 3 in the control group. There was no significant difference between SPM method and conventional ROI method in the AD group and the control group (t=1.500, 1.617, 1.095, 1.889, P>0.05). However, the time used for delineation and volume measurement was significantly different. The time used in SPM measurement was (38.1 ± 2.0) min, while that in ROI measurement was (55.4 ± 2.4) min (t=-25.918, P 3 respectively. The frequency distribution of hippocampal formation volume measured by SPM method and ROI method was different. The CV SPM was 7% and the CV ROI was 19%. Conclusions: The borders of

  3. MODIS-based multi-parametric platform for mapping of flood affected areas. Case study: 2006 Danube extreme flood in Romania

    Directory of Open Access Journals (Sweden)

    Craciunescu Vasile

    2016-12-01

    Full Text Available Flooding remains the most widely distributed natural hazard in Europe, leading to significant economic and social impact. Earth observation data is presently capable of making fundamental contributions towards reducing the detrimental effects of extreme floods. Technological advance makes development of online services able to process high volumes of satellite data without the need of dedicated desktop software licenses possible. The main objective of the case study is to present and evaluate a methodology for mapping of flooded areas based on MODIS satellite images derived indices and using state-of-the-art geospatial web services. The methodology and the developed platform were tested with data for the historical flood event that affected the Danube floodplain in 2006 in Romania. The results proved that, despite the relative coarse resolution, MODIS data is very useful for mapping the development flooded area in large plain floods. Moreover it was shown, that the possibility to adapt and combine the existing global algorithms for flood detection to fit the local conditions is extremely important to obtain accurate results.

  4. Diagnosis of regional cerebral blood flow abnormalities using SPECT: agreement between individualized statistical parametric maps and visual inspection by nuclear medicine physicians with different levels of expertise in nuclear neurology

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Euclides Timoteo da, E-mail: euclidestimoteo@uol.com.b [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer. Dept. de Medicina Nuclear; Buchpiguel, Carlos Alberto [Hospital do Coracao, Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Nitrini, Ricardo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Dept. de Neurologia; Tazima, Sergio [Hospital Alemao Oswaldo Cruz (HAOC), Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Peres, Stela Verzinhase [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer; Busatto Filho, Geraldo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Div. de Medicina Nuclear

    2009-07-01

    Introduction: visual analysis is widely used to interpret regional cerebral blood flow (rCBF) SPECT images in clinical practice despite its limitations. Automated methods are employed to investigate between-group rCBF differences in research studies but have rarely been explored in individual analyses. Objectives: to compare visual inspection by nuclear physicians with the automated statistical parametric mapping program using a SPECT dataset of patients with neurological disorders and normal control images. Methods: using statistical parametric mapping, 14 SPECT images from patients with various neurological disorders were compared individually with a databank of 32 normal images using a statistical threshold of p<0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). Statistical parametric mapping results were compared with visual analyses by a nuclear physician highly experienced in neurology (A) as well as a nuclear physician with a general background of experience (B) who independently classified images as normal or altered, and determined the location of changes and the severity. Results: of the 32 images of the normal databank, 4 generated maps showing rCBF abnormalities (p<0.05, corrected). Among the 14 images from patients with neurological disorders, 13 showed rCBF alterations. Statistical parametric mapping and physician A completely agreed on 84.37% and 64.28% of cases from the normal databank and neurological disorders, respectively. The agreement between statistical parametric mapping and ratings of physician B were lower (71.18% and 35.71%, respectively). Conclusion: statistical parametric mapping replicated the findings described by the more experienced nuclear physician. This finding suggests that automated methods for individually analyzing rCBF SPECT images may be a valuable resource to complement visual inspection in clinical practice. (author)

  5. Software support for SBGN maps: SBGN-ML and LibSBGN.

    Science.gov (United States)

    van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk

    2012-08-01

    LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.

  6. Landslide mapping with multi-scale object-based image analysis – a case study in the Baichi watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    T. Lahousse

    2011-10-01

    Full Text Available We developed a multi-scale OBIA (object-based image analysis landslide detection technique to map shallow landslides in the Baichi watershed, Taiwan, after the 2004 Typhoon Aere event. Our semi-automated detection method selected multiple scales through landslide size statistics analysis for successive classification rounds. The detection performance achieved a modified success rate (MSR of 86.5% with the training dataset and 86% with the validation dataset. This performance level was due to the multi-scale aspect of our methodology, as the MSR for single scale classification was substantially lower, even after spectral difference segmentation, with a maximum of 74%. Our multi-scale technique was capable of detecting landslides of varying sizes, including very small landslides, up to 95 m2. The method presented certain limitations: the thresholds we established for classification were specific to the study area, to the landslide type in the study area, and to the spectral characteristics of the satellite image. Because updating site-specific and image-specific classification thresholds is easy with OBIA software, our multi-scale technique is expected to be useful for mapping shallow landslides at watershed level.

  7. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  8. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  9. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  10. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  11. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    Science.gov (United States)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  12. Development of a software for reconstruction of X-ray fluorescence intensity maps

    International Nuclear Information System (INIS)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela; Cardoso, Simone Coutinho; Moreira, Silvana

    2009-01-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline of XRF at Synchrotron Light National Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in the form of a matrix of data. (author)

  13. Development of a software for reconstruction of X-ray fluorescence intensity maps

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho, E-mail: simone@if.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana, E-mail: silvana@fec.unicamp.b [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline of XRF at Synchrotron Light National Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in the form of a matrix of data. (author)

  14. PARAMETRIC DRAWINGS VS. AUTOLISP

    OpenAIRE

    PRUNĂ Liviu; SLONOVSCHI Andrei

    2015-01-01

    In this paper the authors make a critical analysis of the advantages offered by the parametric drawing use by comparison with the AutoLISP computer programs used when it comes about the parametric design. Studying and analysing these two work models the authors have got to some ideas and conclusions which should be considered in the moment in that someone must to decide if it is the case to elaborate a software, using the AutoLISP language, or to establish the base rules that must be followed...

  15. The Perron-Frobenius theorem for multi-homogeneous mappings

    OpenAIRE

    Gautier, Antoine; Tudisco, Francesco; Hein, Matthias

    2018-01-01

    The Perron-Frobenius theory for nonnegative matrices has been generalized to order-preserving homogeneous mappings on a cone and more recently to nonnegative multilinear forms. We unify both approaches by introducing the concept of order-preserving multi-homogeneous mappings, their associated nonlinear spectral problems and spectral radii. We show several Perron-Frobenius type results for these mappings addressing existence, uniqueness and maximality of nonnegative and positive eigenpairs. We...

  16. A radial map of multi-whisker correlation selectivity in the rat barrel cortex.

    Science.gov (United States)

    Estebanez, Luc; Bertherat, Julien; Shulz, Daniel E; Bourdieu, Laurent; Léger, Jean-François

    2016-11-21

    In the barrel cortex, several features of single-whisker stimuli are organized in functional maps. The barrel cortex also encodes spatio-temporal correlation patterns of multi-whisker inputs, but so far the cortical mapping of neurons tuned to such input statistics is unknown. Here we report that layer 2/3 of the rat barrel cortex contains an additional functional map based on neuronal tuning to correlated versus uncorrelated multi-whisker stimuli: neuron responses to uncorrelated multi-whisker stimulation are strongest above barrel centres, whereas neuron responses to correlated and anti-correlated multi-whisker stimulation peak above the barrel-septal borders, forming rings of multi-whisker synchrony-preferring cells.

  17. Coherence properties of spontaneous parametric down-conversion pumped by a multi-mode cw diode laser.

    Science.gov (United States)

    Kwon, Osung; Ra, Young-Sik; Kim, Yoon-Ho

    2009-07-20

    Coherence properties of the photon pair generated via spontaneous parametric down-conversion pumped by a multi-mode cw diode laser are studied with a Mach-Zehnder interferometer. Each photon of the pair enters a different input port of the interferometer and the biphoton coherence properties are studied with a two-photon detector placed at one output port. When the photon pair simultaneously enters the interferometer, periodic recurrence of the biphoton de Broglie wave packet is observed, closely resembling the coherence properties of the pump diode laser. With non-zero delays between the photons at the input ports, biphoton interference exhibits the same periodic recurrence but the wave packet shapes are shown to be dependent on both the input delay as well as the interferometer delay. These properties could be useful for building engineered entangled photon sources based on diode laser-pumped spontaneous parametric down-conversion.

  18. An Anomaly Detector Based on Multi-aperture Mapping for Hyperspectral Data

    Directory of Open Access Journals (Sweden)

    LI Min

    2016-10-01

    Full Text Available Considering the correlationship of spectral content between anomaly and clutter background, inaccurate selection of background pixels induced estimation error of background model. In order to solve the above problems, a multi-aperture mapping based anomaly detector was proposed in this paper. Firstly, differing from background model which focused on feature extraction of background, multi-aperture mapping of hyperspectral data characterized the feature of whole hyperspectral data. According to constructed basis set of multi-aperture mapping, anomaly salience index of every test pixel was proposed to measure the relative statistic difference. Secondly, in order to analysis the moderate salience anomaly precisely, membership value was constructed to identify anomaly salience of test pixels continuously based on fuzzy logical theory. At same time, weighted iterative estimation of multi-aperture mapping was expected to converge adaptively with membership value as weight. Thirdly, classical defuzzification was proposed to fuse different detection results. Hyperspectral data was used in the experiments, and the robustness and sensitivity to anomaly with lower silence of proposed detector were tested.

  19. Inferring Parametric Energy Consumption Functions at Different Software Levels

    DEFF Research Database (Denmark)

    Liqat, Umer; Georgiou, Kyriakos; Kerrison, Steve

    2016-01-01

    The static estimation of the energy consumed by program executions is an important challenge, which has applications in program optimization and verification, and is instrumental in energy-aware software development. Our objective is to estimate such energy consumption in the form of functions...... on the input data sizes of programs. We have developed a tool for experimentation with static analysis which infers such energy functions at two levels, the instruction set architecture (ISA) and the intermediate code (LLVM IR) levels, and reflects it upwards to the higher source code level. This required...... the development of a translation from LLVM IR to an intermediate representation and its integration with existing components, a translation from ISA to the same representation, a resource analyzer, an ISA-level energy model, and a mapping from this model to LLVM IR. The approach has been applied to programs...

  20. Modelling Multi Hazard Mapping in Semarang City Using GIS-Fuzzy Method

    Science.gov (United States)

    Nugraha, A. L.; Awaluddin, M.; Sasmito, B.

    2018-02-01

    One important aspect of disaster mitigation planning is hazard mapping. Hazard mapping can provide spatial information on the distribution of locations that are threatened by disaster. Semarang City as the capital of Central Java Province is one of the cities with high natural disaster intensity. Frequent natural disasters Semarang city is tidal flood, floods, landslides, and droughts. Therefore, Semarang City needs spatial information by doing multi hazard mapping to support disaster mitigation planning in Semarang City. Multi Hazards map modelling can be derived from parameters such as slope maps, rainfall, land use, and soil types. This modelling is done by using GIS method with scoring and overlay technique. However, the accuracy of modelling would be better if the GIS method is combined with Fuzzy Logic techniques to provide a good classification in determining disaster threats. The Fuzzy-GIS method will build a multi hazards map of Semarang city can deliver results with good accuracy and with appropriate threat class spread so as to provide disaster information for disaster mitigation planning of Semarang city. from the multi-hazard modelling using GIS-Fuzzy can be known type of membership that has a good accuracy is the type of membership Gauss with RMSE of 0.404 the smallest of the other membership and VAF value of 72.909% of the largest of the other membership.

  1. An object-oriented multi-threaded software beamformation toolbox

    DEFF Research Database (Denmark)

    Hansen, Jens Munk; Hemmsen, Martin Christian; Jensen, Jørgen Arendt

    2011-01-01

    Focusing and apodization are an essential part of signal processing in ultrasound imaging. Although the fun- damental principles are simple, the dramatic increase in computational power of CPUs, GPUs, and FPGAs motivates the development of software based beamformers, which further improves image...... new beam formation strategies. It is a general 3D implementation capable of handling a multitude of focusing methods, interpolation schemes, and parametric and dynamic apodization. Despite being exible, it is capable of exploiting parallelization on a single computer, on a cluster, or on both....... On a single computer, it mimics the parallization in a scanner containing multiple beam formers. The focusing is determined using the positions of the transducer elements, presence of virtual sources, and the focus points. For interpolation, a number of interpolation schemes can be chosen, e.g. linear, polyno...

  2. A parametric investigation of hydrogen hcci combustion using a multi-zone model approach

    International Nuclear Information System (INIS)

    Komninos, N.P.; Hountalas, D.T.; Rakopoulos, C.D.

    2007-01-01

    The purpose of the present study is to examine the effect of various operating variables of a homogeneous charge compression ignition (HCCI) engine fueled with hydrogen, using a multi-zone model developed by the authors. The multi-zone model consists of zones, which are allotted spatial locations within the combustion chamber. The model takes into account heat transfer between the zones and the combustion chamber walls, providing a spatial temperature distribution during the closed part of the engine cycle, i.e. compression, combustion and expansion. Mass transfer between zones is also accounted for, based on the geometric configuration of the zones, and includes the flow of mass in and out of the crevice regions, represented by the crevice zone. Combustion is incorporated using chemical kinetics based on a chemical reaction mechanism for the oxidation of hydrogen. This chemical reaction mechanism also includes the reactions for nitrogen oxides formation. Using the multi-zone model a parametric investigation is conducted, in order to determine the effect of engine speed, equivalence ratio, compression ratio, inlet pressure and inlet temperature, on the performance, combustion characteristics and emissions of an HCCI engine fueled with hydrogen

  3. Topological characteristics of multi-valued maps and Lipschitzian functionals

    International Nuclear Information System (INIS)

    Klimov, V S

    2008-01-01

    This paper deals with the operator inclusion O element of F(x)+N Q (x), where F is a multi-valued map of monotonic type from a reflexive space V to its conjugate V * and N Q is the cone normal to the closed set Q, which, generally speaking, is not convex. To estimate the number of solutions of this inclusion we introduce topological characteristics of multi-valued maps and Lipschitzian functionals that have the properties of additivity and homotopy invariance. We prove some infinite-dimensional versions of the Poincare-Hopf theorem

  4. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...... for a level of detailing which is high enough to facilitate an understan- ding of the generic qualities of proposed designs. Starting from a brief overview of parametric design, this paper presents initial findings from the development of a parametric urban design tool with regard to developing a structural...... logic which is flexible and expandable. It then moves on to outline and discuss further development work. Finally, it offers a brief reflection on the potentials and shortcomings of the software – CityEngine – which is used for developing the parametric urban design tool....

  5. Investigation of olfactory function in normal volunteers by Tc-99m ECD Brain SPECT: Analysis using statistical parametric mapping

    International Nuclear Information System (INIS)

    Chung, Y.A.; Kim, S.H.; Park, Y.H.; Lee, S.Y.; Sohn, H.S.; Chung, S.K.

    2002-01-01

    The purpose of this study was to investigate olfactory function according to Tc-99m ECD uptake pattern in brain perfusion SPET of normal volunteer by means of statistical parametric mapping (SPM) analysis. The study population was 8 healthy volunteer subjects (M:F = 6:2, age range: 22-54 years, mean 34 years). We performed baseline brain perfusion SPET using 555 MBq of Tc-99m ECD in a silent dark room. Two hours later, we obtained brain perfusion SPET using 1110 MBq of Tc-99m ECD after 3% butanol solution under the same condition. All SPET images were spatially transformed to standard space smoothed and globally normalized. The differences between the baseline and odor-identification SPET images were statistically analyzed using SPM-99 software. The difference between two sets of brain perfusion SPET was considered significant at a threshold of uncorrected p values less than 0.01. SPM analysis revealed significant hyper-perfusion in both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on odor-identification SPET. This study shows that brain perfusion SPET can securely support other diagnostic techniques in the evaluation of olfactory function

  6. Development of a reconstruction software of elemental maps by micro X-ray fluorescence

    International Nuclear Information System (INIS)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela; Cardoso, Simone Coutinho; Moreira, Silvana

    2009-01-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline at National Synchrotron Light Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in form of a matrix of data. (author)

  7. Development of a reconstruction software of elemental maps by micro X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Energia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho [Universidade Federal do Rio de Janeiro (IF/UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana [Universidade Estadual de Campinas (FEC/UNICAMP), SP (Brazil) Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline at National Synchrotron Light Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in form of a matrix of data. (author)

  8. Ray calibration and phase mapping for structured-light-field 3D reconstruction.

    Science.gov (United States)

    Cai, Zewei; Liu, Xiaoli; Peng, Xiang; Gao, Bruce Z

    2018-03-19

    In previous work, we presented a structured light field (SLF) method combining light field imaging with structured illumination to perform multi-view depth measurement. However, the previous work just accomplishes depth rather than 3D reconstruction. In this paper, we propose a novel active method involving ray calibration and phase mapping, to achieve SLF 3D reconstruction. We performed the ray calibration for the first time to determine each light field ray with metric spatio-angular parameters, making the SLF realize multi-view 3D reconstruction. Based on the ray parametric equation, we further derived the phase mapping in the SLF that spatial coordinates can be directly mapped from phase. A flexible calibration strategy was correspondently designed to determine mapping coefficients for each light field ray, achieving high-efficiency SLF 3D reconstruction. Experimental results demonstrated that the proposed method was suitable for high-efficiency multi-view 3D reconstruction in the SLF.

  9. Different uptake of 99mTc-ECD adn 99mTc-HMPAO in the same brains: analysis by statistical parametric mapping.

    Science.gov (United States)

    Hyun, Y; Lee, J S; Rha, J H; Lee, I K; Ha, C K; Lee, D S

    2001-02-01

    The purpose of this study was to investigate the differences between technetium-99m ethyl cysteinate dimer (99mTc-ECD) and technetium-99m hexamethylpropylene amine oxime (99mTc-HMPAO) uptake in the same brains by means of statistical parametric mapping (SPM) analysis. We examined 20 patients (9 male, 11 female, mean age 62+/-12 years) using 99mTc-ECD and 99mTc-HMPAO single-photon emission tomography (SPET) and magnetic resonance imaging (MRI) of the brain less than 7 days after onset of stroke. MRI showed no cortical infarctions. Infarctions in the pons (6 patients) and medulla (1), ischaemic periventricular white matter lesions (13) and lacunar infarction (7) were found on MRI. Split-dose and sequential SPET techniques were used for 99mTc-ECD and 99mTc-HMPAO brain SPET, without repositioning of the patient. All of the SPET images were spatially transformed to standard space, smoothed and globally normalized. The differences between the 99mTc-ECD and 99mTc-HMPAO SPET images were statistically analysed using statistical parametric mapping (SPM) 96 software. The difference between two groups was considered significant at a threshold of uncorrected P values less than 0.01. Visual analysis showed no hypoperfused areas on either 99mTc-ECD or 99mTc-HMPAO SPET images. SPM analysis revealed significantly different uptake of 99mTc-ECD and 99mTc-HMPAO in the same brains. On the 99mTc-ECD SPET images, relatively higher uptake was observed in the frontal, parietal and occipital lobes, in the left superior temporal lobe and in the superior region of the cerebellum. On the 99mTc-HMPAO SPET images, relatively higher uptake was observed in the medial temporal lobes, thalami, periventricular white matter and brain stem. These differences in uptake of the two tracers in the same brains on SPM analysis suggest that interpretation of cerebral perfusion is possible using SPET with 99mTc-ECD and 99mTc-HMPAO.

  10. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Multi parametric card to personal computers interface based in ispLSI1016 circuits

    International Nuclear Information System (INIS)

    Osorio Deliz, J.F.; Toledo Acosta, R.B.; Arista Romeu, E.

    1997-01-01

    It is described the design and principal characteristic of the interface circuit for a 16 bit multi parametric add on card for IBM or compatible microcomputer which content two communication channels of direct memory access and bidirectional between the card and the computer, an interrupt controller, a programmable address register, a default add res register of the card, a four channels multiplexer, as well as the decoder logic of the 80C186 and computer. The circuit was designed with two programmable logic devices ispL1016, which allowed drastically to diminish the quantity of utilized components and get a more flexible design in less time better characteristics

  12. MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing

    DEFF Research Database (Denmark)

    2014-01-01

    MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing......MATLAB simulation software used for the PhD thesis "Acquisition of Multi-Band Signals via Compressed Sensing...

  13. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    Science.gov (United States)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed

  14. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  15. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  16. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won [College of Medicine, The Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2001-07-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using {sup 99m}Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus.

  17. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    International Nuclear Information System (INIS)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won

    2001-01-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using 99m Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus

  18. Quantitative multi-parameter mapping of R1, PD*, MT and R2* at 3T: a multi-center validation

    Directory of Open Access Journals (Sweden)

    Nikolaus eWeiskopf

    2013-06-01

    Full Text Available Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1mm high-resolution maps of the longitudinal relaxation rate (R1=1/T1, effective proton density (PD*, magnetization transfer saturation (MT and effective transverse relaxation rate (R2*=1/T2*. MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV for typical morphometric measures (i.e., gray matter probability maps used in voxel-based morphometry and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1% and 8%, respectively, except for the inter-site CoV of R2* (< 20%. The gray matter probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived gray matter probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.

  19. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Peng, E-mail: peng@ices.utexas.edu [The Institute for Computational Engineering and Sciences, The University of Texas at Austin, 201 East 24th Street, Stop C0200, Austin, TX 78712-1229 (United States); Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch [Seminar für Angewandte Mathematik, Eidgenössische Technische Hochschule, Römistrasse 101, CH-8092 Zürich (Switzerland)

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data

  20. A scalable hybrid multi-robot SLAM method for highly detailed maps

    NARCIS (Netherlands)

    Pfingsthorn, M.; Slamet, B.; Visser, A.

    2008-01-01

    Recent successful SLAM methods employ hybrid map representations combining the strengths of topological maps and occupancy grids. Such representations often facilitate multi-agent mapping. In this paper, a successful SLAM method is presented, which is inspired by the manifold data structure by

  1. Simulation, identification and statistical variation in cardiovascular analysis (SISCA) - A software framework for multi-compartment lumped modeling.

    Science.gov (United States)

    Huttary, Rudolf; Goubergrits, Leonid; Schütte, Christof; Bernhard, Stefan

    2017-08-01

    It has not yet been possible to obtain modeling approaches suitable for covering a wide range of real world scenarios in cardiovascular physiology because many of the system parameters are uncertain or even unknown. Natural variability and statistical variation of cardiovascular system parameters in healthy and diseased conditions are characteristic features for understanding cardiovascular diseases in more detail. This paper presents SISCA, a novel software framework for cardiovascular system modeling and its MATLAB implementation. The framework defines a multi-model statistical ensemble approach for dimension reduced, multi-compartment models and focuses on statistical variation, system identification and patient-specific simulation based on clinical data. We also discuss a data-driven modeling scenario as a use case example. The regarded dataset originated from routine clinical examinations and comprised typical pre and post surgery clinical data from a patient diagnosed with coarctation of aorta. We conducted patient and disease specific pre/post surgery modeling by adapting a validated nominal multi-compartment model with respect to structure and parametrization using metadata and MRI geometry. In both models, the simulation reproduced measured pressures and flows fairly well with respect to stenosis and stent treatment and by pre-treatment cross stenosis phase shift of the pulse wave. However, with post-treatment data showing unrealistic phase shifts and other more obvious inconsistencies within the dataset, the methods and results we present suggest that conditioning and uncertainty management of routine clinical data sets needs significantly more attention to obtain reasonable results in patient-specific cardiovascular modeling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Una introduzione ai software per il crime mapping / Observations préliminaires sur les logiciels du mappage du crime / Some introductory notes on crime mapping software

    OpenAIRE

    Ummarino Alessandro

    2013-01-01

    RiassuntoIl Crime Mapping più che una disciplina a se stante non è altro che l’applicazione di tecniche di analisi statistico-geografica allo studio dei reati. Grazie all’utilizzo dei software GIS (Geographic Information System), all’esponenziale sviluppo dell’informatica e alla facile accessibilità al web, la produzione di mappe di qualità è ormai alla portata di un qualunque utente medio. La possibilità di applicare tali tecniche di analisi è offerta in modo efficace da software GIS commerc...

  3. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  4. Introduction to SNPP/VIIRS Flood Mapping Software Version 1.0

    Science.gov (United States)

    Li, S.; Sun, D.; Goldberg, M.; Sjoberg, W.; Santek, D.; Hoffman, J.

    2017-12-01

    Near real-time satellite-derived flood maps are invaluable to river forecasters and decision-makers for disaster monitoring and relief efforts. With support from the JPSS (Joint Polar Satellite System) Proving Ground and Risk Reduction (PGRR) Program, flood detection software has been developed using Suomi-NPP/VIIRS (Suomi National Polar-orbiting Partnership/Visible Infrared Imaging Radiometer Suite) imagery to automatically generate near real-time flood maps for National Weather Service (NWS) River Forecast Centers (RFC) in the USA. The software, which is called VIIRS NOAA GMU Flood Version 1.0 (hereafter referred to as VNG Flood V1.0), consists of a series of algorithms that include water detection, cloud shadow removal, terrain shadow removal, minor flood detection, water fraction retrieval, and floodwater determination. The software is designed for flood detection in any land region between 80°S and 80°N, and it has been running routinely with direct broadcast SNPP/VIIRS data at the Space Science and Engineering Center at the University of Wisconsin-Madison (UW/SSEC) and the Geographic Information Network of Alaska at the University of Alaska-Fairbanks (UAF/GINA) since 2014. Near real-time flood maps are distributed via the Unidata Local Data Manager (LDM), reviewed by river forecasters in AWIPS-II (the second generation of the Advanced Weather Interactive Processing System) and applied in flood operations. Initial feedback from operational forecasters on the product accuracy and performance has been largely positive. The software capability has also been extended to areas outside of the USA via a case-driven mode to detect major floods all over the world. Offline validation efforts include the visual inspection of over 10,000 VIIRS false-color composite images, an inter-comparison with MODIS automatic flood products and a quantitative evaluation using Landsat imagery. The steady performance from the 3-year routine process and the promising validation results

  5. PARAMETRIC DRAWINGS VS. AUTOLISP

    Directory of Open Access Journals (Sweden)

    PRUNĂ Liviu

    2015-06-01

    Full Text Available In this paper the authors make a critical analysis of the advantages offered by the parametric drawing use by comparison with the AutoLISP computer programs used when it comes about the parametric design. Studying and analysing these two work models the authors have got to some ideas and conclusions which should be considered in the moment in that someone must to decide if it is the case to elaborate a software, using the AutoLISP language, or to establish the base rules that must be followed by the drawing, in the idea to construct outlines or blocks which can be used in the projection process.

  6. Statistical parametric mapping for effects of verapamil on olfactory connections of rat brain in vivo using manganese-enhanced MR imaging

    International Nuclear Information System (INIS)

    Soma, Tsutomu; Kurakawa, Masami; Koto, Daichi

    2011-01-01

    We investigated the effect of verapamil on the transport of manganese in the olfactory connections of rat brains in vivo using statistical parametric mapping and manganese-enhanced magnetic resonance (MR) imaging. We divided 12 7-week-old male Sprague-Dawley rats into 2 groups of six and injected 10 μL of saline into the right nasal cavities of the first group and 10 μL of verapamil (2.5 mg/mL) into the other group. Twenty minutes after the initial injection, we injected 10 μL of MnCl 2 (1 mol/L) into the right nasal cavities of both groups. We obtained serial T 1 -weighted MR images before administering the verapamil or saline and at 0.5, one, 24, 48, and 72 hours and 7 days after administering the MnCl 2 , spatially normalized the MR images on the rat brain atlas, and analyzed the data using voxel-based statistical comparison. Statistical parametric maps demonstrated the transport of manganese. Manganese ions created significant enhancement (t-score=36.6) 24 hours after MnCl 2 administration in the group administered saline but not at the same time point in the group receiving verapamil. The extent of significantly enhanced regions peaked at 72 hours in both groups and both sides of the brain. The peak of extent in the right side brain in the group injected with saline was 70.2 mm 3 and in the group with verapamil, 92.4 mm 3 . The extents in the left side were 64.0 mm 3 for the group with saline and 53.2 mm 3 for the group with verapamil. We applied statistical parametric mapping using manganese-enhanced MR imaging to demonstrate in vivo the transport of manganese in the olfactory connections of rat brains with and without verapamil and found that verapamil did affect this transport. (author)

  7. TU-H-CAMPUS-IeP3-02: Neurovascular 4D Parametric Imaging Using Co-Registration of Biplane DSA Sequences with 3D Vascular Geometry Obtained From Cone Beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Balasubramoniam, A; Bednarek, D; Rudin, S; Ionita, C [Toshiba Stroke and Vascular Research Centre, SUNY at Buffalo (United States)

    2016-06-15

    Purpose: To create 4D parametric images using biplane Digital Subtraction Angiography (DSA) sequences co-registered with the 3D vascular geometry obtained from Cone Beam-CT (CBCT). Methods: We investigated a method to derive multiple 4D Parametric Imaging (PI) maps using only one CBCT acquisition. During this procedure a 3D-DSA geometry is stored and used subsequently for all 4D images. Each time a biplane DSA is acquired, we calculate 2D parametric maps of Bolus Arrival Time (BAT), Mean Transit Time (MTT) and Time to Peak (TTP). Arterial segments which are nearly parallel with one of the biplane imaging planes in the 2D parametric maps are co-registered with the 3D geometry. The values in the remaining vascular network are found using spline interpolation since the points chosen for co-registration on the vasculature are discrete and remaining regions need to be interpolated. To evaluate the method we used a patient CT volume data set for 3D printing a neurovascular phantom containing a complete Circle of Willis. We connected the phantom to a flow loop with a peristaltic pump, simulating physiological flow conditions. Contrast media was injected with an automatic injector at 10 ml/sec. Images were acquired with a Toshiba Infinix C-arm and 4D parametric image maps of the vasculature were calculated. Results: 4D BAT, MTT, and TTP parametric image maps of the Circle of Willis were derived. We generated color-coded 3D geometries which avoided artifacts due to vessel overlap or foreshortening in the projection direction. Conclusion: The software was tested successfully and multiple 4D parametric images were obtained from biplane DSA sequences without the need to acquire additional 3D-DSA runs. This can benefit the patient by reducing the contrast media and the radiation dose normally associated with these procedures. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.

  8. TU-H-CAMPUS-IeP3-02: Neurovascular 4D Parametric Imaging Using Co-Registration of Biplane DSA Sequences with 3D Vascular Geometry Obtained From Cone Beam CT

    International Nuclear Information System (INIS)

    Balasubramoniam, A; Bednarek, D; Rudin, S; Ionita, C

    2016-01-01

    Purpose: To create 4D parametric images using biplane Digital Subtraction Angiography (DSA) sequences co-registered with the 3D vascular geometry obtained from Cone Beam-CT (CBCT). Methods: We investigated a method to derive multiple 4D Parametric Imaging (PI) maps using only one CBCT acquisition. During this procedure a 3D-DSA geometry is stored and used subsequently for all 4D images. Each time a biplane DSA is acquired, we calculate 2D parametric maps of Bolus Arrival Time (BAT), Mean Transit Time (MTT) and Time to Peak (TTP). Arterial segments which are nearly parallel with one of the biplane imaging planes in the 2D parametric maps are co-registered with the 3D geometry. The values in the remaining vascular network are found using spline interpolation since the points chosen for co-registration on the vasculature are discrete and remaining regions need to be interpolated. To evaluate the method we used a patient CT volume data set for 3D printing a neurovascular phantom containing a complete Circle of Willis. We connected the phantom to a flow loop with a peristaltic pump, simulating physiological flow conditions. Contrast media was injected with an automatic injector at 10 ml/sec. Images were acquired with a Toshiba Infinix C-arm and 4D parametric image maps of the vasculature were calculated. Results: 4D BAT, MTT, and TTP parametric image maps of the Circle of Willis were derived. We generated color-coded 3D geometries which avoided artifacts due to vessel overlap or foreshortening in the projection direction. Conclusion: The software was tested successfully and multiple 4D parametric images were obtained from biplane DSA sequences without the need to acquire additional 3D-DSA runs. This can benefit the patient by reducing the contrast media and the radiation dose normally associated with these procedures. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.

  9. Multi-copy entanglement purification with practical spontaneous parametric down conversion sources

    Science.gov (United States)

    Zhang, Shuai-Shuai; Shu, Qi; Zhou, Lan; Sheng, Yu-Bo

    2017-06-01

    Entanglement purification is to distill the high quality entanglement from the low quality entanglement with local operations and classical communications. It is one of the key technologies in long-distance quantum communication. We discuss an entanglement purification protocol (EPP) with spontaneous parametric down conversion (SPDC) sources, in contrast to previous EPP with multi-copy mixed states, which requires ideal entanglement sources. We show that the SPDC source is not an obstacle for purification, but can benefit the fidelity of the purified mixed state. This EPP works for linear optics and is feasible in current experiment technology. Project supported by the National Natural Science Foundation of China (Grant Nos. 11474168 and 61401222), the Natural Science Foundation of Jiangsu Province, China (Grant No. BK20151502), the Qing Lan Project in Jiangsu Province, China, and a Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions, China.

  10. An Approach for the Implementation of Software Quality Models Adpoting CERTICS and CMMI-DEV

    Directory of Open Access Journals (Sweden)

    GARCIA, F.W.

    2015-12-01

    Full Text Available This paper proposes a mapping between two product quality and software processes models used in the industry, the CERTICS national model and the CMMI-DEV international model. The stages of mapping are presented step by step, as well as the mapping review, which had the cooperation of one specialist in CERTICS and CMMI-DEV models. It aims to correlate the structures of the two models in order to facilitate and reduce the implementation time and costs, and to stimulate the execution of multi-model implementations in software developers companies.

  11. Fast and Lean Immutable Multi-Maps on the JVM based on Heterogeneous Hash-Array Mapped Tries

    NARCIS (Netherlands)

    M.J. Steindorfer (Michael); J.J. Vinju (Jurgen)

    2016-01-01

    textabstractAn immutable multi-map is a many-to-many thread-friendly map data structure with expected fast insert and lookup operations. This data structure is used for applications processing graphs or many-to-many relations as applied in static analysis of object-oriented systems. When

  12. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  13. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  14. Adopting of Agile methods in Software Development Organizations: Systematic Mapping

    Directory of Open Access Journals (Sweden)

    Samia Abdalhamid

    2017-11-01

    Full Text Available Adoption of agile methods in the software development organization is considered as a powerful solution to deal with the quickly changing and regularly developing business environment and fully-educated customers with constantly rising expectation, such as shorter time periods and an extraordinary level of response and service. This study investigates the adoption of agile approaches in software development organizations by using systematic mapping. Six research questions are identified, and to answer these questions a number of research papers have been reviewed in electronic databases. Finally, 25 research papers are examined and answers to all research questions are provided.

  15. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  16. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  17. Research on Mixer Parametric Modeling System Based on Redevelopment of ANSYS

    Directory of Open Access Journals (Sweden)

    Bin Zheng

    2015-01-01

    Full Text Available In this paper, the mixer parametric modeling system software was developed by using VB which was taken as the foreground development program, and the paper combined with ANSYS software to create the finite element model of mixer blade and cylinder for the following numerical simulation of the flow field and parameter optimization of mixer. The software user interface was developed by VB and the pre-process model was created by invoking APDL of ANSYS in background. Therefore, the operation of modeling, meshing, component-building of mixer blade and cylinder were completed by using APDL and the graphic and text were outputted and displayed on the mixer parametric modeling system user interface which was developed by VB. Practice proved that it is convenient to modify the mixer solid model created by the parametric design language of ANSYS due to the similar structure.

  18. On the multi-index (3 m-parametric) Mittag-Leffler functions, fractional calculus relations and series convergence

    Science.gov (United States)

    Paneva-Konovska, Jordanka

    2013-10-01

    In this paper we consider a family of 3 m-indices generalizations of the classical Mittag-Leffler function, called multi-index (3 m-parametric) Mittag-Leffler functions. We survey the basic properties of these entire functions, find their order and type, and new representations by means of Mellin-Barnes type contour integrals, Wright p Ψ q -functions and Fox H-functions, asymptotic estimates. Formulas for integer and fractional order integration and differentiations are found, and these are extended also for the operators of the generalized fractional calculus (multiple Erdélyi-Kober operators). Some interesting particular cases of the multi-index Mittag-Leffler functions are discussed. The convergence of series of such type functions in the complex plane is considered, and analogues of the Cauchy-Hadamard, Abel, Tauber and Littlewood theorems are provided.

  19. Characterizing Heterogeneity within Head and Neck Lesions Using Cluster Analysis of Multi-Parametric MRI Data.

    Directory of Open Access Journals (Sweden)

    Marco Borri

    Full Text Available To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment.The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4. Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters.The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4, determined with cluster validation, produced the best separation between reducing and non-reducing clusters.The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.

  20. Software defined radio (SDR) architecture for concurrent multi-satellite communications

    Science.gov (United States)

    Maheshwarappa, Mamatha R.

    SDRs have emerged as a viable approach for space communications over the last decade by delivering low-cost hardware and flexible software solutions. The flexibility introduced by the SDR concept not only allows the realisation of concurrent multiple standards on one platform, but also promises to ease the implementation of one communication standard on differing SDR platforms by signal porting. This technology would facilitate implementing reconfigurable nodes for parallel satellite reception in Mobile/Deployable Ground Segments and Distributed Satellite Systems (DSS) for amateur radio/university satellite operations. This work outlines the recent advances in embedded technologies that can enable new communication architectures for concurrent multi-satellite or satellite-to-ground missions where multi-link challenges are associated. This research proposes a novel concept to run advanced parallelised SDR back-end technologies in a Commercial-Off-The-Shelf (COTS) embedded system that can support multi-signal processing for multi-satellite scenarios simultaneously. The initial SDR implementation could support only one receiver chain due to system saturation. However, the design was optimised to facilitate multiple signals within the limited resources available on an embedded system at any given time. This was achieved by providing a VHDL solution to the existing Python and C/C++ programming languages along with parallelisation so as to accelerate performance whilst maintaining the flexibility. The improvement in the performance was validated at every stage through profiling. Various cases of concurrent multiple signals with different standards such as frequency (with Doppler effect) and symbol rates were simulated in order to validate the novel architecture proposed in this research. Also, the architecture allows the system to be reconfigurable by providing the opportunity to change the communication standards in soft real-time. The chosen COTS solution provides a

  1. Software filtering method to suppress spike pulse interference in multi-channel scaler

    International Nuclear Information System (INIS)

    Huang Shun; Zhao Xiuliang; Li Zhiqiang; Zhao Yanhui

    2008-01-01

    In the test on anti-jamming function of a multi-channel scaler, we found that the spike pulse interference on the second level counter caused by the motor start-stop operations brings a major count error. There are resolvable characteristics between effective signal and spike pulse interference, and multi-channel hardware filtering circuit is too huge and can't filter thoroughly, therefore we designed a software filtering method. In this method based on C8051F020 MCU, we dynamically store sampling values of one channel in only a one-byte variable and distinguish the rise-trail edge of a signal and spike pulse interference because of value changes of the variable. Test showed that the filtering software method can solve the error counting problem of the multi-channel scaler caused by the motor start-stop operations. The flow chart and source codes of the method were detailed in this paper. (authors)

  2. Multi-Agent Software Engineering

    International Nuclear Information System (INIS)

    Mohamed, A.H.

    2014-01-01

    This paper proposed an alarm-monitoring system for people based on multi-agent using maps. The system monitors the users physical context using their mobile phone. The agents on the mobile phones are responsible for collecting, processing and sending data to the server. They can determine the parameters of their environment by sensors. The data are processed and sent to the server. On the other side, a set of agents on server can store this data and check the preconditions of the restrictions associated with the user, in order to trigger the appropriate alarms. These alarms are sent not only to the user who is alarmed to avoid the appeared restriction, but also to his supervisor. The proposed system is a general purpose alarm system that can be used in different critical application areas. It has been applied for monitoring the workers of radiation sites. However, these workers can do their activity tasks in the radiation environments safely

  3. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  4. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  5. Effects of registration error on parametric response map analysis: a simulation study using liver CT-perfusion images

    International Nuclear Information System (INIS)

    Lausch, A; Lee, T Y; Wong, E; Jensen, N K G; Chen, J; Lock, M

    2014-01-01

    Purpose: To investigate the effects of registration error (RE) on parametric response map (PRM) analysis of pre and post-radiotherapy (RT) functional images. Methods: Arterial blood flow maps (ABF) were generated from the CT-perfusion scans of 5 patients with hepatocellular carcinoma. ABF values within each patient map were modified to produce seven new ABF maps simulating 7 distinct post-RT functional change scenarios. Ground truth PRMs were generated for each patient by comparing the simulated and original ABF maps. Each simulated ABF map was then deformed by different magnitudes of realistic respiratory motion in order to simulate RE. PRMs were generated for each of the deformed maps and then compared to the ground truth PRMs to produce estimates of RE-induced misclassification. Main findings: The percentage of voxels misclassified as decreasing, no change, and increasing, increased with RE For all patients, increasing RE was observed to increase the number of high post-RT ABF voxels associated with low pre-RT ABF voxels and vice versa. 3 mm of average tumour RE resulted in 18-45% tumour voxel misclassification rates. Conclusions: RE induced misclassification posed challenges for PRM analysis in the liver where registration accuracy tends to be lower. Quantitative understanding of the sensitivity of the PRM method to registration error is required if PRMs are to be used to guide radiation therapy dose painting techniques.

  6. A New and General Formulation of the Parametric HFGMC Micromechanical Method for Three-Dimensional Multi-Phase Composites

    Science.gov (United States)

    Haj-Ali, Rami; Aboudi, Jacob

    2012-01-01

    The recent two-dimensional (2-D) parametric formulation of the high fidelity generalized method of cells (HFGMC) reported by the authors is generalized for the micromechanical analysis of three-dimensional (3-D) multiphase composites with periodic microstructure. Arbitrary hexahedral subcell geometry is developed to discretize a triply periodic repeating unit-cell (RUC). Linear parametric-geometric mapping is employed to transform the arbitrary hexahedral subcell shapes from the physical space to an auxiliary orthogonal shape, where a complete quadratic displacement expansion is performed. Previously in the 2-D case, additional three equations are needed in the form of average moments of equilibrium as a result of the inclusion of the bilinear terms. However, the present 3-D parametric HFGMC formulation eliminates the need for such additional equations. This is achieved by expressing the coefficients of the full quadratic polynomial expansion of the subcell in terms of the side or face average-displacement vectors. The 2-D parametric and orthogonal HFGMC are special cases of the present 3-D formulation. The continuity of displacements and tractions, as well as the equilibrium equations, are imposed in the average (integral) sense as in the original HFGMC formulation. Each of the six sides (faces) of a subcell has an independent average displacement micro-variable vector which forms an energy-conjugate pair with the transformed average-traction vector. This allows generating symmetric stiffness matrices along with internal resisting vectors for the subcells which enhances the computational efficiency. The established new parametric 3-D HFGMC equations are formulated and solution implementations are addressed. Several applications for triply periodic 3-D composites are presented to demonstrate the general capability and varsity of the present parametric HFGMC method for refined micromechanical analysis by generating the spatial distributions of local stress fields

  7. Brain SPECT analysis using statistical parametric mapping in patients with transient global amnesia

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E. N.; Sohn, H. S.; Kim, S. H; Chung, S. K.; Yang, D. W. [College of Medicine, The Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2001-07-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with transient global amnesia (TGA) using statistical parametric mapping 99 (SPM99). Noninvasive rCBF measurements using 99mTc-ethyl cysteinate dimer (ECD) SPECT were performed on 8 patients with TGA and 17 age matched controls. The relative rCBF maps in patients with TGA and controls were compared. In patients with TGA, significantly decreased rCBF was found along the left superior temporal extending to left parietal region of the brain and left thalamus. There were areas of increased rCBF in the right temporal, right frontal region and right thalamus. We could demonstrate decreased perfusion in left cerebral hemisphere and increased perfusion in right cerebral hemisphere in patients with TGA using SPM99. The reciprocal change of rCBF between right and left cerebral hemisphere in patients with TGA might suggest that imbalanced neuronal activity between the bilateral hemispheres may be important role in the pathogenesis of the TGA. For quantitative SPECT analysis in TGA patients, we recommend SPM99 rather than the ROI method because of its definitive advantages.

  8. Brain SPECT analysis using statistical parametric mapping in patients with transient global amnesia

    International Nuclear Information System (INIS)

    Kim, E. N.; Sohn, H. S.; Kim, S. H; Chung, S. K.; Yang, D. W.

    2001-01-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with transient global amnesia (TGA) using statistical parametric mapping 99 (SPM99). Noninvasive rCBF measurements using 99mTc-ethyl cysteinate dimer (ECD) SPECT were performed on 8 patients with TGA and 17 age matched controls. The relative rCBF maps in patients with TGA and controls were compared. In patients with TGA, significantly decreased rCBF was found along the left superior temporal extending to left parietal region of the brain and left thalamus. There were areas of increased rCBF in the right temporal, right frontal region and right thalamus. We could demonstrate decreased perfusion in left cerebral hemisphere and increased perfusion in right cerebral hemisphere in patients with TGA using SPM99. The reciprocal change of rCBF between right and left cerebral hemisphere in patients with TGA might suggest that imbalanced neuronal activity between the bilateral hemispheres may be important role in the pathogenesis of the TGA. For quantitative SPECT analysis in TGA patients, we recommend SPM99 rather than the ROI method because of its definitive advantages

  9. Reference Architecture for Multi-Layer Software Defined Optical Data Center Networks

    Directory of Open Access Journals (Sweden)

    Casimer DeCusatis

    2015-09-01

    Full Text Available As cloud computing data centers grow larger and networking devices proliferate; many complex issues arise in the network management architecture. We propose a framework for multi-layer; multi-vendor optical network management using open standards-based software defined networking (SDN. Experimental results are demonstrated in a test bed consisting of three data centers interconnected by a 125 km metropolitan area network; running OpenStack with KVM and VMW are components. Use cases include inter-data center connectivity via a packet-optical metropolitan area network; intra-data center connectivity using an optical mesh network; and SDN coordination of networking equipment within and between multiple data centers. We create and demonstrate original software to implement virtual network slicing and affinity policy-as-a-service offerings. Enhancements to synchronous storage backup; cloud exchanges; and Fibre Channel over Ethernet topologies are also discussed.

  10. Synchronized 2D/3D optical mapping for interactive exploration and real-time visualization of multi-function neurological images.

    Science.gov (United States)

    Zhang, Qi; Alexander, Murray; Ryner, Lawrence

    2013-01-01

    Efficient software with the ability to display multiple neurological image datasets simultaneously with full real-time interactivity is critical for brain disease diagnosis and image-guided planning. In this paper, we describe the creation and function of a new comprehensive software platform that integrates novel algorithms and functions for multiple medical image visualization, processing, and manipulation. We implement an opacity-adjustment algorithm to build 2D lookup tables for multiple slice image display and fusion, which achieves a better visual result than those of using VTK-based methods. We also develop a new real-time 2D and 3D data synchronization scheme for multi-function MR volume and slice image optical mapping and rendering simultaneously through using the same adjustment operation. All these methodologies are integrated into our software framework to provide users with an efficient tool for flexibly, intuitively, and rapidly exploring and analyzing the functional and anatomical MR neurological data. Finally, we validate our new techniques and software platform with visual analysis and task-specific user studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. ARLearn and StreetLearn software for virtual reality and augmented reality multi user learning games

    NARCIS (Netherlands)

    Ternier, Stefaan; Klemke, Roland

    2012-01-01

    Ternier, S., & Klemke, R. (2011). ARLearn and StreetLearn software for virtual reality and augmented reality multi user learning games (Version 1.0) [Software Documentation]. Heerlen, The Netherlands: Open Universiteit in the Netherlands.

  12. ARLearn and StreetLearn software for virtual reality and augmented reality multi user learning games

    NARCIS (Netherlands)

    Ternier, Stefaan; Klemke, Roland

    2012-01-01

    Ternier, S., & Klemke, R. (2011). ARLearn and StreetLearn software for virtual reality and augmented reality multi user learning games (Version 1.0) [Computer software]. Heerlen, The Netherlands: Open Universiteit in the Netherlands.

  13. Multi-Synchronization Caused by Uniform Disorder for Globally Coupled Maps

    International Nuclear Information System (INIS)

    Jing-Hui, Li

    2008-01-01

    We investigate the motion of the globally coupled maps (logistic map) driven by uniform disorder. It is shown that this disorder can produce multi-synchronization for the globally coupled chaotic maps studied by us. The disorder determines the synchronized dynamics, leading to the emergence of a wide range of new collective behaviour in which the individual units in isolation are incapable of producing in the absence of the disorder. Our results imply that the disorder can tame the collective motion of the coupled chaotic maps

  14. Exploiting High Resolution Multi-Seasonal Textural Measures and Spectral Information for Reedbed Mapping

    Directory of Open Access Journals (Sweden)

    Alex Okiemute Onojeghuo

    2016-02-01

    Full Text Available Reedbeds across the UK are amongst the most important habitats for rare and endangered birds, wildlife and organisms. However, over the past century, this valued wetland habitat has experienced a drastic reduction in quality and spatial coverage due to pressures from human related activities. To this end, conservation organisations across the UK have been charged with the task of conserving and expanding this threatened habitat. With this backdrop, the study aimed to develop a methodology for accurate reedbed mapping through the combined use of multi-seasonal texture measures and spectral information contained in high resolution QuickBird satellite imagery. The key objectives were to determine the most effective single-date (autumn or summer and multi-seasonal QuickBird imagery suitable for reedbed mapping over the study area; to evaluate the effectiveness of combining multi-seasonal texture measures and spectral information for reedbed mapping using a variety of combinations; and to evaluate the most suitable classification technique for reedbed mapping from three selected classification techniques, namely maximum likelihood classifier, spectral angular mapper and artificial neural network. Using two selected grey-level co-occurrence textural measures (entropy and angular second moment, a series of experiments were conducted using varied combinations of single-date and multi-seasonal QuickBird imagery. Overall, the results indicate the multi-seasonal pansharpened multispectral bands (eight layers combined with all eight grey level co-occurrence matrix texture measures (entropy and angular second moment computed using windows 3 × 3 and 7 × 7 produced the optimal reedbed (76.5% and overall classification (78.1% accuracies using the maximum likelihood classifier technique. Using the optimal 16 layer multi-seasonal pansharpened multispectral and texture combined image dataset, a total reedbed area of 9.8 hectares was successfully mapped over the

  15. Ultrasonic defect characterization using parametric-manifold mapping

    Science.gov (United States)

    Velichko, A.; Bai, L.; Drinkwater, B. W.

    2017-06-01

    The aim of ultrasonic non-destructive evaluation includes the detection and characterization of defects, and an understanding of the nature of defects is essential for the assessment of structural integrity in safety critical systems. In general, the defect characterization challenge involves an estimation of defect parameters from measured data. In this paper, we explore the extent to which defects can be characterized by their ultrasonic scattering behaviour. Given a number of ultrasonic measurements, we show that characterization information can be extracted by projecting the measurement onto a parametric manifold in principal component space. We show that this manifold represents the entirety of the characterization information available from far-field harmonic ultrasound. We seek to understand the nature of this information and hence provide definitive statements on the defect characterization performance that is, in principle, extractable from typical measurement scenarios. In experiments, the characterization problem of surface-breaking cracks and the more general problem of elliptical voids are studied, and a good agreement is achieved between the actual parameter values and the characterization results. The nature of the parametric manifold enables us to explain and quantify why some defects are relatively easy to characterize, whereas others are inherently challenging.

  16. Sonar: a multibase and parametric interface software for SDI

    International Nuclear Information System (INIS)

    Fonseca Passos, M.C.J. da

    1986-01-01

    Sonar - an automated service for selective dissemination of information (SDI) - developed by the Centro de Informacoes Nucleares (CIN) of the Comissao Nacional de Energia Nuclear (CNEN) is described. Emphasis is given to the multibase feature of the system based on the parametric interface between the system and an external data base reading subroutine. (Author) [pt

  17. Multi-Layer Visualization of Mobile Mapping Data

    Directory of Open Access Journals (Sweden)

    D. Eggert

    2013-10-01

    Full Text Available application various different visualization schemes are conceivable. This paper presents a multi-layer based visualization method, enabling fast data browsing of mobile mapping data. In contrast to systems like Google Street View the proposed visualization does not base on 360° panoramas, but on colored point clouds projected on partially translucent images. Those images are rendered as overlapping textures, preserving the depth of the recorded data and still enabling fast rendering on any kind of platform. Furthermore the proposed visualization allows the user to inspect the mobile mapping data in a panoramic fashion with an immersive depth illusion using the parallax scrolling technic.

  18. Fast, Sequence Adaptive Parcellation of Brain MR Using Parametric Models

    DEFF Research Database (Denmark)

    Puonti, Oula; Iglesias, Juan Eugenio; Van Leemput, Koen

    2013-01-01

    In this paper we propose a method for whole brain parcellation using the type of generative parametric models typically used in tissue classification. Compared to the non-parametric, multi-atlas segmentation techniques that have become popular in recent years, our method obtains state-of-the-art ...

  19. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark [University College London, Research Department of Urology, London (United Kingdom); Kirkham, Alex; Allen, Clare [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2014-09-17

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  20. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    International Nuclear Information System (INIS)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark; Kirkham, Alex; Allen, Clare; Freeman, Alex

    2015-01-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  1. A Software Module for High-Accuracy Calibration of Rings and Cylinders on CMM using Multi-Orientation Techniques (Multi-Step and Reversal methods)

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    . The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy. The present report describes a software module, ROUNDCAL, to be used for high-accuracy calibration of rings and cylinders....... The purpose of the software is to calculate the form error and the least square circle of rings and cylinders by mean of average of pontwise measuring results becoming from so-called multi-orientation techniques (both reversal and multi-step methods) in order to eliminate systematic errors of CMM ....

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  3. Performance evaluation of multi-stratum resources integration based on network function virtualization in software defined elastic data center optical interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; Tian, Rui; Han, Jianrui; Lee, Young

    2015-11-30

    Data center interconnect with elastic optical network is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resilience between IP and elastic optical networks that allows to accommodate data center services. In view of this, this study extends to consider the resource integration by breaking the limit of network device, which can enhance the resource utilization. We propose a novel multi-stratum resources integration (MSRI) architecture based on network function virtualization in software defined elastic data center optical interconnect. A resource integrated mapping (RIM) scheme for MSRI is introduced in the proposed architecture. The MSRI can accommodate the data center services with resources integration when the single function or resource is relatively scarce to provision the services, and enhance globally integrated optimization of optical network and application resources. The overall feasibility and efficiency of the proposed architecture are experimentally verified on the control plane of OpenFlow-based enhanced software defined networking (eSDN) testbed. The performance of RIM scheme under heavy traffic load scenario is also quantitatively evaluated based on MSRI architecture in terms of path blocking probability, provisioning latency and resource utilization, compared with other provisioning schemes.

  4. Current-driven parametric resonance in magnetic multilayers

    International Nuclear Information System (INIS)

    Wang, C; Seinige, H; Tsoi, M

    2013-01-01

    Current-induced parametric excitations were observed in point-contact spin-valve nanodevices. Point contacts were used to inject high densities of direct and microwave currents into spin valves, thus producing oscillating spin-transfer and Oersted-field torques on magnetic moments. The resulting magnetodynamics were observed electrically by measuring rectified voltage signals across the contact. In addition to the spin-torque-driven ferromagnetic resonance we observe doubled-frequency signals which correspond to the parametric excitation of magnetic moments. Numerical simulations suggest that while both spin-transfer torque and ac Oersted field contribute to the parametrically excited dynamics, the ac spin torque dominates, and dc spin torque can switch it on and off. The dc bias dependence of the parametric resonance signal enabled the mapping of instability regions characterizing the nonlinearity of the oscillation. (paper)

  5. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  6. Parametric programming of industrial robots

    Directory of Open Access Journals (Sweden)

    Szulczyński Paweł

    2015-06-01

    Full Text Available This article proposes the use of parametric design software, commonly used by architects, in order to obtain complex trajectory and program code for industrial robots. The paper describes the drawbacks of existing solutions and proposes a new script to obtain a correct program. The result of the algorithm was verified experimentally.

  7. Fault Localization Method by Partitioning Memory Using Memory Map and the Stack for Automotive ECU Software Testing

    Directory of Open Access Journals (Sweden)

    Kwanhyo Kim

    2016-09-01

    Full Text Available Recently, the usage of the automotive Electronic Control Unit (ECU and its software in cars is increasing. Therefore, as the functional complexity of such software increases, so does the likelihood of software-related faults. Therefore, it is important to ensure the reliability of ECU software in order to ensure automobile safety. For this reason, systematic testing methods are required that can guarantee software quality. However, it is difficult to locate a fault during testing with the current ECU development system because a tester performs the black-box testing using a Hardware-in-the-Loop (HiL simulator. Consequently, developers consume a large amount of money and time for debugging because they perform debugging without any information about the location of the fault. In this paper, we propose a method for localizing the fault utilizing memory information during black-box testing. This is likely to be of use to developers who debug automotive software. In order to observe whether symbols stored in the memory have been updated, the memory is partitioned by a memory map and the stack, thus the fault candidate region is reduced. A memory map method has the advantage of being able to finely partition the memory, and the stack method can partition the memory without a memory map. We validated these methods by applying these to HiL testing of the ECU for a body control system. The preliminary results indicate that a memory map and the stack reduce the possible fault locations to 22% and 19% of the updated memory, respectively.

  8. SPM analysis of parametric (R)-[11C]PK11195 binding images: plasma input versus reference tissue parametric methods.

    Science.gov (United States)

    Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald

    2007-05-01

    (R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).

  9. RadMAP: The Radiological Multi-sensor Analysis Platform

    International Nuclear Information System (INIS)

    Bandstra, Mark S.; Aucott, Timothy J.; Brubaker, Erik; Chivers, Daniel H.; Cooper, Reynold J.; Curtis, Joseph C.; Davis, John R.; Joshi, Tenzing H.; Kua, John; Meyer, Ross; Negut, Victor; Quinlan, Michael; Quiter, Brian J.; Srinivasan, Shreyas; Zakhor, Avideh; Zhang, Richard; Vetter, Kai

    2016-01-01

    The variability of gamma-ray and neutron background during the operation of a mobile detector system greatly limits the ability of the system to detect weak radiological and nuclear threats. The natural radiation background measured by a mobile detector system is the result of many factors, including the radioactivity of nearby materials, the geometric configuration of those materials and the system, the presence of absorbing materials, and atmospheric conditions. Background variations tend to be highly non-Poissonian, making it difficult to set robust detection thresholds using knowledge of the mean background rate alone. The Radiological Multi-sensor Analysis Platform (RadMAP) system is designed to allow the systematic study of natural radiological background variations and to serve as a development platform for emerging concepts in mobile radiation detection and imaging. To do this, RadMAP has been used to acquire extensive, systematic background measurements and correlated contextual data that can be used to test algorithms and detector modalities at low false alarm rates. By combining gamma-ray and neutron detector systems with data from contextual sensors, the system enables the fusion of data from multiple sensors into novel data products. The data are curated in a common format that allows for rapid querying across all sensors, creating detailed multi-sensor datasets that are used to study correlations between radiological and contextual data, and develop and test novel techniques in mobile detection and imaging. In this paper we will describe the instruments that comprise the RadMAP system, the effort to curate and provide access to multi-sensor data, and some initial results on the fusion of contextual and radiological data.

  10. RadMAP: The Radiological Multi-sensor Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Bandstra, Mark S., E-mail: msbandstra@lbl.gov [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Aucott, Timothy J. [Department of Nuclear Engineering, University of California Berkeley, CA (United States); Brubaker, Erik [Sandia National Laboratory, Livermore, CA (United States); Chivers, Daniel H.; Cooper, Reynold J. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Curtis, Joseph C. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Department of Nuclear Engineering, University of California Berkeley, CA (United States); Davis, John R. [Department of Nuclear Engineering, University of California Berkeley, CA (United States); Joshi, Tenzing H.; Kua, John; Meyer, Ross; Negut, Victor; Quinlan, Michael; Quiter, Brian J. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Srinivasan, Shreyas [Department of Nuclear Engineering, University of California Berkeley, CA (United States); Department of Electrical Engineering and Computer Science, University of California Berkeley, CA (United States); Zakhor, Avideh; Zhang, Richard [Department of Electrical Engineering and Computer Science, University of California Berkeley, CA (United States); Vetter, Kai [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Department of Nuclear Engineering, University of California Berkeley, CA (United States)

    2016-12-21

    The variability of gamma-ray and neutron background during the operation of a mobile detector system greatly limits the ability of the system to detect weak radiological and nuclear threats. The natural radiation background measured by a mobile detector system is the result of many factors, including the radioactivity of nearby materials, the geometric configuration of those materials and the system, the presence of absorbing materials, and atmospheric conditions. Background variations tend to be highly non-Poissonian, making it difficult to set robust detection thresholds using knowledge of the mean background rate alone. The Radiological Multi-sensor Analysis Platform (RadMAP) system is designed to allow the systematic study of natural radiological background variations and to serve as a development platform for emerging concepts in mobile radiation detection and imaging. To do this, RadMAP has been used to acquire extensive, systematic background measurements and correlated contextual data that can be used to test algorithms and detector modalities at low false alarm rates. By combining gamma-ray and neutron detector systems with data from contextual sensors, the system enables the fusion of data from multiple sensors into novel data products. The data are curated in a common format that allows for rapid querying across all sensors, creating detailed multi-sensor datasets that are used to study correlations between radiological and contextual data, and develop and test novel techniques in mobile detection and imaging. In this paper we will describe the instruments that comprise the RadMAP system, the effort to curate and provide access to multi-sensor data, and some initial results on the fusion of contextual and radiological data.

  11. Statistical dynamics of parametrically perturbed sine-square map

    Indian Academy of Sciences (India)

    Abstract. We discuss the emergence and destruction of complex, critical and completely chaotic attractors in a nonlinear system when subjected to a small parametric perturba- tion in trigonometric, hyperbolic or noise function forms. For this purpose, a hybrid optical bistable system, which is a nonlinear physical system, has ...

  12. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  13. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  14. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    International Nuclear Information System (INIS)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M.; Som, Seu S.; Haindl, Walter; Bye, Ann M.E.

    2004-01-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8±4.3 years; mean±SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the potential

  15. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Som, Seu S.; Haindl, Walter [Department of Nuclear Medicine, Prince of Wales Hospital, Randwick, New South Wales (Australia); Bye, Ann M.E. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Department of Neurology, Sydney Children' s Hospital, High Street, 2031, Randwick, NSW (Australia)

    2004-03-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8{+-}4.3 years; mean{+-}SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the

  16. MR diffusion tensor analysis of schizophrenic brain using statistical parametric mapping

    International Nuclear Information System (INIS)

    Yamada, Haruyasu; Abe, Osamu; Kasai, Kiyoto

    2005-01-01

    The purpose of this study is to investigate diffusion anisotropy in the schizophrenic brain by voxel-based analysis of diffusion tensor imaging (DTI), using statistical parametric mapping (SPM). We studied 33 patients with schizophrenia diagnosed by diagnostic and statistical manual of mental disorders (DSM)-IV criteria and 42 matched controls. The data was obtained with a 1.5 T MRI system. We used single-shot spin-echo planar sequences (repetition time/echo time (TR/TE)=5000/102 ms, 5 mm slice thickness and 1.5 mm gap, field of view (FOV)=21 x 21 cm 2 , number of excitation (NEX)=4, 128 x 128 pixel matrix) for diffusion tensor acquisition. Diffusion gradients (b-value of 500 or 1000 s/mm 2 ) were applied on two axes simultaneously. Diffusion properties were measured along 6 non-linear directions. The structural distortion induced by the large diffusion gradients was corrected, based on each T 2 -weighted echo-planar image (b=0 s/mm 2 ). The fractional anisotropy (FA) maps were generated on a voxel-by-voxel basis. T 2 -weighted echo-planar images were then segmented into gray matter, white matter, and cerebrospinal fluid, using SPM (Wellcome Department of Imaging, University College London, UK). All apparent diffusion coefficient (ADC) and FA maps in native space were transformed to the stereotactic space by registering each of the images to the same template image. The normalized data was smoothed and analyzed using SPM. The significant FA decrease in the patient group was found in the uncinate fasciculus, parahippocampal white matter, anterior cingulum and other areas (corrected p<0.05). No significant increased region was noted. Our results may reflect reduced diffusion anisotropy of the white matter pathway of the limbic system as shown by the decreased FA. Manual region-of-interest analysis is usually more sensitive than voxel-based analysis, but it is subjective and difficult to set with anatomical reproducibility. Voxel-based analysis of the diffusion tensor

  17. Explicit free parametrization of the modified tetrahedron equation

    CERN Document Server

    Gehlen, G V; Sergeev, S

    2003-01-01

    The modified tetrahedron equation (MTE) with affine Weyl quantum variables at the Nth root of unity is solved by a rational mapping operator which is obtained from the solution of a linear problem. We show that the solutions can be parametrized in terms of eight free parameters and 16 discrete phase choices, thus providing a broad starting point for the construction of three-dimensional integrable lattice models. The Fermat-curve points parametrizing the representation of the mapping operator in terms of cyclic functions are expressed in terms of the independent parameters. An explicit formula for the density factor of the MTE is derived. For the example N=2 we write the MTE in full detail.

  18. Explicit free parametrization of the modified tetrahedron equation

    International Nuclear Information System (INIS)

    Gehlen, G von; Pakuliak, S; Sergeev, S

    2003-01-01

    The modified tetrahedron equation (MTE) with affine Weyl quantum variables at the Nth root of unity is solved by a rational mapping operator which is obtained from the solution of a linear problem. We show that the solutions can be parametrized in terms of eight free parameters and 16 discrete phase choices, thus providing a broad starting point for the construction of three-dimensional integrable lattice models. The Fermat-curve points parametrizing the representation of the mapping operator in terms of cyclic functions are expressed in terms of the independent parameters. An explicit formula for the density factor of the MTE is derived. For the example N=2 we write the MTE in full detail

  19. IClinfMRI Software for Integrating Functional MRI Techniques in Presurgical Mapping and Clinical Studies.

    Science.gov (United States)

    Hsu, Ai-Ling; Hou, Ping; Johnson, Jason M; Wu, Changwei W; Noll, Kyle R; Prabhu, Sujit S; Ferguson, Sherise D; Kumar, Vinodh A; Schomer, Donald F; Hazle, John D; Chen, Jyh-Horng; Liu, Ho-Ling

    2018-01-01

    Task-evoked and resting-state (rs) functional magnetic resonance imaging (fMRI) techniques have been applied to the clinical management of neurological diseases, exemplified by presurgical localization of eloquent cortex, to assist neurosurgeons in maximizing resection while preserving brain functions. In addition, recent studies have recommended incorporating cerebrovascular reactivity (CVR) imaging into clinical fMRI to evaluate the risk of lesion-induced neurovascular uncoupling (NVU). Although each of these imaging techniques possesses its own advantage for presurgical mapping, a specialized clinical software that integrates the three complementary techniques and promptly outputs the analyzed results to radiology and surgical navigation systems in a clinical format is still lacking. We developed the Integrated fMRI for Clinical Research (IClinfMRI) software to facilitate these needs. Beyond the independent processing of task-fMRI, rs-fMRI, and CVR mapping, IClinfMRI encompasses three unique functions: (1) supporting the interactive rs-fMRI mapping while visualizing task-fMRI results (or results from published meta-analysis) as a guidance map, (2) indicating/visualizing the NVU potential on analyzed fMRI maps, and (3) exporting these advanced mapping results in a Digital Imaging and Communications in Medicine (DICOM) format that are ready to export to a picture archiving and communication system (PACS) and a surgical navigation system. In summary, IClinfMRI has the merits of efficiently translating and integrating state-of-the-art imaging techniques for presurgical functional mapping and clinical fMRI studies.

  20. Distance education course on spatial multi-hazard risk assessment, using Open Source software

    Science.gov (United States)

    van Westen, C. J.; Frigerio, S.

    2009-04-01

    As part of the capacity building activities of the United Nations University - ITC School on Disaster Geo-Information Management (UNU-ITC DGIM) the International Institute for Geoinformation Science and Earth Observation (ITC) has developed a distance education course on the application of Geographic Information Systems for multi-hazard risk assessment. This course is designed for academic staff, as well as for professionals working in (non-) governmental organizations where knowledge of disaster risk management is essential. The course guides the participants through the entire process of risk assessment, on the basis of a case study of a city exposed to multiple hazards, in a developing country. The courses consists of eight modules, each with a guide book explaining the theoretical background, and guiding the participants through spatial data requirements for risk assessment, hazard assessment procedures, generation of elements at risk databases, vulnerability assessment, qualitative and quantitative risk assessment methods, risk evaluation and risk reduction. Linked to the theory is a large set of exercises, with exercise descriptions, answer sheets, demos and GIS data. The exercises deal with four different types of hazards: earthquakes, flooding, technological hazards, and landslides. One important consideration in designing the course is that people from developing countries should not be restricted in using it due to financial burdens for software acquisition. Therefore the aim was to use Open Source software as a basis. The GIS exercises are written for the ILWIS software. All exercises have also been integrated into a WebGIS, using the Open source software CartoWeb (based on GNU License). It is modular and customizable thanks to its object-oriented architecture and based on a hierarchical structure (to manage and organize every package of information of every step required in risk assessment). Different switches for every component of the risk assessment

  1. Multi-Modal, Multi-Touch Interaction with Maps in Disaster Management Applications

    Directory of Open Access Journals (Sweden)

    V. Paelke

    2012-07-01

    Full Text Available Multi-touch interaction has become popular in recent years and impressive advances in technology have been demonstrated, with the presentation of digital maps as a common presentation scenario. However, most existing systems are really technology demonstrators and have not been designed with real applications in mind. A critical factor in the management of disaster situations is the access to current and reliable data. New sensors and data acquisition platforms (e.g. satellites, UAVs, mobile sensor networks have improved the supply of spatial data tremendously. However, in many cases this data is not well integrated into current crisis management systems and the capabilities to analyze and use it lag behind sensor capabilities. Therefore, it is essential to develop techniques that allow the effective organization, use and management of heterogeneous data from a wide variety of data sources. Standard user interfaces are not well suited to provide this information to crisis managers. Especially in dynamic situations conventional cartographic displays and mouse based interaction techniques fail to address the need to review a situation rapidly and act on it as a team. The development of novel interaction techniques like multi-touch and tangible interaction in combination with large displays provides a promising base technology to provide crisis managers with an adequate overview of the situation and to share relevant information with other stakeholders in a collaborative setting. However, design expertise on the use of such techniques in interfaces for real-world applications is still very sparse. In this paper we report on interdisciplinary research with a user and application centric focus to establish real-world requirements, to design new multi-modal mapping interfaces, and to validate them in disaster management applications. Initial results show that tangible and pen-based interaction are well suited to provide an intuitive and visible way to

  2. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  3. Multi-threaded software framework development for the ATLAS experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00226135; Baines, John; Bold, Tomasz; Calafiura, Paolo; Dotti, Andrea; Farrell, Steven; Leggett, Charles; Malon, David; Ritsch, Elmar; Snyder, Scott; Tsulaia, Vakhtang; van Gemmeren, Peter; Wynne, Benjamin

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. ATLAS examined the requirements on an updated multi-threaded framework and laid out plans for a new framework, including better support for high level trigger (HLT) use cases, in 2014. In this paper we report on our progress in developing the new multi-threaded task parallel extension of Athena, AthenaMT. Implementing AthenaMT has required many significant code changes. Progress has been made in updating key concepts of the framework, to allow the incorporation of different levels of thread safety in algorithmic code (from un-migrated thread-unsafe code, to thread safe copyable code to reentrant co...

  4. Multi-threaded Software Framework Development for the ATLAS Experiment

    CERN Document Server

    Stewart, Graeme; The ATLAS collaboration; Baines, John; Calafiura, Paolo; Dotti, Andrea; Farrell, Steven; Leggett, Charles; Malon, David; Ritsch, Elmar; Snyder, Scott; Tsulaia, Vakhtang; van Gemmeren, Peter; Wynne, Benjamin

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. ATLAS examined the requirements on an updated multi-threaded framework and layed out plans for a new framework, including better support for high level trigger (HLT) use cases, in 2014. In this paper we report on our progress in developing the new multi-threaded task parallel extension of Athena, AthenaMT. Implementing AthenaMT has required many significant code changes. Progress has been made in updating key concepts of the framework, to allow the incorporation of different levels of thread safety in algorithmic code (from un-migrated thread-unsafe code, to thread safe copyable code to reentrant c...

  5. Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma.

    Directory of Open Access Journals (Sweden)

    Leland S Hu

    Full Text Available Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM. Contrast-enhanced MRI (CE-MRI targets enhancing core (ENH but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT, despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM.We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs <80% tumor nuclei for corresponding samples. In a training set, we used three texture analysis algorithms and three ML methods to identify MRI-texture features that optimized model accuracy to distinguish tumor content. We confirmed model accuracy in a separate validation set.We collected 82 biopsies from 18 GBMs throughout ENH and BAT. The MRI-based model achieved 85% cross-validated accuracy to diagnose high- vs low-tumor in the training set (60 biopsies, 11 patients. The model achieved 81.8% accuracy in the validation set (22 biopsies, 7 patients.Multi-parametric MRI and texture analysis can help characterize and visualize GBM's spatial histologic heterogeneity to identify regional tumor-rich biopsy targets.

  6. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    Science.gov (United States)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.

  7. Security Awareness in Software-Defined Multi-Domain 5G Networks

    Directory of Open Access Journals (Sweden)

    Jani Suomalainen

    2018-03-01

    Full Text Available Fifth generation (5G technologies will boost the capacity and ease the management of mobile networks. Emerging virtualization and softwarization technologies enable more flexible customization of network services and facilitate cooperation between different actors. However, solutions are needed to enable users, operators, and service providers to gain an up-to-date awareness of the security and trustworthiness of 5G systems. We describe a novel framework and enablers for security monitoring, inferencing, and trust measuring. The framework leverages software-defined networking and big data technologies to customize monitoring for different applications. We present an approach for sharing security measurements across administrative domains. We describe scenarios where the correlation of multi-domain information improves the accuracy of security measures with respect to two threats: end-user location tracking and Internet of things (IoT authentication storms. We explore the security characteristics of data flows in software networks dedicated to different applications with a mobile network testbed.

  8. Planarity constrained multi-view depth map reconstruction for urban scenes

    Science.gov (United States)

    Hou, Yaolin; Peng, Jianwei; Hu, Zhihua; Tao, Pengjie; Shan, Jie

    2018-05-01

    Multi-view depth map reconstruction is regarded as a suitable approach for 3D generation of large-scale scenes due to its flexibility and scalability. However, there are challenges when this technique is applied to urban scenes where apparent man-made regular shapes may present. To address this need, this paper proposes a planarity constrained multi-view depth (PMVD) map reconstruction method. Starting with image segmentation and feature matching for each input image, the main procedure is iterative optimization under the constraints of planar geometry and smoothness. A set of candidate local planes are first generated by an extended PatchMatch method. The image matching costs are then computed and aggregated by an adaptive-manifold filter (AMF), whereby the smoothness constraint is applied to adjacent pixels through belief propagation. Finally, multiple criteria are used to eliminate image matching outliers. (Vertical) aerial images, oblique (aerial) images and ground images are used for qualitative and quantitative evaluations. The experiments demonstrated that the PMVD outperforms the popular multi-view depth map reconstruction with an accuracy two times better for the aerial datasets and achieves an outcome comparable to the state-of-the-art for ground images. As expected, PMVD is able to preserve the planarity for piecewise flat structures in urban scenes and restore the edges in depth discontinuous areas.

  9. Different uptake of {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO in the same brains: analysis by statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, I.Y. [Dept. of Nuclear Medicine, Inha University College of Medicine, Incheon (Korea); Lee, J.S.; Lee, D.S. [Dept. of Nuclear Medicine, Seoul National University College of Medicine, Seoul (Korea); Rha, J.H.; Lee, I.K.; Ha, C.K. [Dept. of Neurology, Inha University College of Medicine, Incheon (Korea)

    2001-02-01

    The purpose of this study was to investigate the differences between technetium-99m ethyl cysteinate dimer ({sup 99m}Tc-ECD) and technetium-99m hexamethylpropylene amine oxime ({sup 99m}Tc-HMPAO) uptake in the same brains by means of statistical parametric mapping (SPM) analysis. We examined 20 patients (9 male, 11 female, mean age 62{+-}12 years) using {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO single-photon emission tomography (SPET) and magnetic resonance imaging (MRI) of the brain less than 7 days after onset of stroke. MRI showed no cortical infarctions. Infarctions in the pons (6 patients) and medulla (1), ischaemic periventricular white matter lesions (13) and lacunar infarction (7) were found on MRI. Split-dose and sequential SPET techniques were used for {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO brain SPET, without repositioning of the patient. All of the SPET images were spatially transformed to standard space, smoothed and globally normalized. The differences between the {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO SPET images were statistically analysed using statistical parametric mapping (SPM) 96 software. The difference between two groups was considered significant at a threshold of uncorrected P values less than 0.01. Visual analysis showed no hypoperfused areas on either {sup 99m}Tc-ECD or {sup 99m}Tc-HMPAO SPET images. SPM analysis revealed significantly different uptake of {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO in the same brains. On the {sup 99m}Tc-ECD SPET images, relatively higher uptake was observed in the frontal, parietal and occipital lobes, in the left superior temporal lobe and in the superior region of the cerebellum. On the {sup 99m}Tc-HMPAO SPET images, relatively higher uptake was observed in the medial temporal lobes, thalami, periventricular white matter and brain stem. These differences in uptake of the two tracers in the same brains on SPM analysis suggest that interpretation of cerebral perfusion is possible using SPET with {sup 99m}Tc-ECD and

  10. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI.

    Science.gov (United States)

    Dikaios, Nikolaos; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Kirkham, Alex; Allen, Clare; Ahmed, Hashim; Emberton, Mark; Freeman, Alex; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit

    2015-02-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. • MRI helps find prostate cancer in the anterior of the gland • Logistic regression models based on mp-MRI can classify prostate cancer • Computers can help confirm cancer in areas doctors are uncertain about.

  11. Organization of Multi-controller Interaction in Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Sergey V. Morzhov

    2018-01-01

    Full Text Available Software Defined Networking (SDN is a promising paradigm for network management. It is a centralized network intelligence on a dedicated server, which runs network operating system, and is called SDN controller. It was assumed that such an architecture should have an improved network performance and monitoring. However, the centralized control architecture of the SDNs brings novel challenges to reliability, scalability, fault tolerance and interoperability. These problems are especially acute for large data center networks and can be solved by combining SDN controllers into clusters, called multi-controllers. Multi-controller architecture became very important for SDN-enabled networks nowadays. This paper gives a comprehensive overview of SDN multi-controller architectures. The authors review several most popular distributed controllers in order to indicate their strengths and weaknesses. They also investigate and classify approaches used. This paper explains in details the difference among various types of multi-controller architectures, the distribution method and the communication system. Furthermore, it provides already implemented architectures and some examples of architectures under consideration by describing their design, communication process, and performance results. In this paper, the authors show their own classification of multi-controllers and claim that, despite the existence of undeniable advantages, all reviewed controllers have serious drawbacks, which must be eliminated. These drawbacks hamper the development of multi-controllers and their widespread adoption in corporate networks. In the end, the authors conclude that now it is impossible to find a solution capable to solve all the tasks assigned to it adequately and fully. The article is published in the authors’ wording.

  12. A comparison of multi-spectral, multi-angular, and multi-temporal remote sensing datasets for fractional shrub canopy mapping in Arctic Alaska

    Science.gov (United States)

    Selkowitz, D.J.

    2010-01-01

    Shrub cover appears to be increasing across many areas of the Arctic tundra biome, and increasing shrub cover in the Arctic has the potential to significantly impact global carbon budgets and the global climate system. For most of the Arctic, however, there is no existing baseline inventory of shrub canopy cover, as existing maps of Arctic vegetation provide little information about the density of shrub cover at a moderate spatial resolution across the region. Remotely-sensed fractional shrub canopy maps can provide this necessary baseline inventory of shrub cover. In this study, we compare the accuracy of fractional shrub canopy (> 0.5 m tall) maps derived from multi-spectral, multi-angular, and multi-temporal datasets from Landsat imagery at 30 m spatial resolution, Moderate Resolution Imaging SpectroRadiometer (MODIS) imagery at 250 m and 500 m spatial resolution, and MultiAngle Imaging Spectroradiometer (MISR) imagery at 275 m spatial resolution for a 1067 km2 study area in Arctic Alaska. The study area is centered at 69 ??N, ranges in elevation from 130 to 770 m, is composed primarily of rolling topography with gentle slopes less than 10??, and is free of glaciers and perennial snow cover. Shrubs > 0.5 m in height cover 2.9% of the study area and are primarily confined to patches associated with specific landscape features. Reference fractional shrub canopy is determined from in situ shrub canopy measurements and a high spatial resolution IKONOS image swath. Regression tree models are constructed to estimate fractional canopy cover at 250 m using different combinations of input data from Landsat, MODIS, and MISR. Results indicate that multi-spectral data provide substantially more accurate estimates of fractional shrub canopy cover than multi-angular or multi-temporal data. Higher spatial resolution datasets also provide more accurate estimates of fractional shrub canopy cover (aggregated to moderate spatial resolutions) than lower spatial resolution datasets

  13. Metrics for vector quantization-based parametric speech enhancement and separation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2013-01-01

    Speech enhancement and separation algorithms sometimes employ a two-stage processing scheme, wherein the signal is first mapped to an intermediate low-dimensional parametric description after which the parameters are mapped to vectors in codebooks trained on, for exam- ple, individual noise...

  14. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    International Nuclear Information System (INIS)

    Li, Ming; Wu, Huapeng; Handroos, Heikki; Yang, Guangyou; Wang, Yongbo

    2015-01-01

    Highlights: • A high-level protocol is proposed for the data inter-transmission. • The protocol design is task-oriented for the robot control in the software system. • The protocol functions as a role of middleware in the software. • The protocol running stand-alone as an independent process in the software provides greater security. • Providing a reference design protocol for the multi-task robot machine in the industry. - Abstract: A specific communication and control protocol for software design of a multi-task robot machine is proposed. In order to fulfill the requirements on the complicated multi machining functions and the high performance motion control, the software design of robot is divided into two main parts accordingly, which consists of the user-oriented HMI part and robot control-oriented real-time control system. The two parts of software are deployed in the different hardware for the consideration of run-time performance, which forms a client–server-control architecture. Therefore a high-level task-oriented protocol is designed for the data inter-communication between the HMI part and the control system part, in which all the transmitting data related to a machining task is divided into three categories: trajectory-oriented data, task control-oriented data and status monitoring-oriented data. The protocol consists of three sub-protocols accordingly – a trajectory protocol, task control protocol and status protocol – which are deployed over the Ethernet and run as independent processes in both the client and server computers. The protocols are able to manage the vast amounts of data streaming due to the multi machining functions in a more efficient way. Since the protocol is functioning in the software as a role of middleware, and providing the data interface standards for the developing groups of two parts of software, it also permits greater focus of both software parts developers on their own requirements-oriented design. By

  15. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: ming.li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China); Wang, Yongbo [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland)

    2015-10-15

    Highlights: • A high-level protocol is proposed for the data inter-transmission. • The protocol design is task-oriented for the robot control in the software system. • The protocol functions as a role of middleware in the software. • The protocol running stand-alone as an independent process in the software provides greater security. • Providing a reference design protocol for the multi-task robot machine in the industry. - Abstract: A specific communication and control protocol for software design of a multi-task robot machine is proposed. In order to fulfill the requirements on the complicated multi machining functions and the high performance motion control, the software design of robot is divided into two main parts accordingly, which consists of the user-oriented HMI part and robot control-oriented real-time control system. The two parts of software are deployed in the different hardware for the consideration of run-time performance, which forms a client–server-control architecture. Therefore a high-level task-oriented protocol is designed for the data inter-communication between the HMI part and the control system part, in which all the transmitting data related to a machining task is divided into three categories: trajectory-oriented data, task control-oriented data and status monitoring-oriented data. The protocol consists of three sub-protocols accordingly – a trajectory protocol, task control protocol and status protocol – which are deployed over the Ethernet and run as independent processes in both the client and server computers. The protocols are able to manage the vast amounts of data streaming due to the multi machining functions in a more efficient way. Since the protocol is functioning in the software as a role of middleware, and providing the data interface standards for the developing groups of two parts of software, it also permits greater focus of both software parts developers on their own requirements-oriented design. By

  16. Fuzzy Cognitive Map for Software Testing Using Artificial Intelligence Techniques

    OpenAIRE

    Larkman , Deane; Mohammadian , Masoud; Balachandran , Bala; Jentzsch , Ric

    2010-01-01

    International audience; This paper discusses a framework to assist test managers to evaluate the use of AI techniques as a potential tool in software testing. Fuzzy Cognitive Maps (FCMs) are employed to evaluate the framework and make decision analysis easier. A what-if analysis is presented that explores the general application of the framework. Simulations are performed to show the effectiveness of the proposed method. The framework proposed is innovative and it assists managers in making e...

  17. Fast and Lean Immutable Multi-Maps on the JVM based on Heterogeneous Hash-Array Mapped Tries

    OpenAIRE

    Steindorfer, Michael J.; Vinju, Jurgen J.

    2016-01-01

    textabstractAn immutable multi-map is a many-to-many thread-friendly map data structure with expected fast insert and lookup operations. This data structure is used for applications processing graphs or many-to-many relations as applied in static analysis of object-oriented systems. When processing such big data sets the memory overhead of the data structure encoding itself is a memory usage bottleneck. Motivated by reuse and type-safety, libraries for Java, Scala and Clojure typically implem...

  18. Software Development Initiatives to Identify and Mitigate Security Threats - Two Systematic Mapping Studies

    Directory of Open Access Journals (Sweden)

    Paulina Silva

    2016-12-01

    Full Text Available Software Security and development experts have addressed the problem of building secure software systems. There are several processes and initiatives to achieve secure software systems. However, most of these lack empirical evidence of its application and impact in building secure software systems. Two systematic mapping studies (SM have been conducted to cover the existent initiatives for identification and mitigation of security threats. The SMs created were executed in two steps, first in 2015 July, and complemented through a backward snowballing in 2016 July. Integrated results of these two SM studies show a total of 30 relevant sources were identified; 17 different initiatives covering threats identification and 14 covering the mitigation of threats were found. All the initiatives were associated to at least one activity of the Software Development Lifecycle (SDLC; while 6 showed signs of being applied in industrial settings, only 3 initiatives presented experimental evidence of its results through controlled experiments, some of the other selected studies presented case studies or proposals.

  19. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  20. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  1. Association between pathology and texture features of multi parametric MRI of the prostate

    Science.gov (United States)

    Kuess, Peter; Andrzejewski, Piotr; Nilsson, David; Georg, Petra; Knoth, Johannes; Susani, Martin; Trygg, Johan; Helbich, Thomas H.; Polanec, Stephan H.; Georg, Dietmar; Nyholm, Tufve

    2017-10-01

    The role of multi-parametric (mp)MRI in the diagnosis and treatment of prostate cancer has increased considerably. An alternative to visual inspection of mpMRI is the evaluation using histogram-based (first order statistics) parameters and textural features (second order statistics). The aims of the present work were to investigate the relationship between benign and malignant sub-volumes of the prostate and textures obtained from mpMR images. The performance of tumor prediction was investigated based on the combination of histogram-based and textural parameters. Subsequently, the relative importance of mpMR images was assessed and the benefit of additional imaging analyzed. Finally, sub-structures based on the PI-RADS classification were investigated as potential regions to automatically detect maligned lesions. Twenty-five patients who received mpMRI prior to radical prostatectomy were included in the study. The imaging protocol included T2, DWI, and DCE. Delineation of tumor regions was performed based on pathological information. First and second order statistics were derived from each structure and for all image modalities. The resulting data were processed with multivariate analysis, using PCA (principal component analysis) and OPLS-DA (orthogonal partial least squares discriminant analysis) for separation of malignant and healthy tissue. PCA showed a clear difference between tumor and healthy regions in the peripheral zone for all investigated images. The predictive ability of the OPLS-DA models increased for all image modalities when first and second order statistics were combined. The predictive value reached a plateau after adding ADC and T2, and did not increase further with the addition of other image information. The present study indicates a distinct difference in the signatures between malign and benign prostate tissue. This is an absolute prerequisite for automatic tumor segmentation, but only the first step in that direction. For the specific

  2. MAPS-15504 - Uma metodologia automatizada para avaliação de processo de software

    Directory of Open Access Journals (Sweden)

    Itana Maria de Souza Gimenes

    2000-05-01

    Full Text Available Devido às crescentes exigências por qualidade, a comunidade de engenharia de software tem produzido diversas normas e apresentado diversas abordagens sobre a qualidade dos produtos e processos de software. Grande parte dessas normas são aplicadas ao processo de software, dentre os quais se destacam pela larga utilização a ISO 9000-3, a ISO 12207, o CMM e o ISO/IEC TR 15504 (resultado dos trabalhos do projeto SPICE. Outro resultado das pesquisas da comunidade de engenharia de software são os ambientes de engenharia de software centrados em processo (PSEE, os quais visam à automação do processo de software. Este artigo apresenta MAPS-15504, uma metodologia automatizada para avaliação da qualidade do processo de software baseada no ISO/IEC TR 15504. A metodologia de avaliação de processo de software foi aplicada a um estudo de caso e implementada no ambiente do ExPSEE, um ambiente experimental desenvolvido no Departamento de Informática (DIN da Universidade Estadual de Maringá (UEM

  3. COMPARISON OF A FIXED-WING AND MULTI-ROTOR UAV FOR ENVIRONMENTAL MAPPING APPLICATIONS: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    M. A. Boon

    2017-08-01

    Full Text Available The advent and evolution of Unmanned Aerial Vehicles (UAVs and photogrammetric techniques has provided the possibility for on-demand high-resolution environmental mapping. Orthoimages and three dimensional products such as Digital Surface Models (DSMs are derived from the UAV imagery which is amongst the most important spatial information tools for environmental planning. The two main types of UAVs in the commercial market are fixed-wing and multi-rotor. Both have their advantages and disadvantages including their suitability for certain applications. Fixed-wing UAVs normally have longer flight endurance capabilities while multi-rotors can provide for stable image capturing and easy vertical take-off and landing. Therefore, the objective of this study is to assess the performance of a fixed-wing versus a multi-rotor UAV for environmental mapping applications by conducting a specific case study. The aerial mapping of the Cors-Air model aircraft field which includes a wetland ecosystem was undertaken on the same day with a Skywalker fixed-wing UAV and a Raven X8 multi-rotor UAV equipped with similar sensor specifications (digital RGB camera under the same weather conditions. We compared the derived datasets by applying the DTMs for basic environmental mapping purposes such as slope and contour mapping including utilising the orthoimages for identification of anthropogenic disturbances. The ground spatial resolution obtained was slightly higher for the multi-rotor probably due to a slower flight speed and more images. The results in terms of the overall precision of the data was noticeably less accurate for the fixed-wing. In contrast, orthoimages derived from the two systems showed small variations. The multi-rotor imagery provided better representation of vegetation although the fixed-wing data was sufficient for the identification of environmental factors such as anthropogenic disturbances. Differences were observed utilising the respective DTMs

  4. Comparison of a Fixed-Wing and Multi-Rotor Uav for Environmental Mapping Applications: a Case Study

    Science.gov (United States)

    Boon, M. A.; Drijfhout, A. P.; Tesfamichael, S.

    2017-08-01

    The advent and evolution of Unmanned Aerial Vehicles (UAVs) and photogrammetric techniques has provided the possibility for on-demand high-resolution environmental mapping. Orthoimages and three dimensional products such as Digital Surface Models (DSMs) are derived from the UAV imagery which is amongst the most important spatial information tools for environmental planning. The two main types of UAVs in the commercial market are fixed-wing and multi-rotor. Both have their advantages and disadvantages including their suitability for certain applications. Fixed-wing UAVs normally have longer flight endurance capabilities while multi-rotors can provide for stable image capturing and easy vertical take-off and landing. Therefore, the objective of this study is to assess the performance of a fixed-wing versus a multi-rotor UAV for environmental mapping applications by conducting a specific case study. The aerial mapping of the Cors-Air model aircraft field which includes a wetland ecosystem was undertaken on the same day with a Skywalker fixed-wing UAV and a Raven X8 multi-rotor UAV equipped with similar sensor specifications (digital RGB camera) under the same weather conditions. We compared the derived datasets by applying the DTMs for basic environmental mapping purposes such as slope and contour mapping including utilising the orthoimages for identification of anthropogenic disturbances. The ground spatial resolution obtained was slightly higher for the multi-rotor probably due to a slower flight speed and more images. The results in terms of the overall precision of the data was noticeably less accurate for the fixed-wing. In contrast, orthoimages derived from the two systems showed small variations. The multi-rotor imagery provided better representation of vegetation although the fixed-wing data was sufficient for the identification of environmental factors such as anthropogenic disturbances. Differences were observed utilising the respective DTMs for the mapping

  5. Consistency of parametric registration in serial MRI studies of brain tumor progression

    International Nuclear Information System (INIS)

    Mang, Andreas; Buzug, Thorsten M.; Schnabel, Julia A.; Crum, William R.; Modat, Marc; Ourselin, Sebastien; Hawkes, David J.; Camara-Rey, Oscar; Palm, Christoph; Caseiras, Gisele Brasil; Jaeger, H.R.

    2008-01-01

    The consistency of parametric registration in multi-temporal magnetic resonance (MR) imaging studies was evaluated. Serial MRI scans of adult patients with a brain tumor (glioma) were aligned by parametric registration. The performance of low-order spatial alignment (6/9/12 degrees of freedom) of different 3D serial MR-weighted images is evaluated. A registration protocol for the alignment of all images to one reference coordinate system at baseline is presented. Registration results were evaluated for both, multimodal intra-timepoint and mono-modal multi-temporal registration. The latter case might present a challenge to automatic intensity-based registration algorithms due to ill-defined correspondences. The performance of our algorithm was assessed by testing the inverse registration consistency. Four different similarity measures were evaluated to assess consistency. Careful visual inspection suggests that images are well aligned, but their consistency may be imperfect. Sub-voxel inconsistency within the brain was found for allsimilarity measures used for parametric multi-temporal registration. T1-weighted images were most reliable for establishing spatial correspondence between different timepoints. The parametric registration algorithm is feasible for use in this application. The sub-voxel resolution mean displacement error of registration transformations demonstrates that the algorithm converges to an almost identical solution for forward and reverse registration. (orig.)

  6. Negotiation and Decision Making with Collaborative Software: How MarineMap 'Changed the Game' in California's Marine Life Protected Act Initiative.

    Science.gov (United States)

    Cravens, Amanda E

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study-which draws on data from approximately 60 semi-structured interviews and an online survey--examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  7. Negotiation and Decision Making with Collaborative Software: How MarineMap `Changed the Game' in California's Marine Life Protected Act Initiative

    Science.gov (United States)

    Cravens, Amanda E.

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study—which draws on data from approximately 60 semi-structured interviews and an online survey—examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  8. Study of Noise Map and its Features in an Indoor Work Environment through GIS-Based Software

    Directory of Open Access Journals (Sweden)

    Faramarz Majidi

    2016-06-01

    Full Text Available Background: Noise mapping in industry can be useful to assess the risks of harmful noise, or to monitor noise in machine rooms. Using GIS -based software for plotting noise maps in an indoor noisy work environment can be helpful for occupational hygienists to monitor noise pollution. Methods: This study was carried out in a noisy packaging unit of a food industry in Ghazvin industrial zone, to evaluate noise levels by GIS technique. For this reason the floor of packaging unit was divided into squares of 2×2 meters and the center of each square was marked as a measurement station based on NIOSH method. The sound pressure level in each station was measured and then the measurement values were imported into Arc GIS software to plot noise map. Results: Unlike the current method, the noise maps generated by GIS technique are consistent with the nature of sound propagation. Conclusion: This study showed that for an indoor work environment, the application of GIS technology rendering the assessment of noise levels in the form of noise maps, is more realistic and more accurate than the routine method which is now being used by the occupational hygienists.

  9. Una introduzione ai software per il crime mapping / Observations préliminaires sur les logiciels du mappage du crime / Some introductory notes on crime mapping software

    Directory of Open Access Journals (Sweden)

    Ummarino Alessandro

    2013-03-01

    Full Text Available RiassuntoIl Crime Mapping più che una disciplina a se stante non è altro che l’applicazione di tecniche di analisi statistico-geografica allo studio dei reati. Grazie all’utilizzo dei software GIS (Geographic Information System, all’esponenziale sviluppo dell’informatica e alla facile accessibilità al web, la produzione di mappe di qualità è ormai alla portata di un qualunque utente medio. La possibilità di applicare tali tecniche di analisi è offerta in modo efficace da software GIS commerciali e da software GIS free e open source. Chi si vuole avvicinare a questa disciplina, sia che intenda procedere con applicazioni di tipo tattico (pianificazione dei controlli, attività di prevenzione, investigazioni giudiziarie, etc. sia che intenda svolgere degli studi di tipo sociologico (criminalità, devianza, illegalità diffusa, percezione della sicurezza, etc., deve comunque acquisire una solida preparazione di base nell’utilizzo di programmi GIS prima di inferire generalizzazioni dai risultati utilizzando chiavi di lettura provenienti dalle scienze sociali. Il Crime Mapping può trovare una valida applicazione nell’ambito di una generale attività di polizia, soprattutto a livello locale, per la gestione delle risorse destinate alla sicurezza, per la programmazione dei servizi di polizia e soprattutto quale supporto di tipo tattico nell’ambito di attività mirate alla repressione e alla prevenzione di specifici atti criminosi e illeciti. Le mappage du crime n’est pas simplement une discipline en soi, mais une application de techniques d’analyse statistiques et géographiques à l’étude du crime. Grâce au développement exponentiel de l’informatique et à l’accessibilité du Web , tous les utilisateurs moyens ont désormais la possibilité de produire des cartes des crimes de qualité avec le logiciel SIG (système d'information géographique (GIS - Geographic Information System. Aujourd’hui la possibilité de se

  10. Volume of high-risk intratumoral subregions at multi-parametric MR imaging predicts overall survival and complements molecular analysis of glioblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yi; Li, Ruijiang [Stanford University, Department of Radiation Oncology, Palo Alto, CA (United States); Hokkaido University, Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido (Japan); Ren, Shangjie [Tianjin University, School of Electrical Engineering and Automation, Tianjin Shi (China); Tha, Khin Khin; Shirato, Hiroki [Hokkaido University, Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido (Japan); Hokkaido University, Department of Radiology and Nuclear Medicine, Hokkaido (Japan); Wu, Jia [Stanford University, Department of Radiation Oncology, Palo Alto, CA (United States)

    2017-09-15

    To develop and validate a volume-based, quantitative imaging marker by integrating multi-parametric MR images for predicting glioblastoma survival, and to investigate its relationship and synergy with molecular characteristics. We retrospectively analysed 108 patients with primary glioblastoma. The discovery cohort consisted of 62 patients from the cancer genome atlas (TCGA). Another 46 patients comprising 30 from TCGA and 16 internally were used for independent validation. Based on integrated analyses of T1-weighted contrast-enhanced (T1-c) and diffusion-weighted MR images, we identified an intratumoral subregion with both high T1-c and low ADC, and accordingly defined a high-risk volume (HRV). We evaluated its prognostic value and biological significance with genomic data. On both discovery and validation cohorts, HRV predicted overall survival (OS) (concordance index: 0.642 and 0.653, P < 0.001 and P = 0.038, respectively). HRV stratified patients within the proneural molecular subtype (log-rank P = 0.040, hazard ratio = 2.787). We observed different OS among patients depending on their MGMT methylation status and HRV (log-rank P = 0.011). Patients with unmethylated MGMT and high HRV had significantly shorter survival (median survival: 9.3 vs. 18.4 months, log-rank P = 0.002). Volume of the high-risk intratumoral subregion identified on multi-parametric MRI predicts glioblastoma survival, and may provide complementary value to genomic information. (orig.)

  11. The software and hardware architectural design of the vessel thermal map real-time system in JET

    International Nuclear Information System (INIS)

    Alves, D.; Neto, A.; Valcarcel, D.F.; Jachmich, S.; Arnoux, G.; Card, P.; Devaux, S.; Felton, R.; Goodyear, A.; Kinna, D.; Lomas, P; McCullen, P.; Stephen, A.; Zastrow, K.D.

    2012-01-01

    The installation of ITER-relevant materials for the Plasma Facing Components (PFCs) in the Joint European Torus (JET) is expected to have a strong impact on the operation and protection of the experiment. In particular, the use of all-beryllium tiles, which deteriorate at a substantially lower temperature than the formerly installed Carbon Fibre Composite (CFC) tiles, imposes strict thermal restrictions on the PFCs during operation. Prompt and precise responses are therefore required whenever anomalous temperatures are detected. The new Vessel Thermal Map (VTM) real-time application collects the temperature measurements provided by dedicated pyrometers and Infra-Red (IR) cameras, groups them according to spatial location and probable offending heat source and raises alarms that will trigger appropriate protective responses. In the context of JETs global scheme for the protection of the new wall, the system is required to run on a 10 millisecond cycle communicating with other systems through the Real-Time Data Network (RTDN). In order to meet these requirements a Commercial Off- The-Shelf (COTS) solution has been adopted based on standard *86 multi-core technology, Linux and the Multi-threaded Application Real-Time executor (MARTe) software framework. This paper presents an overview of the system with particular technical focus on the configuration of its real-time capability and the benefits of the modular development approach and advanced tools provided by the MARTe framework. (authors)

  12. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks

    KAUST Repository

    Lautenschlä ger, Karin; Hwang, Chiachi; Liu, Wentso; Boon, Nico; Kö ster, Oliver; Vrouwenvelder, Johannes S.; Egli, Thomas; Hammes, Frederik A.

    2013-01-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (±0.6)×104cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, sofar for unknown reasons, recorded a slight but significantly higher TCC (1.3(±0.1)×105cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used provides a powerful

  13. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks.

    Science.gov (United States)

    Lautenschlager, Karin; Hwang, Chiachi; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Vrouwenvelder, Hans; Egli, Thomas; Hammes, Frederik

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52 h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (± 0.6) × 10(4) cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, so far for unknown reasons, recorded a slight but significantly higher TCC (1.3 (± 0.1) × 10(5) cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used

  14. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks

    KAUST Repository

    Lautenschläger, Karin

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (±0.6)×104cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, sofar for unknown reasons, recorded a slight but significantly higher TCC (1.3(±0.1)×105cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used provides a powerful

  15. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  16. Sparse PDF maps for non-linear multi-resolution image operations

    KAUST Repository

    Hadwiger, Markus

    2012-11-01

    We introduce a new type of multi-resolution image pyramid for high-resolution images called sparse pdf maps (sPDF-maps). Each pyramid level consists of a sparse encoding of continuous probability density functions (pdfs) of pixel neighborhoods in the original image. The encoded pdfs enable the accurate computation of non-linear image operations directly in any pyramid level with proper pre-filtering for anti-aliasing, without accessing higher or lower resolutions. The sparsity of sPDF-maps makes them feasible for gigapixel images, while enabling direct evaluation of a variety of non-linear operators from the same representation. We illustrate this versatility for antialiased color mapping, O(n) local Laplacian filters, smoothed local histogram filters (e.g., median or mode filters), and bilateral filters. © 2012 ACM.

  17. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...... for collaborative work, fostering awareness, knowledge management and coordination among team members. Contrary to the evident high importance of the social aspects offered by SoSo, socialization is not the most important usage reported. Conclusions: This review reports how SoSo is used in GSD and how it is capable...... of supporting GSD teams. Four emerging themes in global software engineering were identified: the appropriation and development of usage structures; understanding how an ecology of communication channels and tools are used by teams; the role played by SoSo either as a subtext or as an explicit goal; and finally...

  18. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  19. Improving package structure of object-oriented software using multi-objective optimization and weighted class connections

    Directory of Open Access Journals (Sweden)

    Amarjeet

    2017-07-01

    Full Text Available The software maintenance activities performed without following the original design decisions about the package structure usually deteriorate the quality of software modularization, leading to decay of the quality of the system. One of the main reasons for such structural deterioration is inappropriate grouping of source code classes in software packages. To improve such grouping/modular-structure, previous researchers formulated the software remodularization problem as an optimization problem and solved it using search-based meta-heuristic techniques. These optimization approaches aimed at improving the quality metrics values of the structure without considering the original package design decisions, often resulting into a totally new software modularization. The entirely changed software modularization becomes costly to realize as well as difficult to understand for the developers/maintainers. To alleviate this issue, we propose a multi-objective optimization approach to improve the modularization quality of an object-oriented system with minimum possible movement of classes between existing packages of original software modularization. The optimization is performed using NSGA-II, a widely-accepted multi-objective evolutionary algorithm. In order to ensure minimum modification of original package structure, a new approach of computing class relations using weighted strengths has been proposed here. The weights of relations among different classes are computed on the basis of the original package structure. A new objective function has been formulated using these weighted class relations. This objective function drives the optimization process toward better modularization quality simultaneously ensuring preservation of original structure. To evaluate the results of the proposed approach, a series of experiments are conducted over four real-worlds and two random software applications. The experimental results clearly indicate the effectiveness

  20. Software-based data path for raster-scanned multi-beam mask lithography

    Science.gov (United States)

    Rajagopalan, Archana; Agarwal, Ankita; Buck, Peter; Geller, Paul; Hamaker, H. Christopher; Rao, Nagswara

    2016-10-01

    According to the 2013 SEMATECH Mask Industry Survey,i roughly half of all photomasks are produced using laser mask pattern generator ("LMPG") lithography. LMPG lithography can be used for all layers at mature technology nodes, and for many non-critical and semi-critical masks at advanced nodes. The extensive use of multi-patterning at the 14-nm node significantly increases the number of critical mask layers, and the transition in wafer lithography from positive tone resist to negative tone resist at the 14-nm design node enables the switch from advanced binary masks back to attenuated phase shifting masks that require second level writes to remove unwanted chrome. LMPG lithography is typically used for second level writes due to its high productivity, absence of charging effects, and versatile non-actinic alignment capability. As multi-patterning use expands from double to triple patterning and beyond, the number of LMPG second level writes increases correspondingly. The desire to reserve the limited capacity of advanced electron beam writers for use when essential is another factor driving the demand for LMPG capacity. The increasing demand for cost-effective productivity has kept most of the laser mask writers ever manufactured running in production, sometimes long past their projected lifespan, and new writers continue to be built based on hardware developed some years ago.ii The data path is a case in point. While state-ofthe- art when first introduced, hardware-based data path systems are difficult to modify or add new features to meet the changing requirements of the market. As data volumes increase, design styles change, and new uses are found for laser writers, it is useful to consider a replacement for this critical subsystem. The availability of low-cost, high-performance, distributed computer systems combined with highly scalable EDA software lends itself well to creating an advanced data path system. EDA software, in routine production today, scales

  1. Analysis of brain SPECT with the statistical parametric mapping package SPM99

    International Nuclear Information System (INIS)

    Barnden, L.R.; Rowe, C.C.

    2000-01-01

    Full text: The Statistical Parametric Mapping (SPM) package of the Welcome Department of Cognitive Neurology permits detection in the brain of different regional uptake in an individual subject or a population of subjects compared to a normal population. SPM does not require a-priori specification of regions of interest. Recently SPM has been upgraded from SPM96 to SPM99. Our aim was to vary brain SPECT processing options in the application of SPM to optimise the final statistical map in three clinical trials. The sensitivity of SPM depends on the fidelity of the preliminary spatial normalisation of each scan to the standard anatomical space defined by a template scan provided with SPM. We generated our own SPECT template and compared spatial normalisation to it and to SPM's internal PET template. We also investigated the effects of scatter subtraction, stripping of scalp activity, reconstruction algorithm, non-linear deformation and derivation of spatial normalisation parameters using co-registered MR. Use of our SPECT template yielded better results than with SPM's PET template. Accuracy of SPECT to MR co-registration was 2.5mm with SPM96 and 1.2mm with SPM99. Stripping of scalp activity improved results with SPM96 but was unnecessary with SPM99. Scatter subtraction increased the sensitivity of SPM. Non-linear deformation additional to linear (affine) transformation only marginally improved the final result. Use of the SPECT template yielded more significant results than those obtained when co registered MR was used to derive the transformation parameters. SPM99 is more robust than SPM96 and optimum SPECT analysis requires a SPECT template. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  2. Lenstronomy: Multi-purpose gravitational lens modeling software package

    Science.gov (United States)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  3. The parametric resonance—from LEGO Mindstorms to cold atoms

    Science.gov (United States)

    Kawalec, Tomasz; Sierant, Aleksandra

    2017-07-01

    We show an experimental setup based on a popular LEGO Mindstorms set, allowing us to both observe and investigate the parametric resonance phenomenon. The presented method is simple but covers a variety of student activities like embedded software development, conducting measurements, data collection and analysis. It may be used during science shows, as part of student projects and to illustrate the parametric resonance in mechanics or even quantum physics, during lectures or classes. The parametrically driven LEGO pendulum gains energy in a spectacular way, increasing its amplitude from 10° to about 100° within a few tens of seconds. We provide also a short description of a wireless absolute orientation sensor that may be used in quantitative analysis of driven or free pendulum movement.

  4. The parametric resonance—from LEGO Mindstorms to cold atoms

    International Nuclear Information System (INIS)

    Kawalec, Tomasz; Sierant, Aleksandra

    2017-01-01

    We show an experimental setup based on a popular LEGO Mindstorms set, allowing us to both observe and investigate the parametric resonance phenomenon. The presented method is simple but covers a variety of student activities like embedded software development, conducting measurements, data collection and analysis. It may be used during science shows, as part of student projects and to illustrate the parametric resonance in mechanics or even quantum physics, during lectures or classes. The parametrically driven LEGO pendulum gains energy in a spectacular way, increasing its amplitude from 10° to about 100° within a few tens of seconds. We provide also a short description of a wireless absolute orientation sensor that may be used in quantitative analysis of driven or free pendulum movement. (paper)

  5. Subgrid-scale stresses and scalar fluxes constructed by the multi-scale turnover Lagrangian map

    Science.gov (United States)

    AL-Bairmani, Sukaina; Li, Yi; Rosales, Carlos; Xie, Zheng-tong

    2017-04-01

    The multi-scale turnover Lagrangian map (MTLM) [C. Rosales and C. Meneveau, "Anomalous scaling and intermittency in three-dimensional synthetic turbulence," Phys. Rev. E 78, 016313 (2008)] uses nested multi-scale Lagrangian advection of fluid particles to distort a Gaussian velocity field and, as a result, generate non-Gaussian synthetic velocity fields. Passive scalar fields can be generated with the procedure when the fluid particles carry a scalar property [C. Rosales, "Synthetic three-dimensional turbulent passive scalar fields via the minimal Lagrangian map," Phys. Fluids 23, 075106 (2011)]. The synthetic fields have been shown to possess highly realistic statistics characterizing small scale intermittency, geometrical structures, and vortex dynamics. In this paper, we present a study of the synthetic fields using the filtering approach. This approach, which has not been pursued so far, provides insights on the potential applications of the synthetic fields in large eddy simulations and subgrid-scale (SGS) modelling. The MTLM method is first generalized to model scalar fields produced by an imposed linear mean profile. We then calculate the subgrid-scale stress, SGS scalar flux, SGS scalar variance, as well as related quantities from the synthetic fields. Comparison with direct numerical simulations (DNSs) shows that the synthetic fields reproduce the probability distributions of the SGS energy and scalar dissipation rather well. Related geometrical statistics also display close agreement with DNS results. The synthetic fields slightly under-estimate the mean SGS energy dissipation and slightly over-predict the mean SGS scalar variance dissipation. In general, the synthetic fields tend to slightly under-estimate the probability of large fluctuations for most quantities we have examined. Small scale anisotropy in the scalar field originated from the imposed mean gradient is captured. The sensitivity of the synthetic fields on the input spectra is assessed by

  6. Towards Multi-Method Research Approach in Empirical Software Engineering

    Science.gov (United States)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  7. Method of development of the program of forming of parametrical drawings of details in the AutoCAD software product

    Science.gov (United States)

    Alshakova, E. L.

    2017-01-01

    The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.

  8. Rank-shaping regularization of exponential spectral analysis for application to functional parametric mapping

    International Nuclear Information System (INIS)

    Turkheimer, Federico E; Hinz, Rainer; Gunn, Roger N; Aston, John A D; Gunn, Steve R; Cunningham, Vincent J

    2003-01-01

    Compartmental models are widely used for the mathematical modelling of dynamic studies acquired with positron emission tomography (PET). The numerical problem involves the estimation of a sum of decaying real exponentials convolved with an input function. In exponential spectral analysis (SA), the nonlinear estimation of the exponential functions is replaced by the linear estimation of the coefficients of a predefined set of exponential basis functions. This set-up guarantees fast estimation and attainment of the global optimum. SA, however, is hampered by high sensitivity to noise and, because of the positivity constraints implemented in the algorithm, cannot be extended to reference region modelling. In this paper, SA limitations are addressed by a new rank-shaping (RS) estimator that defines an appropriate regularization over an unconstrained least-squares solution obtained through singular value decomposition of the exponential base. Shrinkage parameters are conditioned on the expected signal-to-noise ratio. Through application to simulated and real datasets, it is shown that RS ameliorates and extends SA properties in the case of the production of functional parametric maps from PET studies

  9. Influence of image reconstruction methods on statistical parametric mapping of brain PET images

    International Nuclear Information System (INIS)

    Yin Dayi; Chen Yingmao; Yao Shulin; Shao Mingzhe; Yin Ling; Tian Jiahe; Cui Hongyan

    2007-01-01

    Objective: Statistic parametric mapping (SPM) was widely recognized as an useful tool in brain function study. The aim of this study was to investigate if imaging reconstruction algorithm of PET images could influence SPM of brain. Methods: PET imaging of whole brain was performed in six normal volunteers. Each volunteer had two scans with true and false acupuncturing. The PET scans were reconstructed using ordered subsets expectation maximization (OSEM) and filtered back projection (FBP) with 3 varied parameters respectively. The images were realigned, normalized and smoothed using SPM program. The difference between true and false acupuncture scans was tested using a matched pair t test at every voxel. Results: (1) SPM corrected multiple comparison (P corrected uncorrected <0.001): SPM derived from the images with different reconstruction method were different. The largest difference, in number and position of the activated voxels, was noticed between FBP and OSEM re- construction algorithm. Conclusions: The method of PET image reconstruction could influence the results of SPM uncorrected multiple comparison. Attention should be paid when the conclusion was drawn using SPM uncorrected multiple comparison. (authors)

  10. Multi-channel Analysis of Passive Surface Waves (MAPS)

    Science.gov (United States)

    Xia, J.; Cheng, F. Mr; Xu, Z.; Wang, L.; Shen, C.; Liu, R.; Pan, Y.; Mi, B.; Hu, Y.

    2017-12-01

    Urbanization is an inevitable trend in modernization of human society. In the end of 2013 the Chinese Central Government launched a national urbanization plan—"Three 100 Million People", which aggressively and steadily pushes forward urbanization. Based on the plan, by 2020, approximately 100 million people from rural areas will permanently settle in towns, dwelling conditions of about 100 million people in towns and villages will be improved, and about 100 million people in the central and western China will permanently settle in towns. China's urbanization process will run at the highest speed in the urbanization history of China. Environmentally friendly, non-destructive and non-invasive geophysical assessment method has played an important role in the urbanization process in China. Because human noise and electromagnetic field due to industrial life, geophysical methods already used in urban environments (gravity, magnetics, electricity, seismic) face great challenges. But humanity activity provides an effective source of passive seismic methods. Claerbout pointed out that wavefileds that are received at one point with excitation at the other point can be reconstructed by calculating the cross-correlation of noise records at two surface points. Based on this idea (cross-correlation of two noise records) and the virtual source method, we proposed Multi-channel Analysis of Passive Surface Waves (MAPS). MAPS mainly uses traffic noise recorded with a linear receiver array. Because Multi-channel Analysis of Surface Waves can produces a shear (S) wave velocity model with high resolution in shallow part of the model, MPAS combines acquisition and processing of active source and passive source data in a same flow, which does not require to distinguish them. MAPS is also of ability of real-time quality control of noise recording that is important for near-surface applications in urban environment. The numerical and real-world examples demonstrated that MAPS can be

  11. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  12. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping.

    Science.gov (United States)

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-02-29

    Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.

  13. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-12-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".

  14. Study on high gain broadband optical parametric chirped pulse amplification

    International Nuclear Information System (INIS)

    Zhang, S.K.; Fujita, M.; Yamanaka, C.; Yoshida, H.; Kodama, R.; Fujita, H.; Nakatsuka, M.; Izawa, Y.

    2000-01-01

    Optical parametric chirped pulse amplification has apparent advantages over the current schemes for high energy ultrashort pulse amplification. High gain in a single pass amplification, small B-integral, low heat deposition, high contrast ratio and, especially the extremely broad gain bandwidth with large-size crystals available bring people new hope for over multi-PW level at which the existing Nd:glass systems suffered difficulties. In this paper we present simulation and experimental studies for a high gain optical parametric chirped pulse amplification system which may be used as a preamplifier to replace the current complicated regenerative system or multi-pass Ti:sapphire amplifiers. Investigations on the amplification bandwidth and gain with BBO are performed. Analysis and discussions are also given. (author)

  15. Software Defined Networking (SDN) controlled all optical switching networks with multi-dimensional switching architecture

    Science.gov (United States)

    Zhao, Yongli; Ji, Yuefeng; Zhang, Jie; Li, Hui; Xiong, Qianjin; Qiu, Shaofeng

    2014-08-01

    Ultrahigh throughout capacity requirement is challenging the current optical switching nodes with the fast development of data center networks. Pbit/s level all optical switching networks need to be deployed soon, which will cause the high complexity of node architecture. How to control the future network and node equipment together will become a new problem. An enhanced Software Defined Networking (eSDN) control architecture is proposed in the paper, which consists of Provider NOX (P-NOX) and Node NOX (N-NOX). With the cooperation of P-NOX and N-NOX, the flexible control of the entire network can be achieved. All optical switching network testbed has been experimentally demonstrated with efficient control of enhanced Software Defined Networking (eSDN). Pbit/s level all optical switching nodes in the testbed are implemented based on multi-dimensional switching architecture, i.e. multi-level and multi-planar. Due to the space and cost limitation, each optical switching node is only equipped with four input line boxes and four output line boxes respectively. Experimental results are given to verify the performance of our proposed control and switching architecture.

  16. Multi-sensors multi-baseline mapping system for mobile robot using stereovision camera and laser-range device

    Directory of Open Access Journals (Sweden)

    Mohammed Faisal

    2016-06-01

    Full Text Available Countless applications today are using mobile robots, including autonomous navigation, security patrolling, housework, search-and-rescue operations, material handling, manufacturing, and automated transportation systems. Regardless of the application, a mobile robot must use a robust autonomous navigation system. Autonomous navigation remains one of the primary challenges in the mobile-robot industry; many control algorithms and techniques have been recently developed that aim to overcome this challenge. Among autonomous navigation methods, vision-based systems have been growing in recent years due to rapid gains in computational power and the reliability of visual sensors. The primary focus of research into vision-based navigation is to allow a mobile robot to navigate in an unstructured environment without collision. In recent years, several researchers have looked at methods for setting up autonomous mobile robots for navigational tasks. Among these methods, stereovision-based navigation is a promising approach for reliable and efficient navigation. In this article, we create and develop a novel mapping system for a robust autonomous navigation system. The main contribution of this article is the fuse of the multi-baseline stereovision (narrow and wide baselines and laser-range reading data to enhance the accuracy of the point cloud, to reduce the ambiguity of correspondence matching, and to extend the field of view of the proposed mapping system to 180°. Another contribution is the pruning the region of interest of the three-dimensional point clouds to reduce the computational burden involved in the stereo process. Therefore, we called the proposed system multi-sensors multi-baseline mapping system. The experimental results illustrate the robustness and accuracy of the proposed system.

  17. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  18. Experimental Evaluation of Multi-Round Matrix Multiplication on MapReduce

    DEFF Research Database (Denmark)

    Ceccarello, Matteo; Silvestri, Francesco

    2015-01-01

    required by reduce functions. Then, we present an extensive study of this library on an in-house cluster and on Amazon Web Services aiming at showing its performance and at comparing monolithic and multi-round approaches. The experiments show that, even without a low level optimization, it is possible...... not be the best approach in cloud systems. Indeed, multi-round algorithms may exploit some features of cloud platforms by suitably setting the round number according to the execution context. In this paper we carry out an experimental study of multi-round MapReduce algorithms aiming at investigating...... the performance of the multi-round approach. We use matrix multiplication as a case study. We first propose a scalable Hadoop library, named M3, for matrix multiplication in the dense and sparse cases which allows to tradeoff round number with the amount of data shuffled in each round and the amount of memory...

  19. Parametric analysis of energy quality management for district in China using multi-objective optimization approach

    International Nuclear Information System (INIS)

    Lu, Hai; Yu, Zitao; Alanne, Kari; Xu, Xu; Fan, Liwu; Yu, Han; Zhang, Liang; Martinac, Ivo

    2014-01-01

    Highlights: • A time-effective multi-objective design optimization scheme is proposed. • The scheme aims at exploring suitable 3E energy system for the specific case. • A realistic case located in China is used for the analysis. • Parametric study is investigated to test the effects of different parameters. - Abstract: Due to the increasing energy demands and global warming, energy quality management (EQM) for districts has been getting importance over the last few decades. The evaluation of the optimum energy systems for specific districts is an essential part of EQM. This paper presents a deep analysis of the optimum energy systems for a district sited in China. A multi-objective optimization approach based on Genetic Algorithm (GA) is proposed for the analysis. The optimization process aims to search for the suitable 3E (minimum economic cost and environmental burden as well as maximum efficiency) energy systems. Here, life cycle CO 2 equivalent (LCCO 2 ), life cycle cost (LCC) and exergy efficiency (EE) are set as optimization objectives. Then, the optimum energy systems for the Chinese case are presented. The final work is to investigate the effects of different energy parameters. The results show the optimum energy systems might vary significantly depending on some parameters

  20. EPANET Multi-Species Extension Software and User's Manual ...

    Science.gov (United States)

    Software and User's Manual EPANET is used in homeland security research to model contamination threats to water systems. Historically, EPANET has been limited to tracking the dynamics of a single chemical transported through a network of pipes and storage tanks, such as a fluoride used in a tracer study or free chlorine used in a disinfection decay study. Recently, the NHSRC released a new extension to EPANET called EPANET-MSX (Multi-Species eXtension) that allows for the consideration of multiple interacting species in the bulk flow and on the pipe walls. This capability has been incorporated into both a stand-alone executable program as well as a toolkit library of functions that programmers can use to build customized applications.

  1. Efficient prediction of ground noise from helicopters and parametric studies based on acoustic mapping

    Directory of Open Access Journals (Sweden)

    Fei WANG

    2018-02-01

    Full Text Available Based on the acoustic mapping, a prediction model for the ground noise radiated from an in-flight helicopter is established. For the enhancement of calculation efficiency, a high-efficiency second-level acoustic radiation model capable of taking the influence of atmosphere absorption on noise into account is first developed by the combination of the point-source idea and the rotor noise radiation characteristics. The comparison between the present model and the direct computation method of noise is done and the high efficiency of the model is validated. Rotor free-wake analysis method and Ffowcs Williams-Hawkings (FW-H equation are applied to the aerodynamics and noise prediction in the present model. Secondly, a database of noise spheres with the characteristic parameters of advance ratio and tip-path-plane angle is established by the helicopter trim model together with a parametric modeling approach. Furthermore, based on acoustic mapping, a method of rapid simulation for the ground noise radiated from an in-flight helicopter is developed. The noise footprint for AH-1 rotor is then calculated and the influence of some parameters including advance ratio and flight path angle on ground noise is deeply analyzed using the developed model. The results suggest that with the increase of advance ratio and flight path angle, the peak noise levels on the ground first increase and then decrease, in the meantime, the maximum Sound Exposure Level (SEL noise on the ground shifts toward the advancing side of rotor. Besides, through the analysis of the effects of longitudinal forces on miss-distance and rotor Blade-Vortex Interaction (BVI noise in descent flight, some meaningful results for reducing the BVI noise on the ground are obtained. Keywords: Acoustic mapping, Helicopter, Noise footprint, Rotor noise, Second-level acoustic radiation model

  2. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies

    Science.gov (United States)

    Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong

    2017-05-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans—each containing 1/8th of the total number of events—were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other

  3. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in-vivo studies

    Science.gov (United States)

    Petibon, Yoann; Rakvongthai, Yothin; Fakhri, Georges El; Ouyang, Jinsong

    2017-01-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves -TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in-vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans - each containing 1/8th of the total number of events - were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard Ordered Subset Expectation Maximization (OSEM) reconstruction algorithm on one side, and the One-Step Late Maximum a Posteriori (OSL-MAP) algorithm on the other

  4. Cyberphysical systems for epilepsy and related brain disorders multi-parametric monitoring and analysis for diagnosis and optimal disease management

    CERN Document Server

    Antonopoulos, Christos

    2015-01-01

    This book introduces a new cyberphysical system that combines clinical and basic neuroscience research with advanced data analysis and medical management tools for developing novel applications for the management of epilepsy. The authors describe the algorithms and architectures needed to provide ambulatory, diagnostic and long-term monitoring services, through multi parametric data collection. Readers will see how to achieve in-hospital quality standards, addressing conventional “routine” clinic-based service purposes, at reduced cost, enhanced capability, and increased geographical availability. The cyberphysical system described in this book is flexible, can be optimized for each patient, and is demonstrated in several case studies.

  5. An ultra-dense integrated linkage map for hexaploid chrysanthemum enables multi-allelic QTL analysis

    NARCIS (Netherlands)

    Geest, van Geert; Bourke, Peter M.; Voorrips, Roeland E.; Marasek-Ciolakowska, Agnieszka; Liao, Yanlin; Post, Aike; Meeteren, van Uulke; Visser, Richard G.F.; Maliepaard, Chris; Arens, Paul

    2017-01-01

    Key message: We constructed the first integrated genetic linkage map in a polysomic hexaploid. This enabled us to estimate inheritance of parental haplotypes in the offspring and detect multi-allelic QTL.Abstract: Construction and use of linkage maps are challenging in hexaploids with polysomic

  6. Evaluating the scalability of HEP software and multi-core hardware

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A

    2011-01-01

    As researchers have reached the practical limits of processor performance improvements by frequency scaling, it is clear that the future of computing lies in the effective utilization of parallel and multi-core architectures. Since this significant change in computing is well underway, it is vital for HEP programmers to understand the scalability of their software on modern hardware and the opportunities for potential improvements. This work aims to quantify the benefit of new mainstream architectures to the HEP community through practical benchmarking on recent hardware solutions, including the usage of parallelized HEP applications.

  7. MODEL OF IMPROVING ENVIRONMENTAL MANAGEMENT SYSTEM BY MULTI - SOFTWARE

    Directory of Open Access Journals (Sweden)

    Jelena Jovanovic

    2009-03-01

    Full Text Available This paper is based on doctoral dissertation which is oriented on improving environmental management system using multi - software. In this doctoral dissertation will be used key results of master thesis which is oriented on quantification environmental aspects and impacts by artificial neural network in organizations. This paper recommend improving environmental management system in organization using Balanced scorecard model and MCDM method - AHP (Analytic hierarchy process based on group decision. BSC would be spread with elements of Environmental management system and used in area of strategic management system in organization and AHP would be used in area of checking results getting by quantification environmental aspects and impacts.

  8. Statistical Parametric Mapping to Identify Differences between Consensus-Based Joint Patterns during Gait in Children with Cerebral Palsy.

    Science.gov (United States)

    Nieuwenhuys, Angela; Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne

    2017-01-01

    Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with 'no or minor gait deviations' (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with 'no or minor gait deviations' differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study

  9. Schedulability-Driven Partitioning and Mapping for Multi-Cluster Real-Time Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2004-01-01

    We present an approach to partitioning and mapping for multi-cluster embedded systems consisting of time-triggered and event-triggered clusters, interconnected via gateways. We have proposed a schedulability analysis for such systems, including a worst-case queuing delay analysis for the gateways...

  10. Multi-user software of radio therapeutical calculation using a computational network

    International Nuclear Information System (INIS)

    Allaucca P, J.J.; Picon C, C.; Zaharia B, M.

    1998-01-01

    It has been designed a hardware and software system for a radiotherapy Department. It runs under an Operative system platform Novell Network sharing the existing resources and of the server, it is centralized, multi-user and of greater safety. It resolves a variety of problems and calculation necessities, patient steps and administration, it is very fast and versatile, it contains a set of menus and options which may be selected with mouse, direction arrows or abbreviated keys. (Author)

  11. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  12. Parametric cost models for space telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtnay

    2017-11-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  13. Parametric Cost Models for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  14. A multi-resolution HEALPix data structure for spherically mapped point data

    Directory of Open Access Journals (Sweden)

    Robert W. Youngren

    2017-06-01

    Full Text Available Data describing entities with locations that are points on a sphere are described as spherically mapped. Several data structures designed for spherically mapped data have been developed. One of them, known as Hierarchical Equal Area iso-Latitude Pixelization (HEALPix, partitions the sphere into twelve diamond-shaped equal-area base cells and then recursively subdivides each cell into four diamond-shaped subcells, continuing to the desired level of resolution. Twelve quadtrees, one associated with each base cell, store the data records associated with that cell and its subcells.HEALPix has been used successfully for numerous applications, notably including cosmic microwave background data analysis. However, for applications involving sparse point data HEALPix has possible drawbacks, including inefficient memory utilization, overwriting of proximate points, and return of spurious points for certain queries.A multi-resolution variant of HEALPix specifically optimized for sparse point data was developed. The new data structure allows different areas of the sphere to be subdivided at different levels of resolution. It combines HEALPix positive features with the advantages of multi-resolution, including reduced memory requirements and improved query performance.An implementation of the new Multi-Resolution HEALPix (MRH data structure was tested using spherically mapped data from four different scientific applications (warhead fragmentation trajectories, weather station locations, galaxy locations, and synthetic locations. Four types of range queries were applied to each data structure for each dataset. Compared to HEALPix, MRH used two to four orders of magnitude less memory for the same data, and on average its queries executed 72% faster. Keywords: Computer science

  15. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    Science.gov (United States)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  16. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having “very high susceptibility”, with the further 31% falling into zones classified as having “high susceptibility”. PMID:26089577

  17. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping

    International Nuclear Information System (INIS)

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-01-01

    Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques

  18. Recent developments in multi-parametric three-dimensional stress field representation in plates weakened by cracks and notches

    Directory of Open Access Journals (Sweden)

    P. Lazzarin

    2013-07-01

    Full Text Available The paper deals with the three-dimensional nature and the multi-parametric representation of the stress field ahead of cracks and notches of different shape. Finite thickness plates are considered, under different loading conditions. Under certain hypotheses, the three-dimensional governing equations of elasticity can be reduced to a system where a bi-harmonic equation and a harmonic equation have to be simultaneously satisfied. The former provides the solution of the corresponding plane notch problem, the latter provides the solution of the corresponding out-of-plane shear notch problem. The analytical frame is applied to some notched and cracked geometries and its degree of accuracy is discussed comparing theoretical results and numerical data from 3D FE models.

  19. Optimizing strategy software for repetitive construction projects within multi-mode resources

    Directory of Open Access Journals (Sweden)

    Remon Fayek Aziz

    2013-09-01

    Full Text Available Estimating tender data for specific project is the most essential part in construction areas as of contractor’s view such as: proposed project duration with corresponding gross value and cash flows. This paper focuses on how to calculate tender data using Optimizing Strategy Software (OSS for repetitive construction projects with identical activity’s duration in case of single number of crew such as: project duration, project/bid price, project maximum working capital, and project net present value of the studied project. A simplified multi-objective optimization software (OSS will be presented that creates best tender data to contractor compared with more feasible options generated from multi-mode resources in a given project. OSS is intended to give more scenarios which provide practical support for typical construction contractors who need to optimize resource utilization in order to minimize project duration, project/bid price, and project maximum working capital while maximizing its net present value simultaneously. OSS is designed by java programing code system to provide a number of new and unique capabilities, including: (1 Ranking the obtained optimal plans according to a set of planner specified weights representing the relative importance of duration, price, maximum working capital and net present value in the analyzed project; (2 Visualizing and viewing the generated optimal trade-off; and (3 Providing seamless integration with available project management calculations. In order to provide the aforementioned capabilities of OSS, the system is implemented and developed in four main modules: (1 A user interface module; (2 A database module; (3 A running module; (4 A connecting module. At the end of the paper, an illustrative example will be presented to demonstrate and verify the applications of the proposed software (OSS to an optimization expressway of repetitive construction project.

  20. Multi parametrical indicator test for urban wastewater influence

    Science.gov (United States)

    Humer, Franko; Weiss, Stefan; Reinnicke, Sandra; Clara, Manfred; Grath, Johannes; Windhofer, Georg

    2013-04-01

    Austria's drinking water is abstracted from groundwater. While 50 % of the Austrian population are supplied with spring water, the other 50 % get their drinking water from groundwater supplies, in part from enormous quaternary valley and basin deposits, subjected to intensive use by population, industry, agriculture and traffic/transport. Due to protected areas around drinking water wells and springs, there is no treatment necessary in most cases. Water bodies, however, can be affected by different pathways from natural, industrial and urban sources. Identification of anthropogenic sources is paramount for taking appropriate measures to safeguard the quality of drinking water supply. Common parameters like boron are widely used as tracers indicating anthropogenic impacts (e.g. wastewater contamination of groundwater systems). Unfortunately application of these conventional indicators is often limited due to high dilution. Another application where common parameters have their limits is the identification and quantification of the diffuse nitrogen input to water by the stable isotopes of nitrogen and oxygen in nitrate. Without any additional tracers the source distinction of nitrate from manure or waste water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore the Umweltbundesamt (Environment Agency Austria) developed a multi parametrical indicator test which shall allow for identification and quantification of anthropogenic pollutions. The test aims at analysing eight target substances which are well known to occur in wastewater: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepin (pharmaceuticals). These substances are polar and degradation in the aquatic system by microbiological processes is not

  1. FPGA-Based Efficient Hardware/Software Co-Design for Industrial Systems with Consideration of Output Selection

    Science.gov (United States)

    Deliparaschos, Kyriakos M.; Michail, Konstantinos; Zolotas, Argyrios C.; Tzafestas, Spyros G.

    2016-05-01

    This work presents a field programmable gate array (FPGA)-based embedded software platform coupled with a software-based plant, forming a hardware-in-the-loop (HIL) that is used to validate a systematic sensor selection framework. The systematic sensor selection framework combines multi-objective optimization, linear-quadratic-Gaussian (LQG)-type control, and the nonlinear model of a maglev suspension. A robustness analysis of the closed-loop is followed (prior to implementation) supporting the appropriateness of the solution under parametric variation. The analysis also shows that quantization is robust under different controller gains. While the LQG controller is implemented on an FPGA, the physical process is realized in a high-level system modeling environment. FPGA technology enables rapid evaluation of the algorithms and test designs under realistic scenarios avoiding heavy time penalty associated with hardware description language (HDL) simulators. The HIL technique facilitates significant speed-up in the required execution time when compared to its software-based counterpart model.

  2. NHL and RCGA Based Multi-Relational Fuzzy Cognitive Map Modeling for Complex Systems

    Directory of Open Access Journals (Sweden)

    Zhen Peng

    2015-11-01

    Full Text Available In order to model multi-dimensions and multi-granularities oriented complex systems, this paper firstly proposes a kind of multi-relational Fuzzy Cognitive Map (FCM to simulate the multi-relational system and its auto construct algorithm integrating Nonlinear Hebbian Learning (NHL and Real Code Genetic Algorithm (RCGA. The multi-relational FCM fits to model the complex system with multi-dimensions and multi-granularities. The auto construct algorithm can learn the multi-relational FCM from multi-relational data resources to eliminate human intervention. The Multi-Relational Data Mining (MRDM algorithm integrates multi-instance oriented NHL and RCGA of FCM. NHL is extended to mine the causal relationships between coarse-granularity concept and its fined-granularity concepts driven by multi-instances in the multi-relational system. RCGA is used to establish high-quality high-level FCM driven by data. The multi-relational FCM and the integrating algorithm have been applied in complex system of Mutagenesis. The experiment demonstrates not only that they get better classification accuracy, but it also shows the causal relationships among the concepts of the system.

  3. PENGGUNAAN ALGORITMA NEWTON – RAPHSON UNTUK MEMBUAT SOFTWARE PENENTUAN DOSIS OBAT

    Directory of Open Access Journals (Sweden)

    Ibnu Gunawan

    2009-01-01

    Full Text Available USCPACK Software from University of Carolina is one of the pioneers of computerized drug dosage system. This software uses Bayesian method. The algorithm that used in this software is known as NPEM (Non Parametric Expectation Maximization. After knowing how USCPACK work, then we made new software that has the same use like USCPACK but with new algorithm that different from NPEM. These paper will describe the how to make the software based on NPAG algorithm. Abstract in Bahasa Indonesia: Software USCPACK buatan University of Carolina merupakan salah satu pelopor dimungkinkannya penentuan dosis obat persatuan waktu tertentu untuk pasien secara umum menggunakan komputer. Software ini bekerja dengan menggunakan metode dasar Bayesian. Algoritma yang digunakan oleh software ini adalah NPEM (Non Parametric Expectation Maximization. Setelah mengetahui cara kerja dari USCPACK maka dibuatlah sebuah software pendosisan obat menggunakan algoritma non parametrik lain selain NPEM. Paper ini akan membahas pembuatan software pendosisan obat menggunakan algoritma newton – raphson dalam penentuan dosis obat terkomputerisasi. Kata kunci: Pendosisan terkomputerisasi, optimasi, Bayesian, NPEM, Newton Raphson,USCPACK

  4. Influence of Variable Acceleration on Parametric Roll Motion of a Container Ship

    Directory of Open Access Journals (Sweden)

    Emre PEŞMAN

    2016-09-01

    Full Text Available Ship operators increase or decrease thrust force of ships to avoid parametric roll motion. These operations cause varying acceleration values. In this study, influence of variable acceleration and deceleration of ships on roll motion is investigated in longitudinal waves. The method which is referred as simple model is utilized for analysis. Simple Model is one degree of freedom nonlinear parametric roll motion equation which contains changing velocity and restoring moment in waves with respect to time. Ship velocities in waves are predicted by XFlow software for various thrust forces. Results indicate that variable acceleration has significant effect on parametric roll phenomenon.

  5. Multi-temporal Land Use Mapping of Coastal Wetlands Area using Machine Learning in Google Earth Engine

    Science.gov (United States)

    Farda, N. M.

    2017-12-01

    Coastal wetlands provide ecosystem services essential to people and the environment. Changes in coastal wetlands, especially on land use, are important to monitor by utilizing multi-temporal imagery. The Google Earth Engine (GEE) provides many machine learning algorithms (10 algorithms) that are very useful for extracting land use from imagery. The research objective is to explore machine learning in Google Earth Engine and its accuracy for multi-temporal land use mapping of coastal wetland area. Landsat 3 MSS (1978), Landsat 5 TM (1991), Landsat 7 ETM+ (2001), and Landsat 8 OLI (2014) images located in Segara Anakan lagoon are selected to represent multi temporal images. The input for machine learning are visible and near infrared bands, PCA band, invers PCA bands, bare soil index, vegetation index, wetness index, elevation from ASTER GDEM, and GLCM (Harralick) texture, and also polygon samples in 140 locations. There are 10 machine learning algorithms applied to extract coastal wetlands land use from Landsat imagery. The algorithms are Fast Naive Bayes, CART (Classification and Regression Tree), Random Forests, GMO Max Entropy, Perceptron (Multi Class Perceptron), Winnow, Voting SVM, Margin SVM, Pegasos (Primal Estimated sub-GrAdient SOlver for Svm), IKPamir (Intersection Kernel Passive Aggressive Method for Information Retrieval, SVM). Machine learning in Google Earth Engine are very helpful in multi-temporal land use mapping, the highest accuracy for land use mapping of coastal wetland is CART with 96.98 % Overall Accuracy using K-Fold Cross Validation (K = 10). GEE is particularly useful for multi-temporal land use mapping with ready used image and classification algorithms, and also very challenging for other applications.

  6. Concept Maps as a strategy to asses learning in biochemistry using educational softwares

    Directory of Open Access Journals (Sweden)

    A. M. P. Azevedo

    2005-07-01

    Full Text Available This abstract reports  the  use of concept  maps applied  to the evaluation of concepts  learned  through the use of an educational software to study  metabolic  pathways called Diagrama Metabolico Dinamico Virtual  do Ciclo de Krebs (DMDV.  Experience  with the use of this method  was carried  through  with two distinct groups  of students.  The  first  group  was composed  by 24 students (in  2003 who used DMDV during  the  classes (computer room.  The second group was formed by 36 students (in 2004 who could access DMDV software anytime  through  the intranet. The construction of the conceptual map by the student permits  the representation of knowledge, the mental  processes that were absorved and the adaptation during the study,  building new mental schemes that could be related to the concept of reflexioning  abstraction (Piaget, 1995 during  the  process of operation  with  these  concepts.   The evaluation of knowlegde was made by the analysis  of three conceptual  maps constructed by each one of them:   (a  one map  before initiating the  study  with  DMDV,  (b  the  second just  after  the  study and (c the third  one two months  later.  We used the following criteria  for the analysis:  predominance of associative  over classificatory  character; correct concepts  and  relationships; coherence;  number  of relationships;  creativity and  logic.   The  initial  maps  showed  that all  students had  some  previous mental scheme  about  the proposed  concept.    All final  concept maps  showed  an  expansion  of the concepts  as compared  to the initial  maps, something  which can be seen even by a mere glance at the size of graphics.  A purely visual comparison  between the maps indicated  that new elements have been added.   The  associative  character has been shown to predominate as compared  to the  classificatory one.  The

  7. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  8. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  9. Efficient approach for simulating response of multi-body structure in reactor core subjected to seismic loading

    International Nuclear Information System (INIS)

    Zhang Hongkun; Cen Song; Wang Haitao; Cheng Huanyu

    2012-01-01

    An efficient 3D approach is proposed for simulating the complicated responses of the multi-body structure in reactor core under seismic loading. By utilizing the rigid-body and connector functions of the software Abaqus, the multi-body structure of the reactor core is simplified as a mass-point system interlinked by spring-dashpot connectors. And reasonable schemes are used for determining various connector coefficients. Furthermore, a scripting program is also complied for the 3D parametric modeling. Numerical examples show that, the proposed method can not only produce the results which satisfy the engineering requirements, but also improve the computational efficiency more than 100 times. (authors)

  10. Circulation and Directional Amplification in the Josephson Parametric Converter

    Science.gov (United States)

    Hatridge, Michael

    Nonreciprocal transport and directional amplification of weak microwave signals are fundamental ingredients in performing efficient measurements of quantum states of flying microwave light. This challenge has been partly met, as quantum-limited amplification is now regularly achieved with parametrically-driven, Josephson-junction based superconducting circuits. However, these devices are typically non-directional, requiring external circulators to separate incoming and outgoing signals. Recently this limitation has been overcome by several proposals and experimental realizations of both directional amplifiers and circulators based on interference between several parametric processes in a single device. This new class of multi-parametrically driven devices holds the promise of achieving a variety of desirable characteristics simultaneously- directionality, reduced gain-bandwidth constraints and quantum-limited added noise, and are good candidates for on-chip integration with other superconducting circuits such as qubits.

  11. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  12. Generation of a landslide risk index map for Cuba using spatial multi-criteria evaluation

    NARCIS (Netherlands)

    Castellanos Abella, E.A.

    2007-01-01

    his paper explains the procedure for the generation of a landslide risk index map at national level in Cuba, using a semiquantitative model with ten indicator maps and a cell size of 90× 90 m. The model was designed and implemented using spatial multi-criteria evaluation techniques in a GIS system.

  13. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    Directory of Open Access Journals (Sweden)

    S. H. Oh

    2007-12-01

    Full Text Available We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512, KAF-1602E (15367times;1024, KAF-3200E (2184×1472 made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  14. A software framework for real-time multi-modal detection of microsleeps.

    Science.gov (United States)

    Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D

    2017-09-01

    A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.

  15. Multi-wavelength and multi-colour temporal and spatial optical solitons

    DEFF Research Database (Denmark)

    Kivshar, Y. S.; Sukhorukov, A. A.; Ostrovskaya, E. A.

    2000-01-01

    We present an overview of several novel types of multi- component envelope solitary waves that appear in fiber and waveguide nonlinear optics. In particular, we describe multi-channel solitary waves in bit-parallel-wavelength fiber transmission systems for high performance computer networks, multi......-color parametric spatial solitary waves due to cascaded nonlinearities of quadratic materials, and quasiperiodic envelope solitons in Fibonacci optical superlattices....

  16. CLIP-seq analysis of multi-mapped reads discovers novel functional RNA regulatory sites in the human transcriptome.

    Science.gov (United States)

    Zhang, Zijun; Xing, Yi

    2017-09-19

    Crosslinking or RNA immunoprecipitation followed by sequencing (CLIP-seq or RIP-seq) allows transcriptome-wide discovery of RNA regulatory sites. As CLIP-seq/RIP-seq reads are short, existing computational tools focus on uniquely mapped reads, while reads mapped to multiple loci are discarded. We present CLAM (CLIP-seq Analysis of Multi-mapped reads). CLAM uses an expectation-maximization algorithm to assign multi-mapped reads and calls peaks combining uniquely and multi-mapped reads. To demonstrate the utility of CLAM, we applied it to a wide range of public CLIP-seq/RIP-seq datasets involving numerous splicing factors, microRNAs and m6A RNA methylation. CLAM recovered a large number of novel RNA regulatory sites inaccessible by uniquely mapped reads. The functional significance of these sites was demonstrated by consensus motif patterns and association with alternative splicing (splicing factors), transcript abundance (AGO2) and mRNA half-life (m6A). CLAM provides a useful tool to discover novel protein-RNA interactions and RNA modification sites from CLIP-seq and RIP-seq data, and reveals the significant contribution of repetitive elements to the RNA regulatory landscape of the human transcriptome. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Multivariable Parametric Cost Model for Ground Optical: Telescope Assembly

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature were examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter were derived.

  18. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation.

    Science.gov (United States)

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15-20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study

  19. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation

    International Nuclear Information System (INIS)

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-01-01

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (∼15–20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate K i and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final K i parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion

  20. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  1. Design of a Multi-layer Lane-Level Map for Vehicle Route Planning

    Directory of Open Access Journals (Sweden)

    Liu Chaoran

    2017-01-01

    Full Text Available With the development of intelligent transportation system, there occurs further demand for high precision localization and route planning, and simultaneously the traditional road-level map fails to meet with this requirement, by which this paper is motivated. In this paper, t he three-layer lane-level map architecture for vehicle path guidance is established, and the mathematical models of road-level layer, intermediate layer and lane-level layer are designed considering efficiency and precision. The geometric model of the lane-level layer of the map is characterized by Cubic Hermite Spline for continuity. A method of generating the lane geometry with fixed and variable control points is proposed, which can effectively ensure the accuracy with limited num ber of control points. In experimental part, a multi-layer map of an intersection is built to validate the map model, and an example of a local map was generated with the lane-level geometry.

  2. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    Science.gov (United States)

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  3. Contribution to data acquisition software of Eurogram and Diamant multi detectors in an Unix/VXWorks environment

    International Nuclear Information System (INIS)

    Diarra, C.

    1994-06-01

    Questions on nuclear matter, need to have new performant equipments. Eurogram is a 4 PI gamma radiations multi detector and a precious tool in gamma spectroscopy, but it is necessary to use a charged particles detector and in this aim Diamant is an Eurogram partner. These two multi detectors needed special software data acquisition systems. The whole of acquisition control and management is based on sun stations with unix system. 56 figs., 64 refs

  4. Fine mapping of multiple QTL using combined linkage and linkage disequilibrium mapping – A comparison of single QTL and multi QTL methods

    Directory of Open Access Journals (Sweden)

    Meuwissen Theo HE

    2007-04-01

    Full Text Available Abstract Two previously described QTL mapping methods, which combine linkage analysis (LA and linkage disequilibrium analysis (LD, were compared for their ability to detect and map multiple QTL. The methods were tested on five different simulated data sets in which the exact QTL positions were known. Every simulated data set contained two QTL, but the distances between these QTL were varied from 15 to 150 cM. The results show that the single QTL mapping method (LDLA gave good results as long as the distance between the QTL was large (> 90 cM. When the distance between the QTL was reduced, the single QTL method had problems positioning the two QTL and tended to position only one QTL, i.e. a "ghost" QTL, in between the two real QTL positions. The multi QTL mapping method (MP-LDLA gave good results for all evaluated distances between the QTL. For the large distances between the QTL (> 90 cM the single QTL method more often positioned the QTL in the correct marker bracket, but considering the broader likelihood peaks of the single point method it could be argued that the multi QTL method was more precise. Since the distances were reduced the multi QTL method was clearly more accurate than the single QTL method. The two methods combine well, and together provide a good tool to position single or multiple QTL in practical situations, where the number of QTL and their positions are unknown.

  5. A multi-professional software tool for radiation therapy treatment verification

    International Nuclear Information System (INIS)

    Fox, Tim; Brooks, Ken; Davis, Larry

    1996-01-01

    Purpose: Verification of patient setup is important in conformal therapy because it provides a means of quality assurance for treatment delivery. Electronic portal imaging systems have led to software tools for performing digital comparison and verification of patient setup. However, these software tools are typically designed from a radiation oncologist's perspective even though treatment verification is a team effort involving oncologists, physicists, and therapists. A new software tool, Treatment Verification Tool (TVT), has been developed as an interactive, multi-professional application for reviewing and verifying treatment plan setup using conventional personal computers. This study will describe our approach to electronic treatment verification and demonstrate the features of TVT. Methods and Materials: TVT is an object-oriented software tool written in C++ using the PC-based Windows NT environment. The software utilizes the selection of a patient's images from a database. The software is also developed as a single window interface to reduce the amount of windows presented to the user. However, the user can select from four different possible views of the patient data. One of the views is side-by-side comparison of portal images (on-line portal images or digitized port film) with a prescription image (digitized simulator film or digitally reconstructed radiograph), and another view is a textual summary of the grades of each portal image. The grades of a portal image are assigned by a radiation oncologist using an evaluation method, and the physicists and therapists may only review these results. All users of TVT can perform image enhancement processes, measure distances, and perform semi-automated registration methods. An electronic dialogue can be established through a set of annotations and notes among the radiation oncologists and the technical staff. Results: Features of TVT include: 1) side-by-side comparison of portal images and a prescription image; 2

  6. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  7. Magnetic Multi-Scale Mapping to Characterize Anthropogenic Targets

    Science.gov (United States)

    Le Maire, P.; Munschy, M.

    2017-12-01

    The discovery of buried anthropic objects on construction sites can cause delays and/or dangers for workers and for the public. Indeed, every year 500 tons of Unexploded-ordnance are discovered in France. Magnetic measurements are useful to localize magnetized objects. Moreover, it is the cheapest geophysical method which does not impact environment and which is relatively fast to perform. Fluxgate magnetometers (three components) are used to measure magnetic properties bellow the ground. These magnetic sensors are not absolute, so they need to be calibrated before the onset of the measurements. The advantage is that they allow magnetic compensation of the equipment attached to the sensor. So the choice of this kind sensor gives the opportunity to install the equipment aboard different magnetized supports: boat, quad bike, unmanned aerial vehicle, aircraft,... Indeed, this methodology permits to perform magnetic mapping with different scale and different elevation above ground level. An old French aerial military plant was chosen to perform this multi-scale approach. The advantage of the site is that it contains a lot of different targets with variable sizes and depth, e.g. buildings, unexploded-ordnances of the two world wars, trenches, pipes,… By comparison between the different magnetic anomaly maps at different elevations some of the geometric parameters of the magnetic sources can be characterized. The comparison between measured maps at different elevations and the prolonged map highlights the maximum distance for the target's detection (figure).

  8. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping

    Directory of Open Access Journals (Sweden)

    Gonzalez-Brito Manuel

    2008-02-01

    Full Text Available Abstract Background Assessment of cerebral blood flow (CBF by SPECT could be important in the management of patients with severe traumatic brain injury (TBI because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia, or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. Methods The focal effects of moderate traumatic brain injury (TBI on cerebral blood flow (CBF by SPECT cerebral blood perfusion (CBP imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM. Results A significant area of hypoperfusion (P Conclusion The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.

  9. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Science.gov (United States)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  10. Parametric Optimization Design of Brake Block Based on Reverse Engineering

    Directory of Open Access Journals (Sweden)

    Jin Hua-wei

    2017-01-01

    Full Text Available As one of the key part of automotive brake,the performance of brake block has a direct impact on the safety and comfort of cars. Modeling the brake block of disc brake in reverse parameterization by reverse engineering software, analyzing and optimizing the reconstructed model by CAE software. Processing the scanned point cloud by Geomagic Studio and reconstructing the CAD model of the brake block with the parametric surface function of the software, then analyzing and optimizing it by Wrokbench. The example shows that it is quick to reconstruct the CAD model of parts by using reverse parameterization method and reduce part re-design development cycle significantly.

  11. Physics- and engineering knowledge-based geometry repair system for robust parametric CAD geometries

    OpenAIRE

    Li, Dong

    2012-01-01

    In modern multi-objective design optimisation, an effective geometry engine is becoming an essential tool and its performance has a significant impact on the entire process. Building a parametric geometry requires difficult compromises between the conflicting goals of robustness and flexibility. The work presents a solution for improving the robustness of parametric geometry models by capturing and modelling relative engineering knowledge into a surrogate model, and deploying it automatically...

  12. A retrospective critic Re-Debate on Stakeholders’ resistance checklist in software project management within multi-cultural, multi-ethnical and cosmopolitan society context: The Malaysian experience

    Directory of Open Access Journals (Sweden)

    Hamed Taherdoost

    2016-12-01

    Full Text Available Risks stemming from software projects were extensively studied. However, software project risk management has rarely researched organizational risks within multi-cultural and multi-ethnical atmospheres. The fact of the matter is that problems occur when the stakeholders’ cultural and ethnical aspects are not addressed, especially in multi-cultural, multi-ethnical, and cosmopolitan society such as Malaysia. To avoid analyzing something that has already been studied in detail, this study conducted based on in-depth literature review considering key word search in subject-specific databases. Journal articles published in reputed journals were reviewed. By employing Rumelt’s resistance to change checklist and culture gap tool source, this paper develops an organizational risk framework considering cross-cultural and cross-ethnical critical factors in order to show how can risks be better comprehended and managed. The significance of bio-cultural dimensions was scrutinized as vital criteria which should be considered in international project sphere, so that, not only the odds of project success would be increased but also the risks can be mitigated significantly. A review of the risk management process, Rumelt’s Checklist, cultural issues in international project environment allows a better understanding of the importance of cultural dimensions in project spheres.

  13. Effect of thematic map misclassification on landscape multi-metric assessment.

    Science.gov (United States)

    Kleindl, William J; Powell, Scott L; Hauer, F Richard

    2015-06-01

    Advancements in remote sensing and computational tools have increased our awareness of large-scale environmental problems, thereby creating a need for monitoring, assessment, and management at these scales. Over the last decade, several watershed and regional multi-metric indices have been developed to assist decision-makers with planning actions of these scales. However, these tools use remote-sensing products that are subject to land-cover misclassification, and these errors are rarely incorporated in the assessment results. Here, we examined the sensitivity of a landscape-scale multi-metric index (MMI) to error from thematic land-cover misclassification and the implications of this uncertainty for resource management decisions. Through a case study, we used a simplified floodplain MMI assessment tool, whose metrics were derived from Landsat thematic maps, to initially provide results that were naive to thematic misclassification error. Using a Monte Carlo simulation model, we then incorporated map misclassification error into our MMI, resulting in four important conclusions: (1) each metric had a different sensitivity to error; (2) within each metric, the bias between the error-naive metric scores and simulated scores that incorporate potential error varied in magnitude and direction depending on the underlying land cover at each assessment site; (3) collectively, when the metrics were combined into a multi-metric index, the effects were attenuated; and (4) the index bias indicated that our naive assessment model may overestimate floodplain condition of sites with limited human impacts and, to a lesser extent, either over- or underestimated floodplain condition of sites with mixed land use.

  14. The BridgeDb framework: standardized access to gene, protein and metabolite identifier mapping services

    Directory of Open Access Journals (Sweden)

    Hanspers Kristina

    2010-01-01

    Full Text Available Abstract Background Many complementary solutions are available for the identifier mapping problem. This creates an opportunity for bioinformatics tool developers. Tools can be made to flexibly support multiple mapping services or mapping services could be combined to get broader coverage. This approach requires an interface layer between tools and mapping services. Results Here we present BridgeDb, a software framework for gene, protein and metabolite identifier mapping. This framework provides a standardized interface layer through which bioinformatics tools can be connected to different identifier mapping services. This approach makes it easier for tool developers to support identifier mapping. Mapping services can be combined or merged to support multi-omics experiments or to integrate custom microarray annotations. BridgeDb provides its own ready-to-go mapping services, both in webservice and local database forms. However, the framework is intended for customization and adaptation to any identifier mapping service. BridgeDb has already been integrated into several bioinformatics applications. Conclusion By uncoupling bioinformatics tools from mapping services, BridgeDb improves capability and flexibility of those tools. All described software is open source and available at http://www.bridgedb.org.

  15. visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.

    Directory of Open Access Journals (Sweden)

    Matthew Scales

    Full Text Available We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk, a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV. Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…. It also allows annotation of genes and other custom features in the plotted region(s. Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3.

  16. Statistical parametric mapping in the detection of rCBF changes in mild Alzheimer's disease

    International Nuclear Information System (INIS)

    Rowe, C.; Barnden, L.; Boundy, K.; McKinnon, J.; Liptak, M.

    1998-01-01

    Full text: Reduction in temporoparietal regional cerebral blood flow (rCBF) is proportional to the degree of cognitive deficit in patients with Alzheimer's Disease (AD). The characteristic pattern is readily apparent in advanced disease but is often subtle in early stage AD, reducing the clinical value of SPECT in the management of this condition. We have previously reported that Statistical Parametric Mapping (SPM95) revealed significant temporoparietal hypoperfusion when 10 patients with mild AD (classified by the Clinical Dementia Rating Scale) were compared to 10 age matched normals. We have now begun to evaluate the sensitivity and specificity of SPM95 in individuals with mild AD by comparison to our bank of 39 normals (30 female, 9 male, age range 26 to 74, mean age 52). Preliminary results reveal low sensitivity (<40%) when the standard reference region for normalization (i.e. global brain counts) is used. Better results are expected from normalizing to the cerebellum or basal ganglia and this is under investigation. An objective method to improve the accuracy of rCBF imaging for the diagnosis of early AD would be very useful in clinical practice. This study will demonstrate whether SPM can fulfill this role

  17. SCT: Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data.

    Science.gov (United States)

    De Leener, Benjamin; Lévy, Simon; Dupont, Sara M; Fonov, Vladimir S; Stikov, Nikola; Louis Collins, D; Callot, Virginie; Cohen-Adad, Julien

    2017-01-15

    For the past 25 years, the field of neuroimaging has witnessed the development of several software packages for processing multi-parametric magnetic resonance imaging (mpMRI) to study the brain. These software packages are now routinely used by researchers and clinicians, and have contributed to important breakthroughs for the understanding of brain anatomy and function. However, no software package exists to process mpMRI data of the spinal cord. Despite the numerous clinical needs for such advanced mpMRI protocols (multiple sclerosis, spinal cord injury, cervical spondylotic myelopathy, etc.), researchers have been developing specific tools that, while necessary, do not provide an integrative framework that is compatible with most usages and that is capable of reaching the community at large. This hinders cross-validation and the possibility to perform multi-center studies. In this study we introduce the Spinal Cord Toolbox (SCT), a comprehensive software dedicated to the processing of spinal cord MRI data. SCT builds on previously-validated methods and includes state-of-the-art MRI templates and atlases of the spinal cord, algorithms to segment and register new data to the templates, and motion correction methods for diffusion and functional time series. SCT is tailored towards standardization and automation of the processing pipeline, versatility, modularity, and it follows guidelines of software development and distribution. Preliminary applications of SCT cover a variety of studies, from cross-sectional area measures in large databases of patients, to the precise quantification of mpMRI metrics in specific spinal pathways. We anticipate that SCT will bring together the spinal cord neuroimaging community by establishing standard templates and analysis procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Nonlinear dynamics of parametrically driven particles in a Φ6 potential

    International Nuclear Information System (INIS)

    Tchawoua, C; Siewe Siewe, M; Tchatchueng, S; Moukam Kakmeni, F M

    2008-01-01

    A general parametrically excited mechanical system is considered. Approximate solutions are determined by applying the method of multiple time scales. It is shown that only combination parametric resonance of the additive type is possible for the system examined. For this case, the existence and stability properties of the fixed points of the averaged equations corresponding to the nontrivial periodic solutions of the original system are investigated. Thus, emphasis is placed on understanding the chaotic behaviour of the extended Duffing oscillator in the Φ 6 potential under parametric excitation for a specific parameter choice. From the Melnikov-type technique, we obtain the conditions for the existence of homoclinic or heteroclinic bifurcation. Our analysis is carried out in the case of a triple well with a double hump which does not lead to unbounded motion; this analysis is complemented by numerical simulations from which we illustrate the fractality of the basins of attraction. The results show that the threshold amplitude of parametric excitation moves upwards as the parametric intensity increases. Numerical simulations including bifurcation diagrams, Lyapunov exponents, phase portraits and Poincaré maps are shown

  19. Dual parametrization of generalized parton distributions in two equivalent representations

    International Nuclear Information System (INIS)

    Müller, D.; Polyakov, M.V.; Semenov-Tian-Shansky, K.M.

    2015-01-01

    The dual parametrization and the Mellin-Barnes integral approach represent two frameworks for handling the double partial wave expansion of generalized parton distributions (GPDs) in the conformal partial waves and in the t-channel SO(3) partial waves. Within the dual parametrization framework, GPDs are represented as integral convolutions of forward-like functions whose Mellin moments generate the conformal moments of GPDs. The Mellin-Barnes integral approach is based on the analytic continuation of the GPD conformal moments to the complex values of the conformal spin. GPDs are then represented as the Mellin-Barnes-type integrals in the complex conformal spin plane. In this paper we explicitly show the equivalence of these two independently developed GPD representations. Furthermore, we clarify the notions of the J=0 fixed pole and the D-form factor. We also provide some insight into GPD modeling and map the phenomenologically successful Kumerički-Müller GPD model to the dual parametrization framework by presenting the set of the corresponding forward-like functions. We also build up the reparametrization procedure allowing to recast the double distribution representation of GPDs in the Mellin-Barnes integral framework and present the explicit formula for mapping double distributions into the space of double partial wave amplitudes with complex conformal spin.

  20. Whole-body direct 4D parametric PET imaging employing nested generalized Patlak expectation-maximization reconstruction

    NARCIS (Netherlands)

    Karakatsanis, Nicolas A.; Casey, Michael E.; Lodge, Martin A.; Rahmim, Arman; Zaidi, Habib

    2016-01-01

    Whole-body (WB) dynamic PET has recently demonstrated its potential in translating the quantitative benefits of parametric imaging to the clinic. Post-reconstruction standard Patlak (sPatlak) WB graphical analysis utilizes multi-bed multi-pass PET acquisition to produce quantitative WB images of the

  1. Digital radiography: optimization of image quality and dose using multi-frequency software.

    Science.gov (United States)

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  2. The numerical model for parametric studies of forest haul roads pavements

    Directory of Open Access Journals (Sweden)

    Lenka Ševelová

    2010-01-01

    Full Text Available Forest roads pavement structures are considered to be low volume roads. These roads serve as a mean of transport of wood and people. Besides they are currently often used for recreational purpose. The construction of the pavements should be suitable for forest transportation irrespective of their low bearing capacity. These pavement structures are very specific for special unbound materials that are used in their construction. To meet the requirements of the pavement designs and simulation analysis the FEM model in the software ANSYS was created.This paper compares two material models used for the description of the behaviour of unbound materials. The first is linear elastic according to Hook theory (H model and the second one is nonlinear plastic model Drucker-Prager (D–P model. ANSYS software has been used to create flexible model based on the parametrers of variable principle. The flexible model is parametric to realize repeated calculations useful for optimization analysis.

  3. Classification rates: non‐parametric verses parametric models using ...

    African Journals Online (AJOL)

    This research sought to establish if non parametric modeling achieves a higher correct classification ratio than a parametric model. The local likelihood technique was used to model fit the data sets. The same sets of data were modeled using parametric logit and the abilities of the two models to correctly predict the binary ...

  4. Evaluation of ictal brain SPET using statistical parametric mapping in temporal lobe epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.D.; Kim, H.-J.; Jeon, T.J.; Kim, M.J. [Div. of Nuclear Medicine, Yonsei University Medical College, Seoul (Korea); Lee, B.I.; Kim, O.J. [Dept. of Neurology, Yonsei University Medical College, Seoul (Korea)

    2000-11-01

    An automated voxel-based analysis of brain images using statistical parametric mapping (SPM) is accepted as a standard approach in the analysis of activation studies in positron emission tomography and functional magnetic resonance imaging. This study aimed to investigate whether or not SPM would increase the diagnostic yield of ictal brain single-photon emission tomography (SPET) in temporal lobe epilepsy (TLE). Twenty-one patients (age 27.14{+-}5.79 years) with temporal lobe epilepsy (right in 8, left in 13) who had a successful seizure outcome after surgery and nine normal subjects were included in the study. The data of ictal and interictal brain SPET of the patients and baseline SPET of the normal control group were analysed using SPM96 software. The t statistic SPM(t) was transformed to SPM(Z) with various thresholds of P<0.05, 0.005 and 0.001, and corrected extent threshold P value of 0.05. The SPM data were compared with the conventional ictal and interictal subtraction method. On group comparison, ictal SPET showed increased uptake within the epileptogenic mesial temporal lobe. On single case analysis, ictal SPET images correctly lateralized the epileptogenic temporal lobe in 18 cases, falsely lateralized it in one and failed to lateralize it in two as compared with the mean image of the normal group at a significance level of P<0.05. Comparing the individual ictal images with the corresponding interictal group, 15 patients were correctly lateralized, one was falsely lateralized and four were not lateralized. At significance levels of P<0.005 and P<0.001, correct lateralization of the epileptogenic temporal lobe was achieved in 15 and 13 patients, respectively, as compared with the normal group. On the other hand, when comparison was made with the corresponding interictal group, only 7 out of 21 patients were correctly lateralized at the threshold of P<0.005 and five at P<0.001. The result of the subtraction method was close to the single case analysis on

  5. Evaluation of ictal brain SPET using statistical parametric mapping in temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Lee, J.D.; Kim, H.-J.; Jeon, T.J.; Kim, M.J.; Lee, B.I.; Kim, O.J.

    2000-01-01

    An automated voxel-based analysis of brain images using statistical parametric mapping (SPM) is accepted as a standard approach in the analysis of activation studies in positron emission tomography and functional magnetic resonance imaging. This study aimed to investigate whether or not SPM would increase the diagnostic yield of ictal brain single-photon emission tomography (SPET) in temporal lobe epilepsy (TLE). Twenty-one patients (age 27.14±5.79 years) with temporal lobe epilepsy (right in 8, left in 13) who had a successful seizure outcome after surgery and nine normal subjects were included in the study. The data of ictal and interictal brain SPET of the patients and baseline SPET of the normal control group were analysed using SPM96 software. The t statistic SPM(t) was transformed to SPM(Z) with various thresholds of P<0.05, 0.005 and 0.001, and corrected extent threshold P value of 0.05. The SPM data were compared with the conventional ictal and interictal subtraction method. On group comparison, ictal SPET showed increased uptake within the epileptogenic mesial temporal lobe. On single case analysis, ictal SPET images correctly lateralized the epileptogenic temporal lobe in 18 cases, falsely lateralized it in one and failed to lateralize it in two as compared with the mean image of the normal group at a significance level of P<0.05. Comparing the individual ictal images with the corresponding interictal group, 15 patients were correctly lateralized, one was falsely lateralized and four were not lateralized. At significance levels of P<0.005 and P<0.001, correct lateralization of the epileptogenic temporal lobe was achieved in 15 and 13 patients, respectively, as compared with the normal group. On the other hand, when comparison was made with the corresponding interictal group, only 7 out of 21 patients were correctly lateralized at the threshold of P<0.005 and five at P<0.001. The result of the subtraction method was close to the single case analysis on

  6. A mixture model for robust point matching under multi-layer motion.

    Directory of Open Access Journals (Sweden)

    Jiayi Ma

    Full Text Available This paper proposes an efficient mixture model for establishing robust point correspondences between two sets of points under multi-layer motion. Our algorithm starts by creating a set of putative correspondences which can contain a number of false correspondences, or outliers, in addition to the true correspondences (inliers. Next we solve for correspondence by interpolating a set of spatial transformations on the putative correspondence set based on a mixture model, which involves estimating a consensus of inlier points whose matching follows a non-parametric geometrical constraint. We formulate this as a maximum a posteriori (MAP estimation of a Bayesian model with hidden/latent variables indicating whether matches in the putative set are outliers or inliers. We impose non-parametric geometrical constraints on the correspondence, as a prior distribution, in a reproducing kernel Hilbert space (RKHS. MAP estimation is performed by the EM algorithm which by also estimating the variance of the prior model (initialized to a large value is able to obtain good estimates very quickly (e.g., avoiding many of the local minima inherent in this formulation. We further provide a fast implementation based on sparse approximation which can achieve a significant speed-up without much performance degradation. We illustrate the proposed method on 2D and 3D real images for sparse feature correspondence, as well as a public available dataset for shape matching. The quantitative results demonstrate that our method is robust to non-rigid deformation and multi-layer/large discontinuous motion.

  7. Parametric modeling for damped sinusoids from multiple channels

    DEFF Research Database (Denmark)

    Zhou, Zhenhua; So, Hing Cheung; Christensen, Mads Græsbøll

    2013-01-01

    frequencies and damping factors are then computed with the multi-channel weighted linear prediction method. The estimated sinusoidal poles are then matched to each channel according to the extreme value theory of distribution of random fields. Simulations are performed to show the performance advantages......The problem of parametric modeling for noisy damped sinusoidal signals from multiple channels is addressed. Utilizing the shift invariance property of the signal subspace, the number of distinct sinusoidal poles in the multiple channels is first determined. With the estimated number, the distinct...... of the proposed multi-channel sinusoidal modeling methodology compared with existing methods....

  8. Regge parametrization of angular distributions for heavy-ion transfer reactions

    International Nuclear Information System (INIS)

    Carlson, B.V.; McVoy, K.W.

    1977-01-01

    A two-pole one-zero Regge parametrization of the l-window for transfer reactions is employed in conjunction with a chi-squared search program to obtain high-quality fits to a wide variety of transfer data. The data employed include both direct and multi-step transfers. (Auth.)

  9. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  10. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  11. A unified theoretical framework for mapping models for the multi-state Hamiltonian.

    Science.gov (United States)

    Liu, Jian

    2016-11-28

    We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.

  12. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    Science.gov (United States)

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  13. Parametric Cost Estimates for an International Competitive Edge

    International Nuclear Information System (INIS)

    Murphy, L.T.; Hickey, M.

    2006-01-01

    This paper summarizes the progress to date by CH2M HILL and the UKAEA in development of a parametric modelling capability for estimating the costs of large nuclear decommissioning projects in the United Kingdom (UK) and Europe. The ability to successfully apply parametric cost estimating techniques will be a key factor to commercial success in the UK and European multi-billion dollar waste management, decommissioning and environmental restoration markets. The most useful parametric models will be those that incorporate individual components representing major elements of work: reactor decommissioning, fuel cycle facility decommissioning, waste management facility decommissioning and environmental restoration. Models must be sufficiently robust to estimate indirect costs and overheads, permit pricing analysis and adjustment, and accommodate the intricacies of international monetary exchange, currency fluctuations and contingency. The development of a parametric cost estimating capability is also a key component in building a forward estimating strategy. The forward estimating strategy will enable the preparation of accurate and cost-effective out-year estimates, even when work scope is poorly defined or as yet indeterminate. Preparation of cost estimates for work outside the organizations current sites, for which detailed measurement is not possible and historical cost data does not exist, will also be facilitated. (authors)

  14. Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma.

    Science.gov (United States)

    Hu, Leland S; Ning, Shuluo; Eschbacher, Jennifer M; Gaw, Nathan; Dueck, Amylou C; Smith, Kris A; Nakaji, Peter; Plasencia, Jonathan; Ranjbar, Sara; Price, Stephen J; Tran, Nhan; Loftus, Joseph; Jenkins, Robert; O'Neill, Brian P; Elmquist, William; Baxter, Leslie C; Gao, Fei; Frakes, David; Karis, John P; Zwart, Christine; Swanson, Kristin R; Sarkaria, Jann; Wu, Teresa; Mitchell, J Ross; Li, Jing

    2015-01-01

    Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM). Contrast-enhanced MRI (CE-MRI) targets enhancing core (ENH) but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT), despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML) algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM. We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs heterogeneity to identify regional tumor-rich biopsy targets.

  15. Improving flood risk mapping in Italy: the FloodRisk open-source software

    Science.gov (United States)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  16. Rapid computation of single PET scan rest-stress myocardial blood flow parametric images by table look up.

    Science.gov (United States)

    Guehl, Nicolas J; Normandin, Marc D; Wooten, Dustin W; Rozen, Guy; Ruskin, Jeremy N; Shoup, Timothy M; Woo, Jonghye; Ptaszek, Leon M; Fakhri, Georges El; Alpert, Nathaniel M

    2017-09-01

    We have recently reported a method for measuring rest-stress myocardial blood flow (MBF) using a single, relatively short, PET scan session. The method requires two IV tracer injections, one to initiate rest imaging and one at peak stress. We previously validated absolute flow quantitation in ml/min/cc for standard bull's eye, segmental analysis. In this work, we extend the method for fast computation of rest-stress MBF parametric images. We provide an analytic solution to the single-scan rest-stress flow model which is then solved using a two-dimensional table lookup method (LM). Simulations were performed to compare the accuracy and precision of the lookup method with the original nonlinear method (NLM). Then the method was applied to 16 single scan rest/stress measurements made in 12 pigs: seven studied after infarction of the left anterior descending artery (LAD) territory, and nine imaged in the native state. Parametric maps of rest and stress MBF as well as maps of left (f LV ) and right (f RV ) ventricular spill-over fractions were generated. Regions of interest (ROIs) for 17 myocardial segments were defined in bull's eye fashion on the parametric maps. The mean of each ROI was then compared to the rest (K 1r ) and stress (K 1s ) MBF estimates obtained from fitting the 17 regional TACs with the NLM. In simulation, the LM performed as well as the NLM in terms of precision and accuracy. The simulation did not show that bias was introduced by the use of a predefined two-dimensional lookup table. In experimental data, parametric maps demonstrated good statistical quality and the LM was computationally much more efficient than the original NLM. Very good agreement was obtained between the mean MBF calculated on the parametric maps for each of the 17 ROIs and the regional MBF values estimated by the NLM (K 1map LM  = 1.019 × K 1 ROI NLM  + 0.019, R 2  = 0.986; mean difference = 0.034 ± 0.036 mL/min/cc). We developed a table lookup method for fast

  17. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography.

    Science.gov (United States)

    Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-06-01

    Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.

  18. Accelerated whole-brain multi-parameter mapping using blind compressed sensing.

    Science.gov (United States)

    Bhave, Sampada; Lingala, Sajan Goud; Johnson, Casey P; Magnotta, Vincent A; Jacob, Mathews

    2016-03-01

    To introduce a blind compressed sensing (BCS) framework to accelerate multi-parameter MR mapping, and demonstrate its feasibility in high-resolution, whole-brain T1ρ and T2 mapping. BCS models the evolution of magnetization at every pixel as a sparse linear combination of bases in a dictionary. Unlike compressed sensing, the dictionary and the sparse coefficients are jointly estimated from undersampled data. Large number of non-orthogonal bases in BCS accounts for more complex signals than low rank representations. The low degree of freedom of BCS, attributed to sparse coefficients, translates to fewer artifacts at high acceleration factors (R). From 2D retrospective undersampling experiments, the mean square errors in T1ρ and T2 maps were observed to be within 0.1% up to R = 10. BCS was observed to be more robust to patient-specific motion as compared to other compressed sensing schemes and resulted in minimal degradation of parameter maps in the presence of motion. Our results suggested that BCS can provide an acceleration factor of 8 in prospective 3D imaging with reasonable reconstructions. BCS considerably reduces scan time for multiparameter mapping of the whole brain with minimal artifacts, and is more robust to motion-induced signal changes compared to current compressed sensing and principal component analysis-based techniques. © 2015 Wiley Periodicals, Inc.

  19. Multi-Agent Based Beam Search for Real-Time Production Scheduling and Control Method, Software and Industrial Application

    CERN Document Server

    Kang, Shu Gang

    2013-01-01

    The Multi-Agent Based Beam Search (MABBS) method systematically integrates four major requirements of manufacturing production - representation capability, solution quality, computation efficiency, and implementation difficulty - within a unified framework to deal with the many challenges of complex real-world production planning and scheduling problems. Multi-agent Based Beam Search for Real-time Production Scheduling and Control introduces this method, together with its software implementation and industrial applications.  This book connects academic research with industrial practice, and develops a practical solution to production planning and scheduling problems. To simplify implementation, a reusable software platform is developed to build the MABBS method into a generic computation engine.  This engine is integrated with a script language, called the Embedded Extensible Application Script Language (EXASL), to provide a flexible and straightforward approach to representing complex real-world problems. ...

  20. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  1. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  2. Quantum theory of novel parametric devices

    International Nuclear Information System (INIS)

    Drummond, P.D.; Reid, M.D.; Dechoum, K.; Chaturvedi, S.; Olsen, M.; Kheruntsyan, K.; Bradley, A.

    2005-01-01

    While the parametric amplifier is a widely used and important source of entangled and squeezed photons, there are many possible ways to investigate the physics of intracavity parametric devices. Novel quantum theory of parametric devices in this talk will cover several new types of unconventional devices, including the following topics:- Critical intracavity paramp - We calculate intrinsic limits to entanglement of a quantum paramp, caused by nonlinear effects originating in phase noise of the pump. - Degenerate planar paramp - We obtain universal quantum critical fluctuations in a planar paramp device by mapping to the equations of magnetic Lifshitz points Nondegenerate planar paramp - The Mermin-Wagner theorem is used to demonstrate that there is no phase transition in the case of a nondegenerate planar device - Coupled channel paramp - A robust and novel integrated entanglement source can be generated using type I waveguides coupled inside a cavity to generate spatial entanglement - Cascade paramps - This possible 'GHZ-type' source is obtained by cascading successive down conversion crystals inside the same cavity, giving two thresholds Parallel paramps - Tripartite entanglement can be generated if three intracavity paramp crystals are operated in parallel, each idler mode acting as a signal for the next. Finally, we briefly treat the relevant experimental developments. (author)

  3. Multi-user software of radio therapeutical calculation using a computational network; Software multiusuario de calculo radioterapeutico usando una red de computo

    Energy Technology Data Exchange (ETDEWEB)

    Allaucca P, J.J.; Picon C, C.; Zaharia B, M. [Departamento de Radioterapia, Instituto de Enfermedades Neoplasicas, Av. Angamos Este 2520, Lima 34 (Peru)

    1998-12-31

    It has been designed a hardware and software system for a radiotherapy Department. It runs under an Operative system platform Novell Network sharing the existing resources and of the server, it is centralized, multi-user and of greater safety. It resolves a variety of problems and calculation necessities, patient steps and administration, it is very fast and versatile, it contains a set of menus and options which may be selected with mouse, direction arrows or abbreviated keys. (Author)

  4. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    Directory of Open Access Journals (Sweden)

    Xiang Gao

    2016-07-01

    Full Text Available This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  5. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  6. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Science.gov (United States)

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    Science.gov (United States)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  8. Multi-temporal maps of the Montaguto earth flow in southern Italy from 1954 to 2010

    Science.gov (United States)

    Guerriero, Luigi; Revellino, Paola; Coe, Jeffrey A.; Focareta, Mariano; Grelle, Gerardo; Albanese, Vincenzo; Corazza, Angelo; Guadagno, Francesco M.

    2013-01-01

    Historical movement of the Montaguto earth flow in southern Italy has periodically destroyed residences and farmland, and damaged the Italian National Road SS90 and the Benevento-Foggia National Railway. This paper provides maps from an investigation into the evolution of the Montaguto earth flow from 1954 to 2010. We used aerial photos, topographic maps, LiDAR data, satellite images, and field observations to produce multi-temporal maps. The maps show the spatial and temporal distribution of back-tilted surfaces, flank ridges, and normal, thrust, and strike-slip faults. Springs, creeks, and ponds are also shown on the maps. The maps provide a basis for interpreting how basal and lateral boundary geometries influence earth-flow behavior and surface-water hydrology.

  9. Large-Area, High-Resolution Tree Cover Mapping with Multi-Temporal SPOT5 Imagery, New South Wales, Australia

    Directory of Open Access Journals (Sweden)

    Adrian Fisher

    2016-06-01

    Full Text Available Tree cover maps are used for many purposes, such as vegetation mapping, habitat connectivity and fragmentation studies. Small remnant patches of native vegetation are recognised as ecologically important, yet they are underestimated in remote sensing products derived from Landsat. High spatial resolution sensors are capable of mapping small patches of trees, but their use in large-area mapping has been limited. In this study, multi-temporal Satellite pour l’Observation de la Terre 5 (SPOT5 High Resolution Geometrical data was pan-sharpened to 5 m resolution and used to map tree cover for the Australian state of New South Wales (NSW, an area of over 800,000 km2. Complete coverages of SPOT5 panchromatic and multispectral data over NSW were acquired during four consecutive summers (2008–2011 for a total of 1256 images. After pre-processing, the imagery was used to model foliage projective cover (FPC, a measure of tree canopy density commonly used in Australia. The multi-temporal imagery, FPC models and 26,579 training pixels were used in a binomial logistic regression model to estimate the probability of each pixel containing trees. The probability images were classified into a binary map of tree cover using local thresholds, and then visually edited to reduce errors. The final tree map was then attributed with the mean FPC value from the multi-temporal imagery. Validation of the binary map based on visually assessed high resolution reference imagery revealed an overall accuracy of 88% (±0.51% standard error, while comparison against airborne lidar derived data also resulted in an overall accuracy of 88%. A preliminary assessment of the FPC map by comparing against 76 field measurements showed a very good agreement (r2 = 0.90 with a root mean square error of 8.57%, although this may not be representative due to the opportunistic sampling design. The map represents a regionally consistent and locally relevant record of tree cover for NSW, and

  10. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM)

    OpenAIRE

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tama...

  11. Jansen-MIDAS: A multi-level photomicrograph segmentation software based on isotropic undecimated wavelets.

    Science.gov (United States)

    de Siqueira, Alexandre Fioravante; Cabrera, Flávio Camargo; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Job, Aldo Eloizo

    2018-01-01

    Image segmentation, the process of separating the elements within a picture, is frequently used for obtaining information from photomicrographs. Segmentation methods should be used with reservations, since incorrect results can mislead when interpreting regions of interest (ROI). This decreases the success rate of extra procedures. Multi-Level Starlet Segmentation (MLSS) and Multi-Level Starlet Optimal Segmentation (MLSOS) were developed to be an alternative for general segmentation tools. These methods gave rise to Jansen-MIDAS, an open-source software. A scientist can use it to obtain several segmentations of hers/his photomicrographs. It is a reliable alternative to process different types of photomicrographs: previous versions of Jansen-MIDAS were used to segment ROI in photomicrographs of two different materials, with an accuracy superior to 89%. © 2017 Wiley Periodicals, Inc.

  12. DESIGN of MICRO CANTILEVER BEAM for VAPOUR DETECTION USING COMSOL MULTI PHYSICS SOFTWARE

    OpenAIRE

    Sivacoumar R; Parvathy JM; Pratishtha Deep

    2015-01-01

    This paper gives an overview of micro cantilever beam of various shapes and materials for vapour detection. The design of micro cantilever beam, analysis and simulation is done for each shape. The simulation is done using COMSOL Multi physics software using structural mechanics and chemical module. The simulation results of applied force and resulting Eigen frequencies will be analyzed for different beam structures. The vapour analysis is done using flow cell that consists of chemical pill...

  13. Evaluation of seizure propagation on ictal brain SPECT using statistical parametric mapping in temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Jeon, Tae Joo; Lee, Jong Doo; Kim, Hee Joung; Lee, Byung In; Kim, Ok Joon; Kim, Min Jung; Jeon, Jeong Dong

    1999-01-01

    Ictal brain SPECT has a high diagnostic sensitivity exceeding 90 % in the localization of seizure focus, however, it often shows increased uptake within the extratemporal areas due to early propagation of seizure discharge. This study aimed to evaluate seizure propagation on ictal brian SPECT in patients with temporal lobe epilepsy (TLE) by statistical parametric mapping (SPM). Twenty-one patients (age 27.14 5.79 y) with temporal lobe epilepsy (right in 8, left in 13) who had successful seizure outcome after surgery and nine normal control were included. The data of ictal and interictal brain SPECT of the patients and baseline SPECT of normal control group were analyzed using automatic image registration and SPM96 softwares. The statistical analysis was performed to compare the mean SPECT image of normal group with individual ictal SPECT, and each mean image of the interictal groups of the right or left TLE with individual ictal scans. The t statistic SPM [t] was transformed to SPM [Z] with a threshold of 1.64. The statistical results were displayed and rendered on the reference 3 dimensional MRI images with P value of 0.05 and uncorrected extent threshold p value of 0.5 for SPM [Z]. SPM data demonstrated increased uptake within the epileptic lesion in 19 patients (90.4 %), among them, localized increased uptake confined to the epileptogenic lesion was seen in only 4 (19%) but 15 patients (71.4%) showed hyperperfusion within propagation sites. Bi-temporal hyperperfusion was observed in 11 out of 19 patients (57.9%, 5 in the right and 6 in the left); higher uptake within the lesion than contralateral side in 9, similar activity in 1 and higher uptake within contralateral lobe in one. Extra-temporal hyperperfusion was observed in 8 (2 in the right, 3 in the left, 3 in bilateral); unilateral hyperperfusion within the epileptogenic temporal lobe and extra-temporal area in 4, bi-temporal with extra-temporal hyperperfusion in remaining 4. Ictal brain SPECT is highly

  14. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  15. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    Science.gov (United States)

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming

  16. Delay Bounded Multi-Source Multicast in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Thabo Semong

    2018-01-01

    Full Text Available Software-Defined Networking (SDN is the next generation network architecture with exciting application prospects. The control function in SDN is decoupled from the data forwarding plane, hence it provides a new centralized architecture with flexible network resource management. Although SDN is attracting much attention from both industry and research, its advantage over the traditional networks has not been fully utilized. Multicast is designed to deliver content to multiple destinations. The current traffic engineering in SDN focuses mainly on unicast, however, multicast can effectively reduce network resource consumption by serving multiple clients. This paper studies a novel delay-bounded multi-source multicast SDN problem, in which among the set of potential sources, we select a source to build the multicast-tree, under the constraint that the transmission delay for every destination is bounded. This problem is more difficult than the traditional Steiner minimum tree (SMT problem, since it needs to find a source from the set of all potential sources. We model the problem as a mixed-integer linear programming (MILP and prove its NP-Hardness. To solve the problem, a delay bounded multi-source (DBMS scheme is proposed, which includes a DBMS algorithm to build a minimum delay cost DBMS-Forest. Through a MATLAB experiment, we demonstrate that DBMS is significantly more efficient and outperforms other existing algorithms in the literature.

  17. HardwareSoftware Co-design for Heterogeneous Multi-core Platforms The hArtes Toolchain

    CERN Document Server

    2012-01-01

    This book describes the results and outcome of the FP6 project, known as hArtes, which focuses on the development of an integrated tool chain targeting a heterogeneous multi core platform comprising of a general purpose processor (ARM or powerPC), a DSP (the diopsis) and an FPGA. The tool chain takes existing source code and proposes transformations and mappings such that legacy code can easily be ported to a modern, multi-core platform. Benefits of the hArtes approach, described in this book, include: Uses a familiar programming paradigm: hArtes proposes a familiar programming paradigm which is compatible with the widely used programming practice, irrespective of the target platform. Enables users to view multiple cores as a single processor: the hArtes approach abstracts away the heterogeneity as well as the multi-core aspect of the underlying hardware so the developer can view the platform as consisting of a single, general purpose processor. Facilitates easy porting of existing applications: hArtes provid...

  18. SPECT image analysis using statistical parametric mapping in patients with temporal lobe epilepsy associated with hippocampal sclerosis

    International Nuclear Information System (INIS)

    Shiraki, Junko

    2004-01-01

    The author examined interictal 123 I-IMP SPECT images using statistical parametric mapping (SPM) in 19 temporal lobe epilepsy patients who revealed hippocampal sclerosis with MRI. Decreased regional cerebral blood flow (rCBF) were shown for eight patients in the medial temporal lobe, six patients in the lateral temporal lobe and five patients in the both medial and lateral temporal lobe. These patients were classified into two types; medial type and lateral type, the former decreased rCBF only in medial and the latter decreased rCBF in the other temporal area. Correlation of rCBF and clinical parameters in the lateral type, age at seizure onset was significantly older (p=0.0098, t-test) than those of patients in the medial type. SPM analysis for interictal SPECT of temporal lobe epilepsy clarified location of decreased rCBF and find correlations with clinical characteristics. In addition, SPM analysis of SPECT was useful to understand pathophysiology of the epilepsy. (author)

  19. On generic obstructions to recovering correct statistics from climate simulations: Homogenization for deterministic maps and multiplicative noise

    Science.gov (United States)

    Gottwald, Georg; Melbourne, Ian

    2013-04-01

    Whereas diffusion limits of stochastic multi-scale systems have a long and successful history, the case of constructing stochastic parametrizations of chaotic deterministic systems has been much less studied. We present rigorous results of convergence of a chaotic slow-fast system to a stochastic differential equation with multiplicative noise. Furthermore we present rigorous results for chaotic slow-fast maps, occurring as numerical discretizations of continuous time systems. This raises the issue of how to interpret certain stochastic integrals; surprisingly the resulting integrals of the stochastic limit system are generically neither of Stratonovich nor of Ito type in the case of maps. It is shown that the limit system of a numerical discretisation is different to the associated continuous time system. This has important consequences when interpreting the statistics of long time simulations of multi-scale systems - they may be very different to the one of the original continuous time system which we set out to study.

  20. OBJECT-SPACE MULTI-IMAGE MATCHING OF MOBILE-MAPPING-SYSTEM IMAGE SEQUENCES

    Directory of Open Access Journals (Sweden)

    Y. C. Chen

    2012-07-01

    Full Text Available This paper proposes an object-space multi-image matching procedure of terrestrial MMS (Mobile Mapping System image sequences to determine the coordinates of an object point automatically and reliably. This image matching procedure can be applied to find conjugate points of MMS image sequences efficiently. Conventional area-based image matching methods are not reliable to deliver accurate matching results for this application due to image scale variations, viewing angle variations, and object occlusions. In order to deal with these three matching problems, an object space multi-image matching is proposed. A modified NCC (Normalized Cross Correlation coefficient is proposed to measure the similarity of image patches. A modified multi-window matching procedure will also be introduced to solve the problem of object occlusion. A coarse-to-fine procedure with a combination of object-space multi-image matching and multi-window matching is adopted. The proposed procedure has been implemented for the purpose of matching terrestrial MMS image sequences. The ratio of correct matches of this experiment was about 80 %. By providing an approximate conjugate point in an overlapping image manually, most of the incorrect matches could be fixed properly and the ratio of correct matches was improved up to 98 %.

  1. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo [Ajou University School of Medicine, Suwon (Korea, Republic of); Lee, Jae Sung [Seoul national University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5{+-}3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis.

  2. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo; Lee, Jae Sung

    2002-01-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5±3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis

  3. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  4. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Science.gov (United States)

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  5. Parametric design of silo steel framework of concrete mixing station based on the finite element method and MATLAB

    Directory of Open Access Journals (Sweden)

    Long Hui

    2016-01-01

    Full Text Available When the structure of the silo steel framework of concrete mixing station is designed, In most cases, the dimension parameters, shape parameters and position parameters of silo steel framework beams are changed as the productivity adjustment of the concrete mixing station, but the structure types of silo steel framework will remain the same. In order to acquire strength of silo steel framework rapidly and efficiently, it is need to provide specialized parametric strength computational software for engineering staff who does not understand the three-dimensional software such as PROE and finite element analysis software. By the finite element methods(FEM, the parametric stress calculation modal of the silo steel framework of concrete mixing station is established, which includes dimension parameters, shape parameters, position parameters and applied load parameters of each beams, and then the parametric calculation program is written with MATLAB. The stress equations reflect the internal relationship between the stress of the silo steel frames with the dimension parameters, shape parameters, position parameters and load parameters. Finally, an example is presented, the calculation results show the stress of all members and the size and location of the maximum stress, which agrees well with realistic cases.

  6. Time-bin entangled photon pairs from spontaneous parametric down-conversion pumped by a cw multi-mode diode laser.

    Science.gov (United States)

    Kwon, Osung; Park, Kwang-Kyoon; Ra, Young-Sik; Kim, Yong-Su; Kim, Yoon-Ho

    2013-10-21

    Generation of time-bin entangled photon pairs requires the use of the Franson interferometer which consists of two spatially separated unbalanced Mach-Zehnder interferometers through which the signal and idler photons from spontaneous parametric down-conversion (SPDC) are made to transmit individually. There have been two SPDC pumping regimes where the scheme works: the narrowband regime and the double-pulse regime. In the narrowband regime, the SPDC process is pumped by a narrowband cw laser with the coherence length much longer than the path length difference of the Franson interferometer. In the double-pulse regime, the longitudinal separation between the pulse pair is made equal to the path length difference of the Franson interferometer. In this paper, we propose another regime by which the generation of time-bin entanglement is possible and demonstrate the scheme experimentally. In our scheme, differently from the previous approaches, the SPDC process is pumped by a cw multi-mode (i.e., short coherence length) laser and makes use of the coherence revival property of such a laser. The high-visibility two-photon Franson interference demonstrates clearly that high-quality time-bin entanglement source can be developed using inexpensive cw multi-mode diode lasers for various quantum communication applications.

  7. Mapping Plastic-Mulched Farmland with Multi-Temporal Landsat-8 Data

    Directory of Open Access Journals (Sweden)

    Hasituya

    2017-06-01

    Full Text Available Using plastic mulching for farmland is booming around the world. Despite its benefit of protecting crops from unfavorable conditions and increasing crop yield, the massive use of the plastic-mulching technique causes many environmental problems. Therefore, timely and effective mapping of plastic-mulched farmland (PMF is of great interest to policy-makers to leverage the trade-off between economic profit and adverse environmental impacts. However, it is still challenging to implement remote-sensing-based PMF mapping due to its changing spectral characteristics with the growing seasons of crops and geographic regions. In this study, we examined the potential of multi-temporal Landsat-8 imagery for mapping PMF. To this end, we gathered the information of spectra, textures, indices, and thermal features into random forest (RF and support vector machine (SVM algorithms in order to select the common characteristics for distinguishing PMF from other land cover types. The experiment was conducted in Jizhou, Hebei Province. The results demonstrated that the spectral features and indices features of NDVI (normalized difference vegetation index, GI (greenness index, and textural features of mean are more important than the other features for mapping PMF in Jizhou. With that, the optimal period for mapping PMF is in April, followed by May. A combination of these two times (April and May is better than later in the season. The highest overall, producer’s, and user’s accuracies achieved were 97.01%, 92.48%, and 96.40% in Jizhou, respectively.

  8. Mapping Deforestation in North Korea Using Phenology-Based Multi-Index and Random Forest

    Directory of Open Access Journals (Sweden)

    Yihua Jin

    2016-12-01

    Full Text Available Phenology-based multi-index with the random forest (RF algorithm can be used to overcome the shortcomings of traditional deforestation mapping that involves pixel-based classification, such as ISODATA or decision trees, and single images. The purpose of this study was to investigate methods to identify specific types of deforestation in North Korea, and to increase the accuracy of classification, using phenological characteristics extracted with multi-index and random forest algorithms. The mapping of deforestation area based on RF was carried out by merging phenology-based multi-indices (i.e., normalized difference vegetation index (NDVI, normalized difference water index (NDWI, and normalized difference soil index (NDSI derived from MODIS (Moderate Resolution Imaging Spectroradiometer products and topographical variables. Our results showed overall classification accuracy of 89.38%, with corresponding kappa coefficients of 0.87. In particular, for forest and farm land categories with similar phenological characteristic (e.g., paddy, plateau vegetation, unstocked forest, hillside field, this approach improved the classification accuracy in comparison with pixel-based methods and other classes. The deforestation types were identified by incorporating point data from high-resolution imagery, outcomes of image classification, and slope data. Our study demonstrated that the proposed methodology could be used for deciding on the restoration priority and monitoring the expansion of deforestation areas.

  9. MultiElec: A MATLAB Based Application for MEA Data Analysis.

    Science.gov (United States)

    Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R

    2015-01-01

    We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.

  10. Dose mapping of the multi-purpose gamma irradiation facility

    Energy Technology Data Exchange (ETDEWEB)

    Cabalfin, E G; Lanuza, L G; Villamater, D T [Irradiation Services, Nuclear Services and Training Division, Philippine Nuclear Research Institute, Quezon City (Philippines)

    1989-12-01

    In radiation processing, reliable dosimetry constitutes a very important part of process control and quality assurance. Radiation dosimetry is the only acceptable method to guarantee that the irradiated product has undergone the correct radiation treatment. In preparation therefore, for the routine operation of the newly installed multi-purpose gamma irradiation facility at the Philippine Nuclear Research Institute (PNRI), dose mapping distribution studies were undertaken. Results of dose distribution in air as well as in dummy product are presented. The effects of product bulk density, product geometry and product to source distance on minimum absorbed dose and uniformity ratio have been determined. (Author).

  11. Dose mapping of the multi-purpose gamma irradiation facility

    International Nuclear Information System (INIS)

    Cabalfin, E.G.; Lanuza, L.G.; Villamater, D.T.

    1989-01-01

    In radiation processing, reliable dosimetry constitutes a very important part of process control and quality assurance. Radiation dosimetry is the only acceptable method to guarantee that the irradiated product has undergone the correct radiation treatment. In preparation therefore, for the routine operation of the newly installed multi-purpose gamma irradiation facility at the Philippine Nuclear Research Institute (PNRI), dose mapping distribution studies were undertaken. Results of dose distribution in air as well as in dummy product are presented. The effects of product bulk density, product geometry and product to source distance on minimum absorbed dose and uniformity ratio have been determined. (Author)

  12. Level-statistics in Disordered Systems: A single parametric scaling and Connection to Brownian Ensembles

    OpenAIRE

    Shukla, Pragya

    2004-01-01

    We find that the statistics of levels undergoing metal-insulator transition in systems with multi-parametric Gaussian disorders and non-interacting electrons behaves in a way similar to that of the single parametric Brownian ensembles \\cite{dy}. The latter appear during a Poisson $\\to$ Wigner-Dyson transition, driven by a random perturbation. The analogy provides the analytical evidence for the single parameter scaling of the level-correlations in disordered systems as well as a tool to obtai...

  13. Developing integrated parametric planning models for budgeting and managing complex projects

    Science.gov (United States)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  14. Integrating science and education during an international, multi-parametric investigation of volcanic activity at Santiaguito volcano, Guatemala

    Science.gov (United States)

    Lavallée, Yan; Johnson, Jeffrey; Andrews, Benjamin; Wolf, Rudiger; Rose, William; Chigna, Gustavo; Pineda, Armand

    2016-04-01

    In January 2016, we held the first scientific/educational Workshops on Volcanoes (WoV). The workshop took place at Santiaguito volcano - the most active volcano in Guatemala. 69 international scientists of all ages participated in this intensive, multi-parametric investigation of the volcanic activity, which included the deployment of seismometers, tiltmeters, infrasound microphones and mini-DOAS as well as optical, thermographic, UV and FTIR cameras around the active vent. These instruments recorded volcanic activity in concert over a period of 3 to 9 days. Here we review the research activities and present some of the spectacular observations made through this interdisciplinary efforts. Observations range from high-resolution drone and IR footage of explosions, monitoring of rock falls and quantification of the erupted mass of different gases and ash, as well as morphological changes in the dome caused by recurring explosions (amongst many other volcanic processes). We will discuss the success of such integrative ventures in furthering science frontiers and developing the next generation of geoscientists.

  15. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    Science.gov (United States)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  16. Symplectic Tracking of Multi-Isotopic Heavy-Ion Beams in SixTrack

    CERN Document Server

    Hermes, Pascal; De Maria, Riccardo

    2016-01-01

    The software SixTrack provides symplectic proton tracking over a large number of turns. The code is used for the tracking of beam halo particles and the simulation of their interaction with the collimators to study the efficiency of the LHC collimation system. Tracking simulations for heavy-ion beams require taking into account the mass to charge ratio of each particle because heavy ions can be subject to fragmentation at their passage through the collimators. In this paper we present the derivation of a Hamiltonian for multi-isotopic heavy-ion beams and symplectic tracking maps derived from it. The resulting tracking maps were implemented in the tracking software SixTrack. With this modification, SixTrack can be used to natively track heavy-ion beams of multiple isotopes through a magnetic accelerator lattice.

  17. AUTOMATIC TEXTURE MAPPING OF ARCHITECTURAL AND ARCHAEOLOGICAL 3D MODELS

    Directory of Open Access Journals (Sweden)

    T. P. Kersten

    2012-07-01

    Full Text Available Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  18. Automatic Texture Mapping of Architectural and Archaeological 3d Models

    Science.gov (United States)

    Kersten, T. P.; Stallmann, D.

    2012-07-01

    Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  19. Multi-state modelling of repeated hospitalisation and death in patients with heart failure: The use of large administrative databases in clinical epidemiology.

    Science.gov (United States)

    Ieva, Francesca; Jackson, Christopher H; Sharples, Linda D

    2017-06-01

    In chronic diseases like heart failure (HF), the disease course and associated clinical event histories for the patient population vary widely. To improve understanding of the prognosis of patients and enable health care providers to assess and manage resources, we wish to jointly model disease progression, mortality and their relation with patient characteristics. We show how episodes of hospitalisation for disease-related events, obtained from administrative data, can be used as a surrogate for disease status. We propose flexible multi-state models for serial hospital admissions and death in HF patients, that are able to accommodate important features of disease progression, such as multiple ordered events and competing risks. Fully parametric and semi-parametric semi-Markov models are implemented using freely available software in R. The models were applied to a dataset from the administrative data bank of the Lombardia region in Northern Italy, which included 15,298 patients who had a first hospitalisation ending in 2006 and 4 years of follow-up thereafter. This provided estimates of the associations of age and gender with rates of hospital admission and length of stay in hospital, and estimates of the expected total time spent in hospital over five years. For example, older patients and men were readmitted more frequently, though the total time in hospital was roughly constant with age. We also discuss the relative merits of parametric and semi-parametric multi-state models, and model assessment and comparison.

  20. A Review of Some Superconducting Technologies for AtLAST: Parametric Amplifiers, Kinetic Inductance Detectors, and On-Chip Spectrometers

    Science.gov (United States)

    Noroozian, Omid

    2018-01-01

    The current state of the art for some superconducting technologies will be reviewed in the context of a future single-dish submillimeter telescope called AtLAST. The technologies reviews include: 1) Kinetic Inductance Detectors (KIDs), which have now been demonstrated in large-format kilo-pixel arrays with photon background-limited sensitivity suitable for large field of view cameras for wide-field imaging. 2) Parametric amplifiers - specifically the Traveling-Wave Kinetic Inductance (TKIP) amplifier - which has enormous potential to increase sensitivity, bandwidth, and mapping speed of heterodyne receivers, and 3) On-chip spectrometers, which combined with sensitive direct detectors such as KIDs or TESs could be used as Multi-Object Spectrometers on the AtLAST focal plane, and could provide low-medium resolution spectroscopy of 100 objects at a time in each field of view.

  1. Dissecting hemisphere-specific contributions to visual spatial imagery using parametric brain mapping.

    Science.gov (United States)

    Bien, Nina; Sack, Alexander T

    2014-07-01

    In the current study we aimed to empirically test previously proposed accounts of a division of labour between the left and right posterior parietal cortices during visuospatial mental imagery. The representation of mental images in the brain has been a topic of debate for several decades. Although the posterior parietal cortex is involved bilaterally, previous studies have postulated that hemispheric specialisation might result in a division of labour between the left and right parietal cortices. In the current fMRI study, we used an elaborated version of a behaviourally-controlled spatial imagery paradigm, the mental clock task, which involves mental image generation and a subsequent spatial comparison between two angles. By systematically varying the difference between the two angles that are mentally compared, we induced a symbolic distance effect: smaller differences between the two angles result in higher task difficulty. We employed parametrically weighed brain imaging to reveal brain areas showing a graded activation pattern in accordance with the induced distance effect. The parametric difficulty manipulation influenced behavioural data and brain activation patterns in a similar matter. Moreover, since this difficulty manipulation only starts to play a role from the angle comparison phase onwards, it allows for a top-down dissociation between the initial mental image formation, and the subsequent angle comparison phase of the spatial imagery task. Employing parametrically weighed fMRI analysis enabled us to top-down disentangle brain activation related to mental image formation, and activation reflecting spatial angle comparison. The results provide first empirical evidence for the repeatedly proposed division of labour between the left and right posterior parietal cortices during spatial imagery. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Utilizing a Multi-Source Forest Inventory Technique, MODIS Data and Landsat TM Images in the Production of Forest Cover and Volume Maps for the Terai Physiographic Zone in Nepal

    Directory of Open Access Journals (Sweden)

    Kalle Eerikäinen

    2012-12-01

    Full Text Available An approach based on the nearest neighbors techniques is presented for producing thematic maps of forest cover (forest/non-forest and total stand volume for the Terai region in southern Nepal. To create the forest cover map, we used a combination of Landsat TM satellite data and visual interpretation data, i.e., a sample grid of visual interpretation plots for which we obtained the land use classification according to the FAO standard. These visual interpretation plots together with the field plots for volume mapping originate from an operative forest inventory project, i.e., the Forest Resource Assessment of Nepal (FRA Nepal project. The field plots were also used in checking the classification accuracy. MODIS satellite data were used as a reference in a local correction approach conducted for the relative calibration of Landsat TM images. This study applied a non-parametric k-nearest neighbor technique (k-NN to the forest cover and volume mapping. A tree height prediction approach based on a nonlinear, mixed-effects (NLME modeling procedure is presented in the Appendix. The MODIS image data performed well as reference data for the calibration approach applied to make the Landsat image mosaic. The agreement between the forest cover map and the field observed values of forest cover was substantial in Western Terai (KHAT 0.745 and strong in Eastern Terai (KHAT 0.825. The forest cover and volume maps that were estimated using the k-NN method and the inventory data from the FRA Nepal project are already appropriate and valuable data for research purposes and for the planning of forthcoming forest inventories. Adaptation of the methods and techniques was carried out using Open Source software tools.

  3. CCLab--a multi-objective genetic algorithm based combinatorial library design software and an application for histone deacetylase inhibitor design.

    Science.gov (United States)

    Fang, Guanghua; Xue, Mengzhu; Su, Mingbo; Hu, Dingyu; Li, Yanlian; Xiong, Bing; Ma, Lanping; Meng, Tao; Chen, Yuelei; Li, Jingya; Li, Jia; Shen, Jingkang

    2012-07-15

    The introduction of the multi-objective optimization has dramatically changed the virtual combinatorial library design, which can consider many objectives simultaneously, such as synthesis cost and drug-likeness, thus may increase positive rates of biological active compounds. Here we described a software called CCLab (Combinatorial Chemistry Laboratory) for combinatorial library design based on the multi-objective genetic algorithm. Tests of the convergence ability and the ratio to re-take the building blocks in the reference library were conducted to assess the software in silico, and then it was applied to a real case of designing a 5×6 HDAC inhibitor library. Sixteen compounds in the resulted library were synthesized, and the histone deactetylase (HDAC) enzymatic assays proved that 14 compounds showed inhibitory ratios more than 50% against tested 3 HDAC enzymes at concentration of 20 μg/mL, with IC(50) values of 3 compounds comparable to SAHA. These results demonstrated that the CCLab software could enhance the hit rates of the designed library and would be beneficial for medicinal chemists to design focused library in drug development (the software can be downloaded at: http://202.127.30.184:8080/drugdesign.html). Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Ellipsoidal terrain correction based on multi-cylindrical equal-area map projection of the reference ellipsoid

    Science.gov (United States)

    Ardalan, A. A.; Safari, A.

    2004-09-01

    An operational algorithm for computation of terrain correction (or local gravity field modeling) based on application of closed-form solution of the Newton integral in terms of Cartesian coordinates in multi-cylindrical equal-area map projection of the reference ellipsoid is presented. Multi-cylindrical equal-area map projection of the reference ellipsoid has been derived and is described in detail for the first time. Ellipsoidal mass elements with various sizes on the surface of the reference ellipsoid are selected and the gravitational potential and vector of gravitational intensity (i.e. gravitational acceleration) of the mass elements are computed via numerical solution of the Newton integral in terms of geodetic coordinates {λ,ϕ,h}. Four base- edge points of the ellipsoidal mass elements are transformed into a multi-cylindrical equal-area map projection surface to build Cartesian mass elements by associating the height of the corresponding ellipsoidal mass elements to the transformed area elements. Using the closed-form solution of the Newton integral in terms of Cartesian coordinates, the gravitational potential and vector of gravitational intensity of the transformed Cartesian mass elements are computed and compared with those of the numerical solution of the Newton integral for the ellipsoidal mass elements in terms of geodetic coordinates. Numerical tests indicate that the difference between the two computations, i.e. numerical solution of the Newton integral for ellipsoidal mass elements in terms of geodetic coordinates and closed-form solution of the Newton integral in terms of Cartesian coordinates, in a multi-cylindrical equal-area map projection, is less than 1.6×10-8 m2/s2 for a mass element with a cross section area of 10×10 m and a height of 10,000 m. For a mass element with a cross section area of 1×1 km and a height of 10,000 m the difference is less than 1.5×10-4m2/s2. Since 1.5× 10-4 m2/s2 is equivalent to 1.5×10-5m in the vertical

  5. Mapping paddy rice distribution using multi-temporal Landsat imagery in the Sanjiang Plain, northeast China

    Science.gov (United States)

    XIAO, Xiangming; DONG, Jinwei; QIN, Yuanwei; WANG, Zongming

    2016-01-01

    Information of paddy rice distribution is essential for food production and methane emission calculation. Phenology-based algorithms have been utilized in the mapping of paddy rice fields by identifying the unique flooding and seedling transplanting phases using multi-temporal moderate resolution (500 m to 1 km) images. In this study, we developed simple algorithms to identify paddy rice at a fine resolution at the regional scale using multi-temporal Landsat imagery. Sixteen Landsat images from 2010–2012 were used to generate the 30 m paddy rice map in the Sanjiang Plain, northeast China—one of the major paddy rice cultivation regions in China. Three vegetation indices, Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Land Surface Water Index (LSWI), were used to identify rice fields during the flooding/transplanting and ripening phases. The user and producer accuracies of paddy rice on the resultant Landsat-based paddy rice map were 90% and 94%, respectively. The Landsat-based paddy rice map was an improvement over the paddy rice layer on the National Land Cover Dataset, which was generated through visual interpretation and digitalization on the fine-resolution images. The agricultural census data substantially underreported paddy rice area, raising serious concern about its use for studies on food security. PMID:27695637

  6. Development of a computer aided diagnosis model for prostate cancer classification on multi-parametric MRI

    Science.gov (United States)

    Alfano, R.; Soetemans, D.; Bauman, G. S.; Gibson, E.; Gaed, M.; Moussa, M.; Gomez, J. A.; Chin, J. L.; Pautler, S.; Ward, A. D.

    2018-02-01

    Multi-parametric MRI (mp-MRI) is becoming a standard in contemporary prostate cancer screening and diagnosis, and has shown to aid physicians in cancer detection. It offers many advantages over traditional systematic biopsy, which has shown to have very high clinical false-negative rates of up to 23% at all stages of the disease. However beneficial, mp-MRI is relatively complex to interpret and suffers from inter-observer variability in lesion localization and grading. Computer-aided diagnosis (CAD) systems have been developed as a solution as they have the power to perform deterministic quantitative image analysis. We measured the accuracy of such a system validated using accurately co-registered whole-mount digitized histology. We trained a logistic linear classifier (LOGLC), support vector machine (SVC), k-nearest neighbour (KNN) and random forest classifier (RFC) in a four part ROI based experiment against: 1) cancer vs. non-cancer, 2) high-grade (Gleason score ≥4+3) vs. low-grade cancer (Gleason score work will form the basis for a tool that enhances the radiologist's ability to detect malignancies, potentially improving biopsy guidance, treatment selection, and focal therapy for prostate cancer patients, maximizing the potential for cure and increasing quality of life.

  7. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  8. Rapid parametric mapping of the longitudinal relaxation time T1 using two-dimensional variable flip angle magnetic resonance imaging at 1.5 Tesla, 3 Tesla, and 7 Tesla.

    Science.gov (United States)

    Dieringer, Matthias A; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf

    2014-01-01

    Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and

  9. Complex mapping of aerofoils - a different perspective

    Science.gov (United States)

    Matthews, Miccal T.

    2012-01-01

    In this article an application of conformal mapping to aerofoil theory is studied from a geometric and calculus point of view. The problem is suitable for undergraduate teaching in terms of a project or extended piece of work, and brings together the concepts of geometric mapping, parametric equations, complex numbers and calculus. The Joukowski and Karman-Trefftz aerofoils are studied, and it is shown that the Karman-Trefftz aerofoil is an improvement over the Joukowski aerofoil from a practical point of view. For the most part only a spreadsheet program and pen and paper is required, only for the last portion of the study of the Karman-Trefftz aerofoils a symbolic computer package is employed. Ignoring the concept of a conformal mapping and instead viewing the problem from a parametric point of view, some interesting mappings are obtained. By considering the derivative of the mapped mapping via the chain rule, some new and interesting analytical results are obtained for the Joukowski aerofoil, and numerical results for the Karman-Trefftz aerofoil.

  10. Short-pulse propagation in fiber optical parametric amplifiers

    DEFF Research Database (Denmark)

    Cristofori, Valentina

    Fiber optical parametric amplifiers (FOPAs) are attractive because they can provide large gain over a broad range of central wavelengths, depending only on the availability of a suitable pump laser. In addition, FOPAs are suitable for the realization of all-optical signal processing functionalities...... transfer can be reduced in saturated F OPAs. In order to characterize propagation impairments such as dispersion and Kerr effect, affecting signals reaching multi-terabit per second per channel, short pulses on the order of 500 fs need to be considered. Therefore, a short pulses fiber laser source...... is implemented to obtain an all-fiber system. The advantages of all fiber-systems are related to their reliability, long-term stability and compactness. Fiber optical parametric chirped pulse amplification is promising for the amplification of such signals thanks to the inherent compatibility of FOPAs with fiber...

  11. Cerebral blood flow and related factors in hyperthyroidism patients by SPECT imaging and statistical parametric mapping analysis

    International Nuclear Information System (INIS)

    Xiu Yan; Shi Hongcheng; Liu Wenguan; Chen Xuefen; Gu Yushen; Chen Shuguang; Yu Haojun; Yu Yiping

    2010-01-01

    Objective: To investigate the cerebral blood flow (CBF) perfusion patterns and related factors in hyperthyroidism patients. Methods: Twenty-five patients with hyperthyroidism and twenty-two healthy controls matched for age, sex, education were enrolled. 99 Tc m -ethylene cysteinate dimer (ECD) SPECT CBF perfusion imaging was performed at rest. Statistical parametric mapping 5.0 software (SPM5) was used and a statistical threshold of P 3 , FT 4 ), thyroid autoimmune antibodies: sensitive thyroid stimulating hormone (sTSH), thyroid peroxidase antibody (TPOAb) and TSH receptor antibody (TRAb) by Pearson analysis, with disease duration by Spearman analysis. Results: rCBF was decreased significantly in limbic system and frontal lobe, including parahippocampal gyrus, uncus (posterior entorhinal cortex, posterior parolfactory cortex, parahippocampal cortex, anterior cingulate, right inferior temporal gyrus), left hypothalamus and caudate nucleus (P 3 (r=-0.468, -0.417, both P 4 (r=-0.4M, -0.418, -0.415, -0.459, all P 4 (r=0.419, 0.412, both P<0.05). rCBF in left insula was negatively correlated with concentration of sTSH, and right auditory associated cortex was positively correlated with concentration of sTSH (r=-0.504, 0.429, both P<0.05). rCBF in left middle temporal gyrus, left angular gyrus was positively correlated with concentration of TRAb while that in right thalamus, right hypothalamus, left anterior nucleus,left ventralis nucleus was negatively correlated with concentration of TRAb (r=0.750, 0.862, -0.691, -0.835, -0.713, -0.759, all P<0.05). rCBF in right anterior cingulate, right cuneus, right rectus gyrus, right superior marginal gyrus was positively correlated with concentration of TPOAb (r=0.696, 0.581, 0.779, 0.683, all P<0.05). rCBF in postcentral gyrus, temporal gyrus, left superior marginal gyrus and auditory associated cortex was positively correlated with disease duration (r=0.502, 0.457, 0.524, 0.440, all P<0.05). Conclusion: Hypoperfusions in

  12. Off-line mapping of multi-rate dependent task sets to many-core platforms

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Noulard, Eric; Pagetti, Claire

    2015-01-01

    This paper presents an approach to execute safety-critical applications on multi- and many-core processors in a predictable manner. We investigate three concrete platforms: the Intel Single-chip Cloud Computer, the Texas Instruments TMS320C6678 and the Tilera TILEmpower-Gx36. We define an execution...... model to safely execute dependent periodic task sets on these platforms. The four rules of the execution model entail that an off-line mapping of the application to the platform must be computed. The paper details our approach to automatically compute a valid mapping. Furthermore, we evaluate our...

  13. Rapid development of scalable scientific software using a process oriented approach

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2011-01-01

    Scientific applications are often not written with multiprocessing, cluster computing or grid computing in mind. This paper suggests using Python and PyCSP to structure scientific software through Communicating Sequential Processes. Three scientific applications are used to demonstrate the features...... of PyCSP and how networks of processes may easily be mapped into a visual representation for better understanding of the process workflow. We show that for many sequential solutions, the difficulty in implementing a parallel application is removed. The use of standard multi-threading mechanisms...

  14. Bamboo mapping of Ethiopia, Kenya and Uganda for the year 2016 using multi-temporal Landsat imagery

    Science.gov (United States)

    Zhao, Yuanyuan; Feng, Duole; Jayaraman, Durai; Belay, Daniel; Sebrala, Heiru; Ngugi, John; Maina, Eunice; Akombo, Rose; Otuoma, John; Mutyaba, Joseph; Kissa, Sam; Qi, Shuhua; Assefa, Fiker; Oduor, Nellie Mugure; Ndawula, Andrew Kalema; Li, Yanxia; Gong, Peng

    2018-04-01

    Mapping the spatial distribution of bamboo in East Africa is necessary for biodiversity conservation, resource management and policy making for rural poverty reduction. In this study, we produced a contemporary bamboo cover map of Ethiopia, Kenya and Uganda for the year 2016 using multi-temporal Landsat imagery series at 30 m spatial resolution. This is the first bamboo map generated using remotely sensed data for these three East African countries that possess most of the African bamboo resource. The producer's and user's accuracies of bamboos are 79.2% and 84.0%, respectively. The hotspots with large amounts of bamboo were identified and the area of bamboo coverage for each region was estimated according to the map. The seasonal growth status of two typical bamboo zones (one highland bamboo and one lowland bamboo) were analyzed and the multi-temporal imagery proved to be useful in differentiating bamboo from other vegetation classes. The images acquired in September to February are less contaminated by clouds and shadows, and the image series cover the dying back process of lowland bamboo, which were helpful for bamboo identification in East Africa.

  15. Separability Analysis of Sentinel-2A Multi-Spectral Instrument (MSI Data for Burned Area Discrimination

    Directory of Open Access Journals (Sweden)

    Haiyan Huang

    2016-10-01

    Full Text Available Biomass burning is a global phenomenon and systematic burned area mapping is of increasing importance for science and applications. With high spatial resolution and novelty in band design, the recently launched Sentinel-2A satellite provides a new opportunity for moderate spatial resolution burned area mapping. This study examines the performance of the Sentinel-2A Multi Spectral Instrument (MSI bands and derived spectral indices to differentiate between unburned and burned areas. For this purpose, five pairs of pre-fire and post-fire top of atmosphere (TOA reflectance and atmospherically corrected (surface reflectance images were studied. The pixel values of locations that were unburned in the first image and burned in the second image, as well as the values of locations that were unburned in both images which served as a control, were compared and the discrimination of individual bands and spectral indices were evaluated using parametric (transformed divergence and non-parametric (decision tree approaches. Based on the results, the most suitable MSI bands to detect burned areas are the 20 m near-infrared, short wave infrared and red-edge bands, while the performance of the spectral indices varied with location. The atmospheric correction only significantly influenced the separability of the visible wavelength bands. The results provide insights that are useful for developing Sentinel-2 burned area mapping algorithms.

  16. Parametric study on the behaviour of bolted composite connections

    Directory of Open Access Journals (Sweden)

    M. N. Kataoka

    Full Text Available The studied connections are composed of concrete filled steel tubes (CFT connected to composite beams by passing through bolts, endplates and steel deck, which also contributes to support the applied loads. The parametric analysis presented in this work is based on numerical simulations performed with software TNO Diana, using experimental results to calibrate the reference numerical model. The influence of three main parameters, being them the bolts diameter, the slab height and the beams cross section, was evaluated. According to the obtained bending moment versus rotation curves, it was concluded that, among the three parameters analyzed, the most important one was the bolts diameter. About the beams cross section, inconclusive results were achieved, probably due to the incompatibility between the 16 mm bolts and the robust beam cross sections considered in the parametric analysis.

  17. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  18. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  19. Optical parametric amplification of arbitrarily polarized light in periodically poled LiNbO3.

    Science.gov (United States)

    Shao, Guang-hao; Song, Xiao-shi; Xu, Fei; Lu, Yan-qing

    2012-08-13

    Optical parametric amplification (OPA) of arbitrarily polarized light is proposed in a multi-section periodically poled Lithium Niobate (PPLN). External electric field is applied on selected sections to induce the polarization rotation of involved lights, thus the quasi-phase matched optical parametric processes exhibit polarization insensitivity under suitable voltage. In addition to the amplified signal wave, an idler wave with the same polarization is generated simultaneously. As an example, a ~10 times OPA showing polarization independency is simulated. Applications of this technology are also discussed.

  20. Multi-parametric ultrasound criteria for internal carotid artery disease - comparison with CT angiography

    International Nuclear Information System (INIS)

    Barlinn, Kristian; Kepplinger, Jessica; Siepmann, Timo; Pallesen, Lars-Peder; Bodechtel, Ulf; Reichmann, Heinz; Puetz, Volker; Floegel, Thomas; Kitzler, Hagen H.; Alexandrov, Andrei V.

    2016-01-01

    The German Society of Ultrasound in Medicine (known by its acronym DEGUM) recently proposed a novel multi-parametric ultrasound approach for comprehensive and accurate assessment of extracranial internal carotid artery (ICA) steno-occlusive disease. We determined the agreement between duplex ultrasonography (DUS) interpreted by the DEGUM criteria and CT angiography (CTA) for grading of extracranial ICA steno-occlusive disease. Consecutive patients with acute cerebral ischemia underwent DUS and CTA. Internal carotid artery stenosis was graded according to the DEGUM-recommended criteria for DUS. Independent readers manually performed North American Symptomatic Carotid Endarterectomy Trial-type measurements on axial CTA source images. Both modalities were compared using Spearman's correlation and Bland-Altman analyses. A total of 303 acute cerebral ischemia patients (mean age, 72 ± 12 years; 58 % men; median baseline National Institutes of Health Stroke Scale score, 4 [interquartile range 7]) provided 593 DUS and CTA vessel pairs for comparison. There was a positive correlation between DUS and CTA (r s = 0.783, p < 0.001) with mean difference in degree of stenosis measurement of 3.57 %. Bland-Altman analysis further revealed widely varying differences (95 % limits of agreement -29.26 to 22.84) between the two modalities. Although the novel DEGUM criteria showed overall good agreement between DUS and CTA across all stenosis ranges, potential for wide incongruence with CTA underscores the need for local laboratory validation to avoid false screening results. (orig.)

  1. Multi-parametric ultrasound criteria for internal carotid artery disease - comparison with CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Barlinn, Kristian; Kepplinger, Jessica; Siepmann, Timo; Pallesen, Lars-Peder; Bodechtel, Ulf; Reichmann, Heinz; Puetz, Volker [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Floegel, Thomas [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Kitzler, Hagen H. [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Alexandrov, Andrei V. [The University of Tennessee Health Science Center, Department of Neurology, Memphis, TN (United States)

    2016-09-15

    The German Society of Ultrasound in Medicine (known by its acronym DEGUM) recently proposed a novel multi-parametric ultrasound approach for comprehensive and accurate assessment of extracranial internal carotid artery (ICA) steno-occlusive disease. We determined the agreement between duplex ultrasonography (DUS) interpreted by the DEGUM criteria and CT angiography (CTA) for grading of extracranial ICA steno-occlusive disease. Consecutive patients with acute cerebral ischemia underwent DUS and CTA. Internal carotid artery stenosis was graded according to the DEGUM-recommended criteria for DUS. Independent readers manually performed North American Symptomatic Carotid Endarterectomy Trial-type measurements on axial CTA source images. Both modalities were compared using Spearman's correlation and Bland-Altman analyses. A total of 303 acute cerebral ischemia patients (mean age, 72 ± 12 years; 58 % men; median baseline National Institutes of Health Stroke Scale score, 4 [interquartile range 7]) provided 593 DUS and CTA vessel pairs for comparison. There was a positive correlation between DUS and CTA (r{sub s} = 0.783, p < 0.001) with mean difference in degree of stenosis measurement of 3.57 %. Bland-Altman analysis further revealed widely varying differences (95 % limits of agreement -29.26 to 22.84) between the two modalities. Although the novel DEGUM criteria showed overall good agreement between DUS and CTA across all stenosis ranges, potential for wide incongruence with CTA underscores the need for local laboratory validation to avoid false screening results. (orig.)

  2. Information rich mapping requirement to product architecture through functional system deployment: The multi entity domain approach

    DEFF Research Database (Denmark)

    Hauksdóttir, Dagný; Mortensen, Niels Henrik

    2017-01-01

    may impede the ability to evolve, maintain or reuse systems. In this paper the Multi Entity Domain Approach (MEDA) is presented. The approach combines different design information within the domain views, incorporates both Software and Hardware design and supports iterative requirements definition...

  3. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  4. Experimental demonstration of bandwidth on demand (BoD) provisioning based on time scheduling in software-defined multi-domain optical networks

    Science.gov (United States)

    Zhao, Yongli; Li, Yajie; Wang, Xinbo; Chen, Bowen; Zhang, Jie

    2016-09-01

    A hierarchical software-defined networking (SDN) control architecture is designed for multi-domain optical networks with the Open Daylight (ODL) controller. The OpenFlow-based Control Virtual Network Interface (CVNI) protocol is deployed between the network orchestrator and the domain controllers. Then, a dynamic bandwidth on demand (BoD) provisioning solution is proposed based on time scheduling in software-defined multi-domain optical networks (SD-MDON). Shared Risk Link Groups (SRLG)-disjoint routing schemes are adopted to separate each tenant for reliability. The SD-MDON testbed is built based on the proposed hierarchical control architecture. Then the proposed time scheduling-based BoD (Ts-BoD) solution is experimentally demonstrated on the testbed. The performance of the Ts-BoD solution is evaluated with respect to blocking probability, resource utilization, and lightpath setup latency.

  5. Satellite SAR interferometric techniques applied to emergency mapping

    Science.gov (United States)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce

  6. DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido

    2018-03-01

    High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.

  7. A Transformational Approach to Parametric Accumulated-Cost Static Profiling

    DEFF Research Database (Denmark)

    Haemmerlé, Rémy; López García, Pedro; Liqat, Umer

    2016-01-01

    Traditional static resource analyses estimate the total resource usage of a program, without executing it. In this paper we present a novel resource analysis whose aim is instead the static profiling of accumulated cost, i.e., to discover, for selected parts of the program, an estimate or bound...... of the resource usage accumulated in each of those parts. Traditional resource analyses are parametric in the sense that the results can be functions on input data sizes. Our static profiling is also parametric, i.e., our accumulated cost estimates are also parameterized by input data sizes. Our proposal is based...... on the concept of cost centers and a program transformation that allows the static inference of functions that return bounds on these accumulated costs depending on input data sizes, for each cost center of interest. Such information is much more useful to the software developer than the traditional resource...

  8. Integrating acoustic analysis in the architectural design process using parametric modelling

    DEFF Research Database (Denmark)

    Peters, Brady

    2011-01-01

    This paper discusses how parametric modeling techniques can be used to provide architectural designers with a better understanding of the acoustic performance of their designs and provide acoustic engineers with models that can be analyzed using computational acoustic analysis software. Architects......, acoustic performance can inform the geometry and material logic of the design. In this way, the architectural design and the acoustic analysis model become linked....

  9. Intensity correction method customized for multi-animal abdominal MR imaging with 3 T clinical scanner and multi-array coil

    International Nuclear Information System (INIS)

    Mitsuda, Minoru; Yamaguchi, Masayuki; Nakagami, Ryutaro; Furuta, Toshihiro; Fujii, Hirofumi; Sekine, Norio; Niitsu, Mamoru; Moriyama, Noriyuki

    2013-01-01

    Simultaneous magnetic resonance (MR) imaging of multiple small animals in a single session increases throughput of preclinical imaging experiments. Such imaging using a 3-tesla clinical scanner with multi-array coil requires correction of intensity variation caused by the inhomogeneous sensitivity profile of the coil. We explored a method for correcting intensity that we customized for multi-animal MR imaging, especially abdominal imaging. Our institutional committee for animal experimentation approved the protocol. We acquired high resolution T 1 -, T 2 -, and T 2 * -weighted images and low resolution proton density-weighted images (PDWIs) of 4 rat abdomens simultaneously using a 3T clinical scanner and custom-made multi-array coil. For comparison, we also acquired T 1 -, T 2 -, and T 2 * -weighted volume coil images in the same rats in 4 separate sessions. We used software created in-house to correct intensity variation. We applied thresholding to the PDWIs to produce binary images that displayed only a signal-producing area, calculated multi-array coil sensitivity maps by dividing low-pass filtered PDWIs by low-pass filtered binary images pixel by pixel, and divided uncorrected T 1 -, T 2 -, or T 2 * -weighted images by those maps to obtain intensity-corrected images. We compared tissue contrast among the liver, spinal canal, and muscle between intensity-corrected multi-array coil images and volume coil images. Our intensity correction method performed well for all pulse sequences studied and corrected variation in original multi-array coil images without deteriorating the throughput of animal experiments. Tissue contrasts were comparable between intensity-corrected multi-array coil images and volume coil images. Our intensity correction method customized for multi-animal abdominal MR imaging using a 3T clinical scanner and dedicated multi-array coil could facilitate image interpretation. (author)

  10. Real-time SHVC software decoding with multi-threaded parallel processing

    Science.gov (United States)

    Gudumasu, Srinivas; He, Yuwen; Ye, Yan; He, Yong; Ryu, Eun-Seok; Dong, Jie; Xiu, Xiaoyu

    2014-09-01

    This paper proposes a parallel decoding framework for scalable HEVC (SHVC). Various optimization technologies are implemented on the basis of SHVC reference software SHM-2.0 to achieve real-time decoding speed for the two layer spatial scalability configuration. SHVC decoder complexity is analyzed with profiling information. The decoding process at each layer and the up-sampling process are designed in parallel and scheduled by a high level application task manager. Within each layer, multi-threaded decoding is applied to accelerate the layer decoding speed. Entropy decoding, reconstruction, and in-loop processing are pipeline designed with multiple threads based on groups of coding tree units (CTU). A group of CTUs is treated as a processing unit in each pipeline stage to achieve a better trade-off between parallelism and synchronization. Motion compensation, inverse quantization, and inverse transform modules are further optimized with SSE4 SIMD instructions. Simulations on a desktop with an Intel i7 processor 2600 running at 3.4 GHz show that the parallel SHVC software decoder is able to decode 1080p spatial 2x at up to 60 fps (frames per second) and 1080p spatial 1.5x at up to 50 fps for those bitstreams generated with SHVC common test conditions in the JCT-VC standardization group. The decoding performance at various bitrates with different optimization technologies and different numbers of threads are compared in terms of decoding speed and resource usage, including processor and memory.

  11. Software defined multi-OLT passive optical network for flexible traffic allocation

    Science.gov (United States)

    Zhang, Shizong; Gu, Rentao; Ji, Yuefeng; Zhang, Jiawei; Li, Hui

    2016-10-01

    With the rapid growth of 4G mobile network and vehicular network services mobile terminal users have increasing demand on data sharing among different radio remote units (RRUs) and roadside units (RSUs). Meanwhile, commercial video-streaming, video/voice conference applications delivered through peer-to-peer (P2P) technology are still keep on stimulating the sharp increment of bandwidth demand in both business and residential subscribers. However, a significant issue is that, although wavelength division multiplexing (WDM) and orthogonal frequency division multiplexing (OFDM) technology have been proposed to fulfil the ever-increasing bandwidth demand in access network, the bandwidth of optical fiber is not unlimited due to the restriction of optical component properties and modulation/demodulation technology, and blindly increase the wavelength cannot meet the cost-sensitive characteristic of the access network. In this paper, we propose a software defined multi-OLT PON architecture to support efficient scheduling of access network traffic. By introducing software defined networking technology and wavelength selective switch into TWDM PON system in central office, multiple OLTs can be considered as a bandwidth resource pool and support flexible traffic allocation for optical network units (ONUs). Moreover, under the configuration of the control plane, ONUs have the capability of changing affiliation between different OLTs under different traffic situations, thus the inter-OLT traffic can be localized and the data exchange pressure of the core network can be released. Considering this architecture is designed to be maximum following the TWDM PON specification, the existing optical distribution network (ODN) investment can be saved and conventional EPON/GPON equipment can be compatible with the proposed architecture. What's more, based on this architecture, we propose a dynamic wavelength scheduling algorithm, which can be deployed as an application on control plane

  12. Low cost, multiscale and multi-sensor application for flooded area mapping

    Directory of Open Access Journals (Sweden)

    D. Giordan

    2018-05-01

    Full Text Available Flood mapping and estimation of the maximum water depth are essential elements for the first damage evaluation, civil protection intervention planning and detection of areas where remediation is needed. In this work, we present and discuss a methodology for mapping and quantifying flood severity over floodplains. The proposed methodology considers a multiscale and multi-sensor approach using free or low-cost data and sensors. We applied this method to the November 2016 Piedmont (northwestern Italy flood. We first mapped the flooded areas at the basin scale using free satellite data from low- to medium-high-resolution from both the SAR (Sentinel-1, COSMO-Skymed and multispectral sensors (MODIS, Sentinel-2. Using very- and ultra-high-resolution images from the low-cost aerial platform and remotely piloted aerial system, we refined the flooded zone and detected the most damaged sector. The presented method considers both urbanised and non-urbanised areas. Nadiral images have several limitations, in particular in urbanised areas, where the use of terrestrial images solved this limitation. Very- and ultra-high-resolution images were processed with structure from motion (SfM for the realisation of 3-D models. These data, combined with an available digital terrain model, allowed us to obtain maps of the flooded area, maximum high water area and damaged infrastructures.

  13. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    schemas for providing a framework that can help identify the categories that have attracted significant amount of research and commercial efforts, and the research areas where there are gaps to be filled. Conclusions: The findings show that whilst commercial and open source solutions are predominantly...... gaps. Objective: The objective of this research is to systematically identify and classify a comprehensive list of the technologies that have been developed and/or used for supporting GSD teams. Method: This study has been undertaken as a Systematic Mapping Study (SMS). Our searches identified 1958......Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil research...

  14. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  15. Change of diffusion anisotropy in patients with acute cerebral infarction using statistical parametric analysis

    International Nuclear Information System (INIS)

    Morita, Naomi; Harada, Masafumi; Uno, Masaaki; Furutani, Kaori; Nishitani, Hiromu

    2006-01-01

    We conducted statistical parametric comparison of fractional anisotropy (FA) images and quantified FA values to determine whether significant change occurs in the ischemic region. The subjects were 20 patients seen within 24 h after onset of ischemia. For statistical comparison of FA images, a sample FA image was coordinated by the Talairach template, and each FA map was normalized. Statistical comparison was conducted using statistical parametric mapping (SPM) 99. Regions of interest were set in the same region on apparent diffusion coefficient (ADC) and FA maps, the region being consistent with the hyperintense region on diffusion-weighted images (DWIs). The contralateral region was also measured to obtain asymmetry ratios of ADC and FA. Regions with areas of statistical significance on FA images were found only in the white matter of three patients, although the regions were smaller than hyperintense regions on DWIs. The mean ADC and FA ratios were 0.64±0.16 and 0.93±0.09, respectively, and the degree of FA change was less than that of the ADC change. Significant change in diffusion anisotropy was limited to the severely infarcted core of the white matter. We believe statistical comparison of FA maps to be useful for detecting different regions of diffusion anisotropy. (author)

  16. Ab initio and template-based prediction of multi-class distance maps by two-dimensional recursive neural networks

    Directory of Open Access Journals (Sweden)

    Martin Alberto JM

    2009-01-01

    Full Text Available Abstract Background Prediction of protein structures from their sequences is still one of the open grand challenges of computational biology. Some approaches to protein structure prediction, especially ab initio ones, rely to some extent on the prediction of residue contact maps. Residue contact map predictions have been assessed at the CASP competition for several years now. Although it has been shown that exact contact maps generally yield correct three-dimensional structures, this is true only at a relatively low resolution (3–4 Å from the native structure. Another known weakness of contact maps is that they are generally predicted ab initio, that is not exploiting information about potential homologues of known structure. Results We introduce a new class of distance restraints for protein structures: multi-class distance maps. We show that Cα trace reconstructions based on 4-class native maps are significantly better than those from residue contact maps. We then build two predictors of 4-class maps based on recursive neural networks: one ab initio, or relying on the sequence and on evolutionary information; one template-based, or in which homology information to known structures is provided as a further input. We show that virtually any level of sequence similarity to structural templates (down to less than 10% yields more accurate 4-class maps than the ab initio predictor. We show that template-based predictions by recursive neural networks are consistently better than the best template and than a number of combinations of the best available templates. We also extract binary residue contact maps at an 8 Å threshold (as per CASP assessment from the 4-class predictors and show that the template-based version is also more accurate than the best template and consistently better than the ab initio one, down to very low levels of sequence identity to structural templates. Furthermore, we test both ab-initio and template-based 8

  17. A novel approach for multiple mobile objects path planning: Parametrization method and conflict resolution strategy

    International Nuclear Information System (INIS)

    Ma, Yong; Wang, Hongwei; Zamirian, M.

    2012-01-01

    We present a new approach containing two steps to determine conflict-free paths for mobile objects in two and three dimensions with moving obstacles. Firstly, the shortest path of each object is set as goal function which is subject to collision-avoidance criterion, path smoothness, and velocity and acceleration constraints. This problem is formulated as calculus of variation problem (CVP). Using parametrization method, CVP is converted to time-varying nonlinear programming problems (TNLPP) and then resolved. Secondly, move sequence of object is assigned by priority scheme; conflicts are resolved by multilevel conflict resolution strategy. Approach efficiency is confirmed by numerical examples. -- Highlights: ► Approach with parametrization method and conflict resolution strategy is proposed. ► Approach fits for multi-object paths planning in two and three dimensions. ► Single object path planning and multi-object conflict resolution are orderly used. ► Path of each object obtained with parameterization method in the first phase. ► Conflict-free paths gained by multi-object conflict resolution in the second phase.

  18. A New Multi-Step Iterative Algorithm for Approximating Common Fixed Points of a Finite Family of Multi-Valued Bregman Relatively Nonexpansive Mappings

    Directory of Open Access Journals (Sweden)

    Wiyada Kumam

    2016-05-01

    Full Text Available In this article, we introduce a new multi-step iteration for approximating a common fixed point of a finite class of multi-valued Bregman relatively nonexpansive mappings in the setting of reflexive Banach spaces. We prove a strong convergence theorem for the proposed iterative algorithm under certain hypotheses. Additionally, we also use our results for the solution of variational inequality problems and to find the zero points of maximal monotone operators. The theorems furnished in this work are new and well-established and generalize many well-known recent research works in this field.

  19. ECG strain pattern in hypertension is associated with myocardial cellular expansion and diffuse interstitial fibrosis: a multi-parametric cardiac magnetic resonance study.

    Science.gov (United States)

    Rodrigues, Jonathan C L; Amadu, Antonio Matteo; Ghosh Dastidar, Amardeep; McIntyre, Bethannie; Szantho, Gergley V; Lyen, Stephen; Godsave, Cattleya; Ratcliffe, Laura E K; Burchell, Amy E; Hart, Emma C; Hamilton, Mark C K; Nightingale, Angus K; Paton, Julian F R; Manghat, Nathan E; Bucciarelli-Ducci, Chiara

    2017-04-01

    In hypertension, the presence of left ventricular (LV) strain pattern on 12-lead electrocardiogram (ECG) carries adverse cardiovascular prognosis. The underlying mechanisms are poorly understood. We investigated whether hypertensive ECG strain is associated with myocardial interstitial fibrosis and impaired myocardial strain, assessed by multi-parametric cardiac magnetic resonance (CMR). A total of 100 hypertensive patients [50 ± 14 years, male: 58%, office systolic blood pressure (SBP): 170 ± 30 mmHg, office diastolic blood pressure (DBP): 97 ± 14 mmHg) underwent ECG and 1.5T CMR and were compared with 25 normotensive controls (46 ± 14 years, 60% male, SBP: 124 ± 8 mmHg, DBP: 76 ± 7 mmHg). Native T1 and extracellular volume fraction (ECV) were calculated with the modified look-locker inversion-recovery sequence. Myocardial strain values were estimated with voxel-tracking software. ECG strain (n = 20) was associated with significantly higher indexed LV mass (LVM) (119 ± 32 vs. 80 ± 17 g/m2, P ECG strain (n = 80). ECG strain subjects had significantly impaired circumferential strain compared with hypertensive subjects without ECG strain and controls (-15.2 ± 4.7 vs. -17.0 ± 3.3 vs. -17.3 ± 2.4%, P ECG strain subjects to hypertensive subjects with elevated LVM but no ECG strain, a significantly higher ECV (30 ± 4 vs. 28 ± 3%, P ECG strain in multivariate logistic regression analysis [odds ratio (95th confidence interval): 1.07 (1.02-1.12), P ECG strain is a marker of advanced LVH associated with increased interstitial fibrosis and associated with significant myocardial circumferential strain impairment. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  20. Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation

    Directory of Open Access Journals (Sweden)

    Timo Korthals

    2018-03-01

    Full Text Available Today, agricultural vehicles are available that can automatically perform tasks such as weed detection and spraying, mowing, and sowing while being steered automatically. However, for such systems to be fully autonomous and self-driven, not only their specific agricultural tasks must be automated. An accurate and robust perception system automatically detecting and avoiding all obstacles must also be realized to ensure safety of humans, animals, and other surroundings. In this paper, we present a multi-modal obstacle and environment detection and recognition approach for process evaluation in agricultural fields. The proposed pipeline detects and maps static and dynamic obstacles globally, while providing process-relevant information along the traversed trajectory. Detection algorithms are introduced for a variety of sensor technologies, including range sensors (lidar and radar and cameras (stereo and thermal. Detection information is mapped globally into semantical occupancy grid maps and fused across all sensors with late fusion, resulting in accurate traversability assessment and semantical mapping of process-relevant categories (e.g., crop, ground, and obstacles. Finally, a decoding step uses a Hidden Markov model to extract relevant process-specific parameters along the trajectory of the vehicle, thus informing a potential control system of unexpected structures in the planned path. The method is evaluated on a public dataset for multi-modal obstacle detection in agricultural fields. Results show that a combination of multiple sensor modalities increases detection performance and that different fusion strategies must be applied between algorithms detecting similar and dissimilar classes.

  1. Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.

    Science.gov (United States)

    Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien

    2017-01-01

    Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.

  2. Multi-component optical solitary waves

    DEFF Research Database (Denmark)

    Kivshar, Y. S.; Sukhorukov, A. A.; Ostrovskaya, E. A.

    2000-01-01

    We discuss several novel types of multi-component (temporal and spatial) envelope solitary waves that appear in fiber and waveguide nonlinear optics. In particular, we describe multi-channel solitary waves in bit-parallel-wavelength fiber transmission systems for highperformance computer networks......, multi-color parametric spatial solitary waves due to cascaded nonlinearities of quadratic materials, and quasiperiodic envelope solitons due to quasi-phase-matching in Fibonacci optical superlattices. (C) 2000 Elsevier Science B.V. All rights reserved....

  3. Web mapping: tools and solutions for creating interactive maps of forestry interest

    Directory of Open Access Journals (Sweden)

    Notarangelo G

    2011-12-01

    Full Text Available The spread of geobrowsers as tools for displaying geographically referenced information provides insights and opportunities to those who, not being specialists in Geographic Information Systems, want to take advantage from exploration and communication power offered by these software. Through the use of web services such as Google Maps and the use of suitable markup languages, one can create interactive maps starting from highly heterogeneous data and information. These interactive maps can also be easily distributed and shared with Internet users, because they do not need to use proprietary software nor special skills but only a web browser. Unlike the maps created with GIS, whose output usually is a static image, the interactive maps retain all their features to users advantage. This paper describes a web application that, using the Keyhole Markup Language and the free service of Google Maps, produces choropleth maps relating to some forest indicators estimated by the last Italian National Forest Inventory. The creation of a map is done through a simple and intuitive interface. The maps created by users can be downloaded as KML file and can be viewed or modified via the freeware application Google Earth or free and open source GIS software like Quantum GIS. The web application is free and available at www.ricercaforestale.it.

  4. Parametric Study of Sealant Nozzle

    Science.gov (United States)

    Yamamoto, Yoshimi

    It has become apparent in recent years the advancement of manufacturing processes in the aerospace industry. Sealant nozzles are a critical device in the use of fuel tank applications for optimal bonds and for ground service support and repair. Sealants has always been a challenging area for optimizing and understanding the flow patterns. A parametric study was conducted to better understand geometric effects of sealant flow and to determine whether the sealant rheology can be numerically modeled. The Star-CCM+ software was used to successfully develop the parametric model, material model, physics continua, and simulate the fluid flow for the sealant nozzle. The simulation results of Semco sealant nozzles showed the geometric effects of fluid flow patterns and the influences from conical area reduction, tip length, inlet diameter, and tip angle parameters. A smaller outlet diameter induced maximum outlet velocity at the exit, and contributed to a high pressure drop. The conical area reduction, tip angle and inlet diameter contributed most to viscosity variation phenomenon. Developing and simulating 2 different flow models (Segregated Flow and Viscous Flow) proved that both can be used to obtain comparable velocity and pressure drop results, however; differences are seen visually in the non-uniformity of the velocity and viscosity fields for the Viscous Flow Model (VFM). A comprehensive simulation setup for sealant nozzles was developed so other analysts can utilize the data.

  5. Nonlinear Dynamical Analysis for the Cable Excited with Parametric and Forced Excitation

    Directory of Open Access Journals (Sweden)

    C. Z. Qian

    2014-01-01

    Full Text Available Considering the deck vibration effect on the cable in cable-stayed bridge, using nonlinear structure dynamics theory, the nonlinear dynamical equation for the stayed cable excited with deck vibration is proposed. Research shows that the vertical vibration of the deck has a combined parametric and forced excitation effect on the cable when the angle of the cable is taken into consideration. Using multiscale method, the 1/2 principle parametric resonance is studied and the bifurcation equation is obtained. Despite the parameters analysis, the bifurcation characters of the dynamical system are studied. At last, by means of numerical method and software MATHMATIC, the effect rules of system parameters to the dynamical behavior of the system are studied, and some useful conclusions are obtained.

  6. The problem of low variance voxels in statistical parametric mapping; a new hat avoids a 'haircut'.

    Science.gov (United States)

    Ridgway, Gerard R; Litvak, Vladimir; Flandin, Guillaume; Friston, Karl J; Penny, Will D

    2012-02-01

    Statistical parametric mapping (SPM) locates significant clusters based on a ratio of signal to noise (a 'contrast' of the parameters divided by its standard error) meaning that very low noise regions, for example outside the brain, can attain artefactually high statistical values. Similarly, the commonly applied preprocessing step of Gaussian spatial smoothing can shift the peak statistical significance away from the peak of the contrast and towards regions of lower variance. These problems have previously been identified in positron emission tomography (PET) (Reimold et al., 2006) and voxel-based morphometry (VBM) (Acosta-Cabronero et al., 2008), but can also appear in functional magnetic resonance imaging (fMRI) studies. Additionally, for source-reconstructed magneto- and electro-encephalography (M/EEG), the problems are particularly severe because sparsity-favouring priors constrain meaningfully large signal and variance to a small set of compactly supported regions within the brain. (Acosta-Cabronero et al., 2008) suggested adding noise to background voxels (the 'haircut'), effectively increasing their noise variance, but at the cost of contaminating neighbouring regions with the added noise once smoothed. Following theory and simulations, we propose to modify--directly and solely--the noise variance estimate, and investigate this solution on real imaging data from a range of modalities. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Land use/land cover mapping using multi-scale texture processing of high resolution data

    Science.gov (United States)

    Wong, S. N.; Sarker, M. L. R.

    2014-02-01

    Land use/land cover (LULC) maps are useful for many purposes, and for a long time remote sensing techniques have been used for LULC mapping using different types of data and image processing techniques. In this research, high resolution satellite data from IKONOS was used to perform land use/land cover mapping in Johor Bahru city and adjacent areas (Malaysia). Spatial image processing was carried out using the six texture algorithms (mean, variance, contrast, homogeneity, entropy, and GLDV angular second moment) with five difference window sizes (from 3×3 to 11×11). Three different classifiers i.e. Maximum Likelihood Classifier (MLC), Artificial Neural Network (ANN) and Supported Vector Machine (SVM) were used to classify the texture parameters of different spectral bands individually and all bands together using the same training and validation samples. Results indicated that texture parameters of all bands together generally showed a better performance (overall accuracy = 90.10%) for land LULC mapping, however, single spectral band could only achieve an overall accuracy of 72.67%. This research also found an improvement of the overall accuracy (OA) using single-texture multi-scales approach (OA = 89.10%) and single-scale multi-textures approach (OA = 90.10%) compared with all original bands (OA = 84.02%) because of the complementary information from different bands and different texture algorithms. On the other hand, all of the three different classifiers have showed high accuracy when using different texture approaches, but SVM generally showed higher accuracy (90.10%) compared to MLC (89.10%) and ANN (89.67%) especially for the complex classes such as urban and road.

  8. Land use/land cover mapping using multi-scale texture processing of high resolution data

    International Nuclear Information System (INIS)

    Wong, S N; Sarker, M L R

    2014-01-01

    Land use/land cover (LULC) maps are useful for many purposes, and for a long time remote sensing techniques have been used for LULC mapping using different types of data and image processing techniques. In this research, high resolution satellite data from IKONOS was used to perform land use/land cover mapping in Johor Bahru city and adjacent areas (Malaysia). Spatial image processing was carried out using the six texture algorithms (mean, variance, contrast, homogeneity, entropy, and GLDV angular second moment) with five difference window sizes (from 3×3 to 11×11). Three different classifiers i.e. Maximum Likelihood Classifier (MLC), Artificial Neural Network (ANN) and Supported Vector Machine (SVM) were used to classify the texture parameters of different spectral bands individually and all bands together using the same training and validation samples. Results indicated that texture parameters of all bands together generally showed a better performance (overall accuracy = 90.10%) for land LULC mapping, however, single spectral band could only achieve an overall accuracy of 72.67%. This research also found an improvement of the overall accuracy (OA) using single-texture multi-scales approach (OA = 89.10%) and single-scale multi-textures approach (OA = 90.10%) compared with all original bands (OA = 84.02%) because of the complementary information from different bands and different texture algorithms. On the other hand, all of the three different classifiers have showed high accuracy when using different texture approaches, but SVM generally showed higher accuracy (90.10%) compared to MLC (89.10%) and ANN (89.67%) especially for the complex classes such as urban and road

  9. Multivariable Parametric Cost Model for Ground Optical Telescope Assembly

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes; Reese, Gayle; Byberg, Alicia

    2005-01-01

    A parametric cost model for ground-based telescopes is developed using multivariable statistical analysis of both engineering and performance parameters. While diameter continues to be the dominant cost driver, diffraction-limited wavelength is found to be a secondary driver. Other parameters such as radius of curvature are examined. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e., multi-telescope phased-array systems). Additionally, single variable models Based on aperture diameter are derived.

  10. Vectoring of parallel synthetic jets: A parametric study

    Science.gov (United States)

    Berk, Tim; Gomit, Guillaume; Ganapathisubramani, Bharathram

    2016-11-01

    The vectoring of a pair of parallel synthetic jets can be described using five dimensionless parameters: the aspect ratio of the slots, the Strouhal number, the Reynolds number, the phase difference between the jets and the spacing between the slots. In the present study, the influence of the latter four on the vectoring behaviour of the jets is examined experimentally using particle image velocimetry. Time-averaged velocity maps are used to study the variations in vectoring behaviour for a parametric sweep of each of the four parameters independently. A topological map is constructed for the full four-dimensional parameter space. The vectoring behaviour is described both qualitatively and quantitatively. A vectoring mechanism is proposed, based on measured vortex positions. We acknowledge the financial support from the European Research Council (ERC Grant Agreement No. 277472).

  11. Optimized statistical parametric mapping procedure for NIRS data contaminated by motion artifacts : Neurometric analysis of body schema extension.

    Science.gov (United States)

    Suzuki, Satoshi

    2017-09-01

    This study investigated the spatial distribution of brain activity on body schema (BS) modification induced by natural body motion using two versions of a hand-tracing task. In Task 1, participants traced Japanese Hiragana characters using the right forefinger, requiring no BS expansion. In Task 2, participants performed the tracing task with a long stick, requiring BS expansion. Spatial distribution was analyzed using general linear model (GLM)-based statistical parametric mapping of near-infrared spectroscopy data contaminated with motion artifacts caused by the hand-tracing task. Three methods were utilized in series to counter the artifacts, and optimal conditions and modifications were investigated: a model-free method (Step 1), a convolution matrix method (Step 2), and a boxcar-function-based Gaussian convolution method (Step 3). The results revealed four methodological findings: (1) Deoxyhemoglobin was suitable for the GLM because both Akaike information criterion and the variance against the averaged hemodynamic response function were smaller than for other signals, (2) a high-pass filter with a cutoff frequency of .014 Hz was effective, (3) the hemodynamic response function computed from a Gaussian kernel function and its first- and second-derivative terms should be included in the GLM model, and (4) correction of non-autocorrelation and use of effective degrees of freedom were critical. Investigating z-maps computed according to these guidelines revealed that contiguous areas of BA7-BA40-BA21 in the right hemisphere became significantly activated ([Formula: see text], [Formula: see text], and [Formula: see text], respectively) during BS modification while performing the hand-tracing task.

  12. Statistical parametric mapping of the regional distribution and ontogenetic scaling of foot pressures during walking in Asian elephants (Elephas maximus).

    Science.gov (United States)

    Panagiotopoulou, Olga; Pataky, Todd C; Hill, Zoe; Hutchinson, John R

    2012-05-01

    Foot pressure distributions during locomotion have causal links with the anatomical and structural configurations of the foot tissues and the mechanics of locomotion. Elephant feet have five toes bound in a flexible pad of fibrous tissue (digital cushion). Does this specialized foot design control peak foot pressures in such giant animals? And how does body size, such as during ontogenetic growth, influence foot pressures? We addressed these questions by studying foot pressure distributions in elephant feet and their correlation with body mass and centre of pressure trajectories, using statistical parametric mapping (SPM), a neuro-imaging technology. Our results show a positive correlation between body mass and peak pressures, with the highest pressures dominated by the distal ends of the lateral toes (digits 3, 4 and 5). We also demonstrate that pressure reduction in the elephant digital cushion is a complex interaction of its viscoelastic tissue structure and its centre of pressure trajectories, because there is a tendency to avoid rear 'heel' contact as an elephant grows. Using SPM, we present a complete map of pressure distributions in elephant feet during ontogeny by performing statistical analysis at the pixel level across the entire plantar/palmar surface. We hope that our study will build confidence in the potential clinical and scaling applications of mammalian foot pressures, given our findings in support of a link between regional peak pressures and pathogenesis in elephant feet.

  13. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Ilander, T; Kansanaho, A; Toivonen, H

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.).

  14. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    International Nuclear Information System (INIS)

    Ilander, T.; Kansanaho, A.; Toivonen, H.

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.)

  15. Parametric modelling design applied to weft knitted surfaces and its effects in their physical properties

    Science.gov (United States)

    Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.

    2017-10-01

    This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.

  16. Colour segmentation of multi variants tuberculosis sputum images using self organizing map

    Science.gov (United States)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri

    2017-05-01

    Lung tuberculosis detection is still identified from Ziehl-Neelsen sputum smear images in low and middle countries. The clinicians decide the grade of this disease by counting manually the amount of tuberculosis bacilli. It is very tedious for clinicians with a lot number of patient and without standardization for sputum staining. The tuberculosis sputum images have multi variant characterizations in colour, because of no standardization in staining. The sputum has more variants colour and they are difficult to be identified. For helping the clinicians, this research examined the Self Organizing Map method for colouring image segmentation in sputum images based on colour clustering. This method has better performance than k-means clustering which also tried in this research. The Self Organizing Map could segment the sputum images with y good result and cluster the colours adaptively.

  17. Arguing for a multi-hazard mapping program in Newfoundland and Labrador, Canada

    Science.gov (United States)

    Batterson, Martin; Neil, Stapleton

    2010-05-01

    This poster describes efforts to implement a Provincial multi-hazard mapping program, and will explore the challenges associated with this process. Newfoundland and Labrador is on the eastern edge of North America, has a large land area (405,212 km2) and a small population (510,000; 2009 estimate). The province currently has no legislative framework to control development in hazardous areas, but recent landslides in the communities of Daniel's Harbour and Trout River, both of which forced the relocation of residents, emphasize the need for action. There are two factors which confirm the need for a natural hazard mapping program: the documented history of natural disasters, and the future potential impacts of climate change. Despite being relatively far removed from the impacts of earthquake and volcanic activity, Newfoundland and Labrador has a long history of natural disasters. Rockfall, landslide, avalanche and flood events have killed at least 176 people over the past 225 years, many in their own homes. Some of the fatalities resulted from the adjacency of homes to places of employment, and of communities and roads to steep slopes. Others were likely the result of chance, and were thus unavoidable. Still others were the result of poor planning, albeit unwitting. Increasingly however, aesthetics have replaced pragmatism as a selection criterion for housing developments, with residential construction being contemplated for many coastal areas. The issue is exacerbated by the impacts of climate change, which while not a universal bane for the Province, will likely result in rising sea level and enhanced coastal erosion. Much of the Province's coastline is receding at up to 30 cm (and locally higher) per year. Sea level is anticipated to rise by 70cm to over 100 cm by 2099, based on IPCC predictions, plus the effects of enhanced ice sheet melting, plus (or minus) continued local isostatic adjustment. The history of geological disasters, coupled with pressures on

  18. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  19. A physiology-based parametric imaging method for FDG-PET data

    Science.gov (United States)

    Scussolini, Mara; Garbarino, Sara; Sambuceti, Gianmario; Caviglia, Giacomo; Piana, Michele

    2017-12-01

    Parametric imaging is a compartmental approach that processes nuclear imaging data to estimate the spatial distribution of the kinetic parameters governing tracer flow. The present paper proposes a novel and efficient computational method for parametric imaging which is potentially applicable to several compartmental models of diverse complexity and which is effective in the determination of the parametric maps of all kinetic coefficients. We consider applications to [18 F]-fluorodeoxyglucose positron emission tomography (FDG-PET) data and analyze the two-compartment catenary model describing the standard FDG metabolization by an homogeneous tissue and the three-compartment non-catenary model representing the renal physiology. We show uniqueness theorems for both models. The proposed imaging method starts from the reconstructed FDG-PET images of tracer concentration and preliminarily applies image processing algorithms for noise reduction and image segmentation. The optimization procedure solves pixel-wise the non-linear inverse problem of determining the kinetic parameters from dynamic concentration data through a regularized Gauss-Newton iterative algorithm. The reliability of the method is validated against synthetic data, for the two-compartment system, and experimental real data of murine models, for the renal three-compartment system.

  20. CONSTRUCTION THE BRIDGE PIER AS PARAMETRIC OBJECT USING AUTODESK REVIT

    Directory of Open Access Journals (Sweden)

    K. I. Hladka

    2017-08-01

    Full Text Available Purpose. The work is aimed to solve the following tasks: 1 to investigate the possibilities of Autodesk Revit to create parametric objects; 2 to create an information model of the bridge pier with the possibility of changing the model size without changing geometry of the object; 3 to trace the complexity and feasibility of using parametric models when designing the elements of bridges. Methodology. The studies were carried out using spatial modeling in the Autodesk Revit system. The ratio of the parameters of the object was set, the relationship between individual geometric elements was determined and the changes that were made to the model with the change of the specified parameters were checked. Findings. Support model of two types has been created: for railway bridges and for road bridges. Both types of models change the dimensions and the number of constituent elements in accordance with the entered parameters. The performed work confirms the possibility of creating information parametric models of complex form and the expediency of using them in the design of bridges and not only. Originality. Creation of information models is a modern and relevant topic. But both in the literature and in Internet resources, parametrization is considered on the example of simple objects. The model proposed in the article consists of several dependent geometric bodies; therefore, it opens the topic of objects parameterization more fully and in detail, in comparison with the existing sources. As for the creation of parametric models of the bridge elements - such information is not found in the literature, that is, it is proposed for the first time. Practical value. Parametrization of spatial models allows significantly to accelerate and simplify the process of designing any objects due to the use of typical parametric models in many projects. Especially it concerns the design of bridges, since the standard elements for them, such as support or span are not

  1. Parametric Resonance in Dynamical Systems

    CERN Document Server

    Nijmeijer, Henk

    2012-01-01

    Parametric Resonance in Dynamical Systems discusses the phenomenon of parametric resonance and its occurrence in mechanical systems,vehicles, motorcycles, aircraft and marine craft, and micro-electro-mechanical systems. The contributors provide an introduction to the root causes of this phenomenon and its mathematical equivalent, the Mathieu-Hill equation. Also included is a discussion of how parametric resonance occurs on ships and offshore systems and its frequency in mechanical and electrical systems. This book also: Presents the theory and principles behind parametric resonance Provides a unique collection of the different fields where parametric resonance appears including ships and offshore structures, automotive vehicles and mechanical systems Discusses ways to combat, cope with and prevent parametric resonance including passive design measures and active control methods Parametric Resonance in Dynamical Systems is ideal for researchers and mechanical engineers working in application fields such as MEM...

  2. Multi-Collinearity Based Model Selection for Landslide Susceptibility Mapping: A Case Study from Ulus District of Karabuk, Turkey

    Science.gov (United States)

    Sahin, E. K.; Colkesen, I., , Dr; Kavzoglu, T.

    2017-12-01

    Identification of localities prone to landslide areas plays an important role for emergency planning, disaster management and recovery planning. Due to its great importance for disaster management, producing accurate and up-to-date landslide susceptibility maps is essential for hazard mitigation purpose and regional planning. The main objective of the present study was to apply multi-collinearity based model selection approach for the production of a landslide susceptibility map of Ulus district of Karabuk, Turkey. It is a fact that data do not contain enough information to describe the problem under consideration when the factors are highly correlated with each other. In such cases, choosing a subset of the original features will often lead to better performance. This paper presents multi-collinearity based model selection approach to deal with the high correlation within the dataset. Two collinearity diagnostic factors (Tolerance (TOL) and the Variance Inflation Factor (VIF)) are commonly used to identify multi-collinearity. Values of VIF that exceed 10.0 and TOL values less than 1.0 are often regarded as indicating multi-collinearity. Five causative factors (slope length, curvature, plan curvature, profile curvature and topographical roughness index) were found highly correlated with each other among 15 factors available for the study area. As a result, the five correlated factors were removed from the model estimation, and performances of the models including the remaining 10 factors (aspect, drainage density, elevation, lithology, land use/land cover, NDVI, slope, sediment transport index, topographical position index and topographical wetness index) were evaluated using logistic regression. The performance of prediction model constructed with 10 factors was compared to that of 15-factor model. The prediction performance of two susceptibility maps was evaluated by overall accuracy and the area under the ROC curve (AUC) values. Results showed that overall

  3. A multi-GPU real-time dose simulation software framework for lung radiotherapy.

    Science.gov (United States)

    Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A

    2012-09-01

    Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.

  4. Parametric optimization in virtual prototyping environment of the control device for a robotic system used in thin layers deposition

    Science.gov (United States)

    Enescu (Balaş, M. L.; Alexandru, C.

    2016-08-01

    The paper deals with the optimal design of the control system for a 6-DOF robot used in thin layers deposition. The optimization is based on parametric technique, by modelling the design objective as a numerical function, and then establishing the optimal values of the design variables so that to minimize the objective function. The robotic system is a mechatronic product, which integrates the mechanical device and the controlled operating device.The mechanical device of the robot was designed in the CAD (Computer Aided Design) software CATIA, the 3D-model being then transferred to the MBS (Multi-Body Systems) environment ADAMS/View. The control system was developed in the concurrent engineering concept, through the integration with the MBS mechanical model, by using the DFC (Design for Control) software solution EASY5. The necessary angular motions in the six joints of the robot, in order to obtain the imposed trajectory of the end-effector, have been established by performing the inverse kinematic analysis. The positioning error in each joint of the robot is used as design objective, the optimization goal being to minimize the root mean square during simulation, which is a measure of the magnitude of the positioning error varying quantity.

  5. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  6. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    Science.gov (United States)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  7. VIBA-LAB2: a virtual ion beam analysis laboratory software package incorporating elemental map simulations

    International Nuclear Information System (INIS)

    Zhou, S.J.; Orlic, I.; Sanchez, J.L.; Watt, F.

    1999-01-01

    The software package VIBA-lab1, which incorporates PIXE and RBS energy spectra simulation has now been extended to include the simulation of elemental maps from 3D structures. VIBA-lab1 allows the user to define a wide variety of experimental parameters, e.g. energy and species of incident ions, excitation and detection geometry, etc. When the relevant experimental parameters as well as target composition are defined, the program can then simulate the corresponding PIXE and RBS spectra. VIBA-LAB2 has been written with applications in nuclear microscopy in mind. A set of drag-and-drop tools has been incorporated to allow the user to define a three-dimensional sample object of mixed elemental composition. PIXE energy spectra simulations are then carried out on pixel-by-pixel basis and the corresponding intensity distributions or elemental maps can be computed. Several simulated intensity distributions for some 3D objects are demonstrated, and simulations obtained from a simple IC are compared with experimental results

  8. Underwater Multi-Vehicle Trajectory Alignment and Mapping Using Acoustic and Optical Constraints

    Directory of Open Access Journals (Sweden)

    Ricard Campos

    2016-03-01

    Full Text Available Multi-robot formations are an important advance in recent robotic developments, as they allow a group of robots to merge their capacities and perform surveys in a more convenient way. With the aim of keeping the costs and acoustic communications to a minimum, cooperative navigation of multiple underwater vehicles is usually performed at the control level. In order to maintain the desired formation, individual robots just react to simple control directives extracted from range measurements or ultra-short baseline (USBL systems. Thus, the robots are unaware of their global positioning, which presents a problem for the further processing of the collected data. The aim of this paper is two-fold. First, we present a global alignment method to correct the dead reckoning trajectories of multiple vehicles to resemble the paths followed during the mission using the acoustic messages passed between vehicles. Second, we focus on the optical mapping application of these types of formations and extend the optimization framework to allow for multi-vehicle geo-referenced optical 3D mapping using monocular cameras. The inclusion of optical constraints is not performed using the common bundle adjustment techniques, but in a form improving the computational efficiency of the resulting optimization problem and presenting a generic process to fuse optical reconstructions with navigation data. We show the performance of the proposed method on real datasets collected within the Morph EU-FP7 project.

  9. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    Science.gov (United States)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  10. Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian

    2014-05-01

    With the upcoming launch of the next generation of hyperspectral satellites that will routinely deliver high spectral resolution images for the entire globe (e.g. EnMAP, HISUI, HyspIRI, HypXIM, PRISMA), an increasing demand for the availability/accessibility of hyperspectral soil products is coming from the geoscience community. Indeed, many robust methods for the prediction of soil properties based on imaging spectroscopy already exist and have been successfully used for a wide range of soil mapping airborne applications. Nevertheless, these methods require expert know-how and fine-tuning, which makes them used sparingly. More developments are needed toward easy-to-access soil toolboxes as a major step toward the operational use of hyperspectral soil products for Earth's surface processes monitoring and modelling, to allow non-experienced users to obtain new information based on non-expensive software packages where repeatability of the results is an important prerequisite. In this frame, based on the EU-FP7 EUFAR (European Facility for Airborne Research) project and EnMAP satellite science program, higher performing soil algorithms were developed at the GFZ German Research Center for Geosciences as demonstrators for end-to-end processing chains with harmonized quality measures. The algorithms were built-in into the HYSOMA (Hyperspectral SOil MApper) software interface, providing an experimental platform for soil mapping applications of hyperspectral imagery that gives the choice of multiple algorithms for each soil parameter. The software interface focuses on fully automatic generation of semi-quantitative soil maps such as soil moisture, soil organic matter, iron oxide, clay content, and carbonate content. Additionally, a field calibration option calculates fully quantitative soil maps provided ground truth soil data are available. Implemented soil algorithms have been tested and validated using extensive in-situ ground truth data sets. The source of the HYSOMA

  11. MulRF: a software package for phylogenetic analysis using multi-copy gene trees.

    Science.gov (United States)

    Chaudhary, Ruchi; Fernández-Baca, David; Burleigh, John Gordon

    2015-02-01

    MulRF is a platform-independent software package for phylogenetic analysis using multi-copy gene trees. It seeks the species tree that minimizes the Robinson-Foulds (RF) distance to the input trees using a generalization of the RF distance to multi-labeled trees. The underlying generic tree distance measure and fast running time make MulRF useful for inferring phylogenies from large collections of gene trees, in which multiple evolutionary processes as well as phylogenetic error may contribute to gene tree discord. MulRF implements several features for customizing the species tree search and assessing the results, and it provides a user-friendly graphical user interface (GUI) with tree visualization. The species tree search is implemented in C++ and the GUI in Java Swing. MulRF's executable as well as sample datasets and manual are available at http://genome.cs.iastate.edu/CBL/MulRF/, and the source code is available at https://github.com/ruchiherself/MulRFRepo. ruchic@ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Joint reconstruction of dynamic PET activity and kinetic parametric images using total variation constrained dictionary sparse coding

    Science.gov (United States)

    Yu, Haiqing; Chen, Shuhang; Chen, Yunmei; Liu, Huafeng

    2017-05-01

    Dynamic positron emission tomography (PET) is capable of providing both spatial and temporal information of radio tracers in vivo. In this paper, we present a novel joint estimation framework to reconstruct temporal sequences of dynamic PET images and the coefficients characterizing the system impulse response function, from which the associated parametric images of the system macro parameters for tracer kinetics can be estimated. The proposed algorithm, which combines statistical data measurement and tracer kinetic models, integrates a dictionary sparse coding (DSC) into a total variational minimization based algorithm for simultaneous reconstruction of the activity distribution and parametric map from measured emission sinograms. DSC, based on the compartmental theory, provides biologically meaningful regularization, and total variation regularization is incorporated to provide edge-preserving guidance. We rely on techniques from minimization algorithms (the alternating direction method of multipliers) to first generate the estimated activity distributions with sub-optimal kinetic parameter estimates, and then recover the parametric maps given these activity estimates. These coupled iterative steps are repeated as necessary until convergence. Experiments with synthetic, Monte Carlo generated data, and real patient data have been conducted, and the results are very promising.

  13. Parametric response mapping of dynamic CT for predicting intrahepatic recurrence of hepatocellular carcinoma after conventional transcatheter arterial chemoembolization

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Seung Joon; Kim, Hyung Sik [Gachon University Gil Hospital, Department of Radiology, Incheon (Korea, Republic of); Kim, Jonghoon [Sungkyunkwan University, Department of Electronic Electrical and Computer Engineering, Suwon (Korea, Republic of); Seo, Jongbum [Yonsei University, Department of Biomedical Engineering, Wonju (Korea, Republic of); Lee, Jong-min [Hanyang University, Department of Biomedical Engineering, Seoul (Korea, Republic of); Park, Hyunjin [Sungkyunwkan University, School of Electronic and Electrical Engineering, Suwon (Korea, Republic of)

    2016-01-15

    The aim of our study was to determine the diagnostic value of a novel image analysis method called parametric response mapping (PRM) for prediction of intrahepatic recurrence of hepatocellular carcinoma (HCC) treated with conventional transcatheter arterial chemoembolization (TACE). This retrospective study was approved by the IRB. We recruited 55 HCC patients who achieved complete remission (CR) after TACE and received longitudinal multiphasic liver computed tomography (CT). The patients fell into two groups: the recurrent tumour group (n = 29) and the non-recurrent tumour group (n = 26). We applied the PRM analysis to see if this technique could distinguish between the two groups. The results of the PRM analysis were incorporated into a prediction algorithm. We retrospectively removed data from the last time point and attempted to predict the response to therapy of the removed data. The PRM analysis was able to distinguish between the non-recurrent and recurrent groups successfully. The prediction algorithm detected response to therapy with an area under the curve (AUC) of 0.76, while the manual approach had AUC 0.64. Adopting PRM analysis can potentially distinguish between recurrent and non-recurrent HCCs and allow for prediction of response to therapy after TACE. (orig.)

  14. Parametric imaging of the rate constant K[sub i] using 18Fluoro-L-dopa positron emission tomography in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Cordes, M. (Neurodegenerative Disorders Centre, Univ. Hospital, Univ. of British Columbia, Vancouver, BC (Canada) Strahlenklinik und Poliklinik, Universitaetsklinikum Rudolf-Virchow, Berlin (Germany)); Snow, B.J. (Neurodegenerative Disorders Centre, Univ. Hospital, Univ. of British Columbia, Vancouver, BC (Canada)); Morrison, S. (TRIUMF, Univ. of British Columbia, Vancouver, BC (Canada)); Sossi, V. (TRIUMF, Univ. of British Columbia, Vancouver, BC (Canada)); Ruth, T.J. (TRIUMF, Univ. of British Columbia, Vancouver, BC (Canada)); Calne, D.B. (Neurodegenerative Disorders Centre, Univ. Hospital, Univ. of British Columbia, Vancouver, BC (Canada))

    1993-01-01

    Positron emission tomography (PET) studies using 18F-L-dopa were carried out in 9 patients with supranuclear palsy and 13 controls. For quantification of PET data a rate constant K[sub i] was calculated for the radiotracer using a graphical method. Corrections for nonspecific activity were performed in both arterial plasma and brain tissue. The purpose of this study was to test the hypothesis that parametric images of the rate constant K mapping can be obtained on a pixel-by-pixel basis using an appropriate mathematical algorithm. K[sub i] values from these parametric images and the graphical approach were compared. Both correlated closely, with y=0.013+0.947[sup *]x, r=0.992 and y=-0.052+1.048[sup *]x, r=0.965 in patients and controls, respectively. Contrast measurements were also performed and showed a striking increase in contrast on parametric images. K mapping offers several advantages over the graphical approach, since parametric images are time-independent, i.e. one image represents the quantitative result of the study. In addition, parmetric images of the rate constant are normalized to arterial plasma radioactivity and corrected for tissue metabolites. Thus, parametric images of K[sub i] in different individuals can be compared directly without further processing in order to assess the nigrostriatal integrity. (orig.)

  15. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov (United States)

    Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative

  16. PERFORMANCE ENHANCEMENT OF A MINIATURE STIRLING CRYOCOOLER WITH A MULTI MESH REGENERATOR DESIGN

    Directory of Open Access Journals (Sweden)

    KISHOR KUMAR V. V.

    2017-06-01

    Full Text Available A parametric study has been carried out using the software REGEN 3.3 to optimize the regenerator of a miniature Stirling cryocooler operating with a warm end temperature of 300 K and cold end temperature of 80 K. Regenerator designs which produce the maximum coefficient of performance (COP of the system is considered as an optimized regenerator. The length and diameter of the regenerator were fixed from the cooler system requirements. Single mesh regenerators made of 200, 250, 300, 400 and 450 Stainless Steel wire meshes were considered and the optimum phase angle and mesh size were obtained. A maximum COP of 0.1475 was obtained for 300 mesh regenerator at 70° phase angle. Then multi mesh regenerators were considered with finer mesh on the cold end and coarser mesh on the hot end. The optimum size and length of each mesh in the multi mesh regenerator and the optimum phase angle were calculated. The maximum COP of 0.156 was obtained for 200 300-400 multi mesh regenerator at 70° phase angle. The COP and net refrigeration obtained for an optimized multi mesh regenerator was found to be significantly higher than that of a single mesh regenerator. Thus a multi mesh regenerator design with a proper combination of regenerator mesh size and length can enhance the regenerator effectiveness.

  17. Cerebral Blood Flow Changes after Shunt in Hydrocephalus after Aneurysmal Subarachnoid Hemorrhage: Analysis by statistical Parametric Mapping

    International Nuclear Information System (INIS)

    Hyun, I. Y.; Choi, W. S.; Pak, H. S.

    2003-01-01

    The purpose of this study was to evaluate the changes of regional cerebral blood flow (rCBF) after shunt operation in patients with hydrocephalus after aneurysmal subarachnoid hemorrhage ba statistical parametric mapping (SPM). Seven patients (4 male, mean age 54 years) with hydrocephalus after aneurysmal subarachnoid hemorrhage underwent a shunt operation. Tc-99m HMPAO SPECT was performed within I week before, and 2 weeks after the shunt operation. All of the SPECT images were spatially transformed to standard space, smoothed, and globally normalized. After spatial and count normalization, rCBF of pre- and post- shunting Tc- 99m HMPAO SPECT was estimated at every voxel using t statistics. The voxels with a P value of less than 0.001 were considered to be significantly different. The shunt operation was effective in all patients. Pre-shunting Tc-99m HMPAO SPECT showed hypoperfusion, predominantly in the periventricular area. After shunt operation, periventricular low perfusion was disappeared. The results of this study show that periventricular CBF is impaired in hydrocephalus after aneurysmal subarachnoid hemorrhage. Significant increase of periventricular CBF after shunt operation suggests the evaluation of periventricular CBF by SPM might be of value for the prediction of shunt effectiveness in hydrocephalus

  18. Cerebral Blood Flow Changes after Shunt in Hydrocephalus after Aneurysmal Subarachnoid Hemorrhage: Analysis by statistical Parametric Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, I. Y.; Choi, W. S.; Pak, H. S. [College of Medicine, Univ. of Inhwa, Incheon (Korea, Republic of)

    2003-07-01

    The purpose of this study was to evaluate the changes of regional cerebral blood flow (rCBF) after shunt operation in patients with hydrocephalus after aneurysmal subarachnoid hemorrhage ba statistical parametric mapping (SPM). Seven patients (4 male, mean age 54 years) with hydrocephalus after aneurysmal subarachnoid hemorrhage underwent a shunt operation. Tc-99m HMPAO SPECT was performed within I week before, and 2 weeks after the shunt operation. All of the SPECT images were spatially transformed to standard space, smoothed, and globally normalized. After spatial and count normalization, rCBF of pre- and post- shunting Tc- 99m HMPAO SPECT was estimated at every voxel using t statistics. The voxels with a P value of less than 0.001 were considered to be significantly different. The shunt operation was effective in all patients. Pre-shunting Tc-99m HMPAO SPECT showed hypoperfusion, predominantly in the periventricular area. After shunt operation, periventricular low perfusion was disappeared. The results of this study show that periventricular CBF is impaired in hydrocephalus after aneurysmal subarachnoid hemorrhage. Significant increase of periventricular CBF after shunt operation suggests the evaluation of periventricular CBF by SPM might be of value for the prediction of shunt effectiveness in hydrocephalus.

  19. MapFactory - Towards a mapping design pattern for big geospatial data

    Science.gov (United States)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  20. A new software routine that automates the fitting of protein X-ray crystallographic electron-density maps.

    Science.gov (United States)

    Levitt, D G

    2001-07-01

    The classical approach to building the amino-acid residues into the initial electron-density map requires days to weeks of a skilled investigator's time. Automating this procedure should not only save time, but has the potential to provide a more accurate starting model for input to refinement programs. The new software routine MAID builds the protein structure into the electron-density map in a series of sequential steps. The first step is the fitting of the secondary alpha-helix and beta-sheet structures. These 'fits' are then used to determine the local amino-acid sequence assignment. These assigned fits are then extended through the loop regions and fused with the neighboring sheet or helix. The program was tested on the unaveraged 2.5 A selenomethionine multiple-wavelength anomalous dispersion (SMAD) electron-density map that was originally used to solve the structure of the 291-residue protein human heart short-chain L-3-hydroxyacyl-CoA dehydrogenase (SHAD). Inputting just the map density and the amino-acid sequence, MAID fitted 80% of the residues with an r.m.s.d. error of 0.43 A for the main-chain atoms and 1.0 A for all atoms without any user intervention. When tested on a higher quality 1.9 A SMAD map, MAID correctly fitted 100% (418) of the residues. A major advantage of the MAID fitting procedure is that it maintains ideal bond lengths and angles and constrains phi/psi angles to the appropriate Ramachandran regions. Recycling the output of this new routine through a partial structure-refinement program may have the potential to completely automate the fitting of electron-density maps.

  1. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  2. Cluster analysis of quantitative parametric maps from DCE-MRI: application in evaluating heterogeneity of tumor response to antiangiogenic treatment.

    Science.gov (United States)

    Longo, Dario Livio; Dastrù, Walter; Consolino, Lorena; Espak, Miklos; Arigoni, Maddalena; Cavallo, Federica; Aime, Silvio

    2015-07-01

    The objective of this study was to compare a clustering approach to conventional analysis methods for assessing changes in pharmacokinetic parameters obtained from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) during antiangiogenic treatment in a breast cancer model. BALB/c mice bearing established transplantable her2+ tumors were treated with a DNA-based antiangiogenic vaccine or with an empty plasmid (untreated group). DCE-MRI was carried out by administering a dose of 0.05 mmol/kg of Gadocoletic acid trisodium salt, a Gd-based blood pool contrast agent (CA) at 1T. Changes in pharmacokinetic estimates (K(trans) and vp) in a nine-day interval were compared between treated and untreated groups on a voxel-by-voxel analysis. The tumor response to therapy was assessed by a clustering approach and compared with conventional summary statistics, with sub-regions analysis and with histogram analysis. Both the K(trans) and vp estimates, following blood-pool CA injection, showed marked and spatial heterogeneous changes with antiangiogenic treatment. Averaged values for the whole tumor region, as well as from the rim/core sub-regions analysis were unable to assess the antiangiogenic response. Histogram analysis resulted in significant changes only in the vp estimates (pclustering approach depicted marked changes in both the K(trans) and vp estimates, with significant spatial heterogeneity in vp maps in response to treatment (pclustered in three or four sub-regions. This study demonstrated the value of cluster analysis applied to pharmacokinetic DCE-MRI parametric maps for assessing tumor response to antiangiogenic therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  4. Damage mapping in structural health monitoring using a multi-grid architecture

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, V. John [Dept. of Electrical and Computer Engineering, University of Utah, Salt Lake City, UT 84112 (United States)

    2015-03-31

    This paper presents a multi-grid architecture for tomography-based damage mapping of composite aerospace structures. The system employs an array of piezo-electric transducers bonded on the structure. Each transducer may be used as an actuator as well as a sensor. The structure is excited sequentially using the actuators and the guided waves arriving at the sensors in response to the excitations are recorded for further analysis. The sensor signals are compared to their baseline counterparts and a damage index is computed for each actuator-sensor pair. These damage indices are then used as inputs to the tomographic reconstruction system. Preliminary damage maps are reconstructed on multiple coordinate grids defined on the structure. These grids are shifted versions of each other where the shift is a fraction of the spatial sampling interval associated with each grid. These preliminary damage maps are then combined to provide a reconstruction that is more robust to measurement noise in the sensor signals and the ill-conditioned problem formulation for single-grid algorithms. Experimental results on a composite structure with complexity that is representative of aerospace structures included in the paper demonstrate that for sufficiently high sensor densities, the algorithm of this paper is capable of providing damage detection and characterization with accuracy comparable to traditional C-scan and A-scan-based ultrasound non-destructive inspection systems quickly and without human supervision.

  5. A multi-agent approach to professional software engineering

    NARCIS (Netherlands)

    M. Lützenberger; T. Küster; T. Konnerth; A. Thiele; N. Masuch; A. Heßler; J. Keiser; M. Burkhardt; S. Kaiser (Silvan); J. Tonn; M. Kaisers (Michael); S. Albayrak; M. Cossentino; A. Seghrouchni; M. Winikoff

    2013-01-01

    htmlabstractThe community of agent researchers and engineers has produced a number of interesting and mature results. However, agent technology is still not widely adopted by industrial software developers or software companies - possibly because existing frameworks are infused with academic

  6. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  7. AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00100895; The ATLAS collaboration; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; van Gemmeren, Peter

    2017-01-01

    ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying ha...

  8. AthenaMT: Upgrading the ATLAS Software Framework for the Many-Core World with Multi-Threading

    CERN Document Server

    Leggett, Charles; The ATLAS collaboration; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; van Gemmeren, Peter

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we will report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying...

  9. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  10. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    International Nuclear Information System (INIS)

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z

    2011-01-01

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 μm each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  11. Pixelman: a multi-platform data acquisition and processing software package for Medipix2, Timepix and Medipix3 detectors

    Energy Technology Data Exchange (ETDEWEB)

    Turecek, D; Holy, T; Jakubek, J; Pospisil, S; Vykydal, Z, E-mail: daniel.turecek@utef.cvut.cz [Institute of Experimental and Applied Physics, Czech Technical University in Prague, Horska 3a/22, 12800 Prague 2 (Czech Republic)

    2011-01-15

    The semiconductor pixel detectors Medipix2, Timepix and Medipix3 (256x256 square pixels, 55x55 {mu}m each) are superior imaging devices in terms of spatial resolution, linearity and dynamic range. This makes them suitable for various applications such as radiography, neutronography, micro-tomography and X-ray dynamic defectoscopy. In order to control and manage such complex measurements a multi-platform software package for acquisition and data processing with a Java graphical user interface has been developed. The functionality of the original version of Pixelman package has been upgraded and extended to include the new medipix devices. The software package can be run on Microsoft Windows, Linux and Mac OS X operating systems. The architecture is very flexible and the functionality can be extended by plugins in C++, Java or combinations of both. The software package may be used as a distributed acquisition system using computers with different operating systems over a local network or the Internet.

  12. Primitive-path statistics of entangled polymers: mapping multi-chain simulations onto single-chain mean-field models

    International Nuclear Information System (INIS)

    Steenbakkers, Rudi J A; Schieber, Jay D; Tzoumanekas, Christos; Li, Ying; Liu, Wing Kam; Kröger, Martin

    2014-01-01

    We present a method to map the full equilibrium distribution of the primitive-path (PP) length, obtained from multi-chain simulations of polymer melts, onto a single-chain mean-field ‘target’ model. Most previous works used the Doi–Edwards tube model as a target. However, the average number of monomers per PP segment, obtained from multi-chain PP networks, has consistently shown a discrepancy of a factor of two with respect to tube-model estimates. Part of the problem is that the tube model neglects fluctuations in the lengths of PP segments, the number of entanglements per chain and the distribution of monomers among PP segments, while all these fluctuations are observed in multi-chain simulations. Here we use a recently proposed slip-link model, which includes fluctuations in all these variables as well as in the spatial positions of the entanglements. This turns out to be essential to obtain qualitative and quantitative agreement with the equilibrium PP-length distribution obtained from multi-chain simulations. By fitting this distribution, we are able to determine two of the three parameters of the model, which govern its equilibrium properties. This mapping is executed for four different linear polymers and for different molecular weights. The two parameters are found to depend on chemistry, but not on molecular weight. The model predicts a constant plateau modulus minus a correction inversely proportional to molecular weight. The value for well-entangled chains, with the parameters determined ab initio, lies in the range of experimental data for the materials investigated. (paper)

  13. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    Science.gov (United States)

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  14. Statistical parametric mapping for analyzing interictal magnetoencephalography in patients with left frontal lobe epilepsy.

    Science.gov (United States)

    Zhu, Haitao; Zhu, Jinlong; Bao, Forrest Sheng; Liu, Hongyi; Zhu, Xuchuang; Wu, Ting; Yang, Lu; Zou, Yuanjie; Zhang, Rui; Zheng, Gang

    2016-01-01

    Frontal lobe epilepsy is a common epileptic disorder and is characterized by recurring seizures that arise in the frontal lobes. The purpose of this study is to identify the epileptogenic regions and other abnormal regions in patients with left frontal lobe epilepsy (LFLE) based on the magnetoencephalogram (MEG), and to understand the effects of clinical variables on brain activities in patients with LFLE. Fifteen patients with LFLE (23.20 ± 8.68 years, 6 female and 9 male) and 16 healthy controls (23.13 ± 7.66 years, 6 female and 10 male) were included in resting-stage MEG examinations. Epileptogenic regions of LFLE patients were confirmed by surgery. Regional brain activations were quantified using statistical parametric mapping (SPM). The correlation between the activations of the abnormal brain regions and the clinical seizure parameters were computed for LFLE patients. Brain activations of LFLE patients were significantly elevated in left superior/middle/inferior frontal gyri, postcentral gyrus, inferior temporal gyrus, insula, parahippocampal gyrus and amygdala, including the epileptogenic regions. Remarkable decreased activations were found mainly in the left parietal gyrus and precuneus. There is a positive correlation between the duration of the epilepsy (in month) and activations of the abnormal regions, while no relation was found between age of seizure onset (year), seizure frequency and the regions of the abnormal activity of the epileptic patients. Our findings suggest that the aberrant brain activities of LFLE patients were not restricted to the epileptogenic zones. Long duration of epilepsy might induce further functional damage in patients with LFLE. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  15. Mapping Infected Area after a Flash-Flooding Storm Using Multi Criteria Analysis and Spectral Indices

    Science.gov (United States)

    Al-Akad, S.; Akensous, Y.; Hakdaoui, M.

    2017-11-01

    This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.

  16. Software Defined Multiband EVA Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  17. Estrategia para el diseño paramétrico basado en modelos. // Strategy for model-based parametric design.

    Directory of Open Access Journals (Sweden)

    S. A. Marrero Osorio

    2008-09-01

    Full Text Available El presente artículo expone una manera de diseñar paramétricamente utilizando los programas de computadora (CAD,CAE, PMS difundidos entre los diseñadores durante los últimos 20 años. La propuesta se basa en modelos matemáticosque consideran el conocimiento sobre la ingeniería del objeto de diseño y lo relacionado con la confección de su modelovirtual tridimensional, planos y otro aspectos; utilizando el Método de los Grafos Dicromáticos para resolver los problemascomputacionales que se presentan en el diseño paramétrico. Se analizan los puntos de vista de diferentes autores en relacióncon el proceso general de diseño y es ubicado dentro del mismo el diseño paramétrico, realizándose una explicación formalque permite arribar a conclusiones interesantes.Palabras claves: Diseño paramétrico, diseño asistido por computadoras (CAD, ingeniería asistida porcomputadoras (CAE, software para el modelado paramétrico (PMS, resolución de problemas._____________________________________________________________________________Abstract:The present article exposes a way to design parametrically applying programs (CAD, CAE, PMS accepted by designers along thelast 20 years. The proposal is based on mathematical models that ponder the knowledge on the engineering of the design object andthe building of its three-dimensional virtual models, blueprints and another aspects; using the dichromatic graph method to solvecomputational problems in parametric design. The points of view of different authors are analyzed in connection with the generalprocess of design, locating parametric design inside it, carrying out a formal explanation which arrives to interesting conclusions.Key words: Parametric design, computer aided design (CAD, computer aided engineering (CAE,parametric modeling software (PMS, problem solving.

  18. kruX: matrix-based non-parametric eQTL discovery.

    Science.gov (United States)

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  19. YouGenMap: a web platform for dynamic multi-comparative mapping and visualization of genetic maps

    Science.gov (United States)

    Keith Batesole; Kokulapalan Wimalanathan; Lin Liu; Fan Zhang; Craig S. Echt; Chun Liang

    2014-01-01

    Comparative genetic maps are used in examination of genome organization, detection of conserved gene order, and exploration of marker order variations. YouGenMap is an open-source web tool that offers dynamic comparative mapping capability of users' own genetic mapping between 2 or more map sets. Users' genetic map data and optional gene annotations are...

  20. Open software architecture for east articulated maintenance arm

    International Nuclear Information System (INIS)

    Wu, Jing; Wu, Huapeng; Song, Yuntao; Li, Ming; Yang, Yang; Alcina, Daniel A.M.

    2016-01-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  1. Open software architecture for east articulated maintenance arm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jing, E-mail: wujing@ipp.ac.cn [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Wu, Huapeng [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Song, Yuntao [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Li, Ming [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Yang, Yang [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Alcina, Daniel A.M. [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland)

    2016-11-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  2. A parametric level-set approach for topology optimization of flow domains

    DEFF Research Database (Denmark)

    Pingen, Georg; Waidmann, Matthias; Evgrafov, Anton

    2010-01-01

    of the design variables in the traditional approaches is seen as a possible cause for the slow convergence. Non-smooth material distributions are suspected to trigger premature onset of instationary flows which cannot be treated by steady-state flow models. In the present work, we study whether the convergence...... and the versatility of topology optimization methods for fluidic systems can be improved by employing a parametric level-set description. In general, level-set methods allow controlling the smoothness of boundaries, yield a non-local influence of design variables, and decouple the material description from the flow...... field discretization. The parametric level-set method used in this study utilizes a material distribution approach to represent flow boundaries, resulting in a non-trivial mapping between design variables and local material properties. Using a hydrodynamic lattice Boltzmann method, we study...

  3. Towards Stabilizing Parametric Active Contours

    DEFF Research Database (Denmark)

    Liu, Jinchao; Fan, Zhun; Olsen, Søren Ingvor

    2014-01-01

    Numerical instability often occurs in evolving of parametric active contours. This is mainly due to the undesired change of parametrization during evolution. In this paper, we propose a new tangential diffusion term to compensate this undesired change. As a result, the parametrization will converge...

  4. Quantitative Multi-Parametric Magnetic Resonance Imaging of Tumor Response to Photodynamic Therapy.

    Directory of Open Access Journals (Sweden)

    Tom J L Schreurs

    Full Text Available The aim of this study was to characterize response to photodynamic therapy (PDT in a mouse cancer model using a multi-parametric quantitative MRI protocol and to identify MR parameters as potential biomarkers for early assessment of treatment outcome.CT26.WT colon carcinoma tumors were grown subcutaneously in the hind limb of BALB/c mice. Therapy consisted of intravenous injection of the photosensitizer Bremachlorin, followed by 10 min laser illumination (200 mW/cm2 of the tumor 6 h post injection. MRI at 7 T was performed at baseline, directly after PDT, as well as at 24 h, and 72 h. Tumor relaxation time constants (T1 and T2 and apparent diffusion coefficient (ADC were quantified at each time point. Additionally, Gd-DOTA dynamic contrast-enhanced (DCE MRI was performed to estimate transfer constants (Ktrans and volume fractions of the extravascular extracellular space (ve using standard Tofts-Kermode tracer kinetic modeling. At the end of the experiment, tumor viability was characterized by histology using NADH-diaphorase staining.The therapy induced extensive cell death in the tumor and resulted in significant reduction in tumor growth, as compared to untreated controls. Tumor T1 and T2 relaxation times remained unchanged up to 24 h, but decreased at 72 h after treatment. Tumor ADC values significantly increased at 24 h and 72 h. DCE-MRI derived tracer kinetic parameters displayed an early response to the treatment. Directly after PDT complete vascular shutdown was observed in large parts of the tumors and reduced uptake (decreased Ktrans in remaining tumor tissue. At 24 h, contrast uptake in most tumors was essentially absent. Out of 5 animals that were monitored for 2 weeks after treatment, 3 had tumor recurrence, in locations that showed strong contrast uptake at 72 h.DCE-MRI is an effective tool for visualization of vascular effects directly after PDT. Endogenous contrast parameters T1, T2, and ADC, measured at 24 to 72 h after PDT, are

  5. The software design of multi-branch, multi-point remote monitoring system for temperature measurement based on MSP430 and DS18B20

    International Nuclear Information System (INIS)

    Yu Jun; Yan Yu

    2009-01-01

    This paper present that the system can acquire the remote temperature measurement data of 40 monitoring points, through the RS-232 serial port and the Intranet. System's hardware is consist of TI's MSP430F149 mixed-signal processor and UA7000A network module. Using digital temperature sensor DS18B20, the structure is simple and easy to expand, the sensors directly send out the temperature data, MSP430F149 has the advantage of ultra-low-power and high degree of integration. Using msp430F149, the multi-branch multi-point temperature measurement system is powerful, simple structure, high reliability, strong anti-interference capability. The client software is user-friendly and easy to use, it is designed in Microsoft Visual C+ +6.0 environment. The monitoring system is able to complete a total of 4 branches of the 40-point temperature measurements in real-time remote monitoring. (authors)

  6. A Component-based Software Development and Execution Framework for CAx Applications

    Directory of Open Access Journals (Sweden)

    N. Matsuki

    2004-01-01

    Full Text Available Digitalization of the manufacturing process and technologies is regarded as the key to increased competitive ability. The MZ-Platform infrastructure is a component-based software development framework, designed for supporting enterprises to enhance digitalized technologies using software tools and CAx components in a self-innovative way. In the paper we show the algorithm, system architecture, and a CAx application example on MZ-Platform. We also propose a new parametric data structure based on MZ-Platform.

  7. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  8. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  9. High-resolution brain SPECT imaging in attention deficit hyperactivity disorder children without comorbidity: quantitative analysis using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Oh, Eun Young; Chung, Young Ki; Hwang, Isaac; Lee, Jae Sung

    2002-01-01

    We examined the abnormalities of regional cerebral blood flow(rCBF) in children with attention deficit hyperactivity disorder(ADHD) without comorbidity using statistical parametric mapping(SPM) method. We used the patients with not compatible to DSM-IV diagnostic criteria of ADHD and normal rCBF pattern in visual analysis as normal control children. Tc-99m ECD brain SPECT was performed on 75 patients (M:F=64:11, 10.0±2.5y) with the DSM-IV diagnostic criteria of ADHD and 13 normal control children (M:F=9:4, 10.3±4.1y). Using SPM method, we compared patient group's SPECT images with those of 13 control subjects and measured the extent of the area with significant hypoperfusion(p<0.01) in predefined 34 cerebral regions. Only on area of left temporal lobe showed significant hypoperfusion in ADHD patients without comorbidity (n=75) compared with control subjects(n=13). (n=75, p<0.01, extent threshold=16). rCBF of left temporal area was decreased in ADHD group without comorbidity, such as tic, compared with control group

  10. High-resolution brain SPECT imaging in attention deficit hyperactivity disorder children without comorbidity: quantitative analysis using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Oh, Eun Young [Ajou University School of Medicine, Suwon (Korea, Republic of); Chung, Young Ki; Hwang, Isaac; Lee, Jae Sung [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    We examined the abnormalities of regional cerebral blood flow(rCBF) in children with attention deficit hyperactivity disorder(ADHD) without comorbidity using statistical parametric mapping(SPM) method. We used the patients with not compatible to DSM-IV diagnostic criteria of ADHD and normal rCBF pattern in visual analysis as normal control children. Tc-99m ECD brain SPECT was performed on 75 patients (M:F=64:11, 10.0{+-}2.5y) with the DSM-IV diagnostic criteria of ADHD and 13 normal control children (M:F=9:4, 10.3{+-}4.1y). Using SPM method, we compared patient group's SPECT images with those of 13 control subjects and measured the extent of the area with significant hypoperfusion(p<0.01) in predefined 34 cerebral regions. Only on area of left temporal lobe showed significant hypoperfusion in ADHD patients without comorbidity (n=75) compared with control subjects(n=13). (n=75, p<0.01, extent threshold=16). rCBF of left temporal area was decreased in ADHD group without comorbidity, such as tic, compared with control group.

  11. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  12. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  13. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  14. Parametric imaging of tumor perfusion and neovascular morphology using ultrasound

    Science.gov (United States)

    Hoyt, Kenneth

    2015-03-01

    A new image processing strategy is detailed for the simultaneous measurement of tumor perfusion and neovascular morphology parameters from a sequence of dynamic contrast-enhanced ultrasound (DCE-US) images. A technique for locally mapping tumor perfusion parameters using skeletonized neovascular data is also introduced. Simulated images were used to test the neovascular skeletonization technique and variance (error) of relevant parametric estimates. Preliminary DCE-US image datasets were collected in 6 female patients diagnosed with invasive breast cancer and using a Philips iU22 ultrasound system equipped with a L9-3 MHz transducer and Definity contrast agent. Simulation data demonstrates that neovascular morphology parametric estimation is reproducible albeit measurement error can occur at a lower signal-to-noise ratio (SNR). Experimental results indicate the feasibility of our approach to performing both tumor perfusion and neovascular morphology measurements from DCE-US images. Future work will expand on our initial clinical findings and also extent our image processing strategy to 3-dimensional space to allow whole tumor characterization.

  15. Immobilization of Cellulase from Bacillus subtilis UniMAP-KB01 on Multi-walled Carbon Nanotubes for Biofuel Production

    Science.gov (United States)

    Naresh, Sandrasekaran; Hoong Shuit, Siew; Kunasundari, Balakrishnan; Hoo Peng, Yong; Qi, Hwa Ng; Teoh, Yi Peng

    2018-03-01

    Bacillus subtilis UniMAP-KB01, a cellulase producer was isolated from Malaysian mangrove soil. Through morphological identification it was observed that the B. subtilis appears to be in rod shaped and identified as a gram positive bacterium. Growth profile of isolated B. subtilis was established by measuring optical density (OD) at 600 nm for every 1 hour intervals. Polymath software was employed to plot the growth profile and the non-linear plot established gave the precision value of linear regression, R2 of 0.9602, root mean square deviation (RMSD) of 0.0176 and variance of 0.0025. The hydrolysis capacity testing revealed the cellulolytic index of 2.83 ± 0.46 after stained with Gram’s Iodine. The harvested crude enzyme after 24 hours incubation in carboxymethylcellulose (CMC) broth at 45°C and 100 RPM, was tested for enzyme activity. Through Filter Paper Assay (FPA), the cellulase activity was calculated to be 0.05 U/mL. The hydrolysis capacity testing and FPA shown an acceptable value for thermophilic bacterial enzyme activity. Thus, this isolated strain reasoned to be potential for producing thermostable cellulase which will be immobilized onto multi-walled carbon nanotubes and the cellulolytic activity will be characterized for biofuel production.

  16. Two-phase flow operational maps for multi-microchannel evaporators

    International Nuclear Information System (INIS)

    Szczukiewicz, Sylwia; Borhani, Navid; Thome, John Richard

    2013-01-01

    Highlights: • New operational maps for several different micro-evaporators are presented. • Inlet micro-orifices prevented flow instability, back flow, and flow maldistribution. • Eight different operating regimes were distinguished. • The flashing two-phase flow without back flow operating regime is preferred. -- Abstract: The current paper presents new operational maps for several different multi-microchannel evaporators, with and without any inlet restrictions (micro-orifices), for the two-phase flow of refrigerants R245fa, R236fa, and R1234ze(E). The test fluids flowed in 67 parallel channels, each having a cross-sectional area of 100 × 100 μm 2 . In order to emulate the power dissipated by active components in a 3D CMOS CPU chip, two aluminium microheaters were sputtered onto the back-side of the test section providing a 0.5 cm 2 each. Without any inlet restrictions in the micro-evaporator, significant parallel channel flow instabilities, vapor back flow, and flow maldistribution led to high-amplitude and high-frequency temperature and pressure oscillations. Such undesired phenomena were then prevented by placing restrictions at the inlet of each channel. High-speed flow visualization distinguished eight different operating regimes of the two-phase flow depending on the tested operating conditions. Therefore, the preferred operating regimes can be easily traced. In particular, flashing two-phase flow without back flow appeared to be the best operating regime without any flow and temperature instabilities

  17. Parametric model of the scala tympani for haptic-rendered cochlear implantation.

    Science.gov (United States)

    Todd, Catherine; Naghdy, Fazel

    2005-01-01

    A parametric model of the human scala tympani has been designed for use in a haptic-rendered computer simulation of cochlear implant surgery. It will be the first surgical simulator of this kind. A geometric model of the Scala Tympani has been derived from measured data for this purpose. The model is compared with two existing descriptions of the cochlear spiral. A first approximation of the basilar membrane is also produced. The structures are imported into a force-rendering software application for system development.

  18. A novel SURE-based criterion for parametric PSF estimation.

    Science.gov (United States)

    Xue, Feng; Blu, Thierry

    2015-02-01

    We propose an unbiased estimate of a filtered version of the mean squared error--the blur-SURE (Stein's unbiased risk estimate)--as a novel criterion for estimating an unknown point spread function (PSF) from the degraded image only. The PSF is obtained by minimizing this new objective functional over a family of Wiener processings. Based on this estimated blur kernel, we then perform nonblind deconvolution using our recently developed algorithm. The SURE-based framework is exemplified with a number of parametric PSF, involving a scaling factor that controls the blur size. A typical example of such parametrization is the Gaussian kernel. The experimental results demonstrate that minimizing the blur-SURE yields highly accurate estimates of the PSF parameters, which also result in a restoration quality that is very similar to the one obtained with the exact PSF, when plugged into our recent multi-Wiener SURE-LET deconvolution algorithm. The highly competitive results obtained outline the great potential of developing more powerful blind deconvolution algorithms based on SURE-like estimates.

  19. A Method of Vector Map Multi-scale Representation Considering User Interest on Subdivision Gird

    Directory of Open Access Journals (Sweden)

    YU Tong

    2016-12-01

    Full Text Available Compared with the traditional spatial data model and method, global subdivision grid show a great advantage in the organization and expression of massive spatial data. In view of this, a method of vector map multi-scale representation considering user interest on subdivision gird is proposed. First, the spatial interest field is built using a large number POI data to describe the spatial distribution of the user interest in geographic information. Second, spatial factor is classified and graded, and its representation scale range can be determined. Finally, different levels of subdivision surfaces are divided based on GeoSOT subdivision theory, and the corresponding relation of subdivision level and scale is established. According to the user interest of subdivision surfaces, the spatial feature can be expressed in different degree of detail. It can realize multi-scale representation of spatial data based on user interest. The experimental results show that this method can not only satisfy general-to-detail and important-to-secondary space cognitive demands of users, but also achieve better multi-scale representation effect.

  20. Tropical land use land cover mapping in Pará (Brazil) using discriminative Markov random fields and multi-temporal TerraSAR-X data

    Science.gov (United States)

    Hagensieker, Ron; Roscher, Ribana; Rosentreter, Johannes; Jakimow, Benjamin; Waske, Björn

    2017-12-01

    Remote sensing satellite data offer the unique possibility to map land use land cover transformations by providing spatially explicit information. However, detection of short-term processes and land use patterns of high spatial-temporal variability is a challenging task. We present a novel framework using multi-temporal TerraSAR-X data and machine learning techniques, namely discriminative Markov random fields with spatio-temporal priors, and import vector machines, in order to advance the mapping of land cover characterized by short-term changes. Our study region covers a current deforestation frontier in the Brazilian state Pará with land cover dominated by primary forests, different types of pasture land and secondary vegetation, and land use dominated by short-term processes such as slash-and-burn activities. The data set comprises multi-temporal TerraSAR-X imagery acquired over the course of the 2014 dry season, as well as optical data (RapidEye, Landsat) for reference. Results show that land use land cover is reliably mapped, resulting in spatially adjusted overall accuracies of up to 79% in a five class setting, yet limitations for the differentiation of different pasture types remain. The proposed method is applicable on multi-temporal data sets, and constitutes a feasible approach to map land use land cover in regions that are affected by high-frequent temporal changes.