WorldWideScience

Sample records for multi-parametric mapping software

  1. A generalized parametric response mapping method for analysis of multi-parametric imaging: A feasibility study with application to glioblastoma.

    Science.gov (United States)

    Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene

    2017-11-01

    Parametric response map (PRM) analysis of functional imaging has been shown to be an effective tool for early prediction of cancer treatment outcomes and may also be well-suited toward guiding personalized adaptive radiotherapy (RT) strategies such as sub-volume boosting. However, the PRM method was primarily designed for analysis of longitudinally acquired pairs of single-parameter image data. The purpose of this study was to demonstrate the feasibility of a generalized parametric response map analysis framework, which enables analysis of multi-parametric data while maintaining the key advantages of the original PRM method. MRI-derived apparent diffusion coefficient (ADC) and relative cerebral blood volume (rCBV) maps acquired at 1 and 3-months post-RT for 19 patients with high-grade glioma were used to demonstrate the algorithm. Images were first co-registered and then standardized using normal tissue image intensity values. Tumor voxels were then plotted in a four-dimensional Cartesian space with coordinate values equal to a voxel's image intensity in each of the image volumes and an origin defined as the multi-parametric mean of normal tissue image intensity values. Voxel positions were orthogonally projected onto a line defined by the origin and a pre-determined response vector. The voxels are subsequently classified as positive, negative or nil, according to whether projected positions along the response vector exceeded a threshold distance from the origin. The response vector was selected by identifying the direction in which the standard deviation of tumor image intensity values was maximally different between responding and non-responding patients within a training dataset. Voxel classifications were visualized via familiar three-class response maps and then the fraction of tumor voxels associated with each of the classes was investigated for predictive utility analogous to the original PRM method. Independent PRM and MPRM analyses of the contrast

  2. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Ren, S [Stanford University, Stanford, CA (United States); Tianjin University, Tianjin (China); Hara, W; Le, Q; Wang, L; Xing, L; Li, R [Stanford University, Stanford, CA (United States)

    2016-06-15

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  3. SU-G-JeP2-02: A Unifying Multi-Atlas Approach to Electron Density Mapping Using Multi-Parametric MRI for Radiation Treatment Planning

    International Nuclear Information System (INIS)

    Ren, S; Hara, W; Le, Q; Wang, L; Xing, L; Li, R

    2016-01-01

    Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2) electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.

  4. WE-AB-202-12: Voxel-Wise Analysis of Apparent Diffusion Coefficient and Perfusion Maps in Multi-Parametric MRI of Prostate Cancer

    International Nuclear Information System (INIS)

    Engstroem, K; Casares-Magaz, O; Muren, L; Roervik, J; Andersen, E

    2016-01-01

    Purpose: Multi-parametric MRI (mp-MRI) is being introduced in radiotherapy (RT) of prostate cancer, including for tumour delineation in focal boosting strategies. We recently developed an image-based tumour control probability model, based on cell density distributions derived from apparent diffusion coefficient (ADC) maps. Beyond tumour volume and cell densities, tumour hypoxia is also an important determinant of RT response. Since tissue perfusion from mp-MRI has been related to hypoxia we have explored the patterns of ADC and perfusion maps, and the relations between them, inside and outside prostate index lesions. Methods: ADC and perfusion maps from 20 prostate cancer patients were used, with the prostate and index lesion delineated by a dedicated uro-radiologist. To reduce noise, the maps were averaged over a 3×3×3 voxel cube. Associations between different ADC and perfusion histogram parameters within the prostate, inside and outside the index lesion, were evaluated with the Pearson’s correlation coefficient. In the voxel-wise analysis, scatter plots of ADC vs perfusion were analysed for voxels in the prostate, inside and outside of the index lesion, again with the associations quantified with the Pearson’s correlation coefficient. Results: Overall ADC was lower inside the index lesion than in the normal prostate as opposed to ktrans that was higher inside the index lesion than outside. In the histogram analysis, the minimum ktrans was significantly correlated with the maximum ADC (Pearson=0.47; p=0.03). At the voxel level, 15 of the 20 cases had a statistically significant inverse correlation between ADC and perfusion inside the index lesion; ten of the cases had a Pearson < −0.4. Conclusion: The minimum value of ktrans across the tumour was correlated to the maximum ADC. However, on the voxel level, the ‘local’ ktrans in the index lesion is inversely (i.e. negatively) correlated to the ‘local’ ADC in most patients. Research agreement with

  5. Integrable multi parametric SU(N) chain

    International Nuclear Information System (INIS)

    Foerster, Angela; Roditi, Itzhak; Rodrigues, Ligia M.C.S.

    1996-03-01

    We analyse integrable models associated to a multi parametric SU(N) R-matrix. We show that the Hamiltonians describe SU(N) chains with twisted boundary conditions and that the underlying algebraic structure is the multi parametric deformation of SU(N) enlarged by the introduction of a central element. (author). 15 refs

  6. MODIS-based multi-parametric platform for mapping of flood affected areas. Case study: 2006 Danube extreme flood in Romania

    Directory of Open Access Journals (Sweden)

    Craciunescu Vasile

    2016-12-01

    Full Text Available Flooding remains the most widely distributed natural hazard in Europe, leading to significant economic and social impact. Earth observation data is presently capable of making fundamental contributions towards reducing the detrimental effects of extreme floods. Technological advance makes development of online services able to process high volumes of satellite data without the need of dedicated desktop software licenses possible. The main objective of the case study is to present and evaluate a methodology for mapping of flooded areas based on MODIS satellite images derived indices and using state-of-the-art geospatial web services. The methodology and the developed platform were tested with data for the historical flood event that affected the Danube floodplain in 2006 in Romania. The results proved that, despite the relative coarse resolution, MODIS data is very useful for mapping the development flooded area in large plain floods. Moreover it was shown, that the possibility to adapt and combine the existing global algorithms for flood detection to fit the local conditions is extremely important to obtain accurate results.

  7. Multi parametric system for the acquisition and processing of nuclear data on a personal computer

    International Nuclear Information System (INIS)

    Toledo Acosta, R. B.; Osorio Deliz, J. F.; Arista Romeu, E.; Perez Sanchez, R.; Lopez Torres, E.

    1997-01-01

    A four-parameter Multi parametric System for the acquisition and processing of nuclear data is described. It is characterized for its flexibility and relatively low cost, also guaranteeing a high acquisition capacity. The system allows to be utilized in a multi parametric manner, in pulse height analysis or in any combination of both for parameter. The hardware and the software of the system are described. A general explanation of the operation and the characteristics of the system is offered. (author) [es

  8. A microcomputer based multi parametric system for nuclear data acquisition and processing

    International Nuclear Information System (INIS)

    Toledo Acosta B, Rene; Osorio Deliz F, Juan; Arista Romeu, Eduardo; Perez Sanchez, Reinaldo; Lopes Torres, E.

    1997-01-01

    A four-parameter Multi parametric System for the acquisition and processing of nuclear data is described. It is characterized for its flexibility and relatively low cost, also guaranteeing a high acquisition capacity. The system allows to be utilized in a multi parametric manner, in pulse height analysis or in many combination of both for parameter. It is described the hardware and the software of the system

  9. Massively multi-parametric immunoassays using ICPMS

    International Nuclear Information System (INIS)

    Tanner, S.D.; Ornatsky, O.; Bandura, D.R.; Baranov, V.I.

    2009-01-01

    The use of stable isotopes as tags in immunoassays, and their determination by ICPMS, is poised to have a huge impact on multi-parametric bioanalysis. A new technology, which we term 'mass cytometry', enables high throughput, highly multiplexed individual cell analysis. Preliminary results for T-cell immunophenotyping in peripheral blood mononuclear cells (PBMC), agonist influence on concomitant phosphorylation pathways, and sub-classification of acute myeloid leukemia patients' samples will be presented. The significance of individual cell analysis is demonstrated by the identification of populations of rogue cells in PBMC samples through the use of multidimensional neural network cluster analysis. (author)

  10. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  11. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    Software process improvement in small, agile organizations is often problematic. Model-based approaches seem to overlook problems. We have been seeking an alternative approach to overcome this through action research. Here we report on a piece of action research from which we developed an approach...... to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...... software process improvement....

  12. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.; Faí sca, N.P.; Panos, C.; Pistikopoulos, E.N.

    2011-01-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques

  13. Theoretical and algorithmic advances in multi-parametric programming and control

    KAUST Repository

    Pistikopoulos, Efstratios N.; Dominguez, Luis; Panos, Christos; Kouramas, Konstantinos; Chinchuluun, Altannar

    2012-01-01

    This paper presents an overview of recent theoretical and algorithmic advances, and applications in the areas of multi-parametric programming and explicit/multi-parametric model predictive control (mp-MPC). In multi-parametric programming, advances include areas such as nonlinear multi-parametric programming (mp-NLP), bi-level programming, dynamic programming and global optimization for multi-parametric mixed-integer linear programming problems (mp-MILPs). In multi-parametric/explicit MPC (mp-MPC), advances include areas such as robust multi-parametric control, multi-parametric nonlinear MPC (mp-NMPC) and model reduction in mp-MPC. A comprehensive framework for multi-parametric programming and control is also presented. Recent applications include a hydrogen storage device, a fuel cell power generation system, an unmanned autonomous vehicle (UAV) and a hybrid pressure swing adsorption (PSA) system. © 2012 Springer-Verlag.

  14. Theoretical and algorithmic advances in multi-parametric programming and control

    KAUST Repository

    Pistikopoulos, Efstratios N.

    2012-04-21

    This paper presents an overview of recent theoretical and algorithmic advances, and applications in the areas of multi-parametric programming and explicit/multi-parametric model predictive control (mp-MPC). In multi-parametric programming, advances include areas such as nonlinear multi-parametric programming (mp-NLP), bi-level programming, dynamic programming and global optimization for multi-parametric mixed-integer linear programming problems (mp-MILPs). In multi-parametric/explicit MPC (mp-MPC), advances include areas such as robust multi-parametric control, multi-parametric nonlinear MPC (mp-NMPC) and model reduction in mp-MPC. A comprehensive framework for multi-parametric programming and control is also presented. Recent applications include a hydrogen storage device, a fuel cell power generation system, an unmanned autonomous vehicle (UAV) and a hybrid pressure swing adsorption (PSA) system. © 2012 Springer-Verlag.

  15. Personalized precision radiotherapy by integration of multi-parametric functional and biological imaging in prostate cancer. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Thorwarth, Daniela [Tuebingen Univ. (Germany). Section for Biomedical Physics; Notohamiprodjo, Mike [Tuebingen Univ. (Germany). Dept. of Diagnostic and Interventional Radiology; Zips, Daniel; Mueller, Arndt-Christan [Tuebingen Univ. (Germany). Dept. of Radiation Oncology

    2017-05-01

    To increase tumour control probability (TCP) in prostate cancer a method was developed integrating multi-parametric functional and biological information into a dose painting treatment plan aiming focal dose-escalation to tumour sub-volumes. A dose-escalation map was derived considering individual, multi-parametric estimated tumour aggressiveness. Multi-parametric functional imaging (MRI, Choline-/PSMA-/FMISO-PET/CT) was acquired for a high risk prostate cancer patient with a high level of tumour load (cT3b cN0 cM0) indicated by subtotal involvement of prostate including the right seminal vesicle and by PSA-level >100. Probability of tumour presence was determined by a combination of multi-parametric functional image information resulting in a voxel-based map of tumour aggressiveness. This probability map was directly integrated into dose optimization in order to plan for inhomogeneous, biological imaging based dose painting. Histograms of the multi-parametric prescription function were generated in addition to a differential histogram of the planned inhomogeneous doses. Comparison of prescribed doses with planned doses on a voxel level was realized using an effective DVH, containing the ratio of prescribed vs. planned dose for each tumour voxel. Multi-parametric imaging data of PSMA, Choline and FMISO PET/CT as well as ADC maps derived from diffusion weighted MRI were combined to an individual probability map of tumour presence. Voxel-based prescription doses ranged from 75.3 Gy up to 93.4 Gy (median: 79.6 Gy), whereas the planned dose painting doses varied only between 72.5 and 80.0 Gy with a median dose of 75.7 Gy. However, inhomogeneous voxel-based dose prescriptions can only be implemented into a treatment plan until a certain level. Multi-parametric probability based dose painting in prostate cancer is technically and clinically feasible. However, detailed calibration functions to define the necessary probability functions need to be assessed in future

  16. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  17. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.

    2011-08-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques. The algorithm features two key steps: (i) a dynamic programming step, in which the mp-MPC problem is decomposed into a set of smaller subproblems in which only the current control, state variables, and constraints are considered, and (ii) a multi-parametric programming step, in which each subproblem is solved as a convex multi-parametric programming problem, to derive the control variables as an explicit function of the states. The key feature of the proposed method is that it overcomes potential limitations of previous methods for solving multi-parametric programming problems with dynamic programming, such as the need for global optimization for each subproblem of the dynamic programming step. © 2011 Elsevier Ltd. All rights reserved.

  18. Documenting the location of systematic transrectal ultrasound-guided prostate biopsies: correlation with multi-parametric MRI.

    Science.gov (United States)

    Turkbey, Baris; Xu, Sheng; Kruecker, Jochen; Locklin, Julia; Pang, Yuxi; Shah, Vijay; Bernardo, Marcelino; Baccala, Angelo; Rastinehad, Ardeshir; Benjamin, Compton; Merino, Maria J; Wood, Bradford J; Choyke, Peter L; Pinto, Peter A

    2011-03-29

    During transrectal ultrasound (TRUS)-guided prostate biopsies, the actual location of the biopsy site is rarely documented. Here, we demonstrate the capability of TRUS-magnetic resonance imaging (MRI) image fusion to document the biopsy site and correlate biopsy results with multi-parametric MRI findings. Fifty consecutive patients (median age 61 years) with a median prostate-specific antigen (PSA) level of 5.8 ng/ml underwent 12-core TRUS-guided biopsy of the prostate. Pre-procedural T2-weighted magnetic resonance images were fused to TRUS. A disposable needle guide with miniature tracking sensors was attached to the TRUS probe to enable fusion with MRI. Real-time TRUS images during biopsy and the corresponding tracking information were recorded. Each biopsy site was superimposed onto the MRI. Each biopsy site was classified as positive or negative for cancer based on the results of each MRI sequence. Sensitivity, specificity, and receiver operating curve (ROC) area under the curve (AUC) values were calculated for multi-parametric MRI. Gleason scores for each multi-parametric MRI pattern were also evaluated. Six hundred and 5 systemic biopsy cores were analyzed in 50 patients, of whom 20 patients had 56 positive cores. MRI identified 34 of 56 positive cores. Overall, sensitivity, specificity, and ROC area values for multi-parametric MRI were 0.607, 0.727, 0.667, respectively. TRUS-MRI fusion after biopsy can be used to document the location of each biopsy site, which can then be correlated with MRI findings. Based on correlation with tracked biopsies, T2-weighted MRI and apparent diffusion coefficient maps derived from diffusion-weighted MRI are the most sensitive sequences, whereas the addition of delayed contrast enhancement MRI and three-dimensional magnetic resonance spectroscopy demonstrated higher specificity consistent with results obtained using radical prostatectomy specimens.

  19. Multi-parametric variational data assimilation for hydrological forecasting

    Science.gov (United States)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  20. Computer-aided diagnosis of prostate cancer using multi-parametric MRI: comparison between PUN and Tofts models

    Science.gov (United States)

    Mazzetti, S.; Giannini, V.; Russo, F.; Regge, D.

    2018-05-01

    Computer-aided diagnosis (CAD) systems are increasingly being used in clinical settings to report multi-parametric magnetic resonance imaging (mp-MRI) of the prostate. Usually, CAD systems automatically highlight cancer-suspicious regions to the radiologist, reducing reader variability and interpretation errors. Nevertheless, implementing this software requires the selection of which mp-MRI parameters can best discriminate between malignant and non-malignant regions. To exploit functional information, some parameters are derived from dynamic contrast-enhanced (DCE) acquisitions. In particular, much CAD software employs pharmacokinetic features, such as K trans and k ep, derived from the Tofts model, to estimate a likelihood map of malignancy. However, non-pharmacokinetic models can be also used to describe DCE-MRI curves, without any requirement for prior knowledge or measurement of the arterial input function, which could potentially lead to large errors in parameter estimation. In this work, we implemented an empirical function derived from the phenomenological universalities (PUN) class to fit DCE-MRI. The parameters of the PUN model are used in combination with T2-weighted and diffusion-weighted acquisitions to feed a support vector machine classifier to produce a voxel-wise malignancy likelihood map of the prostate. The results were all compared to those for a CAD system based on Tofts pharmacokinetic features to describe DCE-MRI curves, using different quality aspects of image segmentation, while also evaluating the number and size of false positive (FP) candidate regions. This study included 61 patients with 70 biopsy-proven prostate cancers (PCa). The metrics used to evaluate segmentation quality between the two CAD systems were not statistically different, although the PUN-based CAD reported a lower number of FP, with reduced size compared to the Tofts-based CAD. In conclusion, the CAD software based on PUN parameters is a feasible means with which to

  1. Multi parametrical indicator test for urban wastewater influence

    Science.gov (United States)

    Humer, Franko; Weiss, Stefan; Reinnicke, Sandra; Clara, Manfred; Grath, Johannes; Windhofer, Georg

    2013-04-01

    Austria's drinking water is abstracted from groundwater. While 50 % of the Austrian population are supplied with spring water, the other 50 % get their drinking water from groundwater supplies, in part from enormous quaternary valley and basin deposits, subjected to intensive use by population, industry, agriculture and traffic/transport. Due to protected areas around drinking water wells and springs, there is no treatment necessary in most cases. Water bodies, however, can be affected by different pathways from natural, industrial and urban sources. Identification of anthropogenic sources is paramount for taking appropriate measures to safeguard the quality of drinking water supply. Common parameters like boron are widely used as tracers indicating anthropogenic impacts (e.g. wastewater contamination of groundwater systems). Unfortunately application of these conventional indicators is often limited due to high dilution. Another application where common parameters have their limits is the identification and quantification of the diffuse nitrogen input to water by the stable isotopes of nitrogen and oxygen in nitrate. Without any additional tracers the source distinction of nitrate from manure or waste water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore the Umweltbundesamt (Environment Agency Austria) developed a multi parametrical indicator test which shall allow for identification and quantification of anthropogenic pollutions. The test aims at analysing eight target substances which are well known to occur in wastewater: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepin (pharmaceuticals). These substances are polar and degradation in the aquatic system by microbiological processes is not

  2. Multi-parametric MRI findings of granulomatous prostatitis developing after intravesical bacillus calmette-guérin therapy.

    Science.gov (United States)

    Gottlieb, Josh; Princenthal, Robert; Cohen, Martin I

    2017-07-01

    To evaluate the multi-parametric MRI (mpMRI) findings in patients with biopsy-proven granulomatous prostatitis and prior Bacillus Calmette-Guérin (BCG) exposure. MRI was performed in six patients with pathologically proven granulomatous prostatitis and a prior history of bladder cancer treated with intravesical BCG therapy. Multi-parametric prostate MRI images were recorded on a GE 750W or Philips Achieva 3.0 Tesla MRI scanner with high-resolution, small-field-of-view imaging consisting of axial T2, axial T1, coronal T2, sagittal T2, axial multiple b-value diffusion (multiple values up to 1200 or 1400), and dynamic contrast-enhanced 3D axial T1 with fat suppression sequence. Two different patterns of MR findings were observed. Five of the six patients had a low mean ADC value prostatitis. The other pattern seen in one of the six patients was decreased signal on the ADC map images with increased signal on the high-b-value sequence, revealing true restricted diffusion indistinguishable from aggressive prostate cancer. This patient had biopsy-confirmed acute BCG prostatitis. Our study suggests that patients with known BCG exposure and PI-RADS v2 scores ≤3, showing similar mpMRI findings as demonstrated, may not require prostate biopsy.

  3. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark [University College London, Research Department of Urology, London (United Kingdom); Kirkham, Alex; Allen, Clare [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2014-09-17

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  4. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    International Nuclear Information System (INIS)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark; Kirkham, Alex; Allen, Clare; Freeman, Alex

    2015-01-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  5. Association between pathology and texture features of multi parametric MRI of the prostate

    Science.gov (United States)

    Kuess, Peter; Andrzejewski, Piotr; Nilsson, David; Georg, Petra; Knoth, Johannes; Susani, Martin; Trygg, Johan; Helbich, Thomas H.; Polanec, Stephan H.; Georg, Dietmar; Nyholm, Tufve

    2017-10-01

    The role of multi-parametric (mp)MRI in the diagnosis and treatment of prostate cancer has increased considerably. An alternative to visual inspection of mpMRI is the evaluation using histogram-based (first order statistics) parameters and textural features (second order statistics). The aims of the present work were to investigate the relationship between benign and malignant sub-volumes of the prostate and textures obtained from mpMR images. The performance of tumor prediction was investigated based on the combination of histogram-based and textural parameters. Subsequently, the relative importance of mpMR images was assessed and the benefit of additional imaging analyzed. Finally, sub-structures based on the PI-RADS classification were investigated as potential regions to automatically detect maligned lesions. Twenty-five patients who received mpMRI prior to radical prostatectomy were included in the study. The imaging protocol included T2, DWI, and DCE. Delineation of tumor regions was performed based on pathological information. First and second order statistics were derived from each structure and for all image modalities. The resulting data were processed with multivariate analysis, using PCA (principal component analysis) and OPLS-DA (orthogonal partial least squares discriminant analysis) for separation of malignant and healthy tissue. PCA showed a clear difference between tumor and healthy regions in the peripheral zone for all investigated images. The predictive ability of the OPLS-DA models increased for all image modalities when first and second order statistics were combined. The predictive value reached a plateau after adding ADC and T2, and did not increase further with the addition of other image information. The present study indicates a distinct difference in the signatures between malign and benign prostate tissue. This is an absolute prerequisite for automatic tumor segmentation, but only the first step in that direction. For the specific

  6. Development of a specific geological mapping software under MAPGIS

    International Nuclear Information System (INIS)

    Zhang Wenkai

    2010-01-01

    The most often used mapping software in geological exploration is MAPGIS system, and related standard is established based on it. The software has more agile functions, except for the following shortages: more parameters to select, difficult to master, different parameters to use for each one, low efficiency. As a result, a specific software is developed for geological mapping by using VC++ on the platform of MAPGIS. According to the standards, toolbars are built for strata, rock, geographic information and materials, etc. By pushing on the buttons, the parameters are selected, and menus of toolbars can be modified to select parameters for each working areas, legends can be sorted automatically. So, the speed can be improved greatly, and the parameters can be identical. The software can complete the transition between Gauss coordinate and longitude-latitude coordinate, drawing points, frames by longitude-latitude, responsible form, plain diagram and profile, etc. The software also improves the way of clipping, topologizing, node catching methods. The application of the software indicates that it can improve the speed of geological mapping greatly, and can improve the standardized level of the final maps. (authors)

  7. Multi-parametric MR imaging for prostate carcinoma; Multiparametrische MR-Bildgebung beim Prostatakarzinom

    Energy Technology Data Exchange (ETDEWEB)

    Schlemmer, Heinz-Peter [Deutsches Krebsforschungszentrum, Heidelberg (Germany). Abt. Radiologie

    2017-03-15

    Multi-parametric NMR imaging in case of prostate carcinoma can improve diagnostics, allows reliable prognostic estimations and helps to find the optimum individual therapy. The contribution is focused to deliver the needed methodological tools and background knowledge for the daily routine.

  8. Some software issues in mapping of power distribution feeders

    International Nuclear Information System (INIS)

    Mufti, I.A.

    1994-01-01

    This paper is about the in-house developed software for distribution feeders mapping project. It first gives birds eye view of the project, highlight its technical complexity in management and logistics, introduced by sheer size of the project. It gives an overview of the software developed and the moves on to describe circuit tracing, circuit model, leaves isolation (for tree structured network) and backtracking in more detail among many different parts of software, description of all which is not possible because of space limitations. (author)

  9. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  10. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI.

    Science.gov (United States)

    Dikaios, Nikolaos; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Kirkham, Alex; Allen, Clare; Ahmed, Hashim; Emberton, Mark; Freeman, Alex; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit

    2015-02-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. • MRI helps find prostate cancer in the anterior of the gland • Logistic regression models based on mp-MRI can classify prostate cancer • Computers can help confirm cancer in areas doctors are uncertain about.

  11. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  12. Adopting of Agile methods in Software Development Organizations: Systematic Mapping

    Directory of Open Access Journals (Sweden)

    Samia Abdalhamid

    2017-11-01

    Full Text Available Adoption of agile methods in the software development organization is considered as a powerful solution to deal with the quickly changing and regularly developing business environment and fully-educated customers with constantly rising expectation, such as shorter time periods and an extraordinary level of response and service. This study investigates the adoption of agile approaches in software development organizations by using systematic mapping. Six research questions are identified, and to answer these questions a number of research papers have been reviewed in electronic databases. Finally, 25 research papers are examined and answers to all research questions are provided.

  13. Fuzzy Cognitive Map for Software Testing Using Artificial Intelligence Techniques

    OpenAIRE

    Larkman , Deane; Mohammadian , Masoud; Balachandran , Bala; Jentzsch , Ric

    2010-01-01

    International audience; This paper discusses a framework to assist test managers to evaluate the use of AI techniques as a potential tool in software testing. Fuzzy Cognitive Maps (FCMs) are employed to evaluate the framework and make decision analysis easier. A what-if analysis is presented that explores the general application of the framework. Simulations are performed to show the effectiveness of the proposed method. The framework proposed is innovative and it assists managers in making e...

  14. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  15. Applying the metro map to software development management

    Science.gov (United States)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  16. Incorporating Oxygen-Enhanced MRI into Multi-Parametric Assessment of Human Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Heling Zhou

    2017-08-01

    Full Text Available Hypoxia is associated with prostate tumor aggressiveness, local recurrence, and biochemical failure. Magnetic resonance imaging (MRI offers insight into tumor pathophysiology and recent reports have related transverse relaxation rate (R2* and longitudinal relaxation rate (R1 measurements to tumor hypoxia. We have investigated the inclusion of oxygen-enhanced MRI for multi-parametric evaluation of tumor malignancy. Multi-parametric MRI sequences at 3 Tesla were evaluated in 10 patients to investigate hypoxia in prostate cancer prior to radical prostatectomy. Blood oxygen level dependent (BOLD, tissue oxygen level dependent (TOLD, dynamic contrast enhanced (DCE, and diffusion weighted imaging MRI were intercorrelated and compared with the Gleason score. The apparent diffusion coefficient (ADC was significantly lower in tumor than normal prostate. Baseline R2* (BOLD-contrast was significantly higher in tumor than normal prostate. Upon the oxygen breathing challenge, R2* decreased significantly in the tumor tissue, suggesting improved vascular oxygenation, however changes in R1 were minimal. R2* of contralateral normal prostate decreased in most cases upon oxygen challenge, although the differences were not significant. Moderate correlation was found between ADC and Gleason score. ADC and R2* were correlated and trends were found between Gleason score and R2*, as well as maximum-intensity-projection and area-under-the-curve calculated from DCE. Tumor ADC and R2* have been associated with tumor hypoxia, and thus the correlations are of particular interest. A multi-parametric approach including oxygen-enhanced MRI is feasible and promises further insights into the pathophysiological information of tumor microenvironment.

  17. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  18. Multi parametric card to personal computers interface based in ispLSI1016 circuits

    International Nuclear Information System (INIS)

    Osorio Deliz, J.F.; Toledo Acosta, R.B.; Arista Romeu, E.

    1997-01-01

    It is described the design and principal characteristic of the interface circuit for a 16 bit multi parametric add on card for IBM or compatible microcomputer which content two communication channels of direct memory access and bidirectional between the card and the computer, an interrupt controller, a programmable address register, a default add res register of the card, a four channels multiplexer, as well as the decoder logic of the 80C186 and computer. The circuit was designed with two programmable logic devices ispL1016, which allowed drastically to diminish the quantity of utilized components and get a more flexible design in less time better characteristics

  19. Dynamic modeling and explicit/multi-parametric MPC control of pressure swing adsorption systems

    KAUST Repository

    Khajuria, Harish

    2011-01-01

    Pressure swing adsorption (PSA) is a flexible, albeit complex gas separation system. Due to its inherent nonlinear nature and discontinuous operation, the design of a model based PSA controller, especially with varying operating conditions, is a challenging task. This work focuses on the design of an explicit/multi-parametric model predictive controller for a PSA system. Based on a system involving four adsorbent beds separating 70% H2, 30% CH4 mixture into high purity hydrogen, the key controller objective is to fast track H2 purity to a set point value of 99.99%. To perform this task, a rigorous and systematic framework is employed. First, a high fidelity detailed dynamic model is built to represent the system\\'s real operation, and understand its dynamic behavior. The model is then used to derive appropriate linear models by applying suitable system identification techniques. For the reduced models, a model predictive control (MPC) step is formulated, where latest developments in multi-parametric programming and control are applied to derive a novel explicit MPC controller. To test the performance of the designed controller, closed loop simulations are performed where the dynamic model is used as the virtual plant. Comparison studies of the derived explicit MPC controller are also performed with conventional PID controllers. © 2010 Elsevier Ltd. All rights reserved.

  20. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...... for collaborative work, fostering awareness, knowledge management and coordination among team members. Contrary to the evident high importance of the social aspects offered by SoSo, socialization is not the most important usage reported. Conclusions: This review reports how SoSo is used in GSD and how it is capable...... of supporting GSD teams. Four emerging themes in global software engineering were identified: the appropriation and development of usage structures; understanding how an ecology of communication channels and tools are used by teams; the role played by SoSo either as a subtext or as an explicit goal; and finally...

  1. Optimizing a machine learning based glioma grading system using multi-parametric MRI histogram and texture features.

    Science.gov (United States)

    Zhang, Xin; Yan, Lin-Feng; Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin

    2017-07-18

    Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization.

  2. Characterizing Heterogeneity within Head and Neck Lesions Using Cluster Analysis of Multi-Parametric MRI Data.

    Directory of Open Access Journals (Sweden)

    Marco Borri

    Full Text Available To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment.The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4. Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters.The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4, determined with cluster validation, produced the best separation between reducing and non-reducing clusters.The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.

  3. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  4. Multi-parametric ultrasound criteria for internal carotid artery disease - comparison with CT angiography

    International Nuclear Information System (INIS)

    Barlinn, Kristian; Kepplinger, Jessica; Siepmann, Timo; Pallesen, Lars-Peder; Bodechtel, Ulf; Reichmann, Heinz; Puetz, Volker; Floegel, Thomas; Kitzler, Hagen H.; Alexandrov, Andrei V.

    2016-01-01

    The German Society of Ultrasound in Medicine (known by its acronym DEGUM) recently proposed a novel multi-parametric ultrasound approach for comprehensive and accurate assessment of extracranial internal carotid artery (ICA) steno-occlusive disease. We determined the agreement between duplex ultrasonography (DUS) interpreted by the DEGUM criteria and CT angiography (CTA) for grading of extracranial ICA steno-occlusive disease. Consecutive patients with acute cerebral ischemia underwent DUS and CTA. Internal carotid artery stenosis was graded according to the DEGUM-recommended criteria for DUS. Independent readers manually performed North American Symptomatic Carotid Endarterectomy Trial-type measurements on axial CTA source images. Both modalities were compared using Spearman's correlation and Bland-Altman analyses. A total of 303 acute cerebral ischemia patients (mean age, 72 ± 12 years; 58 % men; median baseline National Institutes of Health Stroke Scale score, 4 [interquartile range 7]) provided 593 DUS and CTA vessel pairs for comparison. There was a positive correlation between DUS and CTA (r s = 0.783, p < 0.001) with mean difference in degree of stenosis measurement of 3.57 %. Bland-Altman analysis further revealed widely varying differences (95 % limits of agreement -29.26 to 22.84) between the two modalities. Although the novel DEGUM criteria showed overall good agreement between DUS and CTA across all stenosis ranges, potential for wide incongruence with CTA underscores the need for local laboratory validation to avoid false screening results. (orig.)

  5. Development of a computer aided diagnosis model for prostate cancer classification on multi-parametric MRI

    Science.gov (United States)

    Alfano, R.; Soetemans, D.; Bauman, G. S.; Gibson, E.; Gaed, M.; Moussa, M.; Gomez, J. A.; Chin, J. L.; Pautler, S.; Ward, A. D.

    2018-02-01

    Multi-parametric MRI (mp-MRI) is becoming a standard in contemporary prostate cancer screening and diagnosis, and has shown to aid physicians in cancer detection. It offers many advantages over traditional systematic biopsy, which has shown to have very high clinical false-negative rates of up to 23% at all stages of the disease. However beneficial, mp-MRI is relatively complex to interpret and suffers from inter-observer variability in lesion localization and grading. Computer-aided diagnosis (CAD) systems have been developed as a solution as they have the power to perform deterministic quantitative image analysis. We measured the accuracy of such a system validated using accurately co-registered whole-mount digitized histology. We trained a logistic linear classifier (LOGLC), support vector machine (SVC), k-nearest neighbour (KNN) and random forest classifier (RFC) in a four part ROI based experiment against: 1) cancer vs. non-cancer, 2) high-grade (Gleason score ≥4+3) vs. low-grade cancer (Gleason score work will form the basis for a tool that enhances the radiologist's ability to detect malignancies, potentially improving biopsy guidance, treatment selection, and focal therapy for prostate cancer patients, maximizing the potential for cure and increasing quality of life.

  6. Multi-parametric ultrasound criteria for internal carotid artery disease - comparison with CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Barlinn, Kristian; Kepplinger, Jessica; Siepmann, Timo; Pallesen, Lars-Peder; Bodechtel, Ulf; Reichmann, Heinz; Puetz, Volker [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Floegel, Thomas [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Kitzler, Hagen H. [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Alexandrov, Andrei V. [The University of Tennessee Health Science Center, Department of Neurology, Memphis, TN (United States)

    2016-09-15

    The German Society of Ultrasound in Medicine (known by its acronym DEGUM) recently proposed a novel multi-parametric ultrasound approach for comprehensive and accurate assessment of extracranial internal carotid artery (ICA) steno-occlusive disease. We determined the agreement between duplex ultrasonography (DUS) interpreted by the DEGUM criteria and CT angiography (CTA) for grading of extracranial ICA steno-occlusive disease. Consecutive patients with acute cerebral ischemia underwent DUS and CTA. Internal carotid artery stenosis was graded according to the DEGUM-recommended criteria for DUS. Independent readers manually performed North American Symptomatic Carotid Endarterectomy Trial-type measurements on axial CTA source images. Both modalities were compared using Spearman's correlation and Bland-Altman analyses. A total of 303 acute cerebral ischemia patients (mean age, 72 ± 12 years; 58 % men; median baseline National Institutes of Health Stroke Scale score, 4 [interquartile range 7]) provided 593 DUS and CTA vessel pairs for comparison. There was a positive correlation between DUS and CTA (r{sub s} = 0.783, p < 0.001) with mean difference in degree of stenosis measurement of 3.57 %. Bland-Altman analysis further revealed widely varying differences (95 % limits of agreement -29.26 to 22.84) between the two modalities. Although the novel DEGUM criteria showed overall good agreement between DUS and CTA across all stenosis ranges, potential for wide incongruence with CTA underscores the need for local laboratory validation to avoid false screening results. (orig.)

  7. Multi-parametric monitoring and assessment of high-intensity focused ultrasound (HIFU) boiling by harmonic motion imaging for focused ultrasound (HMIFU): an ex vivo feasibility study

    International Nuclear Information System (INIS)

    Hou, Gary Y; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E

    2014-01-01

    Harmonic motion imaging for focused ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase shift during high energy HIFU treatment with tissue boiling. Forty three (n = 43) thermal lesions were formed in ex vivo canine liver specimens (n = 28). Two-dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10 s, 20 s and 30 s HIFU durations at three different acoustic powers of 8, 10, and 11 W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and passive cavitation detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δϕ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite the expectedly chaotic changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property changes throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with

  8. [The primary research and development of software oversampling mapping system for electrocardiogram].

    Science.gov (United States)

    Zhou, Yu; Ren, Jie

    2011-04-01

    We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.

  9. Multi-parametric MRI in cervical cancer. Early prediction of response to concurrent chemoradiotherapy in combination with clinical prognostic factors

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Wei; Chen, Bing; Wang, Ai Jun; Zhao, Jian Guo [The General Hospital of Ningxia Medical University, Department of Radiology, Yinchuan (China); Qiang, Jin Wei [Fudan University, Department of Radiology, Jinshan Hospital, Shanghai (China); Tian, Hai Ping [The General Hospital of Ningxia Medical University, Department of Pathology, Yinchuan (China)

    2018-01-15

    To investigate the prediction of response to concurrent chemoradiotherapy (CCRT) through a combination of pretreatment multi-parametric magnetic resonance imaging (MRI) with clinical prognostic factors (CPF) in cervical cancer patients. Sixty-five patients underwent conventional MRI, diffusion-weighted imaging (DWI), and dynamic contrast-enhanced MRI (DCE-MRI) before CCRT. The patients were divided into non- and residual tumour groups according to post-treatment MRI. Pretreatment MRI parameters and CPF between the two groups were compared and prognostic factors, optimal thresholds, and predictive performance for post-treatment residual tumour occurrence were estimated. The residual group showed a lower maximum slope of increase (MSI{sub L}) and signal enhancement ratio (SER{sub L}) in low-perfusion subregions, a higher apparent diffusion coefficient (ADC) value, and a higher stage than the non-residual tumour group (p < 0.001, p = 0.003, p < 0.001, and p < 0.001, respectively). MSI{sub L} and ADC were independent prognostic factors. The combination of both measures improved the diagnostic performance compared with individual MRI parameters. A further combination of these two factors with CPF exhibited the highest predictive performance. Pretreatment MSI{sub L} and ADC were independent prognostic factors for cervical cancer. The predictive capacity of multi-parametric MRI was superior to individual MRI parameters. The combination of multi-parametric MRI with CPF further improved the predictive performance. (orig.)

  10. A multi-parametric particle-pairing algorithm for particle tracking in single and multiphase flows

    International Nuclear Information System (INIS)

    Cardwell, Nicholas D; Vlachos, Pavlos P; Thole, Karen A

    2011-01-01

    Multiphase flows (MPFs) offer a rich area of fundamental study with many practical applications. Examples of such flows range from the ingestion of foreign particulates in gas turbines to transport of particles within the human body. Experimental investigation of MPFs, however, is challenging, and requires techniques that simultaneously resolve both the carrier and discrete phases present in the flowfield. This paper presents a new multi-parametric particle-pairing algorithm for particle tracking velocimetry (MP3-PTV) in MPFs. MP3-PTV improves upon previous particle tracking algorithms by employing a novel variable pair-matching algorithm which utilizes displacement preconditioning in combination with estimated particle size and intensity to more effectively and accurately match particle pairs between successive images. To improve the method's efficiency, a new particle identification and segmentation routine was also developed. Validation of the new method was initially performed on two artificial data sets: a traditional single-phase flow published by the Visualization Society of Japan (VSJ) and an in-house generated MPF data set having a bi-modal distribution of particles diameters. Metrics of the measurement yield, reliability and overall tracking efficiency were used for method comparison. On the VSJ data set, the newly presented segmentation routine delivered a twofold improvement in identifying particles when compared to other published methods. For the simulated MPF data set, measurement efficiency of the carrier phases improved from 9% to 41% for MP3-PTV as compared to a traditional hybrid PTV. When employed on experimental data of a gas–solid flow, the MP3-PTV effectively identified the two particle populations and reported a vector efficiency and velocity measurement error comparable to measurements for the single-phase flow images. Simultaneous measurement of the dispersed particle and the carrier flowfield velocities allowed for the calculation of

  11. Multi-parametric survey for archaeology: how and why, or how and why not?

    Science.gov (United States)

    Hesse, Albert

    1999-03-01

    Many papers or conference presentations, particularly over the last ten years, have referred to multi-parametric geophysical surveys and integrated interpretations in archaeological prospection. Several experiments of this kind have been undertaken by our laboratory, with mostly fascinating results, but our experience leads us to be rather suspicious of the over-systematic choice of extreme solutions and we would recommend an appropriate and balanced choice, within the limits of the budget available for an operation, between the two following procedures: 1) Routine survey with an extremely large variety of instruments: this allows a better understanding of the underground situation than survey with a single instrument but reduces the area that can be surveyed. A limited number of specific circumstances should lead one to adopt this option. They include: previous knowledge or equally previous ignorance of the targets under investigation, preliminary selection of the most efficient method on a scientific and economic basis, comparative experiments for the validation of new tools, specific detection of targets of different nature into the ground as well as uncertainty about the efficiency of each available method for the actual nature of the investigated site. 2) Survey of a much larger area with only one method, chosen because it is particularly fast and efficient: there is an obvious value in extensive exploration in order to evaluate the size, distribution and limits of a large number of archaeological features. The strict selection of appropriate methods, chosen to meet the aims of a project should consider not only geophysics but all kinds of conventional or non-conventional archaeological methods as well, brought together to permit an integrated interpretation. This highly specialized job does not fall within the normal experience of exploration geophysicists who usually deal with geological features or most field archaeologists who are mainly involved in

  12. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  13. Automated prostate cancer detection via comprehensive multi-parametric magnetic resonance imaging texture feature models

    International Nuclear Information System (INIS)

    Khalvati, Farzad; Wong, Alexander; Haider, Masoom A.

    2015-01-01

    Prostate cancer is the most common form of cancer and the second leading cause of cancer death in North America. Auto-detection of prostate cancer can play a major role in early detection of prostate cancer, which has a significant impact on patient survival rates. While multi-parametric magnetic resonance imaging (MP-MRI) has shown promise in diagnosis of prostate cancer, the existing auto-detection algorithms do not take advantage of abundance of data available in MP-MRI to improve detection accuracy. The goal of this research was to design a radiomics-based auto-detection method for prostate cancer via utilizing MP-MRI data. In this work, we present new MP-MRI texture feature models for radiomics-driven detection of prostate cancer. In addition to commonly used non-invasive imaging sequences in conventional MP-MRI, namely T2-weighted MRI (T2w) and diffusion-weighted imaging (DWI), our proposed MP-MRI texture feature models incorporate computed high-b DWI (CHB-DWI) and a new diffusion imaging modality called correlated diffusion imaging (CDI). Moreover, the proposed texture feature models incorporate features from individual b-value images. A comprehensive set of texture features was calculated for both the conventional MP-MRI and new MP-MRI texture feature models. We performed feature selection analysis for each individual modality and then combined best features from each modality to construct the optimized texture feature models. The performance of the proposed MP-MRI texture feature models was evaluated via leave-one-patient-out cross-validation using a support vector machine (SVM) classifier trained on 40,975 cancerous and healthy tissue samples obtained from real clinical MP-MRI datasets. The proposed MP-MRI texture feature models outperformed the conventional model (i.e., T2w+DWI) with regard to cancer detection accuracy. Comprehensive texture feature models were developed for improved radiomics-driven detection of prostate cancer using MP-MRI. Using a

  14. Assessment of mechanical properties of isolated bovine intervertebral discs from multi-parametric magnetic resonance imaging.

    Science.gov (United States)

    Recuerda, Maximilien; Périé, Delphine; Gilbert, Guillaume; Beaudoin, Gilles

    2012-10-12

    The treatment planning of spine pathologies requires information on the rigidity and permeability of the intervertebral discs (IVDs). Magnetic resonance imaging (MRI) offers great potential as a sensitive and non-invasive technique for describing the mechanical properties of IVDs. However, the literature reported small correlation coefficients between mechanical properties and MRI parameters. Our hypothesis is that the compressive modulus and the permeability of the IVD can be predicted by a linear combination of MRI parameters. Sixty IVDs were harvested from bovine tails, and randomly separated in four groups (in-situ, digested-6h, digested-18h, digested-24h). Multi-parametric MRI acquisitions were used to quantify the relaxation times T1 and T2, the magnetization transfer ratio MTR, the apparent diffusion coefficient ADC and the fractional anisotropy FA. Unconfined compression, confined compression and direct permeability measurements were performed to quantify the compressive moduli and the hydraulic permeabilities. Differences between groups were evaluated from a one way ANOVA. Multi linear regressions were performed between dependent mechanical properties and independent MRI parameters to verify our hypothesis. A principal component analysis was used to convert the set of possibly correlated variables into a set of linearly uncorrelated variables. Agglomerative Hierarchical Clustering was performed on the 3 principal components. Multilinear regressions showed that 45 to 80% of the Young's modulus E, the aggregate modulus in absence of deformation HA0, the radial permeability kr and the axial permeability in absence of deformation k0 can be explained by the MRI parameters within both the nucleus pulposus and the annulus pulposus. The principal component analysis reduced our variables to two principal components with a cumulative variability of 52-65%, which increased to 70-82% when considering the third principal component. The dendograms showed a natural

  15. Quantitative Multi-Parametric Magnetic Resonance Imaging of Tumor Response to Photodynamic Therapy.

    Directory of Open Access Journals (Sweden)

    Tom J L Schreurs

    Full Text Available The aim of this study was to characterize response to photodynamic therapy (PDT in a mouse cancer model using a multi-parametric quantitative MRI protocol and to identify MR parameters as potential biomarkers for early assessment of treatment outcome.CT26.WT colon carcinoma tumors were grown subcutaneously in the hind limb of BALB/c mice. Therapy consisted of intravenous injection of the photosensitizer Bremachlorin, followed by 10 min laser illumination (200 mW/cm2 of the tumor 6 h post injection. MRI at 7 T was performed at baseline, directly after PDT, as well as at 24 h, and 72 h. Tumor relaxation time constants (T1 and T2 and apparent diffusion coefficient (ADC were quantified at each time point. Additionally, Gd-DOTA dynamic contrast-enhanced (DCE MRI was performed to estimate transfer constants (Ktrans and volume fractions of the extravascular extracellular space (ve using standard Tofts-Kermode tracer kinetic modeling. At the end of the experiment, tumor viability was characterized by histology using NADH-diaphorase staining.The therapy induced extensive cell death in the tumor and resulted in significant reduction in tumor growth, as compared to untreated controls. Tumor T1 and T2 relaxation times remained unchanged up to 24 h, but decreased at 72 h after treatment. Tumor ADC values significantly increased at 24 h and 72 h. DCE-MRI derived tracer kinetic parameters displayed an early response to the treatment. Directly after PDT complete vascular shutdown was observed in large parts of the tumors and reduced uptake (decreased Ktrans in remaining tumor tissue. At 24 h, contrast uptake in most tumors was essentially absent. Out of 5 animals that were monitored for 2 weeks after treatment, 3 had tumor recurrence, in locations that showed strong contrast uptake at 72 h.DCE-MRI is an effective tool for visualization of vascular effects directly after PDT. Endogenous contrast parameters T1, T2, and ADC, measured at 24 to 72 h after PDT, are

  16. Una introduzione ai software per il crime mapping / Observations préliminaires sur les logiciels du mappage du crime / Some introductory notes on crime mapping software

    OpenAIRE

    Ummarino Alessandro

    2013-01-01

    RiassuntoIl Crime Mapping più che una disciplina a se stante non è altro che l’applicazione di tecniche di analisi statistico-geografica allo studio dei reati. Grazie all’utilizzo dei software GIS (Geographic Information System), all’esponenziale sviluppo dell’informatica e alla facile accessibilità al web, la produzione di mappe di qualità è ormai alla portata di un qualunque utente medio. La possibilità di applicare tali tecniche di analisi è offerta in modo efficace da software GIS commerc...

  17. Ionospheric Mapping Software Ensures Accuracy of Pilots GPS

    Science.gov (United States)

    2015-01-01

    IonoSTAGE and SuperTruth software are part of a suite created at the Jet Propulsion Laboratory to enable the Federal Aviation Administration's Wide Area Augmentation System, which provides pinpoint accuracy in aircraft GPS units. The system, used by more than 73,000 planes, facilitates landings under adverse conditions at small airports. In 2013, IonoSTAGE and SuperTruth found their first commercial license when NEC, based in Japan, with US headquarters in Irving, Texas, licensed the entire suite.

  18. Cyberphysical systems for epilepsy and related brain disorders multi-parametric monitoring and analysis for diagnosis and optimal disease management

    CERN Document Server

    Antonopoulos, Christos

    2015-01-01

    This book introduces a new cyberphysical system that combines clinical and basic neuroscience research with advanced data analysis and medical management tools for developing novel applications for the management of epilepsy. The authors describe the algorithms and architectures needed to provide ambulatory, diagnostic and long-term monitoring services, through multi parametric data collection. Readers will see how to achieve in-hospital quality standards, addressing conventional “routine” clinic-based service purposes, at reduced cost, enhanced capability, and increased geographical availability. The cyberphysical system described in this book is flexible, can be optimized for each patient, and is demonstrated in several case studies.

  19. Development of a reconstruction software of elemental maps by micro X-ray fluorescence

    International Nuclear Information System (INIS)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela; Cardoso, Simone Coutinho; Moreira, Silvana

    2009-01-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline at National Synchrotron Light Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in form of a matrix of data. (author)

  20. Development of a software for reconstruction of X-ray fluorescence intensity maps

    International Nuclear Information System (INIS)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela; Cardoso, Simone Coutinho; Moreira, Silvana

    2009-01-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline of XRF at Synchrotron Light National Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in the form of a matrix of data. (author)

  1. Development of a software for reconstruction of X-ray fluorescence intensity maps

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho, E-mail: simone@if.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana, E-mail: silvana@fec.unicamp.b [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline of XRF at Synchrotron Light National Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in the form of a matrix of data. (author)

  2. Development of a reconstruction software of elemental maps by micro X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Energia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho [Universidade Federal do Rio de Janeiro (IF/UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana [Universidade Estadual de Campinas (FEC/UNICAMP), SP (Brazil) Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline at National Synchrotron Light Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in form of a matrix of data. (author)

  3. Automated diagnosis of prostate cancer in multi-parametric MRI based on multimodal convolutional neural networks

    Science.gov (United States)

    Le, Minh Hung; Chen, Jingyu; Wang, Liang; Wang, Zhiwei; Liu, Wenyu; (Tim Cheng, Kwang-Ting; Yang, Xin

    2017-08-01

    Automated methods for prostate cancer (PCa) diagnosis in multi-parametric magnetic resonance imaging (MP-MRIs) are critical for alleviating requirements for interpretation of radiographs while helping to improve diagnostic accuracy (Artan et al 2010 IEEE Trans. Image Process. 19 2444-55, Litjens et al 2014 IEEE Trans. Med. Imaging 33 1083-92, Liu et al 2013 SPIE Medical Imaging (International Society for Optics and Photonics) p 86701G, Moradi et al 2012 J. Magn. Reson. Imaging 35 1403-13, Niaf et al 2014 IEEE Trans. Image Process. 23 979-91, Niaf et al 2012 Phys. Med. Biol. 57 3833, Peng et al 2013a SPIE Medical Imaging (International Society for Optics and Photonics) p 86701H, Peng et al 2013b Radiology 267 787-96, Wang et al 2014 BioMed. Res. Int. 2014). This paper presents an automated method based on multimodal convolutional neural networks (CNNs) for two PCa diagnostic tasks: (1) distinguishing between cancerous and noncancerous tissues and (2) distinguishing between clinically significant (CS) and indolent PCa. Specifically, our multimodal CNNs effectively fuse apparent diffusion coefficients (ADCs) and T2-weighted MP-MRI images (T2WIs). To effectively fuse ADCs and T2WIs we design a new similarity loss function to enforce consistent features being extracted from both ADCs and T2WIs. The similarity loss is combined with the conventional classification loss functions and integrated into the back-propagation procedure of CNN training. The similarity loss enables better fusion results than existing methods as the feature learning processes of both modalities are mutually guided, jointly facilitating CNN to ‘see’ the true visual patterns of PCa. The classification results of multimodal CNNs are further combined with the results based on handcrafted features using a support vector machine classifier. To achieve a satisfactory accuracy for clinical use, we comprehensively investigate three critical factors which could greatly affect the performance of our

  4. IDAS, software support for mathematical models and map-based graphics

    International Nuclear Information System (INIS)

    Birnbaum, M.D.; Wecker, D.B.

    1984-01-01

    IDAS (Intermediate Dose Assessment System) was developed for the U.S. Nuclear Regulatory Commission as a hardware/software host for radiological models and display of map-based plume graphics at the Operations Center (HQ), regional incident response centers, and site emergency facilities. IDAS design goals acknowledged the likelihood of future changes in the suite of models and the composition of map features for analysis and graphical display. IDAS provides a generalized software support environment to programmers and users of modeling programs. A database manager process provides multi-user access control to all input and output data for modeling programs. A programmer-created data description file (schema) specifies data field names, data types, legal and recommended ranges, default values, preferred units of measurement, and ''help'' text. Subroutine calls to IDAS from a model program invoke a consistent user interface which can show any of the schema contents, convert units of measurement, and route data to multiple logical devices, including the database. A stand-alone data editor allows the user to read and write model data records without execution of a model. IDAS stores digitized map features in a 4-level naming hierarchy. A user can select the map icon, color, and whether to show a stored name tag, for each map feature. The user also selects image scale (zoom) within limits set by map digitization. The resulting image combines static map information, computed analytic modeling results, and the user's feature selections for display to decision-makers

  5. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  6. MAPS-15504 - Uma metodologia automatizada para avaliação de processo de software

    Directory of Open Access Journals (Sweden)

    Itana Maria de Souza Gimenes

    2000-05-01

    Full Text Available Devido às crescentes exigências por qualidade, a comunidade de engenharia de software tem produzido diversas normas e apresentado diversas abordagens sobre a qualidade dos produtos e processos de software. Grande parte dessas normas são aplicadas ao processo de software, dentre os quais se destacam pela larga utilização a ISO 9000-3, a ISO 12207, o CMM e o ISO/IEC TR 15504 (resultado dos trabalhos do projeto SPICE. Outro resultado das pesquisas da comunidade de engenharia de software são os ambientes de engenharia de software centrados em processo (PSEE, os quais visam à automação do processo de software. Este artigo apresenta MAPS-15504, uma metodologia automatizada para avaliação da qualidade do processo de software baseada no ISO/IEC TR 15504. A metodologia de avaliação de processo de software foi aplicada a um estudo de caso e implementada no ambiente do ExPSEE, um ambiente experimental desenvolvido no Departamento de Informática (DIN da Universidade Estadual de Maringá (UEM

  7. ECG strain pattern in hypertension is associated with myocardial cellular expansion and diffuse interstitial fibrosis: a multi-parametric cardiac magnetic resonance study.

    Science.gov (United States)

    Rodrigues, Jonathan C L; Amadu, Antonio Matteo; Ghosh Dastidar, Amardeep; McIntyre, Bethannie; Szantho, Gergley V; Lyen, Stephen; Godsave, Cattleya; Ratcliffe, Laura E K; Burchell, Amy E; Hart, Emma C; Hamilton, Mark C K; Nightingale, Angus K; Paton, Julian F R; Manghat, Nathan E; Bucciarelli-Ducci, Chiara

    2017-04-01

    In hypertension, the presence of left ventricular (LV) strain pattern on 12-lead electrocardiogram (ECG) carries adverse cardiovascular prognosis. The underlying mechanisms are poorly understood. We investigated whether hypertensive ECG strain is associated with myocardial interstitial fibrosis and impaired myocardial strain, assessed by multi-parametric cardiac magnetic resonance (CMR). A total of 100 hypertensive patients [50 ± 14 years, male: 58%, office systolic blood pressure (SBP): 170 ± 30 mmHg, office diastolic blood pressure (DBP): 97 ± 14 mmHg) underwent ECG and 1.5T CMR and were compared with 25 normotensive controls (46 ± 14 years, 60% male, SBP: 124 ± 8 mmHg, DBP: 76 ± 7 mmHg). Native T1 and extracellular volume fraction (ECV) were calculated with the modified look-locker inversion-recovery sequence. Myocardial strain values were estimated with voxel-tracking software. ECG strain (n = 20) was associated with significantly higher indexed LV mass (LVM) (119 ± 32 vs. 80 ± 17 g/m2, P ECG strain (n = 80). ECG strain subjects had significantly impaired circumferential strain compared with hypertensive subjects without ECG strain and controls (-15.2 ± 4.7 vs. -17.0 ± 3.3 vs. -17.3 ± 2.4%, P ECG strain subjects to hypertensive subjects with elevated LVM but no ECG strain, a significantly higher ECV (30 ± 4 vs. 28 ± 3%, P ECG strain in multivariate logistic regression analysis [odds ratio (95th confidence interval): 1.07 (1.02-1.12), P ECG strain is a marker of advanced LVH associated with increased interstitial fibrosis and associated with significant myocardial circumferential strain impairment. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  8. Software support for SBGN maps: SBGN-ML and LibSBGN.

    Science.gov (United States)

    van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk

    2012-08-01

    LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.

  9. Recent developments in multi-parametric three-dimensional stress field representation in plates weakened by cracks and notches

    Directory of Open Access Journals (Sweden)

    P. Lazzarin

    2013-07-01

    Full Text Available The paper deals with the three-dimensional nature and the multi-parametric representation of the stress field ahead of cracks and notches of different shape. Finite thickness plates are considered, under different loading conditions. Under certain hypotheses, the three-dimensional governing equations of elasticity can be reduced to a system where a bi-harmonic equation and a harmonic equation have to be simultaneously satisfied. The former provides the solution of the corresponding plane notch problem, the latter provides the solution of the corresponding out-of-plane shear notch problem. The analytical frame is applied to some notched and cracked geometries and its degree of accuracy is discussed comparing theoretical results and numerical data from 3D FE models.

  10. Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma.

    Directory of Open Access Journals (Sweden)

    Leland S Hu

    Full Text Available Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM. Contrast-enhanced MRI (CE-MRI targets enhancing core (ENH but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT, despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM.We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs <80% tumor nuclei for corresponding samples. In a training set, we used three texture analysis algorithms and three ML methods to identify MRI-texture features that optimized model accuracy to distinguish tumor content. We confirmed model accuracy in a separate validation set.We collected 82 biopsies from 18 GBMs throughout ENH and BAT. The MRI-based model achieved 85% cross-validated accuracy to diagnose high- vs low-tumor in the training set (60 biopsies, 11 patients. The model achieved 81.8% accuracy in the validation set (22 biopsies, 7 patients.Multi-parametric MRI and texture analysis can help characterize and visualize GBM's spatial histologic heterogeneity to identify regional tumor-rich biopsy targets.

  11. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks

    KAUST Repository

    Lautenschlä ger, Karin; Hwang, Chiachi; Liu, Wentso; Boon, Nico; Kö ster, Oliver; Vrouwenvelder, Johannes S.; Egli, Thomas; Hammes, Frederik A.

    2013-01-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (±0.6)×104cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, sofar for unknown reasons, recorded a slight but significantly higher TCC (1.3(±0.1)×105cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used provides a powerful

  12. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks

    KAUST Repository

    Lautenschläger, Karin

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (±0.6)×104cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, sofar for unknown reasons, recorded a slight but significantly higher TCC (1.3(±0.1)×105cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used provides a powerful

  13. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  14. Close-range geophotogrammetric mapping of trench walls using multi-model stereo restitution software

    International Nuclear Information System (INIS)

    Coe, J.A.; Taylor, E.M.; Schilling, S.P.

    1991-01-01

    Methods for mapping geologic features exposed on trench walls have advanced from conventional gridding and sketch mapping to precise close-range photogrammetric mapping. In our study, two strips of small-format (60 x 60) stereo pairs, each containing 42 photos and covering approximately 60 m of nearly vertical trench wall (2-4 m high), were contact printed onto eight 205 x 255-mm transparent film sheets. Each strip was oriented in a Kern DSR15 analytical plotter using the bundle adjustment module of Multi-Model Stereo Restitution Software (MMSRS). We experimented with several systematic-control-point configurations to evaluate orientation accuracies as a function of the number and position of control points. We recommend establishing control-point columns (each containing 2-3 points) in every 5th photo to achieve the 7-mm Root Mean Square Error (RMSE) accuracy required by our trench-mapping project. 7 refs., 8 figs., 1 tab

  15. Close-range geophotogrammetric mapping of trench walls using multi-model stereo restitution software

    Energy Technology Data Exchange (ETDEWEB)

    Coe, J.A.; Taylor, E.M.; Schilling, S.P.

    1991-06-01

    Methods for mapping geologic features exposed on trench walls have advanced from conventional gridding and sketch mapping to precise close-range photogrammetric mapping. In our study, two strips of small-format (60 {times} 60) stereo pairs, each containing 42 photos and covering approximately 60 m of nearly vertical trench wall (2-4 m high), were contact printed onto eight 205 {times} 255-mm transparent film sheets. Each strip was oriented in a Kern DSR15 analytical plotter using the bundle adjustment module of Multi-Model Stereo Restitution Software (MMSRS). We experimented with several systematic-control-point configurations to evaluate orientation accuracies as a function of the number and position of control points. We recommend establishing control-point columns (each containing 2-3 points) in every 5th photo to achieve the 7-mm Root Mean Square Error (RMSE) accuracy required by our trench-mapping project. 7 refs., 8 figs., 1 tab.

  16. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  17. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  18. IClinfMRI Software for Integrating Functional MRI Techniques in Presurgical Mapping and Clinical Studies.

    Science.gov (United States)

    Hsu, Ai-Ling; Hou, Ping; Johnson, Jason M; Wu, Changwei W; Noll, Kyle R; Prabhu, Sujit S; Ferguson, Sherise D; Kumar, Vinodh A; Schomer, Donald F; Hazle, John D; Chen, Jyh-Horng; Liu, Ho-Ling

    2018-01-01

    Task-evoked and resting-state (rs) functional magnetic resonance imaging (fMRI) techniques have been applied to the clinical management of neurological diseases, exemplified by presurgical localization of eloquent cortex, to assist neurosurgeons in maximizing resection while preserving brain functions. In addition, recent studies have recommended incorporating cerebrovascular reactivity (CVR) imaging into clinical fMRI to evaluate the risk of lesion-induced neurovascular uncoupling (NVU). Although each of these imaging techniques possesses its own advantage for presurgical mapping, a specialized clinical software that integrates the three complementary techniques and promptly outputs the analyzed results to radiology and surgical navigation systems in a clinical format is still lacking. We developed the Integrated fMRI for Clinical Research (IClinfMRI) software to facilitate these needs. Beyond the independent processing of task-fMRI, rs-fMRI, and CVR mapping, IClinfMRI encompasses three unique functions: (1) supporting the interactive rs-fMRI mapping while visualizing task-fMRI results (or results from published meta-analysis) as a guidance map, (2) indicating/visualizing the NVU potential on analyzed fMRI maps, and (3) exporting these advanced mapping results in a Digital Imaging and Communications in Medicine (DICOM) format that are ready to export to a picture archiving and communication system (PACS) and a surgical navigation system. In summary, IClinfMRI has the merits of efficiently translating and integrating state-of-the-art imaging techniques for presurgical functional mapping and clinical fMRI studies.

  19. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks.

    Science.gov (United States)

    Lautenschlager, Karin; Hwang, Chiachi; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Vrouwenvelder, Hans; Egli, Thomas; Hammes, Frederik

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52 h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (± 0.6) × 10(4) cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, so far for unknown reasons, recorded a slight but significantly higher TCC (1.3 (± 0.1) × 10(5) cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used

  20. Software Development Initiatives to Identify and Mitigate Security Threats - Two Systematic Mapping Studies

    Directory of Open Access Journals (Sweden)

    Paulina Silva

    2016-12-01

    Full Text Available Software Security and development experts have addressed the problem of building secure software systems. There are several processes and initiatives to achieve secure software systems. However, most of these lack empirical evidence of its application and impact in building secure software systems. Two systematic mapping studies (SM have been conducted to cover the existent initiatives for identification and mitigation of security threats. The SMs created were executed in two steps, first in 2015 July, and complemented through a backward snowballing in 2016 July. Integrated results of these two SM studies show a total of 30 relevant sources were identified; 17 different initiatives covering threats identification and 14 covering the mitigation of threats were found. All the initiatives were associated to at least one activity of the Software Development Lifecycle (SDLC; while 6 showed signs of being applied in industrial settings, only 3 initiatives presented experimental evidence of its results through controlled experiments, some of the other selected studies presented case studies or proposals.

  1. Una introduzione ai software per il crime mapping / Observations préliminaires sur les logiciels du mappage du crime / Some introductory notes on crime mapping software

    Directory of Open Access Journals (Sweden)

    Ummarino Alessandro

    2013-03-01

    Full Text Available RiassuntoIl Crime Mapping più che una disciplina a se stante non è altro che l’applicazione di tecniche di analisi statistico-geografica allo studio dei reati. Grazie all’utilizzo dei software GIS (Geographic Information System, all’esponenziale sviluppo dell’informatica e alla facile accessibilità al web, la produzione di mappe di qualità è ormai alla portata di un qualunque utente medio. La possibilità di applicare tali tecniche di analisi è offerta in modo efficace da software GIS commerciali e da software GIS free e open source. Chi si vuole avvicinare a questa disciplina, sia che intenda procedere con applicazioni di tipo tattico (pianificazione dei controlli, attività di prevenzione, investigazioni giudiziarie, etc. sia che intenda svolgere degli studi di tipo sociologico (criminalità, devianza, illegalità diffusa, percezione della sicurezza, etc., deve comunque acquisire una solida preparazione di base nell’utilizzo di programmi GIS prima di inferire generalizzazioni dai risultati utilizzando chiavi di lettura provenienti dalle scienze sociali. Il Crime Mapping può trovare una valida applicazione nell’ambito di una generale attività di polizia, soprattutto a livello locale, per la gestione delle risorse destinate alla sicurezza, per la programmazione dei servizi di polizia e soprattutto quale supporto di tipo tattico nell’ambito di attività mirate alla repressione e alla prevenzione di specifici atti criminosi e illeciti. Le mappage du crime n’est pas simplement une discipline en soi, mais une application de techniques d’analyse statistiques et géographiques à l’étude du crime. Grâce au développement exponentiel de l’informatique et à l’accessibilité du Web , tous les utilisateurs moyens ont désormais la possibilité de produire des cartes des crimes de qualité avec le logiciel SIG (système d'information géographique (GIS - Geographic Information System. Aujourd’hui la possibilité de se

  2. Histological correlation of 7 T multi-parametric MRI performed in ex-vivo Achilles tendon

    Energy Technology Data Exchange (ETDEWEB)

    Juras, Vladimir [Center of Excellence for High Field MR, Department of Radiology, Medical University of Vienna Waehringer Guertel 18-20, A-1090, Vienna (Austria); Institute of Measurement Science, Department of Imaging Methods, Dubravska cesta 9, 84104, Bratislava (Slovakia); Apprich, Sebastian; Pressl, Christina; Zbyn, Stefan [Center of Excellence for High Field MR, Department of Radiology, Medical University of Vienna Waehringer Guertel 18-20, A-1090, Vienna (Austria); Szomolanyi, Pavol [Center of Excellence for High Field MR, Department of Radiology, Medical University of Vienna Waehringer Guertel 18-20, A-1090, Vienna (Austria); Institute of Measurement Science, Department of Imaging Methods, Dubravska cesta 9, 84104, Bratislava (Slovakia); Domayer, Stephan; Hofstaetter, Jochen G. [Department of Orthopedic Surgery, Vienna General Hospital, Medical University of Vienna, A-1090 Vienna (Austria); Trattnig, Siegfried, E-mail: siegfried.trattnig@meduniwien.ac.at [Center of Excellence for High Field MR, Department of Radiology, Medical University of Vienna Waehringer Guertel 18-20, A-1090, Vienna (Austria)

    2013-05-15

    Introduction: The goal of this in vitro validation study was to investigate the feasibility of biochemical MRI techniques, such as sodium imaging, T{sub 2} mapping, fast imaging with steady state precession (FISP), and reversed FISP (PSIF), as potential markers for collagen, glycosaminoglycan and water content in the Achilles tendon. Materials and methods: Five fresh cadaver ankles acquired from a local anatomy department were used in the study. To acquire a sodium signal from the Achilles tendon, a 3D-gradient-echo sequence, optimized for sodium imaging, was used with TE = 7.71 ms and TR = 17 ms. The T{sub 2} relaxation times were obtained using a multi-echo, spin-echo technique with a repetition time (TR) of 1200 ms and six echo times. A 3D, partially balanced, steady-state gradient echo pulse sequence was used to acquire FISP and PSIF images, with TR/TE = 6.96/2.46 ms. MRI parameters were correlated with each other, as well as with histologically assessed glycosaminoglycan and water content in cadaver Achilles tendons. Results: The highest relevant Pearson correlation coefficient was found between sodium SNR and glycosaminoglycan content (r = 0.71, p = 0.007). Relatively high correlation was found between the PSIF signal and T{sub 2} values (r = 0.51, p = 0.036), and between the FISP signal and T{sub 2} values (r = 0.56, p = 0.047). Other correlations were found to be below the moderate level. Conclusion: This study demonstrated the feasibility of progressive biochemical MRI methods for the imaging of the AT. A GAG-specific, contrast-free method (sodium imaging), as well as collagen- and water-sensitive methods (T{sub 2} mapping, FISP, PSIF), may be used in fast-relaxing tissues, such as tendons, in reasonable scan times.

  3. Histological correlation of 7 T multi-parametric MRI performed in ex-vivo Achilles tendon

    International Nuclear Information System (INIS)

    Juras, Vladimir; Apprich, Sebastian; Pressl, Christina; Zbyn, Stefan; Szomolanyi, Pavol; Domayer, Stephan; Hofstaetter, Jochen G.; Trattnig, Siegfried

    2013-01-01

    Introduction: The goal of this in vitro validation study was to investigate the feasibility of biochemical MRI techniques, such as sodium imaging, T 2 mapping, fast imaging with steady state precession (FISP), and reversed FISP (PSIF), as potential markers for collagen, glycosaminoglycan and water content in the Achilles tendon. Materials and methods: Five fresh cadaver ankles acquired from a local anatomy department were used in the study. To acquire a sodium signal from the Achilles tendon, a 3D-gradient-echo sequence, optimized for sodium imaging, was used with TE = 7.71 ms and TR = 17 ms. The T 2 relaxation times were obtained using a multi-echo, spin-echo technique with a repetition time (TR) of 1200 ms and six echo times. A 3D, partially balanced, steady-state gradient echo pulse sequence was used to acquire FISP and PSIF images, with TR/TE = 6.96/2.46 ms. MRI parameters were correlated with each other, as well as with histologically assessed glycosaminoglycan and water content in cadaver Achilles tendons. Results: The highest relevant Pearson correlation coefficient was found between sodium SNR and glycosaminoglycan content (r = 0.71, p = 0.007). Relatively high correlation was found between the PSIF signal and T 2 values (r = 0.51, p = 0.036), and between the FISP signal and T 2 values (r = 0.56, p = 0.047). Other correlations were found to be below the moderate level. Conclusion: This study demonstrated the feasibility of progressive biochemical MRI methods for the imaging of the AT. A GAG-specific, contrast-free method (sodium imaging), as well as collagen- and water-sensitive methods (T 2 mapping, FISP, PSIF), may be used in fast-relaxing tissues, such as tendons, in reasonable scan times

  4. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  5. Integrating science and education during an international, multi-parametric investigation of volcanic activity at Santiaguito volcano, Guatemala

    Science.gov (United States)

    Lavallée, Yan; Johnson, Jeffrey; Andrews, Benjamin; Wolf, Rudiger; Rose, William; Chigna, Gustavo; Pineda, Armand

    2016-04-01

    In January 2016, we held the first scientific/educational Workshops on Volcanoes (WoV). The workshop took place at Santiaguito volcano - the most active volcano in Guatemala. 69 international scientists of all ages participated in this intensive, multi-parametric investigation of the volcanic activity, which included the deployment of seismometers, tiltmeters, infrasound microphones and mini-DOAS as well as optical, thermographic, UV and FTIR cameras around the active vent. These instruments recorded volcanic activity in concert over a period of 3 to 9 days. Here we review the research activities and present some of the spectacular observations made through this interdisciplinary efforts. Observations range from high-resolution drone and IR footage of explosions, monitoring of rock falls and quantification of the erupted mass of different gases and ash, as well as morphological changes in the dome caused by recurring explosions (amongst many other volcanic processes). We will discuss the success of such integrative ventures in furthering science frontiers and developing the next generation of geoscientists.

  6. Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma.

    Science.gov (United States)

    Hu, Leland S; Ning, Shuluo; Eschbacher, Jennifer M; Gaw, Nathan; Dueck, Amylou C; Smith, Kris A; Nakaji, Peter; Plasencia, Jonathan; Ranjbar, Sara; Price, Stephen J; Tran, Nhan; Loftus, Joseph; Jenkins, Robert; O'Neill, Brian P; Elmquist, William; Baxter, Leslie C; Gao, Fei; Frakes, David; Karis, John P; Zwart, Christine; Swanson, Kristin R; Sarkaria, Jann; Wu, Teresa; Mitchell, J Ross; Li, Jing

    2015-01-01

    Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM). Contrast-enhanced MRI (CE-MRI) targets enhancing core (ENH) but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT), despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML) algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM. We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs heterogeneity to identify regional tumor-rich biopsy targets.

  7. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  8. Zone-specific logistic regression models improve classification of prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim U.; Emberton, Mark [University College London, Research Department of Urology, Division of Surgery and Interventional Science, London (United Kingdom); Kirkham, Alex [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2015-09-15

    To assess the interchangeability of zone-specific (peripheral-zone (PZ) and transition-zone (TZ)) multiparametric-MRI (mp-MRI) logistic-regression (LR) models for classification of prostate cancer. Two hundred and thirty-one patients (70 TZ training-cohort; 76 PZ training-cohort; 85 TZ temporal validation-cohort) underwent mp-MRI and transperineal-template-prostate-mapping biopsy. PZ and TZ uni/multi-variate mp-MRI LR-models for classification of significant cancer (any cancer-core-length (CCL) with Gleason > 3 + 3 or any grade with CCL ≥ 4 mm) were derived from the respective cohorts and validated within the same zone by leave-one-out analysis. Inter-zonal performance was tested by applying TZ models to the PZ training-cohort and vice-versa. Classification performance of TZ models for TZ cancer was further assessed in the TZ validation-cohort. ROC area-under-curve (ROC-AUC) analysis was used to compare models. The univariate parameters with the best classification performance were the normalised T2 signal (T2nSI) within the TZ (ROC-AUC = 0.77) and normalized early contrast-enhanced T1 signal (DCE-nSI) within the PZ (ROC-AUC = 0.79). Performance was not significantly improved by bi-variate/tri-variate modelling. PZ models that contained DCE-nSI performed poorly in classification of TZ cancer. The TZ model based solely on maximum-enhancement poorly classified PZ cancer. LR-models dependent on DCE-MRI parameters alone are not interchangeable between prostatic zones; however, models based exclusively on T2 and/or ADC are more robust for inter-zonal application. (orig.)

  9. Comparison of Absolute Apparent Diffusion Coefficient (ADC) Values in ADC Maps Generated Across Different Postprocessing Software: Reproducibility in Endometrial Carcinoma.

    Science.gov (United States)

    Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan

    2017-12-01

    Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.

  10. The role of multi-parametric MR imaging in the detection of early inflammatory sacroiliitis according to ASAS criteria

    Energy Technology Data Exchange (ETDEWEB)

    Boy, Fatma Nur, E-mail: nursoylu@yahoo.com [Department of Radiology, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Kayhan, Arda, E-mail: arda_kayhan@yahoo.com [Health Services Vocational School, Esenyurt University, Dogan Arasli Bulvari No 120, Esenyurt, Istanbul (Turkey); Karakas, Hakki Muammer, E-mail: hakki.karakas@gmail.com [Department of Radiology, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Unlu-Ozkan, Feyza, E-mail: feyzamd@yahoo.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Silte, Duygu, E-mail: drduygusilte@hotmail.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Aktas, İlknur, E-mail: iaktas@hotmail.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey)

    2014-06-15

    Purpose: To retrospectively evaluate the accuracy of multi-parametric magnetic resonance (MR) imaging including fat saturated (FS) T2-weighted, short-tau inversion recovery (STIR), diffusion-weighted (DW-MR), and dynamic-contrast-enhanced MR (DCE-MR) imaging techniques in the diagnosis of early inflammatory sacroiliitis and determine the additional value of DW-MR and DCE-MR images according to recently defined ‘Assessment in SpondyloArthritis international Society’ criteria. Materials and methods: The study included 45 patients with back pain. Two radiologists estimated the likelihood of osteitis in 4 independent viewing sessions including FS T2-weighted, STIR, DW-MR and DCE-MR images. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic (ROC) curve (AUC) were calculated. Results: Of the 45 patients, 31 had inflammatory back pain. Of 31, 28 (90.3%) patients had inflammatory sacroiliitis diagnosed by clinical and laboratory analysis. FS T2-weighted MR images had the highest sensitivity (42.8% for both radiologists) for detecting osteitis in patients with inflammaory sacroiliitis when compared to other imaging sequences. For specificity, PPV, NPV, accuracy, and AUC levels there were no statistically significant difference between image viewing settings. However, adding STIR, DW-MR and DCE-MR images to the FS T2-weighted MR images did not improve the above stated indices. Conclusion: FS T2-weighted MR imaging had the highest sensitivity when compared to other imaging sequences. The addition of DW-MR and DCE-MR images did not significantly improve the diagnostic value of MR imaging in the diagnosis of osteitis for both experienced and less experienced radiologists.

  11. A multi-parametric imaging investigation of the response of C6 glioma xenografts to MLN0518 (tandutinib treatment.

    Directory of Open Access Journals (Sweden)

    Jessica K R Boult

    Full Text Available Angiogenesis, the development of new blood vessels, is essential for tumour growth; this process is stimulated by the secretion of numerous growth factors including platelet derived growth factor (PDGF. PDGF signalling, through its receptor platelet derived growth factor receptor (PDGFR, is involved in vessel maturation, stimulation of angiogenesis and upregulation of other angiogenic factors, including vascular endothelial growth factor (VEGF. PDGFR is a promising target for anti-cancer therapy because it is expressed on both tumour cells and stromal cells associated with the vasculature. MLN0518 (tandutinib is a potent inhibitor of type III receptor tyrosine kinases that demonstrates activity against PDGFRα/β, FLT3 and c-KIT. In this study a multi-parametric MRI and histopathological approach was used to interrogate changes in vascular haemodynamics, structural response and hypoxia in C6 glioma xenografts in response to treatment with MLN0518. The doubling time of tumours in mice treated with MLN0518 was significantly longer than tumours in vehicle treated mice. The perfused vessel area, number of alpha smooth muscle actin positive vessels and hypoxic area in MLN0518 treated tumours were also significantly lower after 10 days treatment. These changes were not accompanied by alterations in vessel calibre or fractional blood volume as assessed using susceptibility contrast MRI. Histological assessment of vessel size and total perfused area did not demonstrate any change with treatment. Intrinsic susceptibility MRI did not reveal any difference in baseline R2* or carbogen-induced change in R2*. Dynamic contrast-enhanced MRI revealed anti-vascular effects of MLN0518 following 3 days treatment. Hypoxia confers chemo- and radio-resistance, and alongside PDGF, is implicated in evasive resistance to agents targeted against VEGF signalling. PDGFR antagonists may improve potency and efficacy of other therapeutics in combination. This study highlights

  12. A multi-parametric imaging investigation of the response of C6 glioma xenografts to MLN0518 (tandutinib) treatment.

    Science.gov (United States)

    Boult, Jessica K R; Terkelsen, Jennifer; Walker-Samuel, Simon; Bradley, Daniel P; Robinson, Simon P

    2013-01-01

    Angiogenesis, the development of new blood vessels, is essential for tumour growth; this process is stimulated by the secretion of numerous growth factors including platelet derived growth factor (PDGF). PDGF signalling, through its receptor platelet derived growth factor receptor (PDGFR), is involved in vessel maturation, stimulation of angiogenesis and upregulation of other angiogenic factors, including vascular endothelial growth factor (VEGF). PDGFR is a promising target for anti-cancer therapy because it is expressed on both tumour cells and stromal cells associated with the vasculature. MLN0518 (tandutinib) is a potent inhibitor of type III receptor tyrosine kinases that demonstrates activity against PDGFRα/β, FLT3 and c-KIT. In this study a multi-parametric MRI and histopathological approach was used to interrogate changes in vascular haemodynamics, structural response and hypoxia in C6 glioma xenografts in response to treatment with MLN0518. The doubling time of tumours in mice treated with MLN0518 was significantly longer than tumours in vehicle treated mice. The perfused vessel area, number of alpha smooth muscle actin positive vessels and hypoxic area in MLN0518 treated tumours were also significantly lower after 10 days treatment. These changes were not accompanied by alterations in vessel calibre or fractional blood volume as assessed using susceptibility contrast MRI. Histological assessment of vessel size and total perfused area did not demonstrate any change with treatment. Intrinsic susceptibility MRI did not reveal any difference in baseline R2* or carbogen-induced change in R2*. Dynamic contrast-enhanced MRI revealed anti-vascular effects of MLN0518 following 3 days treatment. Hypoxia confers chemo- and radio-resistance, and alongside PDGF, is implicated in evasive resistance to agents targeted against VEGF signalling. PDGFR antagonists may improve potency and efficacy of other therapeutics in combination. This study highlights the challenges

  13. The role of multi-parametric MR imaging in the detection of early inflammatory sacroiliitis according to ASAS criteria

    International Nuclear Information System (INIS)

    Boy, Fatma Nur; Kayhan, Arda; Karakas, Hakki Muammer; Unlu-Ozkan, Feyza; Silte, Duygu; Aktas, İlknur

    2014-01-01

    Purpose: To retrospectively evaluate the accuracy of multi-parametric magnetic resonance (MR) imaging including fat saturated (FS) T2-weighted, short-tau inversion recovery (STIR), diffusion-weighted (DW-MR), and dynamic-contrast-enhanced MR (DCE-MR) imaging techniques in the diagnosis of early inflammatory sacroiliitis and determine the additional value of DW-MR and DCE-MR images according to recently defined ‘Assessment in SpondyloArthritis international Society’ criteria. Materials and methods: The study included 45 patients with back pain. Two radiologists estimated the likelihood of osteitis in 4 independent viewing sessions including FS T2-weighted, STIR, DW-MR and DCE-MR images. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic (ROC) curve (AUC) were calculated. Results: Of the 45 patients, 31 had inflammatory back pain. Of 31, 28 (90.3%) patients had inflammatory sacroiliitis diagnosed by clinical and laboratory analysis. FS T2-weighted MR images had the highest sensitivity (42.8% for both radiologists) for detecting osteitis in patients with inflammaory sacroiliitis when compared to other imaging sequences. For specificity, PPV, NPV, accuracy, and AUC levels there were no statistically significant difference between image viewing settings. However, adding STIR, DW-MR and DCE-MR images to the FS T2-weighted MR images did not improve the above stated indices. Conclusion: FS T2-weighted MR imaging had the highest sensitivity when compared to other imaging sequences. The addition of DW-MR and DCE-MR images did not significantly improve the diagnostic value of MR imaging in the diagnosis of osteitis for both experienced and less experienced radiologists

  14. Multi-parametric MRI at 14T for muscular dystrophy mice treated with AAV vector-mediated gene therapy.

    Directory of Open Access Journals (Sweden)

    Joshua Park

    Full Text Available The objective of this study was to investigate the efficacy of using quantitative magnetic resonance imaging (MRI as a non-invasive tool for the monitoring of gene therapy for muscular dystrophy. The clinical investigations for this family of diseases often involve surgical biopsy which limits the amount of information that can be obtained due to the invasive nature of the procedure. Thus, other non-invasive tools may provide more opportunities for disease assessment and treatment responses. In order to explore this, dystrophic mdx4cv mice were systemically treated with a recombinant adeno-associated viral (AAV vector containing a codon-optimized micro-dystrophin gene. Multi-parametric MRI of T2, magnetization transfer, and diffusion effects alongside 3-D volume measurements were then utilized to monitor disease/treatment progression. Mice were imaged at 10 weeks of age for pre-treatment, then again post-treatment at 8, 16, and 24 week time points. The efficacy of treatment was assessed by physiological assays for improvements in function and quantification of expression. Tissues from the hindlimbs were collected for histological analysis after the final time point for comparison with MRI results. We found that introduction of the micro-dystrophin gene restored some aspects of normal muscle histology and pathology such as decreased necrosis and resistance to contraction-induced injury. T2 relaxation values showed percentage decreases across all muscle types measured (tibialis anterior, gastrocnemius, and soleus when treated groups were compared to untreated groups. Additionally, the differences between groups were statistically significant for the tibialis anterior as well. The diffusion measurements showed a wider range of percentage changes and less statistical significance while the magnetization transfer effect measurements showed minimal change. MR images displayed hyper-intense regions of muscle that correlated with muscle pathology in

  15. Introduction to SNPP/VIIRS Flood Mapping Software Version 1.0

    Science.gov (United States)

    Li, S.; Sun, D.; Goldberg, M.; Sjoberg, W.; Santek, D.; Hoffman, J.

    2017-12-01

    Near real-time satellite-derived flood maps are invaluable to river forecasters and decision-makers for disaster monitoring and relief efforts. With support from the JPSS (Joint Polar Satellite System) Proving Ground and Risk Reduction (PGRR) Program, flood detection software has been developed using Suomi-NPP/VIIRS (Suomi National Polar-orbiting Partnership/Visible Infrared Imaging Radiometer Suite) imagery to automatically generate near real-time flood maps for National Weather Service (NWS) River Forecast Centers (RFC) in the USA. The software, which is called VIIRS NOAA GMU Flood Version 1.0 (hereafter referred to as VNG Flood V1.0), consists of a series of algorithms that include water detection, cloud shadow removal, terrain shadow removal, minor flood detection, water fraction retrieval, and floodwater determination. The software is designed for flood detection in any land region between 80°S and 80°N, and it has been running routinely with direct broadcast SNPP/VIIRS data at the Space Science and Engineering Center at the University of Wisconsin-Madison (UW/SSEC) and the Geographic Information Network of Alaska at the University of Alaska-Fairbanks (UAF/GINA) since 2014. Near real-time flood maps are distributed via the Unidata Local Data Manager (LDM), reviewed by river forecasters in AWIPS-II (the second generation of the Advanced Weather Interactive Processing System) and applied in flood operations. Initial feedback from operational forecasters on the product accuracy and performance has been largely positive. The software capability has also been extended to areas outside of the USA via a case-driven mode to detect major floods all over the world. Offline validation efforts include the visual inspection of over 10,000 VIIRS false-color composite images, an inter-comparison with MODIS automatic flood products and a quantitative evaluation using Landsat imagery. The steady performance from the 3-year routine process and the promising validation results

  16. Concept Maps as a strategy to asses learning in biochemistry using educational softwares

    Directory of Open Access Journals (Sweden)

    A. M. P. Azevedo

    2005-07-01

    Full Text Available This abstract reports  the  use of concept  maps applied  to the evaluation of concepts  learned  through the use of an educational software to study  metabolic  pathways called Diagrama Metabolico Dinamico Virtual  do Ciclo de Krebs (DMDV.  Experience  with the use of this method  was carried  through  with two distinct groups  of students.  The  first  group  was composed  by 24 students (in  2003 who used DMDV during  the  classes (computer room.  The second group was formed by 36 students (in 2004 who could access DMDV software anytime  through  the intranet. The construction of the conceptual map by the student permits  the representation of knowledge, the mental  processes that were absorved and the adaptation during the study,  building new mental schemes that could be related to the concept of reflexioning  abstraction (Piaget, 1995 during  the  process of operation  with  these  concepts.   The evaluation of knowlegde was made by the analysis  of three conceptual  maps constructed by each one of them:   (a  one map  before initiating the  study  with  DMDV,  (b  the  second just  after  the  study and (c the third  one two months  later.  We used the following criteria  for the analysis:  predominance of associative  over classificatory  character; correct concepts  and  relationships; coherence;  number  of relationships;  creativity and  logic.   The  initial  maps  showed  that all  students had  some  previous mental scheme  about  the proposed  concept.    All final  concept maps  showed  an  expansion  of the concepts  as compared  to the initial  maps, something  which can be seen even by a mere glance at the size of graphics.  A purely visual comparison  between the maps indicated  that new elements have been added.   The  associative  character has been shown to predominate as compared  to the  classificatory one.  The

  17. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    schemas for providing a framework that can help identify the categories that have attracted significant amount of research and commercial efforts, and the research areas where there are gaps to be filled. Conclusions: The findings show that whilst commercial and open source solutions are predominantly...... gaps. Objective: The objective of this research is to systematically identify and classify a comprehensive list of the technologies that have been developed and/or used for supporting GSD teams. Method: This study has been undertaken as a Systematic Mapping Study (SMS). Our searches identified 1958......Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil research...

  18. [Application evaluation of multi-parametric MRI in the diagnosis and differential diagnosis of early prostate cancer and prostatitis].

    Science.gov (United States)

    Li, P; Huang, Y; Li, Y; Cai, L; Ji, G H; Zheng, Y; Chen, Z Q

    2016-10-11

    Objective: To evaluate the value of multi-parametric MRI (Mp-MRI) in the diagnosis and differential diagnosis of early prostate cancer(PCa) in the peripheral zone(PZ) and low T 2 WI signal intensity of prostatitis. Methods: A total of 40 patients with PZ early PCa and 37 with prostatitis of hypointense T 2 WI signal in PZ were retrospectively analyzed, which were collected from the General Hospital of Ningxia Medical University from Janurary 2009 to June 2015, who underwent T 2 WI, DWI, and DCE-MRI examination and all patients were confirmed by pathology. All the data was transferred to GE Advanced Workstation AW4.3, the indexes divided into cancerous and prostatitis regions were calculated by Functool2 of signal intensity-time(SI-T) curve and ADC value, to calcuate the time to minimum(T max ), the whole enhancment degree (SI max ). ROC cure was used to determine the cutoff value for PCa detection with the ADC value. Result: On T 2 WI, 57.5% of PCa (23/40) showed focal nodular homogeneous low signal intensity, 70.3% of prostatitis(26/37) showed diffuse inhomogeneous low signal intensity. DCE-MRI, the distribution of curve types for malignant tumors was type Ⅰ 2.5%(1/40), typeⅡ32.5%(13/40) and type Ⅲ 65.0% (26/40). While the numbers for prostatitis was type Ⅰ 16.2%(6/37) , type Ⅱ 56.8% (21/37) and type Ⅲ 27.0% (10/37)respectively.The patterns of curve types in malignant lesions were different from benign lesions significantly(χ 2 =12.32, P prostatitis regions were (17.96±2.91)s, 1.76%±0.23% and (21.19±3.59)s, 1.53%±0.18%, respectively ( t =5.37, 6.10; P prostatitis regions were (0.95±0.13)×10 -3 mm 2 /s and (1.12±0.13)×10 -3 mm 2 /s, respectively ( t =7.10, P prostatitis from early PCa.

  19. Modelling the presence of myelin and oedema in the brain based on multi-parametric quantitative MRI

    Directory of Open Access Journals (Sweden)

    Marcel eWarntjes

    2016-02-01

    Full Text Available The aim of this study was to present a model that uses multi-parametric quantitative MRI to estimate the presence of myelin and oedema in the brain. The model relates simultaneous measurement of R1 and R2 relaxation rates and proton density to four partial volume compartments, consisting of myelin partial volume, cellular partial volume, free water partial volume and excess parenchymal water partial volume. The model parameters were obtained using spatially normalised brain images of a group of 20 healthy controls. The pathological brain was modelled in terms of the reduction of myelin content and presence of excess parenchymal water, which indicates the degree of oedema. The method was tested on spatially normalised brain images of a group of 20 age-matched multiple sclerosis (MS patients. Clear differences were observed with respect to the healthy controls: the MS group had a 79 mL smaller brain volume (1069 vs. 1148 mL, a 38 mL smaller myelin volume (119 vs. 157 mL and a 21 mL larger excess parenchymal water volume (78 vs. 57 mL. Template regions of interest of various brain structures indicated that the myelin partial volume in the MS group was 1.6±1.5% lower for grey matter (GM structures and 2.8±1.0% lower for white matter (WM structures. The excess parenchymal water partial volume was 9±10% larger for GM and 5±2% larger for WM. Manually placed ROIs indicated that the results using the template ROIs may have suffered from loss of anatomical detail due to the spatial normalization process. Examples of the application of the method on high-resolution images are provided for three individual subjects, a 45-year-old healthy subject, a 72-year-old healthy subject and a 45-year-old MS patient. The observed results agreed with the expected behaviour considering both age and disease. In conclusion, the proposed model may provide clinically important parameters such as the total brain volume, degree of myelination and degree of oedema, based on

  20. Multi-parametric MRI of rectal cancer – Do quantitative functional MR measurements correlate with radiologic and pathologic tumor stages?

    International Nuclear Information System (INIS)

    Attenberger, U.I.; Pilz, L.R.; Morelli, J.N.; Hausmann, D.; Doyon, F.; Hofheinz, R.; Kienle, P.; Post, S.; Michaely, H.J.; Schoenberg, S.O.; Dinter, D.J.

    2014-01-01

    Purpose: The purpose of this study is two-fold. First, to evaluate, whether functional rectal MRI techniques can be analyzed in a reproducible manner by different readers and second, to assess whether different clinical and pathologic T and N stages can be differentiated by functional MRI measurements. Materials and methods: 54 patients (38 men, 16 female; mean age 63.2 ± 12.2 years) with pathologically proven rectal cancer were included in this retrospective IRB-approved study. All patients were referred for a multi-parametric MRI protocol on a 3 Tesla MR-system, consisting of a high-resolution, axial T2 TSE sequence, DWI and perfusion imaging (plasma flow –s PF Tumor ) prior to any treatment. Two experienced radiologists evaluated the MRI measurements, blinded to clinical data and outcome. Inter-reader correlation and the association of functional MRI parameters with c- and p-staging were analyzed. Results: The inter-reader correlation for lymph node (ρ 0.76–0.94; p < 0.0002) and primary tumor (ρ 0.78–0.92; p < 0.0001) apparent diffusion coefficient and plasma flow (PF) values was good to very good. PF Tumor values decreased with cT stage with significant differences identified between cT2 and cT3 tumors (229 versus 107.6 ml/100 ml/min; p = 0.05). ADC Tumor values did not differ significantly. No substantial discrepancies in lymph node ADC Ln values or short axis diameter were found among cN1-3 stages, whereas PF Ln values were distinct between cN1 versus cN2 stages (p = 0.03). In the patients without neoadjuvant RCT no statistically significant differences in the assessed functional parameters on the basis of pathologic stage were found. Conclusion: This study illustrates that ADC as well as MR perfusion values can be analyzed with good interobserver agreement in patients with rectal cancer. Moreover, MR perfusion parameters may allow accurate differentiation of tumor stages. Both findings suggest that functional MRI parameters may help to discriminate

  1. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    Science.gov (United States)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.

  2. Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software

    Science.gov (United States)

    Parsons-Wingerter, P. A.; Vickerman, M. B.; Keith, P. A.

    2010-01-01

    Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of plants, animals and humans is significantly modified in such extraterrestrial environments. One physiological requirement shared by larger plants and animals with humans is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.

  3. VIBA-LAB2: a virtual ion beam analysis laboratory software package incorporating elemental map simulations

    International Nuclear Information System (INIS)

    Zhou, S.J.; Orlic, I.; Sanchez, J.L.; Watt, F.

    1999-01-01

    The software package VIBA-lab1, which incorporates PIXE and RBS energy spectra simulation has now been extended to include the simulation of elemental maps from 3D structures. VIBA-lab1 allows the user to define a wide variety of experimental parameters, e.g. energy and species of incident ions, excitation and detection geometry, etc. When the relevant experimental parameters as well as target composition are defined, the program can then simulate the corresponding PIXE and RBS spectra. VIBA-LAB2 has been written with applications in nuclear microscopy in mind. A set of drag-and-drop tools has been incorporated to allow the user to define a three-dimensional sample object of mixed elemental composition. PIXE energy spectra simulations are then carried out on pixel-by-pixel basis and the corresponding intensity distributions or elemental maps can be computed. Several simulated intensity distributions for some 3D objects are demonstrated, and simulations obtained from a simple IC are compared with experimental results

  4. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM)

    OpenAIRE

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tama...

  5. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  6. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  7. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  8. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  9. Volume of high-risk intratumoral subregions at multi-parametric MR imaging predicts overall survival and complements molecular analysis of glioblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yi; Li, Ruijiang [Stanford University, Department of Radiation Oncology, Palo Alto, CA (United States); Hokkaido University, Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido (Japan); Ren, Shangjie [Tianjin University, School of Electrical Engineering and Automation, Tianjin Shi (China); Tha, Khin Khin; Shirato, Hiroki [Hokkaido University, Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido (Japan); Hokkaido University, Department of Radiology and Nuclear Medicine, Hokkaido (Japan); Wu, Jia [Stanford University, Department of Radiation Oncology, Palo Alto, CA (United States)

    2017-09-15

    To develop and validate a volume-based, quantitative imaging marker by integrating multi-parametric MR images for predicting glioblastoma survival, and to investigate its relationship and synergy with molecular characteristics. We retrospectively analysed 108 patients with primary glioblastoma. The discovery cohort consisted of 62 patients from the cancer genome atlas (TCGA). Another 46 patients comprising 30 from TCGA and 16 internally were used for independent validation. Based on integrated analyses of T1-weighted contrast-enhanced (T1-c) and diffusion-weighted MR images, we identified an intratumoral subregion with both high T1-c and low ADC, and accordingly defined a high-risk volume (HRV). We evaluated its prognostic value and biological significance with genomic data. On both discovery and validation cohorts, HRV predicted overall survival (OS) (concordance index: 0.642 and 0.653, P < 0.001 and P = 0.038, respectively). HRV stratified patients within the proneural molecular subtype (log-rank P = 0.040, hazard ratio = 2.787). We observed different OS among patients depending on their MGMT methylation status and HRV (log-rank P = 0.011). Patients with unmethylated MGMT and high HRV had significantly shorter survival (median survival: 9.3 vs. 18.4 months, log-rank P = 0.002). Volume of the high-risk intratumoral subregion identified on multi-parametric MRI predicts glioblastoma survival, and may provide complementary value to genomic information. (orig.)

  10. Fault Localization Method by Partitioning Memory Using Memory Map and the Stack for Automotive ECU Software Testing

    Directory of Open Access Journals (Sweden)

    Kwanhyo Kim

    2016-09-01

    Full Text Available Recently, the usage of the automotive Electronic Control Unit (ECU and its software in cars is increasing. Therefore, as the functional complexity of such software increases, so does the likelihood of software-related faults. Therefore, it is important to ensure the reliability of ECU software in order to ensure automobile safety. For this reason, systematic testing methods are required that can guarantee software quality. However, it is difficult to locate a fault during testing with the current ECU development system because a tester performs the black-box testing using a Hardware-in-the-Loop (HiL simulator. Consequently, developers consume a large amount of money and time for debugging because they perform debugging without any information about the location of the fault. In this paper, we propose a method for localizing the fault utilizing memory information during black-box testing. This is likely to be of use to developers who debug automotive software. In order to observe whether symbols stored in the memory have been updated, the memory is partitioned by a memory map and the stack, thus the fault candidate region is reduced. A memory map method has the advantage of being able to finely partition the memory, and the stack method can partition the memory without a memory map. We validated these methods by applying these to HiL testing of the ECU for a body control system. The preliminary results indicate that a memory map and the stack reduce the possible fault locations to 22% and 19% of the updated memory, respectively.

  11. Development of Airport Noise Mapping using Matlab Software (Case Study: Adi Soemarmo Airport - Boyolali, Indonesia)

    Science.gov (United States)

    Andarani, Pertiwi; Setiyo Huboyo, Haryono; Setyanti, Diny; Budiawan, Wiwik

    2018-02-01

    Noise is considered as one of the main environmental impact of Adi Soemarmo International Airport (ASIA), the second largest airport in Central Java Province, Indonesia. In order to manage the noise of airport, airport noise mapping is necessary. However, a model that requires simple input but still reliable was not available in ASIA. Therefore, the objective of this study are to develop model using Matlab software, to verify its reliability by measuring actual noise exposure, and to analyze the area of noise levels‥ The model was developed based on interpolation or extrapolation of identified Noise-Power-Distance (NPD) data. In accordance with Indonesian Government Ordinance No.40/2012, the noise metric used is WECPNL (Weighted Equivalent Continuous Perceived Noise Level). Based on this model simulation, there are residence area in the region of noise level II (1.912 km2) and III (1.16 km2) and 18 school buildings in the area of noise levels I, II, and III. These land-uses are actually prohibited unless noise insulation is equipped. The model using Matlab in the case of Adi Soemarmo International Airport is valid based on comparison of the field measurement (6 sampling points). However, it is important to validate the model again once the case study (the airport) is changed.

  12. Improving flood risk mapping in Italy: the FloodRisk open-source software

    Science.gov (United States)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  13. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  14. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  15. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. Software process improvement: a systematic mapping study on the state of the art

    Directory of Open Access Journals (Sweden)

    Marco Kuhrmann

    2016-05-01

    Full Text Available Software process improvement (SPI has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small-to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%. Furthermore, we find a growing interest in success factors (approx. 16% to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%. Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.

  17. Study of Noise Map and its Features in an Indoor Work Environment through GIS-Based Software

    Directory of Open Access Journals (Sweden)

    Faramarz Majidi

    2016-06-01

    Full Text Available Background: Noise mapping in industry can be useful to assess the risks of harmful noise, or to monitor noise in machine rooms. Using GIS -based software for plotting noise maps in an indoor noisy work environment can be helpful for occupational hygienists to monitor noise pollution. Methods: This study was carried out in a noisy packaging unit of a food industry in Ghazvin industrial zone, to evaluate noise levels by GIS technique. For this reason the floor of packaging unit was divided into squares of 2×2 meters and the center of each square was marked as a measurement station based on NIOSH method. The sound pressure level in each station was measured and then the measurement values were imported into Arc GIS software to plot noise map. Results: Unlike the current method, the noise maps generated by GIS technique are consistent with the nature of sound propagation. Conclusion: This study showed that for an indoor work environment, the application of GIS technology rendering the assessment of noise levels in the form of noise maps, is more realistic and more accurate than the routine method which is now being used by the occupational hygienists.

  18. Mapping modern software process engineering techniques onto an HEP development environment

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R and D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software

  19. Mapping modern software process engineering techniques onto an HEP development environment

    Science.gov (United States)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  20. Application of Open Source Software by the Lunar Mapping and Modeling Project

    Science.gov (United States)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  1. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  2. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  3. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  4. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    Science.gov (United States)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  5. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  6. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  7. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  8. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  9. Perceptual Mapping Software as a Tool for Facilitating School-Based Consultation

    Science.gov (United States)

    Rush, S. Craig; Kalish, Ashley; Wheeler, Joanna

    2013-01-01

    Perceptual mapping is a systematic method for collecting, analyzing, and presenting group perceptions that is potentially useful in consultation. With input and feedback from a consultee group, perceptual mapping allows the consultant to capture the group's collective perceptions and display them as an organized image that may foster…

  10. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Directory of Open Access Journals (Sweden)

    S. A. Archfield

    2013-01-01

    Full Text Available Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  11. Towards a publicly available, map-based regional software tool to estimate unregulated daily streamflow at ungauged rivers

    Science.gov (United States)

    Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.

    2013-01-01

    Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.

  12. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  13. Analysis of pairwise correlations in multi-parametric PET/MR data for biological tumor characterization and treatment individualization strategies

    Energy Technology Data Exchange (ETDEWEB)

    Leibfarth, Sara; Moennich, David; Thorwarth, Daniela [University Hospital Tuebingen, Section for Biomedical Physics, Department of Radiation Oncology, Tuebingen (Germany); Simoncic, Urban [University Hospital Tuebingen, Section for Biomedical Physics, Department of Radiation Oncology, Tuebingen (Germany); University of Ljubljana, Faculty of Mathematics and Physics, Ljubljana (Slovenia); Jozef Stefan Institute, Ljubljana (Slovenia); Welz, Stefan; Zips, Daniel [University Hospital Tuebingen, Department of Radiation Oncology, Tuebingen (Germany); Schmidt, Holger; Schwenzer, Nina [University Hospital Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany)

    2016-07-15

    The aim of this pilot study was to explore simultaneous functional PET/MR for biological characterization of tumors and potential future treatment adaptations. To investigate the extent of complementarity between different PET/MR-based functional datasets, a pairwise correlation analysis was performed. Functional datasets of N=15 head and neck (HN) cancer patients were evaluated. For patients of group A (N=7), combined PET/MR datasets including FDG-PET and ADC maps were available. Patients of group B (N=8) had FMISO-PET, DCE-MRI and ADC maps from combined PET/MRI, an additional dynamic FMISO-PET/CT acquired directly after FMISO tracer injection as well as an FDG-PET/CT acquired a few days earlier. From DCE-MR, parameter maps K{sup trans}, v{sub e} and v{sub p} were obtained with the extended Tofts model. Moreover, parameter maps of mean DCE enhancement, ΔS{sub DCE}, and mean FMISO signal 0-4 min p.i., anti A{sub FMISO}, were derived. Pairwise correlations were quantified using the Spearman correlation coefficient (r) on both a voxel and a regional level within the gross tumor volume. Between some pairs of functional imaging modalities moderate correlations were observed with respect to the median over all patient datasets, whereas distinct correlations were only present on an individual basis. Highest inter-modality median correlations on the voxel level were obtained for FDG/FMISO (r = 0.56), FDG/ anti A{sub FMISO} (r = 0.55), anti A{sub FMISO}/ΔS{sub DCE} (r = 0.46), and FDG/ADC (r = -0.39). Correlations on the regional level showed comparable results. The results of this study suggest that the examined functional datasets provide complementary information. However, only pairwise correlations were examined, and correlations could still exist between combinations of three or more datasets. These results might contribute to the future design of individually adapted treatment approaches based on multiparametric functional imaging.

  14. A new software routine that automates the fitting of protein X-ray crystallographic electron-density maps.

    Science.gov (United States)

    Levitt, D G

    2001-07-01

    The classical approach to building the amino-acid residues into the initial electron-density map requires days to weeks of a skilled investigator's time. Automating this procedure should not only save time, but has the potential to provide a more accurate starting model for input to refinement programs. The new software routine MAID builds the protein structure into the electron-density map in a series of sequential steps. The first step is the fitting of the secondary alpha-helix and beta-sheet structures. These 'fits' are then used to determine the local amino-acid sequence assignment. These assigned fits are then extended through the loop regions and fused with the neighboring sheet or helix. The program was tested on the unaveraged 2.5 A selenomethionine multiple-wavelength anomalous dispersion (SMAD) electron-density map that was originally used to solve the structure of the 291-residue protein human heart short-chain L-3-hydroxyacyl-CoA dehydrogenase (SHAD). Inputting just the map density and the amino-acid sequence, MAID fitted 80% of the residues with an r.m.s.d. error of 0.43 A for the main-chain atoms and 1.0 A for all atoms without any user intervention. When tested on a higher quality 1.9 A SMAD map, MAID correctly fitted 100% (418) of the residues. A major advantage of the MAID fitting procedure is that it maintains ideal bond lengths and angles and constrains phi/psi angles to the appropriate Ramachandran regions. Recycling the output of this new routine through a partial structure-refinement program may have the potential to completely automate the fitting of electron-density maps.

  15. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  16. Co-trained convolutional neural networks for automated detection of prostate cancer in multi-parametric MRI.

    Science.gov (United States)

    Yang, Xin; Liu, Chaoyue; Wang, Zhiwei; Yang, Jun; Min, Hung Le; Wang, Liang; Cheng, Kwang-Ting Tim

    2017-12-01

    Multi-parameter magnetic resonance imaging (mp-MRI) is increasingly popular for prostate cancer (PCa) detection and diagnosis. However, interpreting mp-MRI data which typically contains multiple unregistered 3D sequences, e.g. apparent diffusion coefficient (ADC) and T2-weighted (T2w) images, is time-consuming and demands special expertise, limiting its usage for large-scale PCa screening. Therefore, solutions to computer-aided detection of PCa in mp-MRI images are highly desirable. Most recent advances in automated methods for PCa detection employ a handcrafted feature based two-stage classification flow, i.e. voxel-level classification followed by a region-level classification. This work presents an automated PCa detection system which can concurrently identify the presence of PCa in an image and localize lesions based on deep convolutional neural network (CNN) features and a single-stage SVM classifier. Specifically, the developed co-trained CNNs consist of two parallel convolutional networks for ADC and T2w images respectively. Each network is trained using images of a single modality in a weakly-supervised manner by providing a set of prostate images with image-level labels indicating only the presence of PCa without priors of lesions' locations. Discriminative visual patterns of lesions can be learned effectively from clutters of prostate and surrounding tissues. A cancer response map with each pixel indicating the likelihood to be cancerous is explicitly generated at the last convolutional layer of the network for each modality. A new back-propagated error E is defined to enforce both optimized classification results and consistent cancer response maps for different modalities, which help capture highly representative PCa-relevant features during the CNN feature learning process. The CNN features of each modality are concatenated and fed into a SVM classifier. For images which are classified to contain cancers, non-maximum suppression and adaptive

  17. An evaluation of morphological and functional multi-parametric MRI sequences in classifying non-muscle and muscle invasive bladder cancer

    International Nuclear Information System (INIS)

    Panebianco, Valeria; Barchetti, Giovanni; Grompone, Marcello Domenico; Del Monte, Maurizio; Carano, Davide; Catalano, Carlo; De Berardinis, Ettore; Leonardo, Constantino; Simone, Giuseppe; Gallucci, Michele; National Cancer Insitute, Rome; Catto, James

    2017-01-01

    Our goal is to determine the ability of multi-parametric magnetic resonance imaging (mpMRI) to differentiate muscle invasive bladder cancer (MIBC) from non-muscle invasive bladder cancer (NMIBC). Patients underwent mpMRI before tumour resection. Four MRI sets, i.e. T2-weighted (T2W) + perfusion-weighted imaging (PWI), T2W plus diffusion-weighted imaging (DWI), T2W + DWI + PWI, and T2W + DWI + PWI + dif-fusion tensor imaging (DTI) were interpreted qualitatively by two radiologists, blinded to histology results. PWI, DWI and DTI were also analysed quantitatively. Accuracy was determined using histopathology as the reference standard. A total of 82 tumours were analysed. Ninety-six percent of T1-labeled tumours by the T2W + DWI + PWI image set were confirmed to be NMIBC at histopathology. Overall accuracy of the complete mpMRI protocol was 94% in differentiating NMIBC from MIBC. PWI, DWI and DTI quantitative parameters were shown to be significantly different in cancerous versus non-cancerous areas within the bladder wall in T2-labelled lesions. MpMRI with DWI and DTI appears a reliable staging tool for bladder cancer. If our data are validated, then mpMRI could precede cystoscopic resection to allow a faster recognition of MIBC and accelerated treatment pathways. (orig.)

  18. Development and validation of a logistic regression model to distinguish transition zone cancers from benign prostatic hyperplasia on multi-parametric prostate MRI

    Energy Technology Data Exchange (ETDEWEB)

    Iyama, Yuji [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Nakaura, Takeshi; Nagayama, Yasunori; Utsunomiya, Daisuke; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto, Kumamoto (Japan); Katahira, Kazuhiro; Oda, Seitaro [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan); Iyama, Ayumi [National Hospital Organization Kumamoto Medical Center, Department of Diagnostic Radiology, Kumamoto, Kumamoto (Japan)

    2017-09-15

    To develop a prediction model to distinguish between transition zone (TZ) cancers and benign prostatic hyperplasia (BPH) on multi-parametric prostate magnetic resonance imaging (mp-MRI). This retrospective study enrolled 60 patients with either BPH or TZ cancer, who had undergone 3 T-MRI. We generated ten parameters for T2-weighted images (T2WI), diffusion-weighted images (DWI) and dynamic MRI. Using a t-test and multivariate logistic regression (LR) analysis to evaluate the parameters' accuracy, we developed LR models. We calculated the area under the receiver operating characteristic curve (ROC) of LR models by a leave-one-out cross-validation procedure, and the LR model's performance was compared with radiologists' performance with their opinion and with the Prostate Imaging Reporting and Data System (Pi-RADS v2) score. Multivariate LR analysis showed that only standardized T2WI signal and mean apparent diffusion coefficient (ADC) maintained their independent values (P < 0.001). The validation analysis showed that the AUC of the final LR model was comparable to that of board-certified radiologists, and superior to that of Pi-RADS scores. A standardized T2WI and mean ADC were independent factors for distinguishing between BPH and TZ cancer. The performance of the LR model was comparable to that of experienced radiologists. (orig.)

  19. Multi-parametric study of temperature and thermal damage of tumor exposed to high-frequency nanosecond-pulsed electric fields based on finite element simulation.

    Science.gov (United States)

    Mi, Yan; Rui, Shaoqin; Li, Chengxiang; Yao, Chenguo; Xu, Jin; Bian, Changhao; Tang, Xuefeng

    2017-07-01

    High-frequency nanosecond-pulsed electric fields were recently introduced for tumor or abnormal tissue ablation to solve some problems of conventional electroporation. However, it is necessary to study the thermal effects of high-field-intensity nanosecond pulses inside tissues. The multi-parametric analysis performed here is based on a finite element model of liver tissue with a tumor that has been punctured by a pair of needle electrodes. The pulse voltage used in this study ranges from 1 to 4 kV, the pulse width ranges from 50 to 500 ns, and the repetition frequency is between 100 kHz and 1 MHz. The total pulse length is 100 μs, and the pulse burst repetition frequency is 1 Hz. Blood flow and metabolic heat generation have also been considered. Results indicate that the maximum instantaneous temperature at 100 µs can reach 49 °C, with a maximum instantaneous temperature at 1 s of 40 °C, and will not cause thermal damage during single pulse bursts. By parameter fitting, we can obtain maximum instantaneous temperature at 100 µs and 1 s for any parameter values. However, higher temperatures will be achieved and may cause thermal damage when multiple pulse bursts are applied. These results provide theoretical basis of pulse parameter selection for future experimental researches.

  20. An evaluation of morphological and functional multi-parametric MRI sequences in classifying non-muscle and muscle invasive bladder cancer

    Energy Technology Data Exchange (ETDEWEB)

    Panebianco, Valeria; Barchetti, Giovanni; Grompone, Marcello Domenico; Del Monte, Maurizio; Carano, Davide; Catalano, Carlo [Sapienza Univ. Rome (Italy). Dept. of Radiological Sciences, Oncology and Pathology; De Berardinis, Ettore; Leonardo, Constantino [Sapienza Univ. Rome (Italy). Dept. of Gynaecological-Obstetric and Urological Sciences; Simone, Giuseppe; Gallucci, Michele [' ' Regina Elena' ' National Cancer Insitute, Rome (Italy). Dept. of Urology; Catto, James [Sheffield Univ. (United Kingdom). Aademic Urology Unit

    2017-09-15

    Our goal is to determine the ability of multi-parametric magnetic resonance imaging (mpMRI) to differentiate muscle invasive bladder cancer (MIBC) from non-muscle invasive bladder cancer (NMIBC). Patients underwent mpMRI before tumour resection. Four MRI sets, i.e. T2-weighted (T2W) + perfusion-weighted imaging (PWI), T2W plus diffusion-weighted imaging (DWI), T2W + DWI + PWI, and T2W + DWI + PWI + dif-fusion tensor imaging (DTI) were interpreted qualitatively by two radiologists, blinded to histology results. PWI, DWI and DTI were also analysed quantitatively. Accuracy was determined using histopathology as the reference standard. A total of 82 tumours were analysed. Ninety-six percent of T1-labeled tumours by the T2W + DWI + PWI image set were confirmed to be NMIBC at histopathology. Overall accuracy of the complete mpMRI protocol was 94% in differentiating NMIBC from MIBC. PWI, DWI and DTI quantitative parameters were shown to be significantly different in cancerous versus non-cancerous areas within the bladder wall in T2-labelled lesions. MpMRI with DWI and DTI appears a reliable staging tool for bladder cancer. If our data are validated, then mpMRI could precede cystoscopic resection to allow a faster recognition of MIBC and accelerated treatment pathways. (orig.)

  1. Predicting final extent of ischemic infarction using artificial neural network analysis of multi-parametric MRI in patients with stroke.

    Directory of Open Access Journals (Sweden)

    Hassan Bagher-Ebadian

    Full Text Available In hemispheric ischemic stroke, the final size of the ischemic lesion is the most important correlate of clinical functional outcome. Using a set of acute-phase MR images (Diffusion-weighted--DWI, T(1-weighted--T1WI, T(2-weighted--T2WI, and proton density weighted--PDWI for inputs, and the chronic T2WI at 3 months as an outcome measure, an Artificial Neural Network (ANN was trained to predict the 3-month outcome in the form of a voxel-by-voxel forecast of the chronic T2WI. The ANN was trained and tested using 12 subjects (with 83 slices and 140218 voxels using a leave-one-out cross-validation method with calculation of the Area Under the Receiver Operator Characteristic Curve (AUROC for training, testing and optimization of the ANN. After training and optimization, the ANN produced maps of predicted outcome that were well correlated (r = 0.80, p<0.0001 with the T2WI at 3 months for all 12 patients. This result implies that the trained ANN can provide an estimate of 3-month ischemic lesion on T2WI in a stable and accurate manner (AUROC = 0.89.

  2. Identify and Classify Critical Success Factor of Agile Software Development Methodology Using Mind Map

    OpenAIRE

    Tasneem Abd El Hameed; Mahmoud Abd EL Latif; Sherif Kholief

    2016-01-01

    Selecting the right method, right personnel and right practices, and applying them adequately, determine the success of software development. In this paper, a qualitative study is carried out among the critical factors of success from previous studies. The factors of success match with their relative principles to illustrate the most valuable factor for agile approach success, this paper also prove that the twelve principles poorly identified for few factors resulting from qualitative and qua...

  3. WEB MAPPING ARCHITECTURES BASED ON OPEN SPECIFICATIONS AND FREE AND OPEN SOURCE SOFTWARE IN THE WATER DOMAIN

    Directory of Open Access Journals (Sweden)

    C. Arias Muñoz

    2017-09-01

    Full Text Available The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS and the use of open specifications (OS that address different users’ needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  4. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    Science.gov (United States)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  5. Panoramic Images Mapping Tools Integrated Within the ESRI ArcGIS Software

    International Nuclear Information System (INIS)

    Guo, Jiao; Zhong, Ruofei; Zeng, Fanyang

    2014-01-01

    There is a general study on panoramic images which are presented along with appearance of the Google street map. Despite 360 degree viewing of street, we can realize more applications over panoramic images. This paper developed a toolkits plugged in ArcGIS, which can view panoramic photographs at street level directly from ArcMap and measure and capture all visible elements as frontages, trees and bridges. We use a series of panoramic images adjoined with absolute coordinate through GPS and IMU. There are two methods in this paper to measure object from these panoramic images: one is to intersect object position through a stereogram; the other one is multichip matching involved more than three images which all cover the object. While someone wants to measure objects from these panoramic images, each two panoramic images which both contain the object can be chosen to display on ArcMap. Then we calculate correlation coefficient of the two chosen panoramic images so as to calculate the coordinate of object. Our study test different patterns of panoramic pairs and compare the results of measurement to the real value of objects so as to offer the best choosing suggestion. The article has mainly elaborated the principles of calculating correlation coefficient and multichip matching

  6. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  7. Local Dynamic Map als modulares Software Framework für Fahrerassistenzsysteme

    Science.gov (United States)

    Reisdorf, P.; Auerswald, A.; Wanielik, G.

    2015-11-01

    Moderne Fahrerassistenzsysteme basieren auf der Verarbeitung von Informationen, welche durch die Umfeldwahrnehmung mit unterschiedlicher Sensorik erfolgt. Neben den Informationen aus dem eigenen Fahrzeug ergeben sich durch unterschiedliche Kommunikationsmöglichkeiten (Car2Car, Car2X,...) erweiterte Umfeldwahrnehmungen (siehe Abb. 1). Diese Daten gilt es aufbereitet und zielorientiert einer Anwendung zur Verfügung zu stellen, was mit Hilfe einer Local Dynamic Map (LDM) erfüllt werden kann. Die vorliegende Veröffentlichung beschreibt den Aufbau, Verwendungszweck und Eigenschaften einer entwickelten LDM und geht auf einige Applikationen ein, die mit Hilfe dieser realisiert wurden.

  8. Identification of RNA molecules by specific enzyme digestion and mass spectrometry: software for and implementation of RNA mass mapping

    DEFF Research Database (Denmark)

    Matthiesen, Rune; Kirpekar, Finn

    2009-01-01

    The idea of identifying or characterizing an RNA molecule based on a mass spectrum of specifically generated RNA fragments has been used in various forms for well over a decade. We have developed software-named RRM for 'RNA mass mapping'-which can search whole prokaryotic genomes or RNA FASTA...... sequence databases to identify the origin of a given RNA based on a mass spectrum of RNA fragments. As input, the program uses the masses of specific RNase cleavage of the RNA under investigation. RNase T1 digestion is used here as a demonstration of the usability of the method for RNA identification....... The concept for identification is that the masses of the digestion products constitute a specific fingerprint, which characterize the given RNA. The search algorithm is based on the same principles as those used in peptide mass fingerprinting, but has here been extended to work for both RNA sequence databases...

  9. Sparse representation of multi parametric DCE-MRI features using K-SVD for classifying gene expression based breast cancer recurrence risk

    Science.gov (United States)

    Mahrooghy, Majid; Ashraf, Ahmed B.; Daye, Dania; Mies, Carolyn; Rosen, Mark; Feldman, Michael; Kontos, Despina

    2014-03-01

    We evaluate the prognostic value of sparse representation-based features by applying the K-SVD algorithm on multiparametric kinetic, textural, and morphologic features in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). K-SVD is an iterative dimensionality reduction method that optimally reduces the initial feature space by updating the dictionary columns jointly with the sparse representation coefficients. Therefore, by using K-SVD, we not only provide sparse representation of the features and condense the information in a few coefficients but also we reduce the dimensionality. The extracted K-SVD features are evaluated by a machine learning algorithm including a logistic regression classifier for the task of classifying high versus low breast cancer recurrence risk as determined by a validated gene expression assay. The features are evaluated using ROC curve analysis and leave one-out cross validation for different sparse representation and dimensionality reduction numbers. Optimal sparse representation is obtained when the number of dictionary elements is 4 (K=4) and maximum non-zero coefficients is 2 (L=2). We compare K-SVD with ANOVA based feature selection for the same prognostic features. The ROC results show that the AUC of the K-SVD based (K=4, L=2), the ANOVA based, and the original features (i.e., no dimensionality reduction) are 0.78, 0.71. and 0.68, respectively. From the results, it can be inferred that by using sparse representation of the originally extracted multi-parametric, high-dimensional data, we can condense the information on a few coefficients with the highest predictive value. In addition, the dimensionality reduction introduced by K-SVD can prevent models from over-fitting.

  10. Deriving stable multi-parametric MRI radiomic signatures in the presence of inter-scanner variations: survival prediction of glioblastoma via imaging pattern analysis and machine learning techniques

    Science.gov (United States)

    Rathore, Saima; Bakas, Spyridon; Akbari, Hamed; Shukla, Gaurav; Rozycki, Martin; Davatzikos, Christos

    2018-02-01

    There is mounting evidence that assessment of multi-parametric magnetic resonance imaging (mpMRI) profiles can noninvasively predict survival in many cancers, including glioblastoma. The clinical adoption of mpMRI as a prognostic biomarker, however, depends on its applicability in a multicenter setting, which is hampered by inter-scanner variations. This concept has not been addressed in existing studies. We developed a comprehensive set of within-patient normalized tumor features such as intensity profile, shape, volume, and tumor location, extracted from multicenter mpMRI of two large (npatients=353) cohorts, comprising the Hospital of the University of Pennsylvania (HUP, npatients=252, nscanners=3) and The Cancer Imaging Archive (TCIA, npatients=101, nscanners=8). Inter-scanner harmonization was conducted by normalizing the tumor intensity profile, with that of the contralateral healthy tissue. The extracted features were integrated by support vector machines to derive survival predictors. The predictors' generalizability was evaluated within each cohort, by two cross-validation configurations: i) pooled/scanner-agnostic, and ii) across scanners (training in multiple scanners and testing in one). The median survival in each configuration was used as a cut-off to divide patients in long- and short-survivors. Accuracy (ACC) for predicting long- versus short-survivors, for these configurations was ACCpooled=79.06% and ACCpooled=84.7%, ACCacross=73.55% and ACCacross=74.76%, in HUP and TCIA datasets, respectively. The hazard ratio at 95% confidence interval was 3.87 (2.87-5.20, P<0.001) and 6.65 (3.57-12.36, P<0.001) for HUP and TCIA datasets, respectively. Our findings suggest that adequate data normalization coupled with machine learning classification allows robust prediction of survival estimates on mpMRI acquired by multiple scanners.

  11. Fracture network evaluation program (FraNEP): A software for analyzing 2D fracture trace-line maps

    Science.gov (United States)

    Zeeb, Conny; Gomez-Rivas, Enrique; Bons, Paul D.; Virgo, Simon; Blum, Philipp

    2013-10-01

    Fractures, such as joints, faults and veins, strongly influence the transport of fluids through rocks by either enhancing or inhibiting flow. Techniques used for the automatic detection of lineaments from satellite images and aerial photographs, LIDAR technologies and borehole televiewers significantly enhanced data acquisition. The analysis of such data is often performed manually or with different analysis software. Here we present a novel program for the analysis of 2D fracture networks called FraNEP (Fracture Network Evaluation Program). The program was developed using Visual Basic for Applications in Microsoft Excel™ and combines features from different existing software and characterization techniques. The main novelty of FraNEP is the possibility to analyse trace-line maps of fracture networks applying the (1) scanline sampling, (2) window sampling or (3) circular scanline and window method, without the need of switching programs. Additionally, binning problems are avoided by using cumulative distributions, rather than probability density functions. FraNEP is a time-efficient tool for the characterisation of fracture network parameters, such as density, intensity and mean length. Furthermore, fracture strikes can be visualized using rose diagrams and a fitting routine evaluates the distribution of fracture lengths. As an example of its application, we use FraNEP to analyse a case study of lineament data from a satellite image of the Oman Mountains.

  12. Mining the Geophysical Research Abstracts Corpus: Mapping the impact of Free and Open Source Software on the EGU Divisions

    Science.gov (United States)

    Löwe, Peter; Klump, Jens; Robertson, Jesse

    2015-04-01

    Text mining is commonly employed as a tool in data science to investigate and chart emergent information from corpora of research abstracts, such as the Geophysical Research Abstracts (GRA) published by Copernicus. In this context current standards, such as persistent identifiers like DOI and ORCID, allow us to trace, cite and map links between journal publications, the underlying research data and scientific software. This network can be expressed as a directed graph which enables us to chart networks of cooperation and innovation, thematic foci and the locations of research communities in time and space. However, this approach of data science, focusing on the research process in a self-referential manner, rather than the topical work, is still in a developing stage. Scientific work presented at the EGU General Assembly is often the first step towards new approaches and innovative ideas to the geospatial community. It represents a rich, deep and heterogeneous source of geoscientific thought. This corpus is a significant data source for data science, which has not been analysed on this scale previously. In this work, the corpus of the Geophysical Research Abstracts is used for the first time as a data base for analyses of topical text mining. For this, we used a sturdy and customizable software framework, based on the work of Schmitt et al. [1]. For the analysis we used the High Performance Computing infrastructure of the German Research Centre for Geosciences GFZ in Potsdam, Germany. Here, we report on the first results from the analysis of the continuous spreading the of use of Free and Open Source Software Tools (FOSS) within the EGU communities, mapping the general increase of FOSS-themed GRA articles in the last decade and the developing spatial patterns of involved parties and FOSS topics. References: [1] Schmitt, L. M., Christianson, K.T, Gupta R..: Linguistic Computing with UNIX Tools, in Kao, A., Poteet S.R. (Eds.): Natural Language processing and Text

  13. EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.

    Science.gov (United States)

    Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G

    2018-03-29

    In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Global Ionosphere Mapping and Differential Code Bias Estimation during Low and High Solar Activity Periods with GIMAS Software

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2018-05-01

    Full Text Available Ionosphere research using the Global Navigation Satellite Systems (GNSS techniques is a hot topic, with their unprecedented high temporal and spatial sampling rate. We introduced a new GNSS Ionosphere Monitoring and Analysis Software (GIMAS in order to model the global ionosphere vertical total electron content (VTEC maps and to estimate the GPS and GLObalnaya NAvigatsionnaya Sputnikovaya Sistema (GLONASS satellite and receiver differential code biases (DCBs. The GIMAS-based Global Ionosphere Map (GIM products during low (day of year from 202 to 231, in 2008 and high (day of year from 050 to 079, in 2014 solar activity periods were investigated and assessed. The results showed that the biases of the GIMAS-based VTEC maps relative to the International GNSS Service (IGS Ionosphere Associate Analysis Centers (IAACs VTEC maps ranged from −3.0 to 1.0 TECU (TEC unit (1 TECU = 1 × 1016 electrons/m2. The standard deviations (STDs ranged from 0.7 to 1.9 TECU in 2008, and from 2.0 to 8.0 TECU in 2014. The STDs at a low latitude were significantly larger than those at middle and high latitudes, as a result of the ionospheric latitudinal gradients. When compared with the Jason-2 VTEC measurements, the GIMAS-based VTEC maps showed a negative systematic bias of about −1.8 TECU in 2008, and a positive systematic bias of about +2.2 TECU in 2014. The STDs were about 2.0 TECU in 2008, and ranged from 2.2 to 8.5 TECU in 2014. Furthermore, the aforementioned characteristics were strongly related to the conditions of the ionosphere variation and the geographic latitude. The GPS and GLONASS satellite and receiver P1-P2 DCBs were compared with the IAACs DCBs. The root mean squares (RMSs were 0.16–0.20 ns in 2008 and 0.13–0.25 ns in 2014 for the GPS satellites and 0.26–0.31 ns in 2014 for the GLONASS satellites. The RMSs of receiver DCBs were 0.21–0.42 ns in 2008 and 0.33–1.47 ns in 2014 for GPS and 0.67–0.96 ns in 2014 for GLONASS. The monthly

  15. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  16. Assessment of radiation damage - the need for a multi-parametric and integrative approach with the help of both clinical and biological dosimetry

    International Nuclear Information System (INIS)

    Meineke, Viktor

    2008-01-01

    provide the best basis for triage and planning and providing of medical treatment after accidental radiation exposure always different and independent diagnostic procedures integrating all clinical aspects as well as different biological indicators have to be applied. Up to now this multi parametric approach is missing in medical radiation accident management. A new integrative concept is shown and discussed. (author)

  17. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Ilander, T; Kansanaho, A; Toivonen, H

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.).

  18. Desktop mapping using GPS. SAHTI - a software package for environmental monitoring. Report on task JNTB898 on the Finnish support programme to IAEA safeguards

    International Nuclear Information System (INIS)

    Ilander, T.; Kansanaho, A.; Toivonen, H.

    1996-02-01

    Environmental sampling is the key method of the IAEA in searching signatures of a covert nuclear programme. However, it is not always easy to know the exact location of the sampling site. The satellite navigation system, utilizing a small receiver (GPS) and a PC, allows to have independent positioning data easily. The present task on the Finnish Support Programme was launched to create software to merge information about sampling and positioning. The system is build above a desktop mapping software package. However, the result of the development goes beyond the initial goal: the software can be used to real- time positioning in a mobile unit utilizing maps that can be purchased or produced by the user. In addition, the system can be easily enlarged to visualize data in real time from mobile environmental monitors, such as a Geiger counter, a pressurized ionisation chamber of a gamma-ray spectrometer. (orig.) (7 figs.)

  19. The software and hardware architectural design of the vessel thermal map real-time system in JET

    International Nuclear Information System (INIS)

    Alves, D.; Neto, A.; Valcarcel, D.F.; Jachmich, S.; Arnoux, G.; Card, P.; Devaux, S.; Felton, R.; Goodyear, A.; Kinna, D.; Lomas, P; McCullen, P.; Stephen, A.; Zastrow, K.D.

    2012-01-01

    The installation of ITER-relevant materials for the Plasma Facing Components (PFCs) in the Joint European Torus (JET) is expected to have a strong impact on the operation and protection of the experiment. In particular, the use of all-beryllium tiles, which deteriorate at a substantially lower temperature than the formerly installed Carbon Fibre Composite (CFC) tiles, imposes strict thermal restrictions on the PFCs during operation. Prompt and precise responses are therefore required whenever anomalous temperatures are detected. The new Vessel Thermal Map (VTM) real-time application collects the temperature measurements provided by dedicated pyrometers and Infra-Red (IR) cameras, groups them according to spatial location and probable offending heat source and raises alarms that will trigger appropriate protective responses. In the context of JETs global scheme for the protection of the new wall, the system is required to run on a 10 millisecond cycle communicating with other systems through the Real-Time Data Network (RTDN). In order to meet these requirements a Commercial Off- The-Shelf (COTS) solution has been adopted based on standard *86 multi-core technology, Linux and the Multi-threaded Application Real-Time executor (MARTe) software framework. This paper presents an overview of the system with particular technical focus on the configuration of its real-time capability and the benefits of the modular development approach and advanced tools provided by the MARTe framework. (authors)

  20. CGI: Java software for mapping and visualizing data from array-based comparative genomic hybridization and expression profiling.

    Science.gov (United States)

    Gu, Joyce Xiuweu-Xu; Wei, Michael Yang; Rao, Pulivarthi H; Lau, Ching C; Behl, Sanjiv; Man, Tsz-Kwong

    2007-10-06

    With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator) that matches the BAC clones from array-based comparative genomic hybridization (aCGH) to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specific BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License.

  1. CGI: Java Software for Mapping and Visualizing Data from Array-based Comparative Genomic Hybridization and Expression Profiling

    Directory of Open Access Journals (Sweden)

    Joyce Xiuweu-Xu Gu

    2007-01-01

    Full Text Available With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator that matches the BAC clones from array-based comparative genomic hybridization (aCGH to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specifi c BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License.

  2. VESsel GENeration Analysis (VESGEN): Innovative Vascular Mappings for Astronaut Exploration Health Risks and Human Terrestrial Medicine

    Science.gov (United States)

    Parsons-Wingerter, Patricia; Kao, David; Valizadegan, Hamed; Martin, Rodney; Murray, Matthew C.; Ramesh, Sneha; Sekaran, Srinivaas

    2017-01-01

    Currently, astronauts face significant health risks in future long-duration exploration missions such as colonizing the Moon and traveling to Mars. Numerous risks include greatly increased radiation exposures beyond the low earth orbit (LEO) of the ISS, and visual and ocular impairments in response to microgravity environments. The cardiovascular system is a key mediator in human physiological responses to radiation and microgravity. Moreover, blood vessels are necessarily involved in the progression and treatment of vascular-dependent terrestrial diseases such as cancer, coronary vessel disease, wound-healing, reproductive disorders, and diabetes. NASA developed an innovative, globally requested beta-level software, VESsel GENeration Analysis (VESGEN) to map and quantify vascular remodeling for application to astronaut and terrestrial health challenges. VESGEN mappings of branching vascular trees and networks are based on a weighted multi-parametric analysis derived from vascular physiological branching rules. Complex vascular branching patterns are determined by biological signaling mechanisms together with the fluid mechanics of multi-phase laminar blood flow.

  3. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    Science.gov (United States)

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming

  4. Why and Where do We Miss Significant Prostate Cancer with Multi-parametric Magnetic Resonance Imaging followed by Magnetic Resonance-guided and Transrectal Ultrasound-guided Biopsy in Biopsy-naïve Men?

    Science.gov (United States)

    Schouten, Martijn G; van der Leest, Marloes; Pokorny, Morgan; Hoogenboom, Martijn; Barentsz, Jelle O; Thompson, Les C; Fütterer, Jurgen J

    2017-06-01

    Knowledge of significant prostate (sPCa) locations being missed with magnetic resonance (MR)- and transrectal ultrasound (TRUS)-guided biopsy (Bx) may help to improve these techniques. To identify the location of sPCa lesions being missed with MR- and TRUS-Bx. In a referral center, 223 consecutive Bx-naive men with elevated prostate specific antigen level and/or abnormal digital rectal examination were included. Histopathologically-proven cancer locations, Gleason score, and tumor length were determined. All patients underwent multi-parametric MRI and 12-core systematic TRUS-Bx. MR-Bx was performed in all patients with suspicion of PCa on multi-parametric MRI (n=142). Cancer locations were compared between MR- and TRUS-Bx. Proportions were expressed as percentages, and the corresponding 95% confidence intervals were calculated. In total, 191 lesions were found in 108 patients with sPCa. From these lesion 74% (141/191) were defined as sPCa on either MR- or TRUS-Bx. MR-Bx detected 74% (105/141) of these lesions and 61% (86/141) with TRUS-Bx. TRUS-Bx detected more lesions compared with MR-Bx (140 vs 109). However, these lesions were often low risk (39%). Significant lesions missed with MR-Bx most often had involvement of dorsolateral (58%) and apical (37%) segments and missed segments with TRUS-Bx were located anteriorly (79%), anterior midprostate (50%), and anterior apex (23%). Both techniques have difficulties in detecting apical lesions. MR-Bx most often missed cancer with involvement of the dorsolateral part (58%) and TRUS-Bx with involvement of the anterior part (79%). Both biopsy techniques miss cancer in specific locations within the prostate. Identification of these lesions may help to improve these techniques. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  5. Negotiation and Decision Making with Collaborative Software: How MarineMap 'Changed the Game' in California's Marine Life Protected Act Initiative.

    Science.gov (United States)

    Cravens, Amanda E

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study-which draws on data from approximately 60 semi-structured interviews and an online survey--examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  6. Negotiation and Decision Making with Collaborative Software: How MarineMap `Changed the Game' in California's Marine Life Protected Act Initiative

    Science.gov (United States)

    Cravens, Amanda E.

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study—which draws on data from approximately 60 semi-structured interviews and an online survey—examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  7. Visually directed vs. software-based targeted biopsy compared to transperineal template mapping biopsy in the detection of clinically significant prostate cancer.

    Science.gov (United States)

    Valerio, Massimo; McCartan, Neil; Freeman, Alex; Punwani, Shonit; Emberton, Mark; Ahmed, Hashim U

    2015-10-01

    Targeted biopsy based on cognitive or software magnetic resonance imaging (MRI) to transrectal ultrasound registration seems to increase the detection rate of clinically significant prostate cancer as compared with standard biopsy. However, these strategies have not been directly compared against an accurate test yet. The aim of this study was to obtain pilot data on the diagnostic ability of visually directed targeted biopsy vs. software-based targeted biopsy, considering transperineal template mapping (TPM) biopsy as the reference test. Prospective paired cohort study included 50 consecutive men undergoing TPM with one or more visible targets detected on preoperative multiparametric MRI. Targets were contoured on the Biojet software. Patients initially underwent software-based targeted biopsies, then visually directed targeted biopsies, and finally systematic TPM. The detection rate of clinically significant disease (Gleason score ≥3+4 and/or maximum cancer core length ≥4mm) of one strategy against another was compared by 3×3 contingency tables. Secondary analyses were performed using a less stringent threshold of significance (Gleason score ≥4+3 and/or maximum cancer core length ≥6mm). Median age was 68 (interquartile range: 63-73); median prostate-specific antigen level was 7.9ng/mL (6.4-10.2). A total of 79 targets were detected with a mean of 1.6 targets per patient. Of these, 27 (34%), 28 (35%), and 24 (31%) were scored 3, 4, and 5, respectively. At a patient level, the detection rate was 32 (64%), 34 (68%), and 38 (76%) for visually directed targeted, software-based biopsy, and TPM, respectively. Combining the 2 targeted strategies would have led to detection rate of 39 (78%). At a patient level and at a target level, software-based targeted biopsy found more clinically significant diseases than did visually directed targeted biopsy, although this was not statistically significant (22% vs. 14%, P = 0.48; 51.9% vs. 44.3%, P = 0.24). Secondary

  8. National Insect and Disease Risk Map (NIDRM)--cutting edge software for rapid insect and disease risk model development

    Science.gov (United States)

    Frank J. Krist

    2010-01-01

    The Forest Health Technology Enterprise Team (FHTET) of the U.S. Forest Service is leading an effort to produce the next version of the National Insect and Disease Risk Map (NIDRM) for targeted release in 2011. The goal of this effort is to update spatial depictions of risk of tree mortality based on: (1) newly derived 240-m geospatial information depicting the...

  9. TU-PIS-Exhibit Hall-2: Deformable Image Registration, Contour Propagation, and Dose Mapping inMIM Maestro - MIM Software

    International Nuclear Information System (INIS)

    Piper, J.

    2015-01-01

    Brachytherapy devices and software are designed to last for a certain period of time. Due to a number of considerations, such as material factors, wear-and-tear, backwards compatibility, and others, they all reach a date when they are no longer supported by the manufacturer. Most of these products have a limited duration for their use, and the information is provided to the user at time of purchase. Because of issues or concerns determined by the manufacturer, certain products are retired sooner than the anticipated date, and the user is immediately notified. In these situations, the institution is facing some difficult choices: remove these products from the clinic or perform tests and continue their usage. Both of these choices come with a financial burden: replacing the product or assuming a potential medicolegal liability. This session will provide attendees with the knowledge and tools to make better decisions when facing these issues. Learning Objectives: Understand the meaning of “end-of-life or “life expectancy” for brachytherapy devices and software Review items (devices and software) affected by “end-of-life” restrictions Learn how to effectively formulate “end-of-life” policies at your institution Learn about possible implications of “end-of-life” policy Review other possible approaches to “end-of-life” issue

  10. TU-PIS-Exhibit Hall-2: Deformable Image Registration, Contour Propagation, and Dose Mapping inMIM Maestro - MIM Software

    Energy Technology Data Exchange (ETDEWEB)

    Piper, J. [MIM Software, Inc. (United States)

    2015-06-15

    Brachytherapy devices and software are designed to last for a certain period of time. Due to a number of considerations, such as material factors, wear-and-tear, backwards compatibility, and others, they all reach a date when they are no longer supported by the manufacturer. Most of these products have a limited duration for their use, and the information is provided to the user at time of purchase. Because of issues or concerns determined by the manufacturer, certain products are retired sooner than the anticipated date, and the user is immediately notified. In these situations, the institution is facing some difficult choices: remove these products from the clinic or perform tests and continue their usage. Both of these choices come with a financial burden: replacing the product or assuming a potential medicolegal liability. This session will provide attendees with the knowledge and tools to make better decisions when facing these issues. Learning Objectives: Understand the meaning of “end-of-life or “life expectancy” for brachytherapy devices and software Review items (devices and software) affected by “end-of-life” restrictions Learn how to effectively formulate “end-of-life” policies at your institution Learn about possible implications of “end-of-life” policy Review other possible approaches to “end-of-life” issue.

  11. Accurately Diagnosing Uric Acid Stones from Conventional Computerized Tomography Imaging: Development and Preliminary Assessment of a Pixel Mapping Software.

    Science.gov (United States)

    Ganesan, Vishnu; De, Shubha; Shkumat, Nicholas; Marchini, Giovanni; Monga, Manoj

    2018-02-01

    Preoperative determination of uric acid stones from computerized tomography imaging would be of tremendous clinical use. We sought to design a software algorithm that could apply data from noncontrast computerized tomography to predict the presence of uric acid stones. Patients with pure uric acid and calcium oxalate stones were identified from our stone registry. Only stones greater than 4 mm which were clearly traceable from initial computerized tomography to final composition were included in analysis. A semiautomated computer algorithm was used to process image data. Average and maximum HU, eccentricity (deviation from a circle) and kurtosis (peakedness vs flatness) were automatically generated. These parameters were examined in several mathematical models to predict the presence of uric acid stones. A total of 100 patients, of whom 52 had calcium oxalate and 48 had uric acid stones, were included in the final analysis. Uric acid stones were significantly larger (12.2 vs 9.0 mm, p = 0.03) but calcium oxalate stones had higher mean attenuation (457 vs 315 HU, p = 0.001) and maximum attenuation (918 vs 553 HU, p uric acid stones. A combination of stone size, attenuation intensity and attenuation pattern from conventional computerized tomography can distinguish uric acid stones from calcium oxalate stones with high sensitivity and specificity. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. An approach to the analysis of SDSS spectroscopic outliers based on self-organizing maps. Designing the outlier analysis software package for the next Gaia survey

    Science.gov (United States)

    Fustes, D.; Manteiga, M.; Dafonte, C.; Arcay, B.; Ulla, A.; Smith, K.; Borrachero, R.; Sordo, R.

    2013-11-01

    Aims: A new method applied to the segmentation and further analysis of the outliers resulting from the classification of astronomical objects in large databases is discussed. The method is being used in the framework of the Gaia satellite Data Processing and Analysis Consortium (DPAC) activities to prepare automated software tools that will be used to derive basic astrophysical information that is to be included in final Gaia archive. Methods: Our algorithm has been tested by means of simulated Gaia spectrophotometry, which is based on SDSS observations and theoretical spectral libraries covering a wide sample of astronomical objects. Self-organizing maps networks are used to organize the information in clusters of objects, as homogeneously as possible according to their spectral energy distributions, and to project them onto a 2D grid where the data structure can be visualized. Results: We demonstrate the usefulness of the method by analyzing the spectra that were rejected by the SDSS spectroscopic classification pipeline and thus classified as "UNKNOWN". First, our method can help distinguish between astrophysical objects and instrumental artifacts. Additionally, the application of our algorithm to SDSS objects of unknown nature has allowed us to identify classes of objects with similar astrophysical natures. In addition, the method allows for the potential discovery of hundreds of new objects, such as white dwarfs and quasars. Therefore, the proposed method is shown to be very promising for data exploration and knowledge discovery in very large astronomical databases, such as the archive from the upcoming Gaia mission.

  13. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  15. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  16. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. Petroleum software profiles

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    A profile of twenty-two software packages designed for petroleum exploration and production was provided. Some focussed on the oil and gas engineering industry, and others on mapping systems containing well history files and well data summaries. Still other programs provided accounting systems designed to address the complexities of the oil and gas industry. The software packages reviewed were developed by some of the best-known groups involved in software development for the oil and gas industry, including among others, Geoquest, the Can Tek Group, Applied Terravision Systems Inc., Neotechnology Consultants Ltd., (12) OGCI Software Inc., Oracle Energy, Production Revenue Information Systems Management, Virtual Computing Services Ltd., and geoLogic Systems Ltd

  19. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  20. Staff Recall Travel Time for ST Elevation Myocardial Infarction Impacted by Traffic Congestion and Distance: A Digitally Integrated Map Software Study.

    Science.gov (United States)

    Cole, Justin; Beare, Richard; Phan, Thanh G; Srikanth, Velandai; MacIsaac, Andrew; Tan, Christianne; Tong, David; Yee, Susan; Ho, Jesslyn; Layland, Jamie

    2017-01-01

    Recent evidence suggests hospitals fail to meet guideline specified time to percutaneous coronary intervention (PCI) for a proportion of ST elevation myocardial infarction (STEMI) presentations. Implicit in achieving this time is the rapid assembly of crucial catheter laboratory staff. As a proof-of-concept, we set out to create regional maps that graphically show the impact of traffic congestion and distance to destination on staff recall travel times for STEMI, thereby producing a resource that could be used by staff to improve reperfusion time for STEMI. Travel times for staff recalled to one inner and one outer metropolitan hospital at midnight, 6 p.m., and 7 a.m. were estimated using Google Maps Application Programming Interface. Computer modeling predictions were overlaid on metropolitan maps showing color coded staff recall travel times for STEMI, occurring within non-peak and peak hour traffic congestion times. Inner metropolitan hospital staff recall travel times were more affected by traffic congestion compared with outer metropolitan times, and the latter was more affected by distance. The estimated mean travel times to hospital during peak hour were greater than midnight travel times by 13.4 min to the inner and 6.0 min to the outer metropolitan hospital at 6 p.m. ( p  travel time can predict optimal residence of staff when on-call for PCI.

  1. Benthic Photo Survey: Software for Geotagging, Depth-tagging, and Classifying Photos from Survey Data and Producing Shapefiles for Habitat Mapping in GIS

    Directory of Open Access Journals (Sweden)

    Jared Kibele

    2016-03-01

    Full Text Available Photo survey techniques are common for resource management, ecological research, and ground truthing for remote sensing but current data processing methods are cumbersome and inefficient. The Benthic Photo Survey (BPS software described here was created to simplify the data processing and management tasks associated with photo surveys of underwater habitats. BPS is free and open source software written in Python with a QT graphical user interface. BPS takes a GPS log and jpeg images acquired by a diver or drop camera and assigns the GPS position to each photo based on time-stamps (i.e. geotagging. Depth and temperature can be assigned in a similar fashion (i.e. depth-tagging using log files from an inexpensive consumer grade depth / temperature logger that can be attached to the camera. BPS provides the user with a simple interface to assign quantitative habitat and substrate classifications to each photo. Location, depth, temperature, habitat, and substrate data are all stored with the jpeg metadata in Exchangeable image file format (Exif. BPS can then export all of these data in a spatially explicit point shapefile format for use in GIS. BPS greatly reduces the time and skill required to turn photos into usable data thereby making photo survey methods more efficient and cost effective. BPS can also be used, as is, for other photo sampling techniques in terrestrial and aquatic environments and the open source code base offers numerous opportunities for expansion and customization.

  2. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  3. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  4. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  5. MARPLOT Software

    Science.gov (United States)

    Part of the CAMEO suite, MARPLOT® is a mapping application that people can use to quickly create, view, and modify maps. Users can create their own objects in MARPLOT (e.g., facilities, schools, response assets) and display them on top of a basemap.

  6. Programming with Hierarchical Maps

    DEFF Research Database (Denmark)

    Ørbæk, Peter

    This report desribes the hierarchical maps used as a central data structure in the Corundum framework. We describe its most prominent features, ague for its usefulness and briefly describe some of the software prototypes implemented using the technology....

  7. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  8. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  9. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  10. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  11. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  12. Three-dimensional mapping of mechanical activation patterns, contractile dyssynchrony and dyscoordination by two-dimensional strain echocardiography: Rationale and design of a novel software toolbox

    Directory of Open Access Journals (Sweden)

    Cramer Maarten J

    2008-05-01

    of local 2-D echocardiographic deformation data into a 3-D model by dedicated software allows a comprehensive analysis of spatio-temporal distribution patterns of myocardial dyssynchrony, of the global left ventricular deformation and of newer indices that may better reflect myocardial dyscoordination and/or impaired ventricular contractile efficiency. The potential value of such an analysis is highlighted in two dyssynchronous pathologies that impose particular challenges to deformation imaging.

  13. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  14. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  15. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  16. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  17. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  18. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  19. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  20. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  1. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  2. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  3. SU-F-J-173: Online Replanning for Dose Painting Based On Changing ADC Map of Pancreas Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Ates, O; Ahunbay, E; Erickson, B; Li, X [Medical College of Wisconsin, Milwaukee, WI (United States)

    2016-06-15

    Purpose: The introduction of MR-guided radiation therapy (RT), e.g., MR-Linac, would allow dose painting to adapt spatial RT response revealed from MRI data during the RT delivery. The purpose of this study is to investigate the use of an online replanning method to adapt dose painting from the MRI Apparent Diffusion Coefficient (ADC) map acquired during the delivery of RT for pancreatic cancers. Methods: Original dose painting plans were created based on multi-parametric simulation MRI including T1, T2 and ADC, using a treatment planning system (MONACO, Elekta) equipped with an online replanning algorithm (WSO, warm start optimization). Multiple GTVs, identified based on various ADC levels were prescribed to different doses ranging from 50–70 Gy with simultaneous integrated boost in 28 fractions. The MRI acquired after RT were used to mimic weekly MRI, on which the changing GTVs, pancreatic head and other organs-at-risk (OAR) (duodenum, stomach, small bowel) were delineated. The adaptive plan was generated by applying WSO algorithm starting from the deformed original plan based on the weekly MRI using a deformable image registration (DIR) software (ADMIRE, Elekta). The online replanning method takes <10 min. including DIR, target delineation, WSO execution and final dose calculation. Standard IGRT repositioning and full-blown reoptimization plans were also generated to compare with the adaptive plans. Results: The online replanning method significantly improved the multiple target coverages and OAR sparing for pancreatic cancers. For example, for a case with two GTVs with prescriptions of 60 and 70 Gy in pancreatic head, V100-GTV70 (the volume covered by 100% of prescription dose for GTV with 70 Gy)/V100-GTV60/V100-CTV50/V45-duodenum were (95.1/22.2/69.5/85.7), (95.0/97.0/98.6/34.3), and (95.0/98.1/100.0/38.7) for the IGRT, adaptive and reoptimization plans, respectively. Conclusion: The introduced online adaptive replanning method can effectively account for

  4. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  5. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  6. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  7. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  8. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  9. 7. annual software survey 2009

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2009-07-15

    This article presented a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in each product. The featured software developed by Calgary-based providers included: OpenInvoice software developed by DO2 Technologies Inc; oil and gas solutions by Energy Navigator; WellSpring planning system by Enersight; Entero MOSAIC and Entero ONE software packages by Entero Corporation; Emission Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd.; OMNI 3D, VISTA 2D/3D seismic software by Gedco; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS AccuMap and PETRA by IHS; WELLFLO, PIPEFLO and FORGAS wellbore solutions by Neotec; AFENexus, FANexus, GeoNexus, JVNexus, PANexus software by Pandell Technology Corporation; Oil and gas solutions by the Risk Advisory division of SAS; Petrel, ECLIPSE, Avocet, Osprey and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and STABView, ROCKSBank by Weatherford Advanced Geotechnology. The featured software developed by Texas-based providers included the HTRI Xchanger Suite by Heat Transfer Research Inc.; the RFID-based asset tracking system by Merrick Systems; oil and gas solutions by Neuralog Inc.; geoscience data programs by OpenSpirit; and oil and gas solutions by Seismic Micro-Technology Inc. The featured software developed by Vancouver-based providers included the oil and gas solutions by Sustainet Software Solutions Inc.

  10. Software Maintenance Management Evaluation and Continuous Improvement

    CERN Document Server

    April, Alain

    2008-01-01

    This book explores the domain of software maintenance management and provides road maps for improving software maintenance organizations. It describes full maintenance maturity models organized by levels 1, 2, and 3, which allow for benchmarking and continuous improvement paths. Goals for each key practice area are also provided, and the model presented is fully aligned with the architecture and framework of software development maturity models of CMMI and ISO 15504. It is complete with case studies, figures, tables, and graphs.

  11. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  12. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  13. Multi parametric deformed Heisenberg algebras: a route to complexity

    International Nuclear Information System (INIS)

    Curado, E.M.F.; Rego-Monteiro, M.A.

    2000-09-01

    We introduce a generalized of the Heisenberg which is written in terms of a functional of one generator of the algebra, f(J 0 ), that can be any analytical function. When f is linear with slope θ, we show that the algebra in this case corresponds to q-oscillators for q 2 = tan θ. The case where f is polynomial of order n in J 0 corresponds to a n-parameter Heisenberg algebra. The representations of the algebra, when f is any analytical function, are shown to be obtained through the study of the stability of the fixed points of f and their composed functions. The case when f is a quadratic polynomial in J 0 , the simplest non-linear scheme which is able to create chaotic behavior, is analyzed in detail and special regions in the parameter space give representations that ca not be continuously deformed to representations of Heisenberg algebra. (author)

  14. Hybrid fiber grating cavity for multi-parametric sensing.

    Science.gov (United States)

    Paladino, Domenico; Quero, Giuseppe; Caucheteur, Christophe; Mégret, Patrice; Cusano, Andrea

    2010-05-10

    We propose an all-fiber hybrid cavity involving two unbalanced uniform fiber Bragg gratings (FBGs) written at both sides of a tilted FBG (TFBG) to form an all-fiber interferometer. This configuration provides a wavelength gated reflection signal with interference fringes depending on the cavity features modulated by spectral dips associated to the wavelength dependent optical losses due to cladding mode coupling occurring along the TFBG. Such a robust structure preserves the advantages of uniform FBGs in terms of interrogation methods and allows the possibility of simultaneous physical and chemical sensing. (c) 2010 Optical Society of America.

  15. SCT: Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data.

    Science.gov (United States)

    De Leener, Benjamin; Lévy, Simon; Dupont, Sara M; Fonov, Vladimir S; Stikov, Nikola; Louis Collins, D; Callot, Virginie; Cohen-Adad, Julien

    2017-01-15

    For the past 25 years, the field of neuroimaging has witnessed the development of several software packages for processing multi-parametric magnetic resonance imaging (mpMRI) to study the brain. These software packages are now routinely used by researchers and clinicians, and have contributed to important breakthroughs for the understanding of brain anatomy and function. However, no software package exists to process mpMRI data of the spinal cord. Despite the numerous clinical needs for such advanced mpMRI protocols (multiple sclerosis, spinal cord injury, cervical spondylotic myelopathy, etc.), researchers have been developing specific tools that, while necessary, do not provide an integrative framework that is compatible with most usages and that is capable of reaching the community at large. This hinders cross-validation and the possibility to perform multi-center studies. In this study we introduce the Spinal Cord Toolbox (SCT), a comprehensive software dedicated to the processing of spinal cord MRI data. SCT builds on previously-validated methods and includes state-of-the-art MRI templates and atlases of the spinal cord, algorithms to segment and register new data to the templates, and motion correction methods for diffusion and functional time series. SCT is tailored towards standardization and automation of the processing pipeline, versatility, modularity, and it follows guidelines of software development and distribution. Preliminary applications of SCT cover a variety of studies, from cross-sectional area measures in large databases of patients, to the precise quantification of mpMRI metrics in specific spinal pathways. We anticipate that SCT will bring together the spinal cord neuroimaging community by establishing standard templates and analysis procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  17. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  18. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. Software development without languages

    Science.gov (United States)

    Osborne, Haywood S.

    1988-01-01

    Automatic programming generally involves the construction of a formal specification; i.e., one which allows unambiguous interpretation by tools for the subsequent production of the corresponding software. Previous practical efforts in this direction have focused on the serious problems of: (1) designing the optimum specification language; and (2) mapping (translating or compiling) from this specification language to the program itself. The approach proposed bypasses the above problems. It postulates that the specification proper should be an intermediate form, with the sole function of containing information sufficient to facilitate construction of programs and also of matching documentation. Thus, the means of forming the intermediary becomes a human factors task rather than a linguistic one; human users will read documents generated from the specification, rather than the specification itself.

  1. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...

  2. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  3. Architecting for Sustainable Software Delivery

    Science.gov (United States)

    2012-06-01

    14 CrossTalk—May/June 2012 RAPID AND AGILE STABILITY Architecting for Sustainable Software Delivery Ronald J. Koontz , Boeing Robert L. Nord...Figure 2, and additional architecture documentation can be found in the work of Koontz [9, 10, 11]. Designing for extensibility promotes continued...Mapping of Practices to Agile and Architecture Criteria CrossTalk—May/June 2012 19 RAPID AND AGILE STABILITY ABOUT THE AUTHORS Ronald J. Koontz

  4. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  5. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.; Kossykh, V.

    1996-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system

  6. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.

    1998-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorised in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system. (orig.)

  7. The software environment of RODOS

    Energy Technology Data Exchange (ETDEWEB)

    Schuele, O; Rafat, M [Forschungszentrum Karlsruhe, Institut fuer Neutronenphysik und Reaktortechnik, Karlsruhe (Germany); Kossykh, V [Scientific Production Association ' TYPHOON' , Emergency Centre, Obninsk (Russian Federation)

    1996-07-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system.

  8. Measuring the Software Product Quality during the Software Development Life-Cycle: An ISO Standards Perspective

    OpenAIRE

    Rafa E. Al-Qutaish

    2009-01-01

    Problem statement: The International Organization for Standardization (ISO) published a set of international standards related to the software engineering, such as ISO 12207 and ISO 9126. However, there is a set of cross-references between the two standards. Approach: The ISO 9126 on software product quality and ISO 12207 on software life cycle processes had been analyzed to invistigate the relationships between them and to make a mapping from the ISO 9126 quality characteristics to the ISO 1...

  9. Concept mapping instrumental support for problem solving

    NARCIS (Netherlands)

    Stoyanov, S.; Stoyanov, Slavi; Kommers, Petrus A.M.

    2008-01-01

    The main theoretical position of this paper is that it is the explicit problem-solving support in concept mapping software that produces a stronger effect in problem-solving performance than the implicit support afforded by the graphical functionality of concept mapping software. Explicit

  10. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  11. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  12. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  13. Information technology road map 2015

    International Nuclear Information System (INIS)

    2009-09-01

    This book introduces information technology road map 2015 with presentation, process, plan and conclusion of it. It also has introduction of IT road map by field : information technology road map 2015 on the next-generation of semiconductor, display, light emitting diode and light industry, home network and home electronic appliances, digital TV and broadcasting, radio technology, satellite communications, mobile communication for the next-generation, BcN field, software, computer for the next-generation and security of knowledge information.

  14. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  15. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  16. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  17. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  18. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  19. Map of Nasca Geoglyphs

    Science.gov (United States)

    Hanzalová, K.; Pavelka, K.

    2013-07-01

    The Czech Technical University in Prague in the cooperation with the University of Applied Sciences in Dresden (Germany) work on the Nasca Project. The cooperation started in 2004 and much work has been done since then. All work is connected with Nasca lines in southern Peru. The Nasca project started in 1995 and its main target is documentation and conservation of the Nasca lines. Most of the project results are presented as WebGIS application via Internet. In the face of the impending destruction of the soil drawings, it is possible to preserve this world cultural heritage for the posterity at least in a digital form. Creating of Nasca lines map is very useful. The map is in a digital form and it is also available as a paper map. The map contains planimetric component of the map, map lettering and altimetry. Thematic folder in this map is a vector layer of the geoglyphs in Nasca/Peru. Basis for planimetry are georeferenced satellite images, altimetry is created from digital elevation model. This map was created in ArcGis software.

  20. MAP OF NASCA GEOGLYPHS

    Directory of Open Access Journals (Sweden)

    K. Hanzalová

    2013-07-01

    Full Text Available The Czech Technical University in Prague in the cooperation with the University of Applied Sciences in Dresden (Germany work on the Nasca Project. The cooperation started in 2004 and much work has been done since then. All work is connected with Nasca lines in southern Peru. The Nasca project started in 1995 and its main target is documentation and conservation of the Nasca lines. Most of the project results are presented as WebGIS application via Internet. In the face of the impending destruction of the soil drawings, it is possible to preserve this world cultural heritage for the posterity at least in a digital form. Creating of Nasca lines map is very useful. The map is in a digital form and it is also available as a paper map. The map contains planimetric component of the map, map lettering and altimetry. Thematic folder in this map is a vector layer of the geoglyphs in Nasca/Peru. Basis for planimetry are georeferenced satellite images, altimetry is created from digital elevation model. This map was created in ArcGis software.

  1. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  2. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  3. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  4. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  5. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  6. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  7. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  8. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  9. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  10. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  11. Mapping the HISS Dipole

    International Nuclear Information System (INIS)

    McParland, C.; Bieser, F.

    1984-01-01

    The principal component of the Bevalac HISS facility is a large super-conducting 3 Tesla dipole. The facility's need for a large magnetic volume spectrometer resulted in a large gap geometry - a 2 meter pole tip diameter and a 1 meter pole gap. Obviously, the field required detailed mapping for effective use as a spectrometer. The mapping device was designed with several major features in mind. The device would measure field values on a grid which described a closed rectangular solid. The grid would be a regular with the exact measurement intervals adjustable by software. The device would function unattended over the long period of time required to complete a field map. During this time, the progress of the map could be monitored by anyone with access to the HISS VAX computer. Details of the mechanical, electrical, and control design follow

  12. Regulated software meets DevOps

    DEFF Research Database (Denmark)

    Laukkarinen, Teemu; Kuusinen, Kati; Mikkonen, Tommi

    2018-01-01

    Context: Regulatory authorities require proofs from critical systems manufacturers that the software in their products is developed in accordance to prescribed development practices before accepting the product to the markets. This is challenging when using DevOps, where continuous integration...... and deployment are the default practices, which are not a good match with the regulatory software development standards. Objective: We aim to bring DevOps and regulated software development closer to each other. First, we want to make it easier for developers to develop regulated software with tools...... and practices they are familiar with. Second, we want to allow regulatory authorities to build confidence on solutions provided by manufacturers by defining a mapping between DevOps and regulatory software development. Method: We performed a literature survey and created research suggestions using exploratory...

  13. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  14. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  15. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  16. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  17. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  18. Open source clustering software.

    Science.gov (United States)

    de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S

    2004-06-12

    We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.

  19. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  20. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  1. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  2. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  3. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  4. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  5. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  6. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  7. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  8. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  9. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  10. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  11. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  12. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  13. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  14. XES Software Communication Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  15. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  16. XES Software Event Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  17. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  18. XES Software Telemetry Extension

    NARCIS (Netherlands)

    Leemans, M.; Liu, C.

    2017-01-01

    During the execution of software, execution data can be recorded. With the development of process mining techniques on the one hand, and the growing availability of software execution data on the other hand, a new form of software analytics comes into reach. That is, applying process mining

  19. Specifications in software prototyping

    OpenAIRE

    Luqi; Chang, Carl K.; Zhu, Hong

    1998-01-01

    We explore the use of software speci®cations for software prototyping. This paper describes a process model for software prototyping, and shows how specifications can be used to support such a process via a cellular mobile phone switch example.

  20. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  1. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  2. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.

  3. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  4. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  5. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  6. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  7. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  9. Bootstrapped neural nets versus regression kriging in the digital mapping of pedological attributes: the automatic and time-consuming perspectives

    Science.gov (United States)

    Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; Manna, Piero; Terribile, Fabio

    2013-04-01

    Digital soil mapping procedures are widespread used to build two-dimensional continuous maps about several pedological attributes. Our work addressed a regression kriging (RK) technique and a bootstrapped artificial neural network approach in order to evaluate and compare (i) the accuracy of prediction, (ii) the susceptibility of being included in automatic engines (e.g. to constitute web processing services), and (iii) the time cost needed for calibrating models and for making predictions. Regression kriging is maybe the most widely used geostatistical technique in the digital soil mapping literature. Here we tried to apply the EBLUP regression kriging as it is deemed to be the most statistically sound RK flavor by pedometricians. An unusual multi-parametric and nonlinear machine learning approach was accomplished, called BAGAP (Bootstrap aggregating Artificial neural networks with Genetic Algorithms and Principal component regression). BAGAP combines a selected set of weighted neural nets having specified characteristics to yield an ensemble response. The purpose of applying these two particular models is to ascertain whether and how much a more cumbersome machine learning method could be much promising in making more accurate/precise predictions. Being aware of the difficulty to handle objects based on EBLUP-RK as well as BAGAP when they are embedded in environmental applications, we explore the susceptibility of them in being wrapped within Web Processing Services. Two further kinds of aspects are faced for an exhaustive evaluation and comparison: automaticity and time of calculation with/without high performance computing leverage.

  10. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  11. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  12. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  13. Planetary Geologic Mapping Handbook - 2009

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete

  14. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  15. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  16. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  17. Functional mapping in biology and medicine

    International Nuclear Information System (INIS)

    McEachron, D.L.

    1986-01-01

    This book contains 10 selections. Some of the titles are: Two Views of Functional Mapping and Autoradiography; Quantitative Analysis of Autoradiographs; Hardware and Software Design Considerations in Engineering an Image Processing Workstation: Autoradiographic Analysis with DUMAS and the BRAIN Autoradiograph Analysis Software Package (with 1 color plate); and Quantitative Autoradiography and in vitro Radioligand Binding

  18. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  19. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  20. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  1. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  2. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  3. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  4. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere)

  5. Software quality management

    International Nuclear Information System (INIS)

    Bishop, D.C.; Pymm, P.

    1991-01-01

    As programmable electronic (software-based) systems are increasingly being proposed as design solutions for high integrity applications in nuclear power stations, the need to adopt suitable quality management arrangements is paramount. The authors describe Scottish Nuclear's strategy for software quality management and, using the main on-line monitoring system at Torness Power Station as an example, explain how this strategy is put into practice. Particular attention is given to the topics of software quality planning and change control. (author)

  6. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  7. Assuring Software Reliability

    Science.gov (United States)

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  8. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  9. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  10. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  11. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  12. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  13. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  14. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  15. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  16. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  17. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  18. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  19. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  20. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  1. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  2. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  3. Contractor Software Charges

    National Research Council Canada - National Science Library

    Granetto, Paul

    1994-01-01

    .... Examples of computer software costs that contractors charge through indirect rates are material management systems, security systems, labor accounting systems, and computer-aided design and manufacturing...

  4. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  5. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  6. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  7. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  8. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  9. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  10. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  11. Sustainability in Software Engineering

    NARCIS (Netherlands)

    Wolfram, N.J.E.; Lago, P.; Osborne, Francesco

    2017-01-01

    The intersection between software engineering research and issues related to sustainability and green IT has been the subject of increasing attention. In spite of that, we observe that sustainability is still not clearly defined, or understood, in the field of software engineering. This lack of

  12. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  13. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  14. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  15. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  16. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  17. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  18. Method and system for a network mapping service

    Science.gov (United States)

    Bynum, Leo

    2017-10-17

    A method and system of publishing a map includes providing access to a plurality of map data files or mapping services between at least one publisher and at least one subscriber; defining a map in a map context comprising parameters and descriptors to substantially duplicate a map by reference to mutually accessible data or mapping services, publishing a map to a channel in a table file on server; accessing the channel by at least one subscriber, transmitting the mapping context from the server to the at least one subscriber, executing the map context by the at least one subscriber, and generating the map on a display software associated with the at least one subscriber by reconstituting the map from the references and other data in the mapping context.

  19. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  20. Data structure and software engineering challenges and improvements

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Data structure and software engineering is an integral part of computer science. This volume presents new approaches and methods to knowledge sharing, brain mapping, data integration, and data storage. The author describes how to manage an organization's business process and domain data and presents new software and hardware testing methods. The book introduces a game development framework used as a learning aid in a software engineering at the university level. It also features a review of social software engineering metrics and methods for processing business information. It explains how to

  1. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  2. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  3. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  4. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  5. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  6. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  7. Software licenses: Stay honest!

    CERN Multimedia

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  8. On the map: Nature and Science editorials

    OpenAIRE

    Waaijer C.J.F., van, Bochove C.A. van, Eck, N.J.P.

    2011-01-01

    Bibliometric mapping of scientific articles based on keywords and technical terms in abstracts is now frequently used to chart scientific fields. In contrast, no significant mapping has been applied to the full texts of non-specialist documents. Editorials in Nature and Science are such non-specialist documents, reflecting the views of the two most read scientific journals on science, technology and policy issues. We use the VOSviewer mapping software to chart the topics of these editorials. ...

  9. Automatic Mapping of NES Games with Mappy

    OpenAIRE

    Osborn, Joseph C.; Summerville, Adam; Mateas, Michael

    2017-01-01

    Game maps are useful for human players, general-game-playing agents, and data-driven procedural content generation. These maps are generally made by hand-assembling manually-created screenshots of game levels. Besides being tedious and error-prone, this approach requires additional effort for each new game and level to be mapped. The results can still be hard for humans or computational systems to make use of, privileging visual appearance over semantic information. We describe a software sys...

  10. CT perfusion mapping of hemodynamic disturbances associated to acute spontaneous intracerebral hemorrhage

    International Nuclear Information System (INIS)

    Fainardi, Enrico; Borrelli, Massimo; Saletti, Andrea; Ceruti, Stefano; Tamarozzi, Riccardo; Schivalocchi, Roberta; Cavallo, Michele; Azzini, Cristiano; Chieregato, Arturo

    2008-01-01

    We sought to quantify perfusion changes associated to acute spontaneous intracerebral hemorrhage (SICH) by means of computed tomography perfusion (CTP) imaging. We studied 89 patients with supratentorial SICH at admission CT by using CTP scanning obtained within 24 h after symptom onset. Regional cerebral blood flow (rCBF), cerebral blood volume (rCBV) and mean transit time (rMTT) levels were measured in four different regions of interest manually outlined on CT scan: (1) hemorrhagic core; (2) perihematomal low-density area; (3) 1 cm rim of normal-appearing brain tissue surrounding the perilesional area; and (4) a mirrored area, including the clot and the perihematomal region, located in the non-lesioned contralateral hemisphere. rCBF, rCBV, and rMTT mean levels showed a centrifugal distribution with a gradual increase from the core to the periphery (p 20 ml) hematomas (p<0.01 and p <0.02, respectively). Multi-parametric CTP mapping of acute SICH indicates that perfusion values show a progressive improvement from the core to the periphery. In the first 24 h, perihemorrhagic region was hypoperfused with CTP values which were not suggestive of ischemic penumbra destined to survive but more likely indicative of edema formation. These findings also argue for a potential influence of early amounts of bleeding on perihematomal hemodynamic abnormalities. (orig.)

  11. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  12. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  13. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  14. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  15. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  16. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  17. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  18. Workflow for high-content, individual cell quantification of fluorescent markers from universal microscope data, supported by open source software.

    Science.gov (United States)

    Stockwell, Simon R; Mittnacht, Sibylle

    2014-12-16

    Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.

  19. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  20. Characterization of Morphology using MAMA Software

    Energy Technology Data Exchange (ETDEWEB)

    Gravelle, Julie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-02

    The MAMA (Morphological Analysis for Material Attribution) software was developed at the Los Alamos National Laboratory funded through the National Technical Nuclear Forensics Center in the Department of Homeland Security. The software allows images to be analysed and quantified. The largest project I worked on was to quantify images of plutonium oxides and ammonium diuranates prepared by the group with the software and provide analyses on the particles of each sample. Images were quantified through MAMA, with a color analysis, a lexicon description and powder x-ray diffraction. Through this we were able to visually see a difference between some of the syntheses. An additional project was to revise the manual for MAMA to help streamline training and provide useful tips to users to more quickly become acclimated to using the software. The third project investigated expanding the scope of MAMA and finding a statistically relevant baseline for the particulates through the analysis of maps in the software and using known measurements to compare the error associated with the software. During this internship, I worked on several different projects dealing with the MAMA software. The revision of the usermanual for the MAMA software was the first project I was able to work and collaborate on. I first learned how to use the software by getting instruction from a skilled user at the laboratory, Dan Schwartz, and by using the existing user manual and examples. After becoming accustomed to the program, I started to go over the manual to correct and change items that were not as useful or descriptive as they could have been. I also added in tips that I learned as I explored the software. The updated manual was also worked on by several others who have been developing the program. The goal of these revisions was to ensure the most concise and simple directions to the software were available to future users. By incorporating tricks and shortcuts that I discovered and picked up

  1. Mapping Information

    Data.gov (United States)

    Department of Homeland Security — ArcGIS is a system that provides an integrated collection of GIS software products that provides a standards-based platform for spatial analysis, data management,...

  2. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  3. Software industrial flexible

    OpenAIRE

    Díaz Araya, Daniel; Muñoz, Leandro; Sirerol, Daniel; Oviedo, Sandra; Ibáñez, Francisco S.

    2012-01-01

    En este trabajo se pretende investigar y proponer técnicas, métodos y tecnologías que permitan el desarrollo de software flexible en ambientes industriales. El objetivo es generar métodos y técnicas para facilitar el desarrollo de software flexible en ambientes industriales. Las áreas de investigación son los sistemas de scheduling de producción, la generación de software para plataformas de hardware abiertas y la innovación.

  4. Thyroid uptake software

    International Nuclear Information System (INIS)

    Alonso, Dolores; Arista, Eduardo

    2003-01-01

    The DETEC-PC software was developed as a complement to a measurement system (hardware) able to perform Iodine Thyroid Uptake studies. The software was designed according to the principles of Object oriented programming using C++ language. The software automatically fixes spectrometric measurement parameters and besides patient measurement also performs statistical analysis of a batch of samples. It possesses a PARADOX database with all information of measured patients and a help system with the system options and medical concepts related to the thyroid uptake study

  5. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  6. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  7. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  8. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  9. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  10. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  11. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  12. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  13. Sobre software libre

    OpenAIRE

    Matellán Olivera, Vicente; González Barahona, Jesús; Heras Quirós, Pedro de las; Robles Martínez, Gregorio

    2004-01-01

    220 p. "Sobre software libre" reune casi una treintena de ensayos sobre temas de candente actualidad relacionados con el software libre (del cual Linux es su ex- ponente más conocido). Los ensayos que el lector encontrará están divididos en bloques temáticos que van desde la propiedad intelectual o las cuestiones económicas y sociales de este modelo hasta su uso en la educación y las administraciones publicas, pasando por alguno que repasa la historia del software libre en l...

  14. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  15. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  16. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  17. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  18. Next Generation Software Development

    National Research Council Canada - National Science Library

    Manna, Zohar

    2005-01-01

    Under this grant we have studied the development of a scientifically sound basis for software development that builds on widely used pragmatic methods but is firmly grounded in well-established formal...

  19. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  20. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  1. Software didattico: integrazione scolastica

    Directory of Open Access Journals (Sweden)

    Lucia Ferlino

    1996-01-01

    Full Text Available Discussion of the use of educational software for school integration. Requires being aware of its potential effectiveness and know that it also lies in the choice of functional products.

  2. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  3. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  4. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  5. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  6. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  7. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  8. ITSY Handheld Software Radio

    National Research Council Canada - National Science Library

    Bose, Vanu

    2001-01-01

    .... A handheld software radio platform would enable the construction of devices that could inter-operate with multiple legacy systems, download new waveforms and be used to construct adhoc networks...

  9. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  10. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  11. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  12. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  13. Principles of Antifragile Software

    OpenAIRE

    Monperrus, Martin

    2014-01-01

    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in productio...

  14. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  15. Mapping racism.

    Science.gov (United States)

    Moss, Donald B

    2006-01-01

    The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.

  16. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    Science.gov (United States)

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  17. Profiling a Mind Map User: A Descriptive Appraisal

    Science.gov (United States)

    Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.

    2010-01-01

    Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…

  18. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  19. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  20. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  1. Arctic Basemaps In Google Maps

    DEFF Research Database (Denmark)

    Muggah, J.; Mioc, Darka

    2010-01-01

    The Ocean Mapping Group has been collecting data in the Arctic since 2003 and there are approximately 2,000 basemaps. In the current online storage format used by the OMG, it is difficult to view the data and users cannot easily pan and zoom. The purpose of this research is to investigate...... the advantages of the use of Google Maps, to display the OMG's Arctic data. The map should should load the large Artic dataset in a reasonable time. The bathymetric images were created using software in Linux written by the OMG, and a step-by-step process was used to create images from the multibeam data...... collected by the OMG in the Arctic. The website was also created using Linux operating system. The projection needed to be changed from Lambert Conformal Conic (useful at higher Latitudes) to Mercator (used by Google Maps) and the data needed to have a common colour scheme. After creating and testing...

  2. Genetic Mapping

    Science.gov (United States)

    ... greatly advanced genetics research. The improved quality of genetic data has reduced the time required to identify a ... cases, a matter of months or even weeks. Genetic mapping data generated by the HGP's laboratories is freely accessible ...

  3. Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian

    2014-05-01

    With the upcoming launch of the next generation of hyperspectral satellites that will routinely deliver high spectral resolution images for the entire globe (e.g. EnMAP, HISUI, HyspIRI, HypXIM, PRISMA), an increasing demand for the availability/accessibility of hyperspectral soil products is coming from the geoscience community. Indeed, many robust methods for the prediction of soil properties based on imaging spectroscopy already exist and have been successfully used for a wide range of soil mapping airborne applications. Nevertheless, these methods require expert know-how and fine-tuning, which makes them used sparingly. More developments are needed toward easy-to-access soil toolboxes as a major step toward the operational use of hyperspectral soil products for Earth's surface processes monitoring and modelling, to allow non-experienced users to obtain new information based on non-expensive software packages where repeatability of the results is an important prerequisite. In this frame, based on the EU-FP7 EUFAR (European Facility for Airborne Research) project and EnMAP satellite science program, higher performing soil algorithms were developed at the GFZ German Research Center for Geosciences as demonstrators for end-to-end processing chains with harmonized quality measures. The algorithms were built-in into the HYSOMA (Hyperspectral SOil MApper) software interface, providing an experimental platform for soil mapping applications of hyperspectral imagery that gives the choice of multiple algorithms for each soil parameter. The software interface focuses on fully automatic generation of semi-quantitative soil maps such as soil moisture, soil organic matter, iron oxide, clay content, and carbonate content. Additionally, a field calibration option calculates fully quantitative soil maps provided ground truth soil data are available. Implemented soil algorithms have been tested and validated using extensive in-situ ground truth data sets. The source of the HYSOMA

  4. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  5. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  6. Business Management Software Axolon ERP

    OpenAIRE

    Axolon ERP Solution

    2018-01-01

    Axolon ERP a Business Management Software www.axolonerp.com by Micromind is a comprehensive business management software solution for businesses. We deliver Business Management Software Dubai in UAE, GCC Countries and products also include ERP Software Dubai. HR & Payroll, Inventory Software, Project Management, Software Development, Solutions and Services in Dubai, UAE for small and medium sized Enterprises (SME) in the middle east with a easy-to-use, secure and efficient business management...

  7. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  8. The Ettention software package

    International Nuclear Information System (INIS)

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-01-01

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  9. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  10. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  11. The contribution of instrumentation and control software to system reliability

    International Nuclear Information System (INIS)

    Fryer, M.O.

    1984-01-01

    Advanced instrumentation and control systems are usually implemented using computers that monitor the instrumentation and issue commands to control elements. The control commands are based on instrument readings and software control logic. The reliability of the total system will be affected by the software design. When comparing software designs, an evaluation of how each design can contribute to the reliability of the system is desirable. Unfortunately, the science of reliability assessment of combined hardware and software systems is in its infancy. Reliability assessment of combined hardware/software systems is often based on over-simplified assumptions about software behavior. A new method of reliability assessment of combined software/hardware systems is presented. The method is based on a procedure called fault tree analysis which determines how component failures can contribute to system failure. Fault tree analysis is a well developed method for reliability assessment of hardware systems and produces quantitative estimates of failure probability based on component failure rates. It is shown how software control logic can be mapped into a fault tree that depicts both software and hardware contributions to system failure. The new method is important because it provides a way for quantitatively evaluating the reliability contribution of software designs. In many applications, this can help guide designers in producing safer and more reliable systems. An application to the nuclear power research industry is discussed

  12. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  13. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  14. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  15. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  16. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  17. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  18. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  19. New Media as Software

    Directory of Open Access Journals (Sweden)

    Manuel Portela

    2014-03-01

    Full Text Available Review of Lev Manovich, Software Takes Command: Extending the Language of New Media. London: Bloomsbury, 2013, 358 pp. ISBN 978-1-6235-6817-7. In Lev Manovich’s most recent book, this programmatic interrogation of our medial condition leads to the following question: do media still exist after software? This is the question that triggers Manovich’s dialogue both with computing history and with theories of digital media of recent decades, including the extension of his own previous formulations in The Language of New Media, published in 2001, and which became a major reference work in the field. The subtitle of the new book points precisely to this critical revisiting of his earlier work in the context of ubiquitous computing and accelerated transcoding of social, cultural and artistic practices by software.

  20. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  1. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  2. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  3. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  4. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  5. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  6. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  7. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  8. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  9. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  10. Provider software buyer's guide.

    Science.gov (United States)

    1994-03-01

    To help long term care providers find new ways to improve quality of care and efficiency, Provider magazine presents the fourth annual listing of software firms marketing computer programs for all areas of nursing facility operations. On the following five pages, more than 80 software firms display their wares, with programs such as minimum data set and care planning, dietary, accounting and financials, case mix, and medication administration records. The guide also charts compatible hardware, integration ability, telephone numbers, company contacts, and easy-to-use reader service numbers.

  11. Model of software quality

    OpenAIRE

    Valencia Ayala, Luz Estela; Villa Sánchez, Paula Andréa; Ocampo S., Carlos Alberto

    2009-01-01

    En un mercado globalizado donde las empresas deben innovar y mejorar continuamente para crecer y ser más competitivas, es necesario tener acceso a certificaciones de calidad internacionales que les den un respaldo y puedan mantenerse en este mercado. Las certificaciones de calidad en la industria del software ayudan a las empresas a ser más productivas disminuyendo costos y tiempo en sus desarrollos. Las empresas de desarrollo de software de nuestro país en su mayoría son micro y pequeñas...

  12. Security System Software

    Science.gov (United States)

    1993-01-01

    C Language Integration Production System (CLIPS), a NASA-developed expert systems program, has enabled a security systems manufacturer to design a new generation of hardware. C.CURESystem 1 Plus, manufactured by Software House, is a software based system that is used with a variety of access control hardware at installations around the world. Users can manage large amounts of information, solve unique security problems and control entry and time scheduling. CLIPS acts as an information management tool when accessed by C.CURESystem 1 Plus. It asks questions about the hardware and when given the answer, recommends possible quick solutions by non-expert persons.

  13. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  14. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  15. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  16. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  17. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  18. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  19. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  20. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  1. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  2. Web mapping: tools and solutions for creating interactive maps of forestry interest

    Directory of Open Access Journals (Sweden)

    Notarangelo G

    2011-12-01

    Full Text Available The spread of geobrowsers as tools for displaying geographically referenced information provides insights and opportunities to those who, not being specialists in Geographic Information Systems, want to take advantage from exploration and communication power offered by these software. Through the use of web services such as Google Maps and the use of suitable markup languages, one can create interactive maps starting from highly heterogeneous data and information. These interactive maps can also be easily distributed and shared with Internet users, because they do not need to use proprietary software nor special skills but only a web browser. Unlike the maps created with GIS, whose output usually is a static image, the interactive maps retain all their features to users advantage. This paper describes a web application that, using the Keyhole Markup Language and the free service of Google Maps, produces choropleth maps relating to some forest indicators estimated by the last Italian National Forest Inventory. The creation of a map is done through a simple and intuitive interface. The maps created by users can be downloaded as KML file and can be viewed or modified via the freeware application Google Earth or free and open source GIS software like Quantum GIS. The web application is free and available at www.ricercaforestale.it.

  3. Projective mapping

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus

    2012-01-01

    by the practical testing environment. As a result of the changes, a reasonable assumption would be to question the consequences caused by the variations in method procedures. Here, the aim is to highlight the proven or hypothetic consequences of variations of Projective Mapping. Presented variations will include...... instructions and influence heavily the product placements and the descriptive vocabulary (Dehlholm et.al., 2012b). The type of assessors performing the method influences results with an extra aspect in Projective Mapping compared to more analytical tests, as the given spontaneous perceptions are much dependent......Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometimes...

  4. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  5. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  6. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  7. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  8. Affective Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    . In particular, mapping environmental damage, endangered species, and human made disasters has become one of the focal point of affective knowledge production. These ‘more-than-humangeographies’ practices include notions of species, space and territory, and movement towards a new political ecology. This type...... of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper looks at computer-assisted cartography as part...

  9. CDC field mapping device - ''ROTOTRACK''

    International Nuclear Information System (INIS)

    Yamada, R.; Hawtree, J.; Kaczar, K.; Leverence, R.; McGuire, K.; Newman-Holmes, C.; Schmidt, E.E.; Shallenberger, J.

    1985-10-01

    A field mapping device for the magnet of the Collider Detector at Fermilab (CDF) was constructed. The device was used for extensive study of the CDF magnetic field distribution. The mechanical and electrical features of the device, as well as the data acquisition system and software, are described. The mechanical system was designed so that the errors on the position and angle of the probe were +-0.75 mm and +-1 mrad, respectively

  10. a comparative survey on mind mapping tools

    Directory of Open Access Journals (Sweden)

    Avgoustos A. TSINAKOS

    2009-07-01

    Full Text Available Mind Mapping is an important technique that improves the way you takes notes, and enhances your creative problem solving. By using Mind Maps, you can quickly identify and understand the structure of a subject and the way that pieces of information fit together, as well as recording the raw facts contained in normal notes. It can also be used as complementary tools for knowledge construction and sharing. Their suitability as a pedagogical tool for education, e-learning and training, increases their importance. Also, in a world of information overload and businesses struggling to keep up with the place of change, knowledge workers need effective tools to organize, analyze, brainstorm and collaborate on ideas. In resent years, a wide variety of mind mapping software tools have been developed. An often question that comes up, due to this plethora of software tools, is “which is the best mind mapping software?” Anyone who gives you an immediate answer either knows you and your mind mapping activities very well or their answer in not worth a lot. The “best” depends so much on how you use mind maps. In this paper we are trying to investigate different user profiles and to identify various axes for comparison among mind mapping tools that are suitable for a specific user profile, describe each axis and then analyze each tool.

  11. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  12. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  13. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  14. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  15. Software for noise measurements

    International Nuclear Information System (INIS)

    Zyryanov, V.A.

    1987-01-01

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  16. Software complex "remember me"

    OpenAIRE

    Kosheutova, N. V.; Osina, P. M.

    2016-01-01

    The article describes the importance of time management and effective planning in modern society and is devoted to an Android OS application development. It points out the main features of a mobile application such as cross-platform capability and synchronization. Much attention is given to the software architecture as well as user data protection via password hashing methods.

  17. Software management issues

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  18. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  19. Software configuration management

    International Nuclear Information System (INIS)

    Arribas Peces, E.; Martin Faraldo, P.

    1993-01-01

    Software Configuration Management is directed towards identifying system configuration at specific points of its life cycle, so as to control changes to the configuration and to maintain the integrity and traceability of the configuration throughout its life. SCM functions and tasks are presented in the paper

  20. Patterns in Software Development

    DEFF Research Database (Denmark)

    Corry, Aino Vonge

    the university and I entered a project to industry within Center for Object Technology (COT). I focused on promoting the pattern concept to the Danish software industry in order to help them take advantage of the benefits of applying patterns in system development. In the obligatory stay abroad, I chose to visit...

  1. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  2. SEER*Stat Software

    Science.gov (United States)

    If you have access to SEER Research Data, use SEER*Stat to analyze SEER and other cancer-related databases. View individual records and produce statistics including incidence, mortality, survival, prevalence, and multiple primary. Tutorials and related analytic software tools are available.

  3. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  4. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  5. MOCASSIN-prot software

    Science.gov (United States)

    MOCASSIN-prot is a software, implemented in Perl and Matlab, for constructing protein similarity networks to classify proteins. Both domain composition and quantitative sequence similarity information are utilized in constructing the directed protein similarity networks. For each reference protein i...

  6. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  7. Green Software Products

    NARCIS (Netherlands)

    Jagroep, E.A.

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  8. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  9. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  10. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices; TOPICAL

    International Nuclear Information System (INIS)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document

  11. TWRS engineering bibliography software listing

    International Nuclear Information System (INIS)

    Husa, E.I.

    1995-01-01

    This document contains the computer software listing for Engineering Bibliography software, developed by E. Ivar Husa. This software is in the working prototype stage of development. The code has not been tested to requirements. TWRS Engineering created this software for engineers to share bibliographic references across the Hanford site network (HLAN). This software is intended to store several hundred to several thousand references (a compendium with limited range). Coded changes are needed to support the larger number of references

  12. Interface-based software testing

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of softwar...

  13. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  14. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  15. Energetic map

    International Nuclear Information System (INIS)

    2012-01-01

    This report explains the energetic map of Uruguay as well as the different systems that delimits political frontiers in the region. The electrical system importance is due to the electricity, oil and derived , natural gas, potential study, biofuels, wind and solar energy

  16. Necklace maps

    NARCIS (Netherlands)

    Speckmann, B.; Verbeek, K.A.B.

    2010-01-01

    Statistical data associated with geographic regions is nowadays globally available in large amounts and hence automated methods to visually display these data are in high demand. There are several well-established thematic map types for quantitative data on the ratio-scale associated with regions:

  17. Participatory maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    towards a new political ecology. This type of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper...

  18. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  19. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  20. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  1. GPS Software Packages Deliver Positioning Solutions

    Science.gov (United States)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  2. Redrawing the solar map of South Africa for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Munzhedzi, R.; Sebitosi, A.B. [Electrical Engineering, University of Cape Town, Private Bag, Rm 522.2 Menzies Building, Rondebosch 7701, Cape Town (South Africa)

    2009-01-15

    The South African solar map has been redrawn to make it applicable to photovoltaic installations. This has been done with the aim of reducing the cost of solar PV installations in South Africa through accurate energy resource assessment and competent system design. Climate data software as well as solar design software was used to aid this process. The new map provides an alternative to the map in current use, which only considers radiation, whereas many more factors affect the output of a panel, such as wind, cloud cover and humidity. All these are taken into account when drawing the new map. (author)

  3. Digital Mapping and Land Information Systems - Volume 6

    DEFF Research Database (Denmark)

    Frederiksen, Poul

    1998-01-01

    Introduction of digital mapping techniques in the 28 counties of Latvia related to the offices of the national mapping agency (State Land Service). Major components are: Training of regional staff, procurement of hard- and software, training of technical staff from State Land Service, HQ. Develop......Introduction of digital mapping techniques in the 28 counties of Latvia related to the offices of the national mapping agency (State Land Service). Major components are: Training of regional staff, procurement of hard- and software, training of technical staff from State Land Service, HQ...

  4. Research on Topographic Map Updating

    Directory of Open Access Journals (Sweden)

    Ivana Javorović

    2013-04-01

    Full Text Available The investigation of interpretability of panchromatic satellite image IRS-1C integrated with multispectral Landsat TM image with the purpose of updating the topographic map sheet at the scale of 1:25 000 has been described. The geocoding of source map was based on trigonometric points of the map sheet. Satellite images were geocoded using control points selected from the map. The contents of map have been vectorized and topographic database designed. The digital image processing improved the interpretability of images. Then, the vectorization of new contents was made. The change detection of the forest and water area was defined by using unsupervised classification of spatial and spectral merged images. Verification of the results was made using corresponding aerial photographs. Although this methodology could not insure the complete updating of topographic map at the scale of 1:25 000, the database has been updated with huge amount of data. Erdas Imagine 8.3. software was used. 

  5. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  6. Fingerprinting Software Defined Networks and Controllers

    Science.gov (United States)

    2015-03-01

    rps requests per second RTT Round-Trip Time SDN Software Defined Networking SOM Self-Organizing Map STP Spanning Tree Protocol TRW-CB Threshold Random...Protocol ( STP ) updates), in which case the frame will be “punted” from the forwarding lookup process and processed by the route processor [9]. The act of...environment 20 to accomplish the needs of B4. In addition to Google, the SDN market is expected to grow beyond $35 billion by April 2018 [31]. The rate

  7. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  8. Software Configurable Multichannel Transceiver

    Science.gov (United States)

    Freudinger, Lawrence C.; Cornelius, Harold; Hickling, Ron; Brooks, Walter

    2009-01-01

    Emerging test instrumentation and test scenarios increasingly require network communication to manage complexity. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. A fundamental requirement for a software-definable radio system is independence from carrier frequencies, one of the radio components that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate the viability of a promising chipset that performs conversion of radio frequency (RF) signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit the size of a commodity disk drive, programmable for any frequency band between 1 MHz and 6 GHz.

  9. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  10. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  11. Beebook: light field mapping app

    Science.gov (United States)

    De Donatis, Mauro; Di Pietro, Gianfranco; Rinnone, Fabio

    2014-05-01

    In the last decade the mobile systems for field digital mapping were developed (see Wikipedia for "Digital geologic mapping"), also against many skeptic traditional geologists. Until now, hardware was often heavy (tablet PC) and software sometime difficult also for expert GIS users. At present, the advent of light tablet and applications makes things easier, but we are far to find a whole solution for a complex survey like the geological one where you have to manage complexities such information, hypothesis, data, interpretation. Beebook is a new app for Android devices, has been developed for fast ad easy mapping work in the field trying to try to solve this problem. The main features are: • off-line raster management, GeoTIFF ed other raster format using; • on-line map visualisation (Google Maps, OSM, WMS, WFS); • SR management and conversion using PROJ.4; • vector file mash-up (KML and SQLite format); • editing of vector data on the map (lines, points, polygons); • augmented reality using "Mixare" platform; • export of vector data in KML, CSV, SQLite (Spatialite) format; • note: GPS or manual point inserting linked to other application files (pictures, spreadsheet, etc.); • form: creation, edition and filling of customized form; • GPS: status control, tracker and positioning on map; • sharing: synchronization and sharing of data, forms, positioning and other information can be done among users. The input methods are different from digital keyboard to fingers touch, from voice recording to stylus. In particular the most efficient way of inserting information is the stylus (or pen): field geologists are familiar with annotation and sketches. Therefore we suggest the use of devices with stylus. The main point is that Beebook is the first "transparent" mobile GIS for tablet and smartphone deriving from previous experience as traditional mapping and different previous digital mapping software ideation and development (MapIT, BeeGIS, Geopaparazzi

  12. Guidance and Control Software,

    Science.gov (United States)

    1980-05-01

    user, by forcing him subconsciously to make faster decisions than necessary and giving him fewer choices than possible. It may be compared to the... reprogramming , and two real time references. Interfaced to the main computer but still within the same physical case are a 12-bit HUD processor, a HDD...redesigned and reprogrammed many areas of the UPDATE I Mission Software to rectify this problem. The lesson learned was that the in-house stuff must devise

  13. ThermalTracker Software

    Energy Technology Data Exchange (ETDEWEB)

    2016-08-10

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  14. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  15. Unified Engineering Software System

    Science.gov (United States)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  16. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  17. Software for Avionics.

    Science.gov (United States)

    1983-01-01

    fonctions gfinbrales et lea uti- litaires fournis en particulier grice 41 UNIX, sont intfigrfs aelon divers points de vue: - par leur accas 41 travers le...Are They Really A Problem? Proceedings, 2nd International Conference On Software Engineering, pp 91-68. Long acCA : IEEE Computer Society. Britton...CD The Hague. Nc KLEINSCIIMIDT, M. Dr Fa. LITEF. Poatfach 774. 7800 Freiburg i. Br., Ge KLEMM, R. Dr FGAN- FFM , D 5 307 Watchberg-Werthhoven. Ge KLENK

  18. Real World Software Engineering

    Science.gov (United States)

    1994-07-15

    You put the new kid there and their first promotion is out of maintenance. ii Maintenance is not sufficiently emphasized as an important criteria for...the successful material from Koffman’s CS1 pedagogy with a software-engineering-oriented Ada presentation order. Packages are introduced early and...Shumate, K. Understanding Ada. 2nd edition, John Wiley & Sons. This would make a CS1 book if it included more overall pedagogy , independent of language

  19. Hardening Software Defined Networks

    Science.gov (United States)

    2014-07-01

    Zarifis,Peyman Kazemian:Leveraging SDN layering to systematically troubleshoot networks. HotSDN 2013: 37-42 21. Aurojit Panda ,Colin Scott,Ali Ghodsi...Unlimited. 21 ICN.SIGCOMM 2013: 147-158 23. Sangjin Han (U.C.Berkeley), Norbert Egi (Huawei Corp.), Aurojit Panda , Sylvia Ratnasamy (U.C.Berkeley...balancers, traffic-shapers, and so on. SDN brings software and processing power to bear on all this complexity. While a large Data Center may be

  20. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)