WorldWideScience

Sample records for multi-parametric mapping software

  1. Predicting a multi-parametric probability map of active tumor extent using random forests.

    Science.gov (United States)

    Prior, Fred W; Fouke, Sarah J; Benzinger, Tammie; Boyd, Alicia; Chicoine, Michael; Cholleti, Sharath; Kelsey, Matthew; Keogh, Bart; Kim, Lauren; Milchenko, Mikhail; Politte, David G; Tyree, Stephen; Weinberger, Kilian; Marcus, Daniel

    2013-01-01

    Glioblastoma Mulitforme is highly infiltrative, making precise delineation of tumor margin difficult. Multimodality or multi-parametric MR imaging sequences promise an advantage over anatomic sequences such as post contrast enhancement as methods for determining the spatial extent of tumor involvement. In considering multi-parametric imaging sequences however, manual image segmentation and classification is time-consuming and prone to error. As a preliminary step toward integration of multi-parametric imaging into clinical assessments of primary brain tumors, we propose a machine-learning based multi-parametric approach that uses radiologist generated labels to train a classifier that is able to classify tissue on a voxel-wise basis and automatically generate a tumor segmentation. A random forests classifier was trained using a leave-one-out experimental paradigm. A simple linear classifier was also trained for comparison. The random forests classifier accurately predicted radiologist generated segmentations and tumor extent.

  2. Predicting a Multi-Parametric Probability Map of Active Tumor Extent Using Random Forests*

    Science.gov (United States)

    Prior, Fred W.; Fouke, Sarah J.; Benzinger, Tammie; Boyd, Alicia; Chicoine, Michael; Cholleti, Sharath; Kelsey, Matthew; Keogh, Bart; Kim, Lauren; Milchenko, Mikhail; Politte, David G.; Tyree, Stephen; Weinberger, Kilian; Marcus, Daniel

    2014-01-01

    Glioblastoma Mulitforme is highly infiltrative, making precise delineation of tumor margin difficult. Multimodality or multi-parametric MR imaging sequences promise an advantage over anatomic sequences such as post contrast enhancement as methods for determining the spatial extent of tumor involvement. In considering multi-parametric imaging sequences however, manual image segmentation and classification is time-consuming and prone to error. As a preliminary step toward integration of multi-parametric imaging into clinical assessments of primary brain tumors, we propose a machine-learning based multi-parametric approach that uses radiologist generated labels to train a classifier that is able to classify tissue on a voxel-wise basis and automatically generate a tumor segmentation. A random forests classifier was trained using a leave-one-out experimental paradigm. A simple linear classifier was also trained for comparison. The random forests classifier accurately predicted radiologist generated segmentations and tumor extent. PMID:24111225

  3. Urban Flood Vulnerability and Risk Mapping Using Integrated Multi-Parametric AHP and GIS: Methodological Overview and Case Study Assessment

    OpenAIRE

    2014-01-01

    This study aims at providing expertise for preparing public-based flood mapping and estimating flood risks in growing urban areas. To model and predict the magnitude of flood risk areas, an integrated Analytical Hierarchy Process (AHP) and Geographic Information System (GIS) analysis techniques are used for the case of Eldoret Municipality in Kenya. The flood risk vulnerability mapping follows a multi-parametric approach and integrates some of the flooding causative factors such as rainfall d...

  4. MODIS-based multi-parametric platform for mapping of flood affected areas. Case study: 2006 Danube extreme flood in Romania

    Directory of Open Access Journals (Sweden)

    Craciunescu Vasile

    2016-12-01

    Full Text Available Flooding remains the most widely distributed natural hazard in Europe, leading to significant economic and social impact. Earth observation data is presently capable of making fundamental contributions towards reducing the detrimental effects of extreme floods. Technological advance makes development of online services able to process high volumes of satellite data without the need of dedicated desktop software licenses possible. The main objective of the case study is to present and evaluate a methodology for mapping of flooded areas based on MODIS satellite images derived indices and using state-of-the-art geospatial web services. The methodology and the developed platform were tested with data for the historical flood event that affected the Danube floodplain in 2006 in Romania. The results proved that, despite the relative coarse resolution, MODIS data is very useful for mapping the development flooded area in large plain floods. Moreover it was shown, that the possibility to adapt and combine the existing global algorithms for flood detection to fit the local conditions is extremely important to obtain accurate results.

  5. Multi-Parametric Toolbox 3.0

    OpenAIRE

    Herceg, Martin; Kvasnica, Michal; Jones, Colin; Morari, Manfred

    2013-01-01

    The Multi-Parametric Toolbox is a col- lection of algorithms for modeling, control, analysis, and deployment of constrained optimal controllers developed under Matlab. It features a powerful ge- ometric library that extends the application of the toolbox beyond optimal control to various problems arising in computational geometry. The new version 3.0 is a complete rewrite of the original toolbox with a more flexible structure that offers faster integration of new algorithms. The numerical sid...

  6. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  7. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...

  8. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...

  9. Personalized precision radiotherapy by integration of multi-parametric functional and biological imaging in prostate cancer. A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Thorwarth, Daniela [Tuebingen Univ. (Germany). Section for Biomedical Physics; Notohamiprodjo, Mike [Tuebingen Univ. (Germany). Dept. of Diagnostic and Interventional Radiology; Zips, Daniel; Mueller, Arndt-Christan [Tuebingen Univ. (Germany). Dept. of Radiation Oncology

    2017-05-01

    To increase tumour control probability (TCP) in prostate cancer a method was developed integrating multi-parametric functional and biological information into a dose painting treatment plan aiming focal dose-escalation to tumour sub-volumes. A dose-escalation map was derived considering individual, multi-parametric estimated tumour aggressiveness. Multi-parametric functional imaging (MRI, Choline-/PSMA-/FMISO-PET/CT) was acquired for a high risk prostate cancer patient with a high level of tumour load (cT3b cN0 cM0) indicated by subtotal involvement of prostate including the right seminal vesicle and by PSA-level >100. Probability of tumour presence was determined by a combination of multi-parametric functional image information resulting in a voxel-based map of tumour aggressiveness. This probability map was directly integrated into dose optimization in order to plan for inhomogeneous, biological imaging based dose painting. Histograms of the multi-parametric prescription function were generated in addition to a differential histogram of the planned inhomogeneous doses. Comparison of prescribed doses with planned doses on a voxel level was realized using an effective DVH, containing the ratio of prescribed vs. planned dose for each tumour voxel. Multi-parametric imaging data of PSMA, Choline and FMISO PET/CT as well as ADC maps derived from diffusion weighted MRI were combined to an individual probability map of tumour presence. Voxel-based prescription doses ranged from 75.3 Gy up to 93.4 Gy (median: 79.6 Gy), whereas the planned dose painting doses varied only between 72.5 and 80.0 Gy with a median dose of 75.7 Gy. However, inhomogeneous voxel-based dose prescriptions can only be implemented into a treatment plan until a certain level. Multi-parametric probability based dose painting in prostate cancer is technically and clinically feasible. However, detailed calibration functions to define the necessary probability functions need to be assessed in future

  10. Theoretical and algorithmic advances in multi-parametric programming and control

    KAUST Repository

    Pistikopoulos, Efstratios N.

    2012-04-21

    This paper presents an overview of recent theoretical and algorithmic advances, and applications in the areas of multi-parametric programming and explicit/multi-parametric model predictive control (mp-MPC). In multi-parametric programming, advances include areas such as nonlinear multi-parametric programming (mp-NLP), bi-level programming, dynamic programming and global optimization for multi-parametric mixed-integer linear programming problems (mp-MILPs). In multi-parametric/explicit MPC (mp-MPC), advances include areas such as robust multi-parametric control, multi-parametric nonlinear MPC (mp-NMPC) and model reduction in mp-MPC. A comprehensive framework for multi-parametric programming and control is also presented. Recent applications include a hydrogen storage device, a fuel cell power generation system, an unmanned autonomous vehicle (UAV) and a hybrid pressure swing adsorption (PSA) system. © 2012 Springer-Verlag.

  11. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  12. AIRS Maps from Space Processing Software

    Science.gov (United States)

    Thompson, Charles K.; Licata, Stephen J.

    2012-01-01

    This software package processes Atmospheric Infrared Sounder (AIRS) Level 2 swath standard product geophysical parameters, and generates global, colorized, annotated maps. It automatically generates daily and multi-day averaged colorized and annotated maps of various AIRS Level 2 swath geophysical parameters. It also generates AIRS input data sets for Eyes on Earth, Puffer-sphere, and Magic Planet. This program is tailored to AIRS Level 2 data products. It re-projects data into 1/4-degree grids that can be combined and averaged for any number of days. The software scales and colorizes global grids utilizing AIRS-specific color tables, and annotates images with title and color bar. This software can be tailored for use with other swath data products for the purposes of visualization.

  13. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.

    2011-08-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques. The algorithm features two key steps: (i) a dynamic programming step, in which the mp-MPC problem is decomposed into a set of smaller subproblems in which only the current control, state variables, and constraints are considered, and (ii) a multi-parametric programming step, in which each subproblem is solved as a convex multi-parametric programming problem, to derive the control variables as an explicit function of the states. The key feature of the proposed method is that it overcomes potential limitations of previous methods for solving multi-parametric programming problems with dynamic programming, such as the need for global optimization for each subproblem of the dynamic programming step. © 2011 Elsevier Ltd. All rights reserved.

  14. Documenting the location of systematic transrectal ultrasound-guided prostate biopsies: correlation with multi-parametric MRI

    Science.gov (United States)

    Xu, Sheng; Kruecker, Jochen; Locklin, Julia; Pang, Yuxi; Shah, Vijay; Bernardo, Marcelino; Baccala, Angelo; Rastinehad, Ardeshir; Benjamin, Compton; Merino, Maria J.; Wood, Bradford J.; Choyke, Peter L.; Pinto, Peter A.

    2011-01-01

    Abstract During transrectal ultrasound (TRUS)-guided prostate biopsies, the actual location of the biopsy site is rarely documented. Here, we demonstrate the capability of TRUS-magnetic resonance imaging (MRI) image fusion to document the biopsy site and correlate biopsy results with multi-parametric MRI findings. Fifty consecutive patients (median age 61 years) with a median prostate-specific antigen (PSA) level of 5.8 ng/ml underwent 12-core TRUS-guided biopsy of the prostate. Pre-procedural T2-weighted magnetic resonance images were fused to TRUS. A disposable needle guide with miniature tracking sensors was attached to the TRUS probe to enable fusion with MRI. Real-time TRUS images during biopsy and the corresponding tracking information were recorded. Each biopsy site was superimposed onto the MRI. Each biopsy site was classified as positive or negative for cancer based on the results of each MRI sequence. Sensitivity, specificity, and receiver operating curve (ROC) area under the curve (AUC) values were calculated for multi-parametric MRI. Gleason scores for each multi-parametric MRI pattern were also evaluated. Six hundred and 5 systemic biopsy cores were analyzed in 50 patients, of whom 20 patients had 56 positive cores. MRI identified 34 of 56 positive cores. Overall, sensitivity, specificity, and ROC area values for multi-parametric MRI were 0.607, 0.727, 0.667, respectively. TRUS-MRI fusion after biopsy can be used to document the location of each biopsy site, which can then be correlated with MRI findings. Based on correlation with tracked biopsies, T2-weighted MRI and apparent diffusion coefficient maps derived from diffusion-weighted MRI are the most sensitive sequences, whereas the addition of delayed contrast enhancement MRI and three-dimensional magnetic resonance spectroscopy demonstrated higher specificity consistent with results obtained using radical prostatectomy specimens. PMID:21450548

  15. Computer Software for Displaying Map Projections and Comparing Distortions.

    Science.gov (United States)

    Wikle, Thomas

    1991-01-01

    Discusses software that educators can use to teach about distortions associated with alternative map projections. Examines the Projection, MicroCAM, World, and Atlas-GIS software programs. Suggests using the software in either introductory or more advanced courses dealing with map design or thematic cartography. Explains how to obtain the…

  16. High-resolution multi-parametric quantitative magnetic resonance imaging of the human cervical spinal cord at 7T.

    Science.gov (United States)

    Massire, Aurélien; Taso, Manuel; Besson, Pierre; Guye, Maxime; Ranjeva, Jean-Philippe; Callot, Virginie

    2016-12-01

    Quantitative MRI techniques have the potential to characterize spinal cord tissue impairments occurring in various pathologies, from both microstructural and functional perspectives. By enabling very high image resolution and enhanced tissue contrast, ultra-high field imaging may offer further opportunities for such characterization. In this study, a multi-parametric high-resolution quantitative MRI protocol is proposed to characterize in vivo the human cervical spinal cord at 7T. Multi-parametric quantitative MRI acquizitions including T1, T2(*) relaxometry mapping and axial diffusion MRI were performed on ten healthy volunteers with a whole-body 7T system using a commercial prototype coil-array dedicated to cervical spinal cord imaging. Automatic cord segmentation and multi-parametric data registration to spinal cord templates enabled robust regional studies within atlas-based WM tracts and GM horns at the C3 cervical level. T1 value, cross-sectional area and GM/WM ratio evolutions along the cervical cord were also reported. An original correction method for B1(+)-biased T1 mapping sequence was additionally proposed and validated on phantom. As a result, relaxometry and diffusion parameters derived from high-resolution quantitative MRI acquizitions were reported at 7T for the first time. Obtained images, with unmatched resolutions compared to lower field investigations, provided exquisite anatomical details and clear delineation of the spinal cord substructures within an acquisition time of 30min, compatible with clinical investigations. Regional statistically significant differences were highlighted between WM and GM based on T1 and T2* maps (pquantitative MRI is feasible and lays the groundwork for future clinical investigations of degenerative spinal cord pathologies. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Choropleth Mapping on Personal Computers: Software Sources and Hardware Requirements.

    Science.gov (United States)

    Lewis, Lawrence T.

    1986-01-01

    Describes the hardware and some of the choropleth mapping software available for the IBM-PC, PC compatible and Apple II microcomputers. Reviewed are: Micromap II, Documap, Desktop Information Display System (DIDS) , Multimap, Execuvision, Iris Gis, Mapmaker, PC Map, Statmap, and Atlas Map. Vendors' addresses are provided. (JDH)

  18. Multi-parametric Effect of Solar Activity on Cosmic Rays

    Indian Academy of Sciences (India)

    V. K. Mishra; Meera Gupta; B. N. Mishra; S. K. Nigam; A. P. Mishra

    2008-03-01

    The long-term modulation of cosmic ray intensity (CRI) by different solar activity (SA) parameters and an inverse correlation between individual SA parameter and CRI is well known. Earlier, it has been suggested that the concept of multi-parametric modulation of CRI may play an important role in the study of long-term modulation of CRI. In the present study, we have tried to investigate the combined effect of a set of two SA parameters in the long-term modulation of CRI. For this purpose, we have used a new statistical technique called “Running multiple correlation method”, based on the “Running cross correlation method”. The running multiple correlation functions among different sets of two SA parameters (e.g., sunspot numbers and solar flux, sunspot numbers and coronal index, sunspot numbers and grouped solar flares, etc.) and CRI have been correlated separately. It is found that the strength of multiple correlation (among two SA parameters and CRI) and cross correlation (between individual SA parameter and CRI) is almost similar throughout the period of investigation (1955–2005). It is also found that the multiple correlations among various SA parameters and CRI is stronger during ascending and descending phases of the solar cycles and it becomes weaker during maxima and minima of the solar cycles, which is in accordance with the linear relationship between SA parameters and CRI. The values of multiple correlation functions among different sets of SA parameters and CRI fall well within the 95% confidence interval. In the view of odd–even hypothesis of solar cycles, the strange behaviour of present cycle 23 (odd cycle), as this is characterized by many peculiarities with double peaks and many quiet periods (Gnevyshev gaps) interrupted the solar activity (for example April 2001, October–November 2003 and January 2005), leads us to speculate that the solar cycle 24 (even cycle) might be of exceptional nature.

  19. Fusion of multi-parametric MRI and temporal ultrasound for characterization of prostate cancer: in vivo feasibility study

    Science.gov (United States)

    Imani, Farhad; Ghavidel, Sahar; Abolmaesumi, Purang; Khallaghi, Siavash; Gibson, Eli; Khojaste, Amir; Gaed, Mena; Moussa, Madeleine; Gomez, Jose A.; Romagnoli, Cesare; Cool, Derek W.; Bastian-Jordan, Matthew; Kassam, Zahra; Siemens, D. Robert; Leveridge, Michael; Chang, Silvia; Fenster, Aaron; Ward, Aaron D.; Mousavi, Parvin

    2016-03-01

    Recently, multi-parametric Magnetic Resonance Imaging (mp-MRI) has been used to improve the sensitivity of detecting high-risk prostate cancer (PCa). Prior to biopsy, primary and secondary cancer lesions are identified on mp-MRI. The lesions are then targeted using TRUS guidance. In this paper, for the first time, we present a fused mp-MRI-temporal-ultrasound framework for characterization of PCa, in vivo. Cancer classification results obtained using temporal ultrasound are fused with those achieved using consolidated mp-MRI maps determined by multiple observers. We verify the outcome of our study using histopathology following deformable registration of ultrasound and histology images. Fusion of temporal ultrasound and mp-MRI for characterization of the PCa results in an area under the receiver operating characteristic curve (AUC) of 0.86 for cancerous regions with Gleason scores (GSs)>=3+3, and AUC of 0.89 for those with GSs>=3+4.

  20. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark [University College London, Research Department of Urology, London (United Kingdom); Kirkham, Alex; Allen, Clare [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2014-09-17

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  1. Accuracy and variability of tumor burden measurement on multi-parametric MRI

    Science.gov (United States)

    Salarian, Mehrnoush; Gibson, Eli; Shahedi, Maysam; Gaed, Mena; Gómez, José A.; Moussa, Madeleine; Romagnoli, Cesare; Cool, Derek W.; Bastian-Jordan, Matthew; Chin, Joseph L.; Pautler, Stephen; Bauman, Glenn S.; Ward, Aaron D.

    2014-03-01

    Measurement of prostate tumour volume can inform prognosis and treatment selection, including an assessment of the suitability and feasibility of focal therapy, which can potentially spare patients the deleterious side effects of radical treatment. Prostate biopsy is the clinical standard for diagnosis but provides limited information regarding tumour volume due to sparse tissue sampling. A non-invasive means for accurate determination of tumour burden could be of clinical value and an important step toward reduction of overtreatment. Multi-parametric magnetic resonance imaging (MPMRI) is showing promise for prostate cancer diagnosis. However, the accuracy and inter-observer variability of prostate tumour volume estimation based on separate expert contouring of T2-weighted (T2W), dynamic contrastenhanced (DCE), and diffusion-weighted (DW) MRI sequences acquired using an endorectal coil at 3T is currently unknown. We investigated this question using a histologic reference standard based on a highly accurate MPMRIhistology image registration and a smooth interpolation of planimetric tumour measurements on histology. Our results showed that prostate tumour volumes estimated based on MPMRI consistently overestimated histological reference tumour volumes. The variability of tumour volume estimates across the different pulse sequences exceeded interobserver variability within any sequence. Tumour volume estimates on DCE MRI provided the lowest inter-observer variability and the highest correlation with histology tumour volumes, whereas the apparent diffusion coefficient (ADC) maps provided the lowest volume estimation error. If validated on a larger data set, the observed correlations could support the development of automated prostate tumour volume segmentation algorithms as well as correction schemes for tumour burden estimation on MPMRI.

  2. Multi-parametric cytometry from a complex cellular sample: Improvements and limits of manual versus computational-based interactive analyses.

    Science.gov (United States)

    Gondois-Rey, F; Granjeaud, S; Rouillier, P; Rioualen, C; Bidaut, G; Olive, D

    2016-05-01

    The wide possibilities opened by the developments of multi-parametric cytometry are limited by the inadequacy of the classical methods of analysis to the multi-dimensional characteristics of the data. While new computational tools seemed ideally adapted and were applied successfully, their adoption is still low among the flow cytometrists. In the purpose to integrate unsupervised computational tools for the management of multi-stained samples, we investigated their advantages and limits by comparison to manual gating on a typical sample analyzed in immunomonitoring routine. A single tube of PBMC, containing 11 populations characterized by different sizes and stained with 9 fluorescent markers, was used. We investigated the impact of the strategy choice on manual gating variability, an undocumented pitfall of the analysis process, and we identified rules to optimize it. While assessing automatic gating as an alternate, we introduced the Multi-Experiment Viewer software (MeV) and validated it for merging clusters and annotating interactively populations. This procedure allowed the finding of both targeted and unexpected populations. However, the careful examination of computed clusters in standard dot plots revealed some heterogeneity, often below 10%, that was overcome by increasing the number of clusters to be computed. MeV facilitated the identification of populations by displaying both the MFI and the marker signature of the dataset simultaneously. The procedure described here appears fully adapted to manage homogeneously high number of multi-stained samples and allows improving multi-parametric analyses in a way close to the classic approach. © 2016 International Society for Advancement of Cytometry.

  3. Multi-parametric MR imaging for prostate carcinoma; Multiparametrische MR-Bildgebung beim Prostatakarzinom

    Energy Technology Data Exchange (ETDEWEB)

    Schlemmer, Heinz-Peter [Deutsches Krebsforschungszentrum, Heidelberg (Germany). Abt. Radiologie

    2017-03-15

    Multi-parametric NMR imaging in case of prostate carcinoma can improve diagnostics, allows reliable prognostic estimations and helps to find the optimum individual therapy. The contribution is focused to deliver the needed methodological tools and background knowledge for the daily routine.

  4. Single-Frame Terrain Mapping Software for Robotic Vehicles

    Science.gov (United States)

    Rankin, Arturo L.

    2011-01-01

    This software is a component in an unmanned ground vehicle (UGV) perception system that builds compact, single-frame terrain maps for distribution to other systems, such as a world model or an operator control unit, over a local area network (LAN). Each cell in the map encodes an elevation value, terrain classification, object classification, terrain traversability, terrain roughness, and a confidence value into four bytes of memory. The input to this software component is a range image (from a lidar or stereo vision system), and optionally a terrain classification image and an object classification image, both registered to the range image. The single-frame terrain map generates estimates of the support surface elevation, ground cover elevation, and minimum canopy elevation; generates terrain traversability cost; detects low overhangs and high-density obstacles; and can perform geometry-based terrain classification (ground, ground cover, unknown). A new origin is automatically selected for each single-frame terrain map in global coordinates such that it coincides with the corner of a world map cell. That way, single-frame terrain maps correctly line up with the world map, facilitating the merging of map data into the world map. Instead of using 32 bits to store the floating-point elevation for a map cell, the vehicle elevation is assigned to the map origin elevation and reports the change in elevation (from the origin elevation) in terms of the number of discrete steps. The single-frame terrain map elevation resolution is 2 cm. At that resolution, terrain elevation from 20.5 to 20.5 m (with respect to the vehicle's elevation) is encoded into 11 bits. For each four-byte map cell, bits are assigned to encode elevation, terrain roughness, terrain classification, object classification, terrain traversability cost, and a confidence value. The vehicle s current position and orientation, the map origin, and the map cell resolution are all included in a header for each

  5. CosmoQuest: A software platform for surface feature mapping

    Science.gov (United States)

    Gay, Pamela

    2016-07-01

    While many tools exist for allowing individuals to mark features in images, it has previously been unwieldy to get entire teams collaboratively mapping out surface features, and to statistically compare each team members contributions. Our CSB software was initially developed to facilitate crowd-sourcing projects, including CosmoQuest's "Moon Mappers" project. Statistically study of its results (Robbins et al 2014) has shown that professionals using this software get results that are as good as those they get using other commonly used software packages. This has lead to an expansion of the software to facilitate professional science use of the software. In order to allow the greatest use of CSB, and to facilitate better science collaboration, CosmoQuest now allows teams to create private projects. Basic features include: using their own data sets, allowing multiple team members to annotate the images, performing basic statistics on the resulting data, downloading all results in either .sql or .csv formats. In this presentation, we will overview how best to use CSB to improve your own science collaboration. Current applications include surface science and transient object identification, and published results include both crater maps and the discovery of KBOs.

  6. Incorporating Oxygen-Enhanced MRI into Multi-Parametric Assessment of Human Prostate Cancer.

    Science.gov (United States)

    Zhou, Heling; Hallac, Rami R; Yuan, Qing; Ding, Yao; Zhang, Zhongwei; Xie, Xian-Jin; Francis, Franto; Roehrborn, Claus G; Sims, R Douglas; Costa, Daniel N; Raj, Ganesh V; Mason, Ralph P

    2017-08-24

    Hypoxia is associated with prostate tumor aggressiveness, local recurrence, and biochemical failure. Magnetic resonance imaging (MRI) offers insight into tumor pathophysiology and recent reports have related transverse relaxation rate (R₂*) and longitudinal relaxation rate (R₁) measurements to tumor hypoxia. We have investigated the inclusion of oxygen-enhanced MRI for multi-parametric evaluation of tumor malignancy. Multi-parametric MRI sequences at 3 Tesla were evaluated in 10 patients to investigate hypoxia in prostate cancer prior to radical prostatectomy. Blood oxygen level dependent (BOLD), tissue oxygen level dependent (TOLD), dynamic contrast enhanced (DCE), and diffusion weighted imaging MRI were intercorrelated and compared with the Gleason score. The apparent diffusion coefficient (ADC) was significantly lower in tumor than normal prostate. Baseline R₂* (BOLD-contrast) was significantly higher in tumor than normal prostate. Upon the oxygen breathing challenge, R₂* decreased significantly in the tumor tissue, suggesting improved vascular oxygenation, however changes in R₁ were minimal. R₂* of contralateral normal prostate decreased in most cases upon oxygen challenge, although the differences were not significant. Moderate correlation was found between ADC and Gleason score. ADC and R₂* were correlated and trends were found between Gleason score and R₂*, as well as maximum-intensity-projection and area-under-the-curve calculated from DCE. Tumor ADC and R₂* have been associated with tumor hypoxia, and thus the correlations are of particular interest. A multi-parametric approach including oxygen-enhanced MRI is feasible and promises further insights into the pathophysiological information of tumor microenvironment.

  7. A software tool for rapid flood inundation mapping

    Science.gov (United States)

    Verdin, James; Verdin, Kristine; Mathis, Melissa L.; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-06-02

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  8. Open source projects in software engineering education: a mapping study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study aims to summarize the literature on how OSP have been used to facilitate students' learning of SE. Method: A systematic mapping study was undertaken by identifying, filtering and classifying primary studies using a predefined strategy. Results: 72 papers were selected and classified. The main results were: (a) most studies focused on comprehensive SE courses, although some dealt with specific areas; (b) the most prevalent approach was the traditional project method; (c) studies' general goals were: learning SE concepts and principles by using OSP, learning open source software or both; (d) most studies tried out ideas in regular courses within the curriculum; (e) in general, students had to work with predefined projects; (f) there was a balance between approaches where instructors had either inside control or no control on the activities performed by students; (g) when learning was assessed, software artefacts, reports and presentations were the main instruments used by teachers, while surveys were widely used for students' self-assessment; (h) most studies were published in the last seven years. Conclusions: The resulting map gives an overview of the existing initiatives in this context and shows gaps where further research can be pursued.

  9. QTL IciMapping:Integrated software for genetic linkage map construction and quantitative trait locus mapping in biparental populations

    Institute of Scientific and Technical Information of China (English)

    Lei; Meng; Huihui; Li; Luyan; Zhang; Jiankang; Wang

    2015-01-01

    QTL Ici Mapping is freely available public software capable of building high-density linkage maps and mapping quantitative trait loci(QTL) in biparental populations. Eight functionalities are integrated in this software package:(1) BIN: binning of redundant markers;(2) MAP: construction of linkage maps in biparental populations;(3) CMP: consensus map construction from multiple linkage maps sharing common markers;(4) SDL: mapping of segregation distortion loci;(5) BIP: mapping of additive, dominant, and digenic epistasis genes;(6) MET: QTL-by-environment interaction analysis;(7) CSL: mapping of additive and digenic epistasis genes with chromosome segment substitution lines; and(8) NAM: QTL mapping in NAM populations. Input files can be arranged in plain text, MS Excel 2003, or MS Excel 2007 formats. Output files have the same prefix name as the input but with different extensions. As examples, there are two output files in BIN, one for summarizing the identified bin groups and deleted markers in each bin, and the other for using the MAP functionality. Eight output files are generated by MAP, including summary of the completed linkage maps, Mendelian ratio test of individual markers, estimates of recombination frequencies, LOD scores, and genetic distances, and the input files for using the BIP, SDL,and MET functionalities. More than 30 output files are generated by BIP, including results at all scanning positions, identified QTL, permutation tests, and detection powers for up to six mapping methods. Three supplementary tools have also been developed to display completed genetic linkage maps, to estimate recombination frequency between two loci,and to perform analysis of variance for multi-environmental trials.

  10. QTL IciMapping:Integrated software for genetic linkage map construction and quantitative trait locus mapping in biparental populations

    Institute of Scientific and Technical Information of China (English)

    Lei Meng; Huihui Li; Luyan Zhang; Jiankang Wang

    2015-01-01

    QTL IciMapping is freely available public software capable of building high-density linkage maps and mapping quantitative trait loci (QTL) in biparental populations. Eight func-tionalities are integrated in this software package: (1) BIN:binning of redundant markers;(2) MAP: construction of linkage maps in biparental populations; (3) CMP: consensus map construction from multiple linkage maps sharing common markers; (4) SDL: mapping of segregation distortion loci;(5) BIP:mapping of additive, dominant, and digenic epistasis genes;(6) MET:QTL-by-environment interaction analysis;(7) CSL:mapping of additive and digenic epistasis genes with chromosome segment substitution lines; and (8) NAM: QTL mapping in NAM populations. Input files can be arranged in plain text, MS Excel 2003, or MS Excel 2007 formats. Output files have the same prefix name as the input but with different extensions. As examples, there are two output files in BIN, one for summarizing the identified bin groups and deleted markers in each bin, and the other for using the MAP functionality. Eight output files are generated by MAP, including summary of the completed linkage maps, Mendelian ratio test of individual markers, estimates of recombination frequencies, LOD scores, and genetic distances, and the input files for using the BIP, SDL, and MET functionalities. More than 30 output files are generated by BIP, including results at all scanning positions, identified QTL, permutation tests, and detection powers for up to six mapping methods. Three supplementary tools have also been developed to display completed genetic linkage maps, to estimate recombination frequency between two loci, and to perform analysis of variance for multi-environmental trials.

  11. QTL IciMapping: Integrated software for genetic linkage map construction and quantitative trait locus mapping in biparental populations

    Directory of Open Access Journals (Sweden)

    Lei Meng

    2015-06-01

    Full Text Available QTL IciMapping is freely available public software capable of building high-density linkage maps and mapping quantitative trait loci (QTL in biparental populations. Eight functionalities are integrated in this software package: (1 BIN: binning of redundant markers; (2 MAP: construction of linkage maps in biparental populations; (3 CMP: consensus map construction from multiple linkage maps sharing common markers; (4 SDL: mapping of segregation distortion loci; (5 BIP: mapping of additive, dominant, and digenic epistasis genes; (6 MET: QTL-by-environment interaction analysis; (7 CSL: mapping of additive and digenic epistasis genes with chromosome segment substitution lines; and (8 NAM: QTL mapping in NAM populations. Input files can be arranged in plain text, MS Excel 2003, or MS Excel 2007 formats. Output files have the same prefix name as the input but with different extensions. As examples, there are two output files in BIN, one for summarizing the identified bin groups and deleted markers in each bin, and the other for using the MAP functionality. Eight output files are generated by MAP, including summary of the completed linkage maps, Mendelian ratio test of individual markers, estimates of recombination frequencies, LOD scores, and genetic distances, and the input files for using the BIP, SDL, and MET functionalities. More than 30 output files are generated by BIP, including results at all scanning positions, identified QTL, permutation tests, and detection powers for up to six mapping methods. Three supplementary tools have also been developed to display completed genetic linkage maps, to estimate recombination frequency between two loci, and to perform analysis of variance for multi-environmental trials.

  12. Applying the metro map to software development management

    Science.gov (United States)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  13. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    , and to highlight the findings of research works which could prove to be beneficial for GSD researchers and practitioners. Method: A Systematic Mapping Study is conducted using a broad search string that allows identifying a variety of studies which can be beneficial for GSD. Papers have been retrieved through...... a combination of automatic search and snowballing, hence a wide quantitative map of the research area is provided. Additionally, text extracts from the studies are qualitatively synthesised to investigate benefits and challenges of the use of SoSo. Results: SoSo is reported as being chiefly used as a support...... of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...

  14. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  15. MAPPING OF TRADITIONAL SOFTWARE DEVELOPMENT METHODS TO AGILE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Rashmi Popli

    2013-02-01

    Full Text Available Agility is bringing in responsibility and ownership in individuals, which will eventually bring out effectiveness and efficiency in deliverables. Companies are drifting from traditional Software Development Life Cycle models to Agile Environment for the purpose of attaining quality and for the sake of saving cost and time. In Traditional models, life cycle is properly defined and also phases are elaborated by specifying needed input and output parameters. On the other hand, in Agile environment, phases are specific to methodologies of Agile - Extreme Programming etc. In this paper a common life cycle approach is proposed that is applicable for different kinds of methods. This paper also aims to describe a mapping function for mapping of traditional methods to Agile methods.

  16. VESGEN Software for Mapping and Quantification of Vascular Regulators

    Science.gov (United States)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  17. Multi-parametric R-matrix for the sl(2|1) Yangian

    CERN Document Server

    Babichenko, Andrei

    2012-01-01

    We study the Yangian of the sl(2|1) Lie superalgebra in a multi-parametric four-dimensional representation. We use Drinfeld's second realization to derive the R-matrix, the antiparticle representation, the crossing and unitarity condition. We consistently apply the Yangian antipode and its inverse to the individual particles involved in the scattering. We explicitly find a scalar factor solving the crossing and unitarity conditions. The formulas we obtain bear some similarities with those familiar from the study of integrable structures in the AdS/CFT correspondence, although they present obvious crucial differences.

  18. Study protocol: multi-parametric magnetic resonance imaging for therapeutic response prediction in rectal cancer.

    Science.gov (United States)

    Pham, Trang Thanh; Liney, Gary; Wong, Karen; Rai, Robba; Lee, Mark; Moses, Daniel; Henderson, Christopher; Lin, Michael; Shin, Joo-Shik; Barton, Michael Bernard

    2017-07-04

    Response to neoadjuvant chemoradiotherapy (CRT) of rectal cancer is variable. Accurate imaging for prediction and early assessment of response would enable appropriate stratification of management to reduce treatment morbidity and improve therapeutic outcomes. Use of either diffusion weighted imaging (DWI) or dynamic contrast enhanced (DCE) imaging alone currently lacks sufficient sensitivity and specificity for clinical use to guide individualized treatment in rectal cancer. Multi-parametric MRI and analysis combining DWI and DCE may have potential to improve the accuracy of therapeutic response prediction and assessment. This protocol describes a prospective non-interventional single-arm clinical study. Patients with locally advanced rectal cancer undergoing preoperative CRT will prospectively undergo multi-parametric MRI pre-CRT, week 3 CRT, and post-CRT. The protocol consists of DWI using a read-out segmented sequence (RESOLVE), and DCE with pre-contrast T1-weighted (VIBE) scans for T1 calculation, followed by 60 phases at high temporal resolution (TWIST) after gadoversetamide injection. A 3-dimensional voxel-by-voxel technique will be used to produce colour-coded ADC and K(trans) histograms, and data evaluated in combination using scatter plots. MRI parameters will be correlated with surgical histopathology. Histopathology analysis will be standardized, with chemoradiotherapy response defined according to AJCC 7th Edition Tumour Regression Grade (TRG) criteria. Good response will be defined as TRG 0-1, and poor response will be defined as TRG 2-3. The combination of DWI and DCE can provide information on physiological tumour factors such as cellularity and perfusion that may affect radiotherapy response. If validated, multi-parametric MRI combining DWI and DCE can be used to stratify management in rectal cancer patients. Accurate imaging prediction of patients with a complete response to CRT would enable a 'watch and wait' approach, avoiding surgical morbidity

  19. Dynamic modeling and explicit/multi-parametric MPC control of pressure swing adsorption systems

    KAUST Repository

    Khajuria, Harish

    2011-01-01

    Pressure swing adsorption (PSA) is a flexible, albeit complex gas separation system. Due to its inherent nonlinear nature and discontinuous operation, the design of a model based PSA controller, especially with varying operating conditions, is a challenging task. This work focuses on the design of an explicit/multi-parametric model predictive controller for a PSA system. Based on a system involving four adsorbent beds separating 70% H2, 30% CH4 mixture into high purity hydrogen, the key controller objective is to fast track H2 purity to a set point value of 99.99%. To perform this task, a rigorous and systematic framework is employed. First, a high fidelity detailed dynamic model is built to represent the system\\'s real operation, and understand its dynamic behavior. The model is then used to derive appropriate linear models by applying suitable system identification techniques. For the reduced models, a model predictive control (MPC) step is formulated, where latest developments in multi-parametric programming and control are applied to derive a novel explicit MPC controller. To test the performance of the designed controller, closed loop simulations are performed where the dynamic model is used as the virtual plant. Comparison studies of the derived explicit MPC controller are also performed with conventional PID controllers. © 2010 Elsevier Ltd. All rights reserved.

  20. Multi-parametric relationships between PAM measurements and carbon incorporation, an in situ approach.

    Science.gov (United States)

    Napoléon, Camille; Claquin, Pascal

    2012-01-01

    Primary production (PP) in the English Channel was measured using (13)C uptake and compared to the electron transport rate (ETR) measured using PAM (pulse amplitude modulated fluorometer). The relationship between carbon incorporation (P(obs)) and ETR was not linear but logarithmic. This result can be explained by alternative electron sinks at high irradiance which protect the phytoplankton from photoinhibition. A multi-parametric model was developed to estimate PP by ETR. This approach highlighted the importance of taking physicochemical parameters like incident light and nutrient concentrations into account. The variation in the ETR/P(obs) ratio as a function of the light revealed different trends which were characterized by three parameters (R(max), the maximum value of ETR/P(obs); E(Rmax), the light intensity at which R(max) is measured; γ the initial slope of the curve). Based on the values of these three parameters, data were divided into six groups which were highly dependent on the seasons and on the physicochemical conditions. Using the multi-parametric model which we defined by P(obs) and ETR measurements at low frequencies, the high frequency measurements of ETR enabled us to estimate the primary production capacity between November 2009 and December 2010 at high temporal and spatial scales.

  1. Optimizing a machine learning based glioma grading system using multi-parametric MRI histogram and texture features.

    Science.gov (United States)

    Zhang, Xin; Yan, Lin-Feng; Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin

    2017-07-18

    Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization.

  2. Multi-parametric MRI-pathologic correlation of prostate cancer using tracked biopsies

    Science.gov (United States)

    Xu, Sheng; Turkbey, Baris; Kruecker, Jochen; Yan, Pingkun; Locklin, Julia; Pinto, Peter; Choyke, Peter; Wood, Bradford

    2010-02-01

    MRI is currently the most promising imaging modality for prostate cancer diagnosis due to its high resolution and multiparametric nature. However, currently there is no standard for integration of diagnostic information from different MRI sequences. We propose a method to increase the diagnostic accuracy of MRI by correlating biopsy specimens with four MRI sequences including T2 weighted MRI, Diffusion Weight Imaging, Dynamic Contrast Enhanced MRI and MRI spectroscopy. This method uses device tracking and image fusion to determine the specimen's position on MRI images. The proposed method is unbiased and cost effective. It does not substantially interfere with the standard biopsy workflow, allowing it to be easily accepted by physicians. A study of 41 patients was carried out to validate the approach. The performance of all four MRI sequences in various combinations is reported. Guidelines are given for multi-parametric imaging and tracked biopsy of prostate cancer.

  3. Impact of state updating and multi-parametric ensemble for streamflow hindcasting in European river basins

    Science.gov (United States)

    Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.

    2015-12-01

    Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.

  4. Characterizing Heterogeneity within Head and Neck Lesions Using Cluster Analysis of Multi-Parametric MRI Data.

    Directory of Open Access Journals (Sweden)

    Marco Borri

    Full Text Available To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment.The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4. Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters.The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4, determined with cluster validation, produced the best separation between reducing and non-reducing clusters.The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.

  5. Characterizing Heterogeneity within Head and Neck Lesions Using Cluster Analysis of Multi-Parametric MRI Data.

    Science.gov (United States)

    Borri, Marco; Schmidt, Maria A; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M; Partridge, Mike; Bhide, Shreerang A; Nutting, Christopher M; Harrington, Kevin J; Newbold, Katie L; Leach, Martin O

    2015-01-01

    To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.

  6. Mapping Pedagogical Opportunities Provided by Mathematics Analysis Software

    Science.gov (United States)

    Pierce, Robyn; Stacey, Kaye

    2010-01-01

    This paper proposes a taxonomy of the pedagogical opportunities that are offered by mathematics analysis software such as computer algebra systems, graphics calculators, dynamic geometry or statistical packages. Mathematics analysis software is software for purposes such as calculating, drawing graphs and making accurate diagrams. However, its…

  7. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. Th

  8. Automated multi-parametric sorting of micron-sized particles via multi-trap laser tweezers

    Science.gov (United States)

    Kaputa, Daniel S.

    The capabilities of laser tweezers have rapidly expanded since the first demonstration by Ashkin and co-workers in 1970 of the ability to trap particles using optical energy. Laser tweezers have been used to measure piconewton forces in many biological and material science application, sort bacteria, measure DNA bond strength, and even perform microsurgery. The laser tweezers system developed for this dissertation foreshadows the next generation of laser tweezer systems that provide automated particle sorted based upon multiple criteria. Many laser tweezer sorting applications today entail the operator sorting cells from a bulk sample, one by one. This dissertation demonstrates the technologies of pattern recognition and image processing that allow for an entire microscope slide to be sorted without any operator intervention. We already live in an automated world where the cars we drive are built by machines instead of humans. The technology is there, and the only factors limiting the advancements of fully automated biological instrumentation is the lack of developers with the appropriate knowledge sets. This dissertation introduces the concept of sorting particles via a multi-parametric approach where several parameters such as size, fluorescence, and Raman spectra are used as sorting criteria. Since the advent of laser tweezers, several groups have demonstrated the ability to sort cells and other particle by size, or by fluorescence, or by any other parameter, but to our knowledge there does not exist a laser tweezer sorting system that can sort particles based upon multiple parameters. Sorting via a single parameter can be a severe limitation as the method lacks the robustness and class specificity that exists when sorting based upon multiple parameters. Simply put, it makes more sense to determine the worth of a baseball card by considering it's condition as well as it's age, rather then solely upon its condition. By adding another parameter such as the name of

  9. Classifying Glioblastoma Multiforme Follow-Up Progressive vs. Responsive Forms Using Multi-Parametric MRI Features

    Science.gov (United States)

    Ion-Mărgineanu, Adrian; Van Cauter, Sofie; Sima, Diana M.; Maes, Frederik; Sunaert, Stefan; Himmelreich, Uwe; Van Huffel, Sabine

    2017-01-01

    Purpose: The purpose of this paper is discriminating between tumor progression and response to treatment based on follow-up multi-parametric magnetic resonance imaging (MRI) data retrieved from glioblastoma multiforme (GBM) patients. Materials and Methods: Multi-parametric MRI data consisting of conventional MRI (cMRI) and advanced MRI [i.e., perfusion weighted MRI (PWI) and diffusion kurtosis MRI (DKI)] were acquired from 29 GBM patients treated with adjuvant therapy after surgery. We propose an automatic pipeline for processing advanced MRI data and extracting intensity-based histogram features and 3-D texture features using manually and semi-manually delineated regions of interest (ROIs). Classifiers are trained using a leave-one-patient-out cross validation scheme on complete MRI data. Balanced accuracy rate (BAR)–values are computed and compared between different ROIs, MR modalities, and classifiers, using non-parametric multiple comparison tests. Results: Maximum BAR–values using manual delineations are 0.956, 0.85, 0.879, and 0.932, for cMRI, PWI, DKI, and all three MRI modalities combined, respectively. Maximum BAR–values using semi-manual delineations are 0.932, 0.894, 0.885, and 0.947, for cMRI, PWI, DKI, and all three MR modalities combined, respectively. After statistical testing using Kruskal-Wallis and post-hoc Dunn-Šidák analysis we conclude that training a RUSBoost classifier on features extracted using semi-manual delineations on cMRI or on all MRI modalities combined performs best. Conclusions: We present two main conclusions: (1) using T1 post-contrast (T1pc) features extracted from manual total delineations, AdaBoost achieves the highest BAR–value, 0.956; (2) using T1pc-average, T1pc-90th percentile, and Cerebral Blood Volume (CBV) 90th percentile extracted from semi-manually delineated contrast enhancing ROIs, SVM-rbf, and RUSBoost achieve BAR–values of 0.947 and 0.932, respectively. Our findings show that AdaBoost, SVM-rbf, and

  10. [The primary research and development of software oversampling mapping system for electrocardiogram].

    Science.gov (United States)

    Zhou, Yu; Ren, Jie

    2011-04-01

    We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.

  11. Individual component analysis of the multi-parametric cardiovascular magnetic resonance protocol in the CE-MARC trial.

    Science.gov (United States)

    Ripley, David P; Motwani, Manish; Brown, Julia M; Nixon, Jane; Everett, Colin C; Bijsterveld, Petra; Maredia, Neil; Plein, Sven; Greenwood, John P

    2015-07-15

    The CE-MARC study assessed the diagnostic performance investigated the use of cardiovascular magnetic resonance (CMR) in patients with suspected coronary artery disease (CAD). The study used a multi-parametric CMR protocol assessing 4 components: i) left ventricular function; ii) myocardial perfusion; iii) viability (late gadolinium enhancement (LGE)) and iv) coronary magnetic resonance angiography (MRA). In this pre-specified CE-MARC sub-study we assessed the diagnostic accuracy of the individual CMR components and their combinations. All patients from the CE-MARC population (n = 752) were included using data from the original blinded-read. The four individual core components of the CMR protocol was determined separately and then in paired and triplet combinations. Results were then compared to the full multi-parametric protocol. CMR and X-ray angiography results were available in 676 patients. The maximum sensitivity for the detection of significant CAD by CMR was achieved when all four components were used (86.5%). Specificity of perfusion (91.8%), function (93.7%) and LGE (95.8%) on its own was significantly better than specificity of the multi-parametric protocol (83.4%) (all P parametric protocol was the optimum to rule-out significant CAD (Likelihood Ratio negative (LR-) 0.16) and the LGE component alone was the best to rue-in CAD (LR+ 9.81). Overall diagnostic accuracy was similar with the full multi-parametric protocol (85.9%) compared to paired and triplet combinations. The use of coronary MRA within the full multi-parametric protocol had no additional diagnostic benefit compared to the perfusion/function/LGE combination (overall accuracy 84.6% vs. 84.2% (P = 0.5316); LR- 0.16 vs. 0.21; LR+ 5.21 vs. 5.77). From this pre-specified sub-analysis of the CE-MARC study, the full multi-parametric protocol had the highest sensitivity and was the optimal approach to rule-out significant CAD. The LGE component alone was the optimal rule-in strategy

  12. Evaluation criteria for software classification inventories, accuracies, and maps

    Science.gov (United States)

    Jayroe, R. R., Jr.

    1976-01-01

    Statistical criteria are presented for modifying the contingency table used to evaluate tabular classification results obtained from remote sensing and ground truth maps. This classification technique contains information on the spatial complexity of the test site, on the relative location of classification errors, on agreement of the classification maps with ground truth maps, and reduces back to the original information normally found in a contingency table.

  13. Multi-parametric ultrasound criteria for internal carotid artery disease - comparison with CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Barlinn, Kristian; Kepplinger, Jessica; Siepmann, Timo; Pallesen, Lars-Peder; Bodechtel, Ulf; Reichmann, Heinz; Puetz, Volker [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Floegel, Thomas [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neurology, Dresden (Germany); Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Kitzler, Hagen H. [Carl Gustav Carus University Hospital, Technische Universitaet Dresden, Department of Neuroradiology, Dresden (Germany); Alexandrov, Andrei V. [The University of Tennessee Health Science Center, Department of Neurology, Memphis, TN (United States)

    2016-09-15

    The German Society of Ultrasound in Medicine (known by its acronym DEGUM) recently proposed a novel multi-parametric ultrasound approach for comprehensive and accurate assessment of extracranial internal carotid artery (ICA) steno-occlusive disease. We determined the agreement between duplex ultrasonography (DUS) interpreted by the DEGUM criteria and CT angiography (CTA) for grading of extracranial ICA steno-occlusive disease. Consecutive patients with acute cerebral ischemia underwent DUS and CTA. Internal carotid artery stenosis was graded according to the DEGUM-recommended criteria for DUS. Independent readers manually performed North American Symptomatic Carotid Endarterectomy Trial-type measurements on axial CTA source images. Both modalities were compared using Spearman's correlation and Bland-Altman analyses. A total of 303 acute cerebral ischemia patients (mean age, 72 ± 12 years; 58 % men; median baseline National Institutes of Health Stroke Scale score, 4 [interquartile range 7]) provided 593 DUS and CTA vessel pairs for comparison. There was a positive correlation between DUS and CTA (r{sub s} = 0.783, p < 0.001) with mean difference in degree of stenosis measurement of 3.57 %. Bland-Altman analysis further revealed widely varying differences (95 % limits of agreement -29.26 to 22.84) between the two modalities. Although the novel DEGUM criteria showed overall good agreement between DUS and CTA across all stenosis ranges, potential for wide incongruence with CTA underscores the need for local laboratory validation to avoid false screening results. (orig.)

  14. Multi-parametric analysis and modeling of relationships between mitochondrial morphology and apoptosis.

    Science.gov (United States)

    Reis, Yara; Bernardo-Faura, Marti; Richter, Daniela; Wolf, Thomas; Brors, Benedikt; Hamacher-Brady, Anne; Eils, Roland; Brady, Nathan R

    2012-01-01

    Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis.

  15. Earthquake precursory research in western Himalaya based on the multi-parametric geophysical observatory data

    Science.gov (United States)

    Kumar, Naresh; Rawat, Gautam; Choubey, Vinay; Hazarika, Devajit

    2013-08-01

    The opening of cracks and influx of fluids in the dilatancy zone of impending earthquake is expected to induce short-term changes in physical/chemical/hydrological properties during earthquake build-up cycle, which should be reflected in time-varying geophysical fields. With this rationale, eleven geophysical parameters are being recorded in continuous mode at the Multi-Parametric Geophysical Observatory (MPGO), in Ghuttu, Garhwal Himalaya, for earthquake precursory research. The critical analysis of various geophysical time series indicates anomalous behavior at few occasions; however, the data is also influenced by many external forces. These external influences are the major deterrent for the isolation of precursory signals. The recent work is focused on the data adoptive techniques to estimate and eliminate effects of solar-terrestrial and hydrological/environmental factors for delimiting the data to identify short-term precursors. Although any significant earthquake is not reported close to the observatory, some weak precursory signals and coseismic changes have been identified in few parameters related to the occurrence of moderate and strong earthquakes.

  16. Multi-parametric analysis and modeling of relationships between mitochondrial morphology and apoptosis.

    Directory of Open Access Journals (Sweden)

    Yara Reis

    Full Text Available Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis.

  17. Comparison of unsupervised classification methods for brain tumor segmentation using multi-parametric MRI

    Directory of Open Access Journals (Sweden)

    N. Sauwen

    2016-01-01

    Full Text Available Tumor segmentation is a particularly challenging task in high-grade gliomas (HGGs, as they are among the most heterogeneous tumors in oncology. An accurate delineation of the lesion and its main subcomponents contributes to optimal treatment planning, prognosis and follow-up. Conventional MRI (cMRI is the imaging modality of choice for manual segmentation, and is also considered in the vast majority of automated segmentation studies. Advanced MRI modalities such as perfusion-weighted imaging (PWI, diffusion-weighted imaging (DWI and magnetic resonance spectroscopic imaging (MRSI have already shown their added value in tumor tissue characterization, hence there have been recent suggestions of combining different MRI modalities into a multi-parametric MRI (MP-MRI approach for brain tumor segmentation. In this paper, we compare the performance of several unsupervised classification methods for HGG segmentation based on MP-MRI data including cMRI, DWI, MRSI and PWI. Two independent MP-MRI datasets with a different acquisition protocol were available from different hospitals. We demonstrate that a hierarchical non-negative matrix factorization variant which was previously introduced for MP-MRI tumor segmentation gives the best performance in terms of mean Dice-scores for the pathologic tissue classes on both datasets.

  18. Multi-parametric clustering for sensor node coordination in cognitive wireless sensor networks.

    Science.gov (United States)

    Wang, Xiao Yu; Wong, Alexander

    2013-01-01

    The deployment of wireless sensor networks for healthcare applications have been motivated and driven by the increasing demand for real-time monitoring of patients in hospital and large disaster response environments. A major challenge in developing such sensor networks is the need for coordinating a large number of randomly deployed sensor nodes. In this study, we propose a multi-parametric clustering scheme designed to aid in the coordination of sensor nodes within cognitive wireless sensor networks. In the proposed scheme, sensor nodes are clustered together based on similar network behaviour across multiple network parameters, such as channel availability, interference characteristics, and topological characteristics, followed by mechanisms for forming, joining and switching clusters. Extensive performance evaluation is conducted to study the impact on important factors such as clustering overhead, cluster joining estimation error, interference probability, as well as probability of reclustering. Results show that the proposed clustering scheme can be an excellent candidate for use in large scale cognitive wireless sensor network deployments with high dynamics.

  19. Multi-parametric monitoring and assessment of high-intensity focused ultrasound (HIFU) boiling by harmonic motion imaging for focused ultrasound (HMIFU): an ex vivo feasibility study.

    Science.gov (United States)

    Hou, Gary Y; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E

    2014-03-07

    Harmonic motion imaging for focused ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase shift during high energy HIFU treatment with tissue boiling. Forty three (n = 43) thermal lesions were formed in ex vivo canine liver specimens (n = 28). Two-dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10 s, 20 s and 30 s HIFU durations at three different acoustic powers of 8, 10, and 11 W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and passive cavitation detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δϕ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite the expectedly chaotic changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property changes throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with

  20. Multi-parametric monitoring and assessment of High Intensity Focused Ultrasound (HIFU) boiling by Harmonic Motion Imaging for Focused Ultrasound (HMIFU): An ex vivo feasibility study

    Science.gov (United States)

    Hou, Gary Y.; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E.

    2014-01-01

    Harmonic Motion Imaging for Focused Ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase-shift during high energy HIFU treatment with tissue boiling. Forty three (n=43) thermal lesions were formed in ex vivo canine liver specimens (n=28). Two dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10-s, 20-s and 30-s HIFU durations at three different acoustic powers of 8, 10, and 11W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and Passive Cavitation Detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δφ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite unpredictable changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property change throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with treatment duration

  1. Mapping Social Network to Software Architecture to Detect Structure Clashes in Agile Software Development

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos

    2007-01-01

    Software development is rarely an individual effort and generally involves teams of developers collaborating together in order to generate reliable code. Such collaborations require proper communication and regular coordination among the team members. In addition, coordination is required to sort

  2. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  3. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  4. Texture descriptors to distinguish radiation necrosis from recurrent brain tumors on multi-parametric MRI

    Science.gov (United States)

    Tiwari, Pallavi; Prasanna, Prateek; Rogers, Lisa; Wolansky, Leo; Badve, Chaitra; Sloan, Andrew; Cohen, Mark; Madabhushi, Anant

    2014-03-01

    Di erentiating radiation necrosis (a radiation induced treatment e ect) from recurrent brain tumors (rBT) is currently one of the most clinically challenging problems in care and management of brain tumor (BT) patients. Both radiation necrosis (RN), and rBT exhibit similar morphological appearance on standard MRI making non-invasive diagnosis extremely challenging for clinicians, with surgical intervention being the only course for obtaining de nitive ground truth". Recent studies have reported that the underlying biological pathways de n- ing RN and rBT are fundamentally di erent. This strongly suggests that there might be phenotypic di erences and hence cues on multi-parametric MRI, that can distinguish between the two pathologies. One challenge is that these di erences, if they exist, might be too subtle to distinguish by the human observer. In this work, we explore the utility of computer extracted texture descriptors on multi-parametric MRI (MP-MRI) to provide alternate representations of MRI that may be capable of accentuating subtle micro-architectural di erences between RN and rBT for primary and metastatic (MET) BT patients. We further explore the utility of texture descriptors in identifying the MRI protocol (from amongst T1-w, T2-w and FLAIR) that best distinguishes RN and rBT across two independent cohorts of primary and MET patients. A set of 119 texture descriptors (co-occurrence matrix homogeneity, neighboring gray-level dependence matrix, multi-scale Gaussian derivatives, Law features, and histogram of gradient orientations (HoG)) for modeling di erent macro and micro-scale morphologic changes within the treated lesion area for each MRI protocol were extracted. Principal component analysis based variable importance projection (PCA-VIP), a feature selection method previously developed in our group, was employed to identify the importance of every texture descriptor in distinguishing RN and rBT on MP-MRI. PCA-VIP employs regression analysis to provide

  5. Spatial data software integration - Merging CAD/CAM/mapping with GIS and image processing

    Science.gov (United States)

    Logan, Thomas L.; Bryant, Nevin A.

    1987-01-01

    The integration of CAD/CAM/mapping with image processing using geographic information systems (GISs) as the interface is examined. Particular emphasis is given to the development of software interfaces between JPL's Video Image Communication and Retrieval (VICAR)/Imaged Based Information System (IBIS) raster-based GIS and the CAD/CAM/mapping system. The design and functions of the VICAR and IBIS are described. Vector data capture and editing are studied. Various software programs for interfacing between the VICAR/IBIS and CAD/CAM/mapping are presented and analyzed.

  6. Usefulness of Multi-Parametric MRI for the Investigation of Posterior Cortical Atrophy.

    Directory of Open Access Journals (Sweden)

    Andrea Arighi

    Full Text Available Posterior Cortical Atrophy (PCA is a neurodegenerative disease characterized by a progressive decline in selective cognitive functions anatomically referred to occipital, parietal and temporal brain regions, whose diagnosis is rather challenging for clinicians. The aim of this study was to assess, using quantitative Magnetic Resonance Imaging techniques, the pattern of regional grey matter loss and metabolism in individuals with PCA to improve pathophysiological comprehension and diagnostic confidence.We enrolled 5 patients with PCA and 5 matched controls who all underwent magnetic resonance imaging (MRI and spectroscopy (MRS. Patients also underwent neuropsychological and cerebrospinal fluid (CSF assessments. MRI data were used for unbiased assessment of regional grey matter loss in PCA patients compared to controls. MRS data were obtained from a set of brain regions, including the occipital lobe and the centrum semiovale bilaterally, and the posterior and anterior cingulate.VBM analysis documented the presence of focal brain atrophy in the occipital lobes and in the posterior parietal and temporal lobes bilaterally but more pronounced on the right hemisphere. MRS revealed, in the occipital lobes and in the posterior cingulate cortex of PCA patients, reduced levels of N-Acetyl Aspartate (NAA, a marker of neurodegeneration and increased levels of Myo-Inositol (Ins, a glial marker, with no hemispheric lateralization.The bilateral but asymmetric pattern of regional grey matter loss is consistent with patients' clinical and neuropsychological features and with previous literature. The MRS findings reveal different stages of neurodegeneration (neuronal loss; gliosis, which coexist and likely precede the occurrence of brain tissue loss, and might represent early biomarkers. In conclusion, this study indicates the potential usefulness of a multi-parametric MRI approach for an early diagnosis and staging of patients with PCA.

  7. Multi-parametric approach to identify coffee components that regulate mechanisms of gastric acid secretion.

    Science.gov (United States)

    Rubach, Malte; Lang, Roman; Seebach, Elisabeth; Somoza, Mark M; Hofmann, Thomas; Somoza, Veronika

    2012-02-01

    Chlorogenic acid (CA), caffeine (CAFF), pyrogallol (PYR), catechol (CAT), (β)N-alkanoyl-hydroxytryptamides (C5HT) and N-methylpyridinium (N-MP) were evaluated for their influence on mechanisms of gastric acid secretion as single compounds and in biomimetic mixtures. Compounds were tested in coffee representative concentrations. Human gastric cancer cells (HGT-1) were used to study the proton secretory activity by Ussing chamber experiments and FACS analysis. For activation of EGFr, Akt1, ERK1/2, ATF-2 and cAMP levels, we performed pathway screening assays. Time-dependent expression of related genes were determined by real-time PCR. Part of the data was used for neural network modeling to identify the most relevant compounds. N-MP increased the expression of the anti-secretory somatostatin receptor by 114%, whereas C5HT decreased its expression by 52%. N-MP down-regulated the pro-secretory CHRM3 receptor by 36% and the H⁺,K⁺-ATPase by 36%. CAFF stimulated the secretory activity in the functional assays, whereas N-MP and CA decreased proton secretion. After applying a pathway analysis, we were able to discriminate between CAFF, CA, CAT, C5HT, PYR and histamine-activating EGFr signaling and N-MP-associated ERK1/2 signaling. By applying a multi-parametric approach, N-MP was shown to effectively down-regulate mechanisms of gastric acid secretion in human parietal gastric cells. © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Prostate cancer on computed tomography: A direct comparison with multi-parametric magnetic resonance imaging and tissue pathology.

    Science.gov (United States)

    Jia, Jemianne Bautista; Houshyar, Roozbeh; Verma, Sadhna; Uchio, Edward; Lall, Chandana

    2016-01-01

    Multi-parametric prostate magnetic resonance imaging (MRI) is considered the current imaging standard for detection and staging of prostate cancer. The combination of anatomical and functional imaging provided in this exam significantly increases the accuracy of prostate cancer detection. Computed tomography (CT) imaging has so far been found to be lacking in this regard, however observations at our academic institution as well as evidence present in the literature support the proposition that CT could indeed be helpful in detecting prostate abnormalities that correspond to neoplasm. The purpose of this study was to prove that areas of focal mass-like enhancement on CT imaging directly correlate with prostate neoplasms as revealed on multi-parametric MRI and follow-up targeted biopsy. This was a single institution retrospective study with 27 male subjects. Inclusion criteria required subjects to have a multi-parametric MRI of the prostate between January 1, 2014 and June 1, 2015 and a pelvic venous phase contrast-enhanced CT study between January 1, 2000 and June 1, 2015. Two blinded Radiologists read subjects' CT scans for any abnormalities of the prostate. CT and multi-parametric MRI results were compared and were considered concordant if focal or mass like enhancement to a greater degree than the background parenchyma was detected in the same areas of the prostate on CT scan as areas of decreased T2 signal, perfusion abnormalities, and restricted diffusion on multi-parametric MRI. CT results were directly compared to multi-parametric MRI findings and biopsy results. The overall agreement of MRI and CT is 85.19% (95% CI: 67.52-94.08%). The positive percent agreement is 78.95% (95% CI: 54.43-93.95%) and the negative percent agreement is 100.0% (95% CL: 63.06-100.0%). When CT results are directly compared to biopsy results, sensitivity and specificity of CT are 63.64% (95% CI: 30.79-89.07%) and 100.0% (95% CI: 47.82-100.0%). The positive predictive value (PPV) is

  9. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    Science.gov (United States)

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization.

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. Mapping social network to software architecture to detect structure clashes in agile software development

    NARCIS (Netherlands)

    Amrit, Chintan; Hillegersberg, van Jos

    2007-01-01

    Software development is rarely an individual effort and generally involves teams of developers collaborating together in order to generate reliable code. Such collaborations require proper communication and regular coordination among the team members. In addition, coordination is required to sort ou

  13. Ionospheric Mapping Software Ensures Accuracy of Pilots GPS

    Science.gov (United States)

    2015-01-01

    IonoSTAGE and SuperTruth software are part of a suite created at the Jet Propulsion Laboratory to enable the Federal Aviation Administration's Wide Area Augmentation System, which provides pinpoint accuracy in aircraft GPS units. The system, used by more than 73,000 planes, facilitates landings under adverse conditions at small airports. In 2013, IonoSTAGE and SuperTruth found their first commercial license when NEC, based in Japan, with US headquarters in Irving, Texas, licensed the entire suite.

  14. Development of a reconstruction software of elemental maps by micro X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Energia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho [Universidade Federal do Rio de Janeiro (IF/UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana [Universidade Estadual de Campinas (FEC/UNICAMP), SP (Brazil) Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline at National Synchrotron Light Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in form of a matrix of data. (author)

  15. Development of a software for reconstruction of X-ray fluorescence intensity maps

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Andre Pereira de; Braz, Delson; Mota, Carla Lemos, E-mail: apalmeid@gmail.co, E-mail: delson@lin.ufrj.b, E-mail: clemos@con.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Oliveira, Luis Fernando de; Barroso, Regina Cely; Pinto, Nivia Graciele Villela, E-mail: cely@uerj.b, E-mail: lfolive@uerj.b, E-mail: nitatag@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica; Cardoso, Simone Coutinho, E-mail: simone@if.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Fisica; Moreira, Silvana, E-mail: silvana@fec.unicamp.b [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo

    2009-07-01

    The technique of X-ray fluorescence (XRF) using SR microbeams is a powerful analysis tool for studying elemental composition in several samples. One application of this technique is the analysis done through the mapping of chemical elements forming a matrix of data. The aim of this work is the presentation of the program MapXRF, an in-house software designed to optimize the processing and mapping of fluorescence intensities data. This program uses spectra generated by QXAS as input data and separates the intensities of each chemical element found in the fluorescence spectra in files themselves. From these files, the program generates the intensity maps that can be visualized in any program of treatment of images. The proposed software was tested using fluorescence data obtained in the XRF beamline of XRF at Synchrotron Light National Laboratory (LNLS), Brazil. Automatic 2D scans were performed and element distribution maps were obtained in the form of a matrix of data. (author)

  16. Smoothing point data into maps using SAS/GRAPH (trade name) software. Forest Service research note

    Energy Technology Data Exchange (ETDEWEB)

    Chojnacky, D.C.; Rubey, M.E.

    1996-01-01

    Point of plot data are commonly available for mapping forest landscapes. Because such data are sampled, mapping a complete coverage usually requires some type of interpolation between plots. SAS/GRAPH software includes the G3GRID procedure for interpolating or smoothing this type of data to map with G3D or GCONTOUR procedures. However, the smoothing process in G3GRID is not easily controlled, nor can it be used to display missing data within rectangular grid maps. These shortcomings motivated development of SAS code that prepares point data for display in mapping units. This code links well with the rest of the SAS system to allow for powerful, easily controlled data analysis within mapping units. Examples are given for mapping forest vegetation with the GMAP procedure.

  17. Hypertrophic cardiomyopathy: Cardiac structural and microvascular abnormalities as evaluated with multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yu-Dong, E-mail: njmu_zyd@163.com [Department of Radiology, the First Affiliated Hospital with Nanjing Medical University (China); Li, Meijiao, E-mail: newgljyk@163.com [Department of Radiology, Peking University First Hospital (China); Qi, Liang, E-mail: qiliang1120@126.com [Department of Radiology, the First Affiliated Hospital with Nanjing Medical University (China); Wu, Chen-Jiang, E-mail: njmu_wcj@163.com [Department of Radiology, the First Affiliated Hospital with Nanjing Medical University (China); Wang, Xiaoying, E-mail: cjr.wangxiaoying@vip.163.com [Department of Radiology, Peking University First Hospital (China)

    2015-08-15

    Highlights: • LGE-present HCM had lower K{sup trans}, higher V{sub e} and MTT against LGE-absent HCM and normal group. • LGE-absent had significantly higher V{sub e} and MTT against normal group. • K{sup trans} was not changed between LGE-absent and normal group Microcirculatory dysfunction in HCM closely correlated to structural abnormality. - Abstract: Purpose: To determine the relationship between myocardial structural and microvascular abnormality in hypertrophic cardiomyopathy (HCM) by multi-parametric cardiac MRI. Materials and methods: Twenty-four HCM and eighteen controls were retrospectively included. Left ventricle mass (LVM), LV end-systolic and end-diastolic volume (LVESV, LVEDV), LV ejection fraction (LVEF), and 16-segment wall thickness at ES and ED (SESWT, SEDWT) were assessed with a 2D cine-MRI. Myocardial perfusion (reflected by K{sup trans}), interstitial volume (V{sub e}) and mean transmit time (MTT) were evaluated with a model-dependent dynamic contrast-enhanced MRI. Myocardial fibrosis was assessed with late gadolinium enhancement (LGE) imaging. Results: K{sup trans} was significantly decreased in LGE-present (0.74 ± 0.15 mL/g/min) against LGE-absent (0.55 ± 0.14 mL/g/min, p = 0.030) and normal group (0.81 ± 0.32 mL/g/min, p < 0.001), but was unchanged in LGE-absent against normal group (p > 0.05). V{sub e} and MTT were significantly increased in LGE-present (V{sub e}: 26.7 ± 15.7%; MTT: 28.6 ± 21.3 s) against LGE-absent (37.6 ± 18.3%; 49.8 ± 30.5 s) and normal group (19.7 ± 6.9%; 15.1 ± 3.9 s; all p < 0.001), and were significantly increased in LGE-absent against normal group (p < 0.001). LGE significantly correlated to K{sup trans}, V{sub e}, MTT, and SESWT (ρ = 0.232, −0.247, −0.443, and −0.207, respectively). K{sup trans} negatively correlated to SEDWT and SESWT (ρ = −0.224 and −0.231). V{sub e} and MTT positively correlated to SEDWT (V{sub e}: ρ = 0.223; MTT: ρ = 0.239) and SESWT (V{sub e}: ρ = 0.248; MTT:

  18. Investigating differential dynamics of the MAPK signaling cascade using a multi-parametric global sensitivity analysis.

    Directory of Open Access Journals (Sweden)

    Jeongah Yoon

    Full Text Available Cell growth critically depends on signalling pathways whose regulation is the focus of intense research. Without utilizing a priori knowledge of the relative importance of pathway components, we have applied in silico computational methods to the EGF-induced MAPK cascade. Specifically, we systematically perturbed the entire parameter space, including initial conditions, using a Monte Carlo approach, and investigate which protein components or kinetic reaction steps contribute to the differentiation of ERK responses. The model, based on previous work by Brightman and Fell (2000, is composed of 28 reactions, 27 protein molecules, and 48 parameters from both mass action and Michaelis-Menten kinetics. Our multi-parametric systems analysis confirms that Raf inactivation is one of the key steps regulating ERK responses to be either transient or sustained. Furthermore, the results of amplitude-differential ERK phosphorylations within the transient case are mainly attributed to the balance between activation and inactivation of Ras while duration-differential ERK responses for the sustained case are, in addition to Ras, markedly affected by dephospho-/phosphorylation of both MEK and ERK. Our sub-module perturbations showed that MEK and ERK's contribution to this differential ERK activation originates from fluctuations in intermediate pathway module components such as Ras and Raf, implicating a cooperative regulatory mode among the key components. The initial protein concentrations of corresponding reactions such as Ras, GAP, and Raf also influence the distinct signalling outputs of ERK activation. We then compare these results with those obtained from a single-parametric perturbation approach using an overall state sensitivity (OSS analysis. The OSS findings indicate a more pronounced role of ERK's inhibitory feedback effect on catalysing the dissociation of the SOS complex. Both approaches reveal the presence of multiple specific reactions involved in

  19. Assessment of mechanical properties of isolated bovine intervertebral discs from multi-parametric magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Recuerda Maximilien

    2012-10-01

    Full Text Available Abstract Background The treatment planning of spine pathologies requires information on the rigidity and permeability of the intervertebral discs (IVDs. Magnetic resonance imaging (MRI offers great potential as a sensitive and non-invasive technique for describing the mechanical properties of IVDs. However, the literature reported small correlation coefficients between mechanical properties and MRI parameters. Our hypothesis is that the compressive modulus and the permeability of the IVD can be predicted by a linear combination of MRI parameters. Methods Sixty IVDs were harvested from bovine tails, and randomly separated in four groups (in-situ, digested-6h, digested-18h, digested-24h. Multi-parametric MRI acquisitions were used to quantify the relaxation times T1 and T2, the magnetization transfer ratio MTR, the apparent diffusion coefficient ADC and the fractional anisotropy FA. Unconfined compression, confined compression and direct permeability measurements were performed to quantify the compressive moduli and the hydraulic permeabilities. Differences between groups were evaluated from a one way ANOVA. Multi linear regressions were performed between dependent mechanical properties and independent MRI parameters to verify our hypothesis. A principal component analysis was used to convert the set of possibly correlated variables into a set of linearly uncorrelated variables. Agglomerative Hierarchical Clustering was performed on the 3 principal components. Results Multilinear regressions showed that 45 to 80% of the Young’s modulus E, the aggregate modulus in absence of deformation HA0, the radial permeability kr and the axial permeability in absence of deformation k0 can be explained by the MRI parameters within both the nucleus pulposus and the annulus pulposus. The principal component analysis reduced our variables to two principal components with a cumulative variability of 52-65%, which increased to 70-82% when considering the third

  20. Cyberphysical systems for epilepsy and related brain disorders multi-parametric monitoring and analysis for diagnosis and optimal disease management

    CERN Document Server

    Antonopoulos, Christos

    2015-01-01

    This book introduces a new cyberphysical system that combines clinical and basic neuroscience research with advanced data analysis and medical management tools for developing novel applications for the management of epilepsy. The authors describe the algorithms and architectures needed to provide ambulatory, diagnostic and long-term monitoring services, through multi parametric data collection. Readers will see how to achieve in-hospital quality standards, addressing conventional “routine” clinic-based service purposes, at reduced cost, enhanced capability, and increased geographical availability. The cyberphysical system described in this book is flexible, can be optimized for each patient, and is demonstrated in several case studies.

  1. Modern methods of documentation for conservation - digital mapping in metigo® MAP, Software for documentation, mapping and quantity survey and analysis

    Science.gov (United States)

    Siedler, Gunnar; Vetter, Sebastian

    2015-04-01

    Several years of experience of heritage documentation have given a background to develop methods of cartography and digital evaluation. The outcome of which is the development of a 2D-mapping software with integrated image rectification over a period of more then 10 years and that became the state of the art software in Germany initially and now elsewhere for Conservation and Restoration projects. If there are no mapping bases (image plan or CAD-drawing), the user can create its own image plans using different types of rectification functions. Based on true to scale mappings, quantity surveys of areas and lines can be calculated automatically. Digital maps were used for the documentation and analysis of materials and damages, for planning of required action and for calculation of costs. With the help of the hierarchy even large mapping projects with many sub projects can be managed. The results of quantification can be exported to excel spreadsheets for further processing. The combination of image processing and CAD-functionality makes operation of the programm user-friendly, both in the office and on-site. metigo MAP was developed in close cooperation with conservators and restorers. Based on simple equipment consisting of digital camera, laser measuring instrument for measuring distances or total station and standard notebook the mapping software is used in many restoration companies.

  2. Automated diagnosis of prostate cancer in multi-parametric MRI based on multimodal convolutional neural networks

    Science.gov (United States)

    Le, Minh Hung; Chen, Jingyu; Wang, Liang; Wang, Zhiwei; Liu, Wenyu; (Tim Cheng, Kwang-Ting; Yang, Xin

    2017-08-01

    Automated methods for prostate cancer (PCa) diagnosis in multi-parametric magnetic resonance imaging (MP-MRIs) are critical for alleviating requirements for interpretation of radiographs while helping to improve diagnostic accuracy (Artan et al 2010 IEEE Trans. Image Process. 19 2444-55, Litjens et al 2014 IEEE Trans. Med. Imaging 33 1083-92, Liu et al 2013 SPIE Medical Imaging (International Society for Optics and Photonics) p 86701G, Moradi et al 2012 J. Magn. Reson. Imaging 35 1403-13, Niaf et al 2014 IEEE Trans. Image Process. 23 979-91, Niaf et al 2012 Phys. Med. Biol. 57 3833, Peng et al 2013a SPIE Medical Imaging (International Society for Optics and Photonics) p 86701H, Peng et al 2013b Radiology 267 787-96, Wang et al 2014 BioMed. Res. Int. 2014). This paper presents an automated method based on multimodal convolutional neural networks (CNNs) for two PCa diagnostic tasks: (1) distinguishing between cancerous and noncancerous tissues and (2) distinguishing between clinically significant (CS) and indolent PCa. Specifically, our multimodal CNNs effectively fuse apparent diffusion coefficients (ADCs) and T2-weighted MP-MRI images (T2WIs). To effectively fuse ADCs and T2WIs we design a new similarity loss function to enforce consistent features being extracted from both ADCs and T2WIs. The similarity loss is combined with the conventional classification loss functions and integrated into the back-propagation procedure of CNN training. The similarity loss enables better fusion results than existing methods as the feature learning processes of both modalities are mutually guided, jointly facilitating CNN to ‘see’ the true visual patterns of PCa. The classification results of multimodal CNNs are further combined with the results based on handcrafted features using a support vector machine classifier. To achieve a satisfactory accuracy for clinical use, we comprehensively investigate three critical factors which could greatly affect the performance of our

  3. Investigating the Impact of Concept Mapping Software on Greek Students with Attention Deficit (AD)

    Science.gov (United States)

    Riga, Asimina; Papayiannis, Nikolaos

    2015-01-01

    The present study investigates if there is a positive effect of the use of concept mapping software on students with Attention Deficit (AD) when learning descriptive writing in the secondary level of education. It also examines what kind of difficulties AD students may have come across during this learning procedure. Sample students were selected…

  4. Inspection design using 2D phased array, TFM and cueMAP software

    Energy Technology Data Exchange (ETDEWEB)

    McGilp, Ailidh; Dziewierz, Jerzy; Lardner, Tim; Mackersie, John; Gachagan, Anthony [Centre for Ultrasonic Engineering, University of Strathclyde, Glasgow (United Kingdom)

    2014-02-18

    A simulation suite, cueMAP, has been developed to facilitate the design of inspection processes and sparse 2D array configurations. At the core of cueMAP is a Total Focusing Method (TFM) imaging algorithm that enables computer assisted design of ultrasonic inspection scenarios, including the design of bespoke array configurations to match the inspection criteria. This in-house developed TFM code allows for interactive evaluation of image quality indicators of ultrasonic imaging performance when utilizing a 2D phased array working in FMC/TFM mode. The cueMAP software uses a series of TFM images to build a map of resolution, contrast and sensitivity of imaging performance of a simulated reflector, swept across the inspection volume. The software takes into account probe properties, wedge or water standoff, and effects of specimen curvature. In the validation process of this new software package, two 2D arrays have been evaluated on 304n stainless steel samples, typical of the primary circuit in nuclear plants. Thick section samples have been inspected using a 1MHz 2D matrix array. Due to the processing efficiency of the software, the data collected from these array configurations has been used to investigate the influence sub-aperture operation on inspection performance.

  5. Inspection design using 2D phased array, TFM and cueMAP software

    Science.gov (United States)

    McGilp, Ailidh; Dziewierz, Jerzy; Lardner, Tim; Mackersie, John; Gachagan, Anthony

    2014-02-01

    A simulation suite, cueMAP, has been developed to facilitate the design of inspection processes and sparse 2D array configurations. At the core of cueMAP is a Total Focusing Method (TFM) imaging algorithm that enables computer assisted design of ultrasonic inspection scenarios, including the design of bespoke array configurations to match the inspection criteria. This in-house developed TFM code allows for interactive evaluation of image quality indicators of ultrasonic imaging performance when utilizing a 2D phased array working in FMC/TFM mode. The cueMAP software uses a series of TFM images to build a map of resolution, contrast and sensitivity of imaging performance of a simulated reflector, swept across the inspection volume. The software takes into account probe properties, wedge or water standoff, and effects of specimen curvature. In the validation process of this new software package, two 2D arrays have been evaluated on 304n stainless steel samples, typical of the primary circuit in nuclear plants. Thick section samples have been inspected using a 1MHz 2D matrix array. Due to the processing efficiency of the software, the data collected from these array configurations has been used to investigate the influence sub-aperture operation on inspection performance.

  6. Development of PowerMap: a software package for statistical power calculation in neuroimaging studies.

    Science.gov (United States)

    Joyce, Karen E; Hayasaka, Satoru

    2012-10-01

    Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.

  7. MAPS-15504 - Uma metodologia automatizada para avaliação de processo de software

    Directory of Open Access Journals (Sweden)

    Itana Maria de Souza Gimenes

    2000-05-01

    Full Text Available Devido às crescentes exigências por qualidade, a comunidade de engenharia de software tem produzido diversas normas e apresentado diversas abordagens sobre a qualidade dos produtos e processos de software. Grande parte dessas normas são aplicadas ao processo de software, dentre os quais se destacam pela larga utilização a ISO 9000-3, a ISO 12207, o CMM e o ISO/IEC TR 15504 (resultado dos trabalhos do projeto SPICE. Outro resultado das pesquisas da comunidade de engenharia de software são os ambientes de engenharia de software centrados em processo (PSEE, os quais visam à automação do processo de software. Este artigo apresenta MAPS-15504, uma metodologia automatizada para avaliação da qualidade do processo de software baseada no ISO/IEC TR 15504. A metodologia de avaliação de processo de software foi aplicada a um estudo de caso e implementada no ambiente do ExPSEE, um ambiente experimental desenvolvido no Departamento de Informática (DIN da Universidade Estadual de Maringá (UEM

  8. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  9. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. The use of digital signal processors (DSPs) in real-time processing of multi-parametric bioelectronic signals.

    Science.gov (United States)

    Ressler, Johann; Dirscherl, Andreas; Grothe, Helmut; Wolf, Bernhard

    2007-02-01

    In many cases of bioanalytical measurement, calculation of large amounts of data, analysis of complex signal waveforms or signal speed can overwhelm the performance of microcontrollers, analog electronic circuits or even PCs. One method to obtain results in real time is to apply a digital signal processor (DSP) for the analysis or processing of measurement data. In this paper we show how DSP-supported multiplying and accumulating (MAC) operations, such as time/frequency transformation, pattern recognition by correlation, convolution or filter algorithms, can optimize the processing of bioanalytical data. Discrete integral calculations are applied to the acquisition of impedance values as part of multi-parametric sensor chips, to pH monitoring using light-addressable potentiometric sensors (LAPS) and to the analysis of rapidly changing signal shapes, such as action potentials of cultured neuronal networks, as examples of DSP capability.

  13. Recent developments in multi-parametric three-dimensional stress field representation in plates weakened by cracks and notches

    Directory of Open Access Journals (Sweden)

    P. Lazzarin

    2013-07-01

    Full Text Available The paper deals with the three-dimensional nature and the multi-parametric representation of the stress field ahead of cracks and notches of different shape. Finite thickness plates are considered, under different loading conditions. Under certain hypotheses, the three-dimensional governing equations of elasticity can be reduced to a system where a bi-harmonic equation and a harmonic equation have to be simultaneously satisfied. The former provides the solution of the corresponding plane notch problem, the latter provides the solution of the corresponding out-of-plane shear notch problem. The analytical frame is applied to some notched and cracked geometries and its degree of accuracy is discussed comparing theoretical results and numerical data from 3D FE models.

  14. Prostate Cancer: Multi-Parametric Magnetic Resonance Imaging of the Prostate re- classifies patients eligible for Active Surveillance

    DEFF Research Database (Denmark)

    Elkjær, Maria; Pedersen, Bodil Ginnerup; Høyer, Søren

    2015-01-01

    Introduction and Objectives: Pathological examination of the extracted prostate after radical prostatectomy often reveals that trans-rectal ultrasound (TRUS) and TRUS-guided biopsy (TRUS-bx) underestimated the tumor’s true size and aggressiveness. This clinical problem seems to defy a satisfactory...... solution to the dilemma of whether to enroll patients in active surveillance (AS) or to offer the patient active treatment. We present the preliminary results of an on-going trial in which we investigate if multi-parametric magnetic resonance imaging (mpMRI) of the prostate detects significant prostate...... cancer (PC) better than TRUS and TRUS-bx, and if we may in this way make selection of patients for active surveillance safer. Materials and Methods: From November 2014 patients enrolled in an AS program at Aarhus University Hospital, Denmark, were offered an mpMRI 8 weeks after TRUS-bx. All patients were...

  15. Assessment of the classification abilities of the CNS multi-parametric optimization approach by the method of logistic regression.

    Science.gov (United States)

    Raevsky, O A; Polianczyk, D E; Mukhametov, A; Grigorev, V Y

    2016-08-01

    Assessment of "CNS drugs/CNS candidates" classification abilities of the multi-parametric optimization (CNS MPO) approach was performed by logistic regression. It was found that the five out of the six separately used physical-chemical properties (topological polar surface area, number of hydrogen-bonded donor atoms, basicity, lipophilicity of compound in neutral form and at pH = 7.4) provided accuracy of recognition below 60%. Only the descriptor of molecular weight (MW) could correctly classify two-thirds of the studied compounds. Aggregation of all six properties in the MPOscore did not improve the classification, which was worse than the classification using only MW. The results of our study demonstrate the imperfection of the CNS MPO approach; in its current form it is not very useful for computer design of new, effective CNS drugs.

  16. Multi-Parametric MRI and Texture Analysis to Visualize Spatial Histologic Heterogeneity and Tumor Extent in Glioblastoma.

    Directory of Open Access Journals (Sweden)

    Leland S Hu

    Full Text Available Genetic profiling represents the future of neuro-oncology but suffers from inadequate biopsies in heterogeneous tumors like Glioblastoma (GBM. Contrast-enhanced MRI (CE-MRI targets enhancing core (ENH but yields adequate tumor in only ~60% of cases. Further, CE-MRI poorly localizes infiltrative tumor within surrounding non-enhancing parenchyma, or brain-around-tumor (BAT, despite the importance of characterizing this tumor segment, which universally recurs. In this study, we use multiple texture analysis and machine learning (ML algorithms to analyze multi-parametric MRI, and produce new images indicating tumor-rich targets in GBM.We recruited primary GBM patients undergoing image-guided biopsies and acquired pre-operative MRI: CE-MRI, Dynamic-Susceptibility-weighted-Contrast-enhanced-MRI, and Diffusion Tensor Imaging. Following image coregistration and region of interest placement at biopsy locations, we compared MRI metrics and regional texture with histologic diagnoses of high- vs low-tumor content (≥80% vs <80% tumor nuclei for corresponding samples. In a training set, we used three texture analysis algorithms and three ML methods to identify MRI-texture features that optimized model accuracy to distinguish tumor content. We confirmed model accuracy in a separate validation set.We collected 82 biopsies from 18 GBMs throughout ENH and BAT. The MRI-based model achieved 85% cross-validated accuracy to diagnose high- vs low-tumor in the training set (60 biopsies, 11 patients. The model achieved 81.8% accuracy in the validation set (22 biopsies, 7 patients.Multi-parametric MRI and texture analysis can help characterize and visualize GBM's spatial histologic heterogeneity to identify regional tumor-rich biopsy targets.

  17. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks

    KAUST Repository

    Lautenschläger, Karin

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (±0.6)×104cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, sofar for unknown reasons, recorded a slight but significantly higher TCC (1.3(±0.1)×105cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used provides a powerful

  18. Program Setup Time and Learning Curves associated with "ready to fly" Drone Mapping Hardware and Software.

    Science.gov (United States)

    Wilcox, T.

    2016-12-01

    How quickly can students (and educators) get started using a "ready to fly" UAS and popular publicly available photogrammetric mapping software for student research at the undergraduate level? This poster presentation focuses on the challenges of starting up your own drone-mapping program for undergraduate research in a compressed timescale of three months. Particular focus will be given to learning the operation of the platforms, hardware and software interface challenges, and using these electronic systems in real-world field settings that pose a range of physical challenges to both operators and equipment. We will be using a combination of the popular DJI Phantom UAS and Pix4D mapping software to investigate mass wasting processes and potential hazards present in public lands popular with recreational users. Projects are aimed at characterizing active geological hazards that operate on short timescales and may include gully headwall erosion in Flaming Geyser State Park and potential landslide instability within Capital State Forest, both in the Puget Sound region of Washington State.

  19. R-CMap-An open-source software for concept mapping.

    Science.gov (United States)

    Bar, Haim; Mentch, Lucas

    2017-02-01

    Planning and evaluating projects often involves input from many stakeholders. Fusing and organizing many different ideas, opinions, and interpretations into a coherent and acceptable plan or project evaluation is challenging. This is especially true when seeking contributions from a large number of participants, especially when not all can participate in group discussions, or when some prefer to contribute their perspectives anonymously. One of the major breakthroughs in the area of evaluation and program planning has been the use of graphical tools to represent the brainstorming process. This provides a quantitative framework for organizing ideas and general concepts into simple-to-interpret graphs. We developed a new, open-source concept mapping software called R-CMap, which is implemented in R. This software provides a graphical user interface to guide users through the analytical process of concept mapping. The R-CMap software allows users to generate a variety of plots, including cluster maps, point rating and cluster rating maps, as well as pattern matching and go-zone plots. Additionally, R-CMap is capable of generating detailed reports that contain useful statistical summaries of the data. The plots and reports can be embedded in Microsoft Office tools such as Word and PowerPoint, where users may manually adjust various plot and table features to achieve the best visual results in their presentations and official reports. The graphical user interface of R-CMap allows users to define cluster names, change the number of clusters, select rating variables for relevant plots, and importantly, select subsets of respondents by demographic criteria. The latter is particularly useful to project managers in order to identify different patterns of preferences by subpopulations. R-CMap is user-friendly, and does not require any programming experience. However, proficient R users can add to its functionality by directly accessing built-in functions in R and sharing new

  20. AV-8B Map System II: Moving Map Composer Software Users Manual

    Science.gov (United States)

    2007-11-02

    Plan A4112-23). The authors thank Captain Reese Hines in the AV-8B Program Office for his support of this project. We also thank Ms. Diana Lemon and...Place monitor, keyboard, and mouse next to stack of equipment; • Place scanner on a table with enough clearance for the moving scan bed. (3) Plug...Team Leader: Diana Lemon Project Engineers: Luie Trudy, Jean Carlton 70 Lohrenz et al. C4.5 National Imagery and Mapping Agency (NIMA) NIMA

  1. User's Guide for the MapImage Reprojection Software Package, Version 1.01

    Science.gov (United States)

    Finn, Michael P.; Trent, Jason R.

    2004-01-01

    Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional

  2. Open Source Software for Mapping Human Impacts on Marine Ecosystems with an Additive Model

    Directory of Open Access Journals (Sweden)

    Andy Stock

    2016-06-01

    Full Text Available This paper describes an easy-to-use open source software tool implementing a commonly used additive model (Halpern et al., 'Science', 2008 for mapping human impacts on marine ecosystems. The tool has been used to map the potential for cumulative human impacts in Arctic marine waters and can support future human impact mapping projects by 1 making the model easier to use; 2 making updates of model results straightforward when better input data become available; 3 storing input data and information about processing steps in a defined format and thus facilitating data sharing and reproduction of modeling results; 4 supporting basic visualization of model inputs and outputs without the need for advanced technical skills. The tool, called EcoImpactMapper, was implemented in Java and is thus platform-independent. A tutorial, example data, the tool and the source code are available online.

  3. ShakeMap manual: technical manual, user's guide, and software guide

    Science.gov (United States)

    Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.

    2005-01-01

    ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.

  4. Software Development Initiatives to Identify and Mitigate Security Threats - Two Systematic Mapping Studies

    Directory of Open Access Journals (Sweden)

    Paulina Silva

    2016-12-01

    Full Text Available Software Security and development experts have addressed the problem of building secure software systems. There are several processes and initiatives to achieve secure software systems. However, most of these lack empirical evidence of its application and impact in building secure software systems. Two systematic mapping studies (SM have been conducted to cover the existent initiatives for identification and mitigation of security threats. The SMs created were executed in two steps, first in 2015 July, and complemented through a backward snowballing in 2016 July. Integrated results of these two SM studies show a total of 30 relevant sources were identified; 17 different initiatives covering threats identification and 14 covering the mitigation of threats were found. All the initiatives were associated to at least one activity of the Software Development Lifecycle (SDLC; while 6 showed signs of being applied in industrial settings, only 3 initiatives presented experimental evidence of its results through controlled experiments, some of the other selected studies presented case studies or proposals.

  5. Una introduzione ai software per il crime mapping / Observations préliminaires sur les logiciels du mappage du crime / Some introductory notes on crime mapping software

    Directory of Open Access Journals (Sweden)

    Ummarino Alessandro

    2013-03-01

    Full Text Available RiassuntoIl Crime Mapping più che una disciplina a se stante non è altro che l’applicazione di tecniche di analisi statistico-geografica allo studio dei reati. Grazie all’utilizzo dei software GIS (Geographic Information System, all’esponenziale sviluppo dell’informatica e alla facile accessibilità al web, la produzione di mappe di qualità è ormai alla portata di un qualunque utente medio. La possibilità di applicare tali tecniche di analisi è offerta in modo efficace da software GIS commerciali e da software GIS free e open source. Chi si vuole avvicinare a questa disciplina, sia che intenda procedere con applicazioni di tipo tattico (pianificazione dei controlli, attività di prevenzione, investigazioni giudiziarie, etc. sia che intenda svolgere degli studi di tipo sociologico (criminalità, devianza, illegalità diffusa, percezione della sicurezza, etc., deve comunque acquisire una solida preparazione di base nell’utilizzo di programmi GIS prima di inferire generalizzazioni dai risultati utilizzando chiavi di lettura provenienti dalle scienze sociali. Il Crime Mapping può trovare una valida applicazione nell’ambito di una generale attività di polizia, soprattutto a livello locale, per la gestione delle risorse destinate alla sicurezza, per la programmazione dei servizi di polizia e soprattutto quale supporto di tipo tattico nell’ambito di attività mirate alla repressione e alla prevenzione di specifici atti criminosi e illeciti. Le mappage du crime n’est pas simplement une discipline en soi, mais une application de techniques d’analyse statistiques et géographiques à l’étude du crime. Grâce au développement exponentiel de l’informatique et à l’accessibilité du Web , tous les utilisateurs moyens ont désormais la possibilité de produire des cartes des crimes de qualité avec le logiciel SIG (système d'information géographique (GIS - Geographic Information System. Aujourd’hui la possibilité de se

  6. An expert-based landslide susceptibility mapping (LSM) module developed for Netcad Architect Software

    Science.gov (United States)

    Sezer, E. A.; Nefeslioglu, H. A.; Osna, T.

    2017-01-01

    The main purpose of this study is to introduce an expert-based LSM module developed for Netcad Architect Software. A landslide-prone area located at the eastern Black Sea region of Turkey was selected as the experimental site for this study. The investigations were performed in four stages: (i) introducing technical details of LSM module and theoretical background of the methods implemented in the module, (ii) experiments; landslide susceptibility evaluations by applying the methods M-AHP and Mamdani type FIS by using the expert-based LSM module, (iii) map similarity assessments and evaluations for the generalization capacities of the expert-based models, and (iv) performance assessments of the LSM module. When considering the areal distributions of matching ratios obtained from the map similarity evaluations, it is revealed that M-AHP is more pessimistic and covers a greater area in higher hazard classes, whereas the Mamdani type FIS behaves more optimistically and restricts the area of higher hazard classes in the experimental site. According to the Receiver Operating Characteristics (ROC) curve analyses, the value of Area Under the ROC Curve (AUC) was obtained as 0.66 for the resultant map produced with Mamdani type FIS and 0.82 for the map produced with M-AHP. To compare the time consumptions of the expert methods, experiments were implemented. Mamdani type FIS completes its task in 3 h and 39 min, whereas M-AHP only requires 47 s. As a consequence, (i) the LSM module developed for Netcad Architect Software presents full-featured expert-based landslide susceptibility mapping abilities, and (ii) M-AHP is a useful method for obtaining an expert opinion and modeling landslide susceptibility.

  7. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    Science.gov (United States)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (problem. As spatial databases evolve these 3D models should be readily importable into the database.

  8. A microbiology-based multi-parametric approach towards assessing biological stability in drinking water distribution networks.

    Science.gov (United States)

    Lautenschlager, Karin; Hwang, Chiachi; Liu, Wen-Tso; Boon, Nico; Köster, Oliver; Vrouwenvelder, Hans; Egli, Thomas; Hammes, Frederik

    2013-06-01

    Biological stability of drinking water implies that the concentration of bacterial cells and composition of the microbial community should not change during distribution. In this study, we used a multi-parametric approach that encompasses different aspects of microbial water quality including microbial growth potential, microbial abundance, and microbial community composition, to monitor biological stability in drinking water of the non-chlorinated distribution system of Zürich. Drinking water was collected directly after treatment from the reservoir and in the network at several locations with varied average hydraulic retention times (6-52 h) over a period of four months, with a single repetition two years later. Total cell concentrations (TCC) measured with flow cytometry remained remarkably stable at 9.5 (± 0.6) × 10(4) cells/ml from water in the reservoir throughout most of the distribution network, and during the whole time period. Conventional microbial methods like heterotrophic plate counts, the concentration of adenosine tri-phosphate, total organic carbon and assimilable organic carbon remained also constant. Samples taken two years apart showed more than 80% similarity for the microbial communities analysed with denaturing gradient gel electrophoresis and 454 pyrosequencing. Only the two sampling locations with the longest water retention times were the exceptions and, so far for unknown reasons, recorded a slight but significantly higher TCC (1.3 (± 0.1) × 10(5) cells/ml) compared to the other locations. This small change in microbial abundance detected by flow cytometry was also clearly observed in a shift in the microbial community profiles to a higher abundance of members from the Comamonadaceae (60% vs. 2% at other locations). Conventional microbial detection methods were not able to detect changes as observed with flow cytometric cell counts and microbial community analysis. Our findings demonstrate that the multi-parametric approach used

  9. Real-Time Bathymetry and Backscatter Mosaic Software for Towed and Shipboard Mapping Sonars

    Science.gov (United States)

    Davis, R. B.; Appelgate, T. B.; Johnson, P. D.

    2006-12-01

    The Hawaii Mapping Research Group (HMRG) of the University of Hawaii has developed a bathymetry and backscatter mosaic software package that is suitable for use either in real-time survey operations or non-real- time processing applications. The system was originally developed for use with HMRG's own towed sonar systems but is also compatible with shipboard multibeam systems. It has been operational on R/V Kilo Moana under the supervision of the University's Ocean Technology Group marine science technicians for well over a year in support of that vessel's Kongsberg/Simrad EM120 and EM1002 multibeams. The software grids and renders incoming data in geo-referenced chart-like displays. Data are typically processed in three operator-selectable resolutions limited only by the processing and storage capabilities of the Linux- or Unix-based host computer system. A master navigation chart shows survey tracklines and swath coverage -- from this chart the user can simultaneously open multiple mosaic views into the underlying bathymetry and backscatter datasets with independent resolutions and display properties (e.g., contour color key and interval for bathymetry, grayscale mapping for backscatter, region of interest, etc.) All displays are optimized for quick pan and zoom operations and include on-screen interface control features which permit numerous display parameters to be rapidly changed. All mosaic views, as well as the master navigation display, can be updated in real-time as new data are supplied by the mapping instrument and can also display a background reference dataset for comparison with incoming instrument data. The software also includes survey-planning features which permit new survey waypoints to be generated interactively with reference to incoming and/or historical background data. Aboard R/V Kilo Moana a pair of dual-monitor computer systems, one for the EM120 and one for the EM1002, are available for the processing and display of incoming data from

  10. Identify and Classify Critical Success Factor of Agile Software Development Methodology Using Mind Map

    Directory of Open Access Journals (Sweden)

    Tasneem Abd El Hameed

    2016-05-01

    Full Text Available Selecting the right method, right personnel and right practices, and applying them adequately, determine the success of software development. In this paper, a qualitative study is carried out among the critical factors of success from previous studies. The factors of success match with their relative principles to illustrate the most valuable factor for agile approach success, this paper also prove that the twelve principles poorly identified for few factors resulting from qualitative and quantitative past studies. Dimensions and Factors are presented using Critical success Dimensions and Factors Mind Map Model.

  11. A Mapping Model for Transforming Traditional Software Development Methods to Agile Methodology

    Directory of Open Access Journals (Sweden)

    Rashmi Popli

    2013-07-01

    Full Text Available Agility is bringing in responsibility and ownership in individuals, which will eventually bring outeffectiveness and efficiency in deliverables. Agile model is growing in the market at very good pace.Companies are drifting from traditional Software Development Life Cycle models to Agile Environment forthe purpose of attaining quality and for the sake of saving cost and time. Nimbleness nature of Agile ishelpful in frequent releases so as to satisfy the customer by providing frequent dual feedback. InTraditional models, life cycle is properly defined and also phases are elaborated by specifying needed inputand output parameters. On the other hand, in Agile environment, phases are specific to methodologies ofAgile - Extreme Programming etc. In this paper a common life cycle approach is proposed that isapplicable for different kinds of teams. The paper aims to describe a mapping function for mapping oftraditional methods to Agile method.

  12. Integrating science and education during an international, multi-parametric investigation of volcanic activity at Santiaguito volcano, Guatemala

    Science.gov (United States)

    Lavallée, Yan; Johnson, Jeffrey; Andrews, Benjamin; Wolf, Rudiger; Rose, William; Chigna, Gustavo; Pineda, Armand

    2016-04-01

    In January 2016, we held the first scientific/educational Workshops on Volcanoes (WoV). The workshop took place at Santiaguito volcano - the most active volcano in Guatemala. 69 international scientists of all ages participated in this intensive, multi-parametric investigation of the volcanic activity, which included the deployment of seismometers, tiltmeters, infrasound microphones and mini-DOAS as well as optical, thermographic, UV and FTIR cameras around the active vent. These instruments recorded volcanic activity in concert over a period of 3 to 9 days. Here we review the research activities and present some of the spectacular observations made through this interdisciplinary efforts. Observations range from high-resolution drone and IR footage of explosions, monitoring of rock falls and quantification of the erupted mass of different gases and ash, as well as morphological changes in the dome caused by recurring explosions (amongst many other volcanic processes). We will discuss the success of such integrative ventures in furthering science frontiers and developing the next generation of geoscientists.

  13. Multi-Parametric Profiling Network Based on Gene Expression and Phenotype Data: A Novel Approach to Developmental Neurotoxicity Testing

    Directory of Open Access Journals (Sweden)

    Hideko Sone

    2011-12-01

    Full Text Available The establishment of more efficient approaches for developmental neurotoxicity testing (DNT has been an emerging issue for children’s environmental health. Here we describe a systematic approach for DNT using the neuronal differentiation of mouse embryonic stem cells (mESCs as a model of fetal programming. During embryoid body (EB formation, mESCs were exposed to 12 chemicals for 24 h and then global gene expression profiling was performed using whole genome microarray analysis. Gene expression signatures for seven kinds of gene sets related to neuronal development and neuronal diseases were selected for further analysis. At the later stages of neuronal cell differentiation from EBs, neuronal phenotypic parameters were determined using a high-content image analyzer. Bayesian network analysis was then performed based on global gene expression and neuronal phenotypic data to generate comprehensive networks with a linkage between early events and later effects. Furthermore, the probability distribution values for the strength of the linkage between parameters in each network was calculated and then used in principal component analysis. The characterization of chemicals according to their neurotoxic potential reveals that the multi-parametric analysis based on phenotype and gene expression profiling during neuronal differentiation of mESCs can provide a useful tool to monitor fetal programming and to predict developmentally neurotoxic compounds.

  14. Multi-parametric profiling network based on gene expression and phenotype data: a novel approach to developmental neurotoxicity testing.

    Science.gov (United States)

    Nagano, Reiko; Akanuma, Hiromi; Qin, Xian-Yang; Imanishi, Satoshi; Toyoshiba, Hiroyoshi; Yoshinaga, Jun; Ohsako, Seiichiroh; Sone, Hideko

    2012-01-01

    The establishment of more efficient approaches for developmental neurotoxicity testing (DNT) has been an emerging issue for children's environmental health. Here we describe a systematic approach for DNT using the neuronal differentiation of mouse embryonic stem cells (mESCs) as a model of fetal programming. During embryoid body (EB) formation, mESCs were exposed to 12 chemicals for 24 h and then global gene expression profiling was performed using whole genome microarray analysis. Gene expression signatures for seven kinds of gene sets related to neuronal development and neuronal diseases were selected for further analysis. At the later stages of neuronal cell differentiation from EBs, neuronal phenotypic parameters were determined using a high-content image analyzer. Bayesian network analysis was then performed based on global gene expression and neuronal phenotypic data to generate comprehensive networks with a linkage between early events and later effects. Furthermore, the probability distribution values for the strength of the linkage between parameters in each network was calculated and then used in principal component analysis. The characterization of chemicals according to their neurotoxic potential reveals that the multi-parametric analysis based on phenotype and gene expression profiling during neuronal differentiation of mESCs can provide a useful tool to monitor fetal programming and to predict developmentally neurotoxic compounds.

  15. Multi-parametric surface plasmon resonance platform for studying liposome-serum interactions and protein corona formation.

    Science.gov (United States)

    Kari, Otto K; Rojalin, Tatu; Salmaso, Stefano; Barattin, Michela; Jarva, Hanna; Meri, Seppo; Yliperttula, Marjo; Viitala, Tapani; Urtti, Arto

    2017-04-01

    When nanocarriers are administered into the blood circulation, a complex biomolecular layer known as the "protein corona" associates with their surface. Although the drivers of corona formation are not known, it is widely accepted that this layer mediates biological interactions of the nanocarrier with its surroundings. Label-free optical methods can be used to study protein corona formation without interfering with its dynamics. We demonstrate the proof-of-concept for a multi-parametric surface plasmon resonance (MP-SPR) technique in monitoring the formation of a protein corona on surface-immobilized liposomes subjected to flowing 100 % human serum. We observed the formation of formulation-dependent "hard" and "soft" coronas with distinct refractive indices, layer thicknesses, and surface mass densities. MP-SPR was also employed to determine the affinity (K D ) of a complement system molecule (C3b) with cationic liposomes with and without polyethylene glycol. Tendency to create a thick corona correlated with a higher affinity of opsonin C3b for the surface. The label-free platform provides a fast and robust preclinical tool for tuning nanocarrier surface architecture and composition to control protein corona formation.

  16. Incorporating endorectal MR elastography into multi-parametric MRI for prostate cancer imaging: Initial feasibility in volunteers.

    Science.gov (United States)

    Arani, Arvin; Da Rosa, Michael; Ramsay, Elizabeth; Plewes, Don B; Haider, Masoom A; Chopra, Rajiv

    2013-11-01

    To investigate the tolerability and technical feasibility of performing endorectal MR elastography (eMRE) in human volunteers within the representative age group commonly affected by prostate cancer. Endorectal MRE was conducted on seven volunteers in a 1.5 Tesla (T) MR imager using a rigid endorectal coil. Another five volunteers were imaged on a 3T MR imager using an inflatable balloon type endorectal coil. Tolerability was accessed for vibration amplitudes of ±1-50 μm and for frequencies of 100-300 Hz. All 12 volunteers tolerated the displacements necessary to successfully perform eMRE. Shear waves with frequencies up to 300 Hz could propagate across the entire prostate using both coil designs. The results of this study motivate further investigation of eMRE in prostate cancer patients to help determine if there is an added value of integrating eMRE into existing multi-parametric prostate MRI exams. Copyright © 2013 Wiley Periodicals, Inc.

  17. COMPETENCY MAPPING IN A SMALL SOFTWARE BUSINESS: THE CASE OF ABC LTD

    Directory of Open Access Journals (Sweden)

    Ariel Behr

    2010-06-01

    Full Text Available Small businesses in Brazilian are among the most managerially challenged in the country, principally because their small size requires a focus on day to day operations, leaving minimal resources for their management. This paper proposes a methodology for mapping individual, functional, and organizational competencies as they relate to the firm’s strategy, and applies it in a small software firm. Our method is based on the systematic integration of “ends activities” (strategic objectives and “means activities” (competencies necessary to the organization, its functional areas, and its professionals Put differently, this paper presents techniques for mapping the relation between strategy and organizational, functional, and individual competencies. Data collection involved four phases: 1. An initial interview with the owner, 2. Analyses of commercial documents. 3. Interviews with all managers, 4. Interviews with selected clients. Our study is unique in that it provides concrete guidelines for firms in the software industry at the same time it brings together previously isolated knowledge from different disciplines and different levels of analysis.

  18. A multi-parametric microarray for protein profiling: simultaneous analysis of 8 different cytochromes via differentially element tagged antibodies and laser ablation ICP-MS.

    Science.gov (United States)

    Waentig, Larissa; Techritz, Sandra; Jakubowski, Norbert; Roos, Peter H

    2013-11-07

    The paper presents a new multi-parametric protein microarray embracing the multi-analyte capabilities of laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). The combination of high throughput reverse phase protein microarrays with element tagged antibodies and LA-ICP-MS makes it possible to detect and quantify many proteins or biomarkers in multiple samples simultaneously. A proof of concept experiment is performed for the analysis of cytochromes particularly of cytochrome P450 enzymes, which play an important role in the metabolism of xenobiotics such as toxicants and drugs. With the aid of the LA-ICP-MS based multi-parametric reverse phase protein microarray it was possible to analyse 8 cytochromes in 14 different proteomes in one run. The methodology shows excellent detection limits in the lower amol range and a very good linearity of R(2) ≥ 0.9996 which is a prerequisite for the development of further quantification strategies.

  19. Multi-parametric (ADC/PWI/T2-w) image fusion approach for accurate semi-automatic segmentation of tumorous regions in glioblastoma multiforme.

    Science.gov (United States)

    Fathi Kazerooni, Anahita; Mohseni, Meysam; Rezaei, Sahar; Bakhshandehpour, Gholamreza; Saligheh Rad, Hamidreza

    2015-02-01

    Glioblastoma multiforme (GBM) brain tumor is heterogeneous in nature, so its quantification depends on how to accurately segment different parts of the tumor, i.e. viable tumor, edema and necrosis. This procedure becomes more effective when metabolic and functional information, provided by physiological magnetic resonance (MR) imaging modalities, like diffusion-weighted-imaging (DWI) and perfusion-weighted-imaging (PWI), is incorporated with the anatomical magnetic resonance imaging (MRI). In this preliminary tumor quantification work, the idea is to characterize different regions of GBM tumors in an MRI-based semi-automatic multi-parametric approach to achieve more accurate characterization of pathogenic regions. For this purpose, three MR sequences, namely T2-weighted imaging (anatomical MR imaging), PWI and DWI of thirteen GBM patients, were acquired. To enhance the delineation of the boundaries of each pathogenic region (peri-tumoral edema, viable tumor and necrosis), the spatial fuzzy C-means algorithm is combined with the region growing method. The results show that exploiting the multi-parametric approach along with the proposed semi-automatic segmentation method can differentiate various tumorous regions with over 80 % sensitivity, specificity and dice score. The proposed MRI-based multi-parametric segmentation approach has the potential to accurately segment tumorous regions, leading to an efficient design of the pre-surgical treatment planning.

  20. Detection of sub-basaltic sediments by a multi-parametric joint inversion approach

    Indian Academy of Sciences (India)

    Ajay Manglik; Saurabh K Verma; K H Singh

    2009-10-01

    In many parts of the world sedimentary horizons with potential for hydrocarbon are located below flood basalt provinces.However,the presence of high velocity basaltic overburden makes delineation of sediments difficult due to the low velocity layer problem.Electrical and electromagnetic methods have been used in such scenarios because of the good electrical conductivity contrast between basalts and underlying sediments.However,mapping of the target sediments becomes difficult when the layer is thin as the data errors due to inherent noise lead to equivalent solutions.To tackle such difficult situations,a joint inversion scheme incorporating seismic reflection and refraction, magnetotelluric and deep electrical resistivity datasets is presented. Efficacy of the scheme is tested for a model comprising a thin sedimentary layer sandwiched between a thick basalt cover and a granitic basement.The results indicate that the parameters of the target sedimentary layer are either poorly resolved or equivalent solutions are obtained by the inversion of individual datasets. Joint inversions of seismic reflection (RFLS)and refraction (RFRS),or DC and MT dataset pairs provide improved results and the range of equivalent solutions is narrowed down.Combination of any three of the above datasets leads to further narrowing of this range and improvements in mean model estimates.Joint inversion incorporating all the datasets is found to yield good estimates of the structure.Resolution analysis is carried out to appraise estimates of various model parameters obtained by jointly inverting different combinations of datasets.

  1. Zone-specific logistic regression models improve classification of prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim U.; Emberton, Mark [University College London, Research Department of Urology, Division of Surgery and Interventional Science, London (United Kingdom); Kirkham, Alex [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2015-09-15

    To assess the interchangeability of zone-specific (peripheral-zone (PZ) and transition-zone (TZ)) multiparametric-MRI (mp-MRI) logistic-regression (LR) models for classification of prostate cancer. Two hundred and thirty-one patients (70 TZ training-cohort; 76 PZ training-cohort; 85 TZ temporal validation-cohort) underwent mp-MRI and transperineal-template-prostate-mapping biopsy. PZ and TZ uni/multi-variate mp-MRI LR-models for classification of significant cancer (any cancer-core-length (CCL) with Gleason > 3 + 3 or any grade with CCL ≥ 4 mm) were derived from the respective cohorts and validated within the same zone by leave-one-out analysis. Inter-zonal performance was tested by applying TZ models to the PZ training-cohort and vice-versa. Classification performance of TZ models for TZ cancer was further assessed in the TZ validation-cohort. ROC area-under-curve (ROC-AUC) analysis was used to compare models. The univariate parameters with the best classification performance were the normalised T2 signal (T2nSI) within the TZ (ROC-AUC = 0.77) and normalized early contrast-enhanced T1 signal (DCE-nSI) within the PZ (ROC-AUC = 0.79). Performance was not significantly improved by bi-variate/tri-variate modelling. PZ models that contained DCE-nSI performed poorly in classification of TZ cancer. The TZ model based solely on maximum-enhancement poorly classified PZ cancer. LR-models dependent on DCE-MRI parameters alone are not interchangeable between prostatic zones; however, models based exclusively on T2 and/or ADC are more robust for inter-zonal application. (orig.)

  2. Concept Maps as a strategy to asses learning in biochemistry using educational softwares

    Directory of Open Access Journals (Sweden)

    A. M. P. Azevedo

    2005-07-01

    Full Text Available This abstract reports  the  use of concept  maps applied  to the evaluation of concepts  learned  through the use of an educational software to study  metabolic  pathways called Diagrama Metabolico Dinamico Virtual  do Ciclo de Krebs (DMDV.  Experience  with the use of this method  was carried  through  with two distinct groups  of students.  The  first  group  was composed  by 24 students (in  2003 who used DMDV during  the  classes (computer room.  The second group was formed by 36 students (in 2004 who could access DMDV software anytime  through  the intranet. The construction of the conceptual map by the student permits  the representation of knowledge, the mental  processes that were absorved and the adaptation during the study,  building new mental schemes that could be related to the concept of reflexioning  abstraction (Piaget, 1995 during  the  process of operation  with  these  concepts.   The evaluation of knowlegde was made by the analysis  of three conceptual  maps constructed by each one of them:   (a  one map  before initiating the  study  with  DMDV,  (b  the  second just  after  the  study and (c the third  one two months  later.  We used the following criteria  for the analysis:  predominance of associative  over classificatory  character; correct concepts  and  relationships; coherence;  number  of relationships;  creativity and  logic.   The  initial  maps  showed  that all  students had  some  previous mental scheme  about  the proposed  concept.    All final  concept maps  showed  an  expansion  of the concepts  as compared  to the initial  maps, something  which can be seen even by a mere glance at the size of graphics.  A purely visual comparison  between the maps indicated  that new elements have been added.   The  associative  character has been shown to predominate as compared  to the  classificatory one.  The

  3. Streamflow hindcasting in European river basins via multi-parametric ensemble of the mesoscale hydrologic model (mHM)

    Science.gov (United States)

    Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis

    2016-04-01

    There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins

  4. The role of multi-parametric MR imaging in the detection of early inflammatory sacroiliitis according to ASAS criteria

    Energy Technology Data Exchange (ETDEWEB)

    Boy, Fatma Nur, E-mail: nursoylu@yahoo.com [Department of Radiology, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Kayhan, Arda, E-mail: arda_kayhan@yahoo.com [Health Services Vocational School, Esenyurt University, Dogan Arasli Bulvari No 120, Esenyurt, Istanbul (Turkey); Karakas, Hakki Muammer, E-mail: hakki.karakas@gmail.com [Department of Radiology, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Unlu-Ozkan, Feyza, E-mail: feyzamd@yahoo.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Silte, Duygu, E-mail: drduygusilte@hotmail.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey); Aktas, İlknur, E-mail: iaktas@hotmail.com [Department of Physical Therapy and Rehabilitation, Fatih Sultan Mehmet Training and Research Hospital, E-5 Karayolu Uzeri, 34752 Atasehir, Istanbul (Turkey)

    2014-06-15

    Purpose: To retrospectively evaluate the accuracy of multi-parametric magnetic resonance (MR) imaging including fat saturated (FS) T2-weighted, short-tau inversion recovery (STIR), diffusion-weighted (DW-MR), and dynamic-contrast-enhanced MR (DCE-MR) imaging techniques in the diagnosis of early inflammatory sacroiliitis and determine the additional value of DW-MR and DCE-MR images according to recently defined ‘Assessment in SpondyloArthritis international Society’ criteria. Materials and methods: The study included 45 patients with back pain. Two radiologists estimated the likelihood of osteitis in 4 independent viewing sessions including FS T2-weighted, STIR, DW-MR and DCE-MR images. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic (ROC) curve (AUC) were calculated. Results: Of the 45 patients, 31 had inflammatory back pain. Of 31, 28 (90.3%) patients had inflammatory sacroiliitis diagnosed by clinical and laboratory analysis. FS T2-weighted MR images had the highest sensitivity (42.8% for both radiologists) for detecting osteitis in patients with inflammaory sacroiliitis when compared to other imaging sequences. For specificity, PPV, NPV, accuracy, and AUC levels there were no statistically significant difference between image viewing settings. However, adding STIR, DW-MR and DCE-MR images to the FS T2-weighted MR images did not improve the above stated indices. Conclusion: FS T2-weighted MR imaging had the highest sensitivity when compared to other imaging sequences. The addition of DW-MR and DCE-MR images did not significantly improve the diagnostic value of MR imaging in the diagnosis of osteitis for both experienced and less experienced radiologists.

  5. Multi-parametric MRI at 14T for muscular dystrophy mice treated with AAV vector-mediated gene therapy.

    Directory of Open Access Journals (Sweden)

    Joshua Park

    Full Text Available The objective of this study was to investigate the efficacy of using quantitative magnetic resonance imaging (MRI as a non-invasive tool for the monitoring of gene therapy for muscular dystrophy. The clinical investigations for this family of diseases often involve surgical biopsy which limits the amount of information that can be obtained due to the invasive nature of the procedure. Thus, other non-invasive tools may provide more opportunities for disease assessment and treatment responses. In order to explore this, dystrophic mdx4cv mice were systemically treated with a recombinant adeno-associated viral (AAV vector containing a codon-optimized micro-dystrophin gene. Multi-parametric MRI of T2, magnetization transfer, and diffusion effects alongside 3-D volume measurements were then utilized to monitor disease/treatment progression. Mice were imaged at 10 weeks of age for pre-treatment, then again post-treatment at 8, 16, and 24 week time points. The efficacy of treatment was assessed by physiological assays for improvements in function and quantification of expression. Tissues from the hindlimbs were collected for histological analysis after the final time point for comparison with MRI results. We found that introduction of the micro-dystrophin gene restored some aspects of normal muscle histology and pathology such as decreased necrosis and resistance to contraction-induced injury. T2 relaxation values showed percentage decreases across all muscle types measured (tibialis anterior, gastrocnemius, and soleus when treated groups were compared to untreated groups. Additionally, the differences between groups were statistically significant for the tibialis anterior as well. The diffusion measurements showed a wider range of percentage changes and less statistical significance while the magnetization transfer effect measurements showed minimal change. MR images displayed hyper-intense regions of muscle that correlated with muscle pathology in

  6. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil research...... gaps. Objective: The objective of this research is to systematically identify and classify a comprehensive list of the technologies that have been developed and/or used for supporting GSD teams. Method: This study has been undertaken as a Systematic Mapping Study (SMS). Our searches identified 1958...... papers out of which 182 were found suitable for inclusion in this study. Results: We have identified 412 technologies reported to be developed and/or used for supporting GSD teams from 182 papers. The identified technologies have been analyzed in order to categorize them using four main classification...

  7. Scanamorphos: a map-making software for Herschel and similar scanning bolometer arrays

    CERN Document Server

    Roussel, Hélène

    2012-01-01

    Scanamorphos is one of the public softwares available to post-process scan observations performed with the Herschel photometer arrays. This post-processing mainly consists in subtracting the total low-frequency noise (both its thermal and non-thermal components), masking cosmic ray hit residuals, and projecting the data onto a map. Although it was developed for Herschel, it is also applicable with minimal adjustment to scan observations made with other bolometer arrays, provided they entail sufficient redundancy; it was successfully applied to P-Artemis, an instrument operating on the APEX telescope. Contrary to most other algorithms (first developed for microwave background experiments and later adapted to Herschel), Scanamorphos does not assume any particular noise model, and does not apply any Fourier-space filtering to the data, but is an empirical tool using purely the redundancy built in the observations -- taking advantage of the fact that each portion of the sky is sampled at multiple times by multipl...

  8. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    Science.gov (United States)

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems.

  9. Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software

    Science.gov (United States)

    Parsons-Wingerter, P. A.; Vickerman, M. B.; Keith, P. A.

    2010-01-01

    Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of plants, animals and humans is significantly modified in such extraterrestrial environments. One physiological requirement shared by larger plants and animals with humans is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.

  10. The Architecture and Object Model of Map Publishing Software%地图出版软件的体系结构与对象模型

    Institute of Scientific and Technical Information of China (English)

    尹伟强

    2000-01-01

    This paper introduces the architecture and object model of map publishing software based on our experience in the analysis and design of our map publishing system. We also introduce some key problems during implementation.

  11. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  12. Modelling the presence of myelin and oedema in the brain based on multi-parametric quantitative MRI

    Directory of Open Access Journals (Sweden)

    Marcel eWarntjes

    2016-02-01

    Full Text Available The aim of this study was to present a model that uses multi-parametric quantitative MRI to estimate the presence of myelin and oedema in the brain. The model relates simultaneous measurement of R1 and R2 relaxation rates and proton density to four partial volume compartments, consisting of myelin partial volume, cellular partial volume, free water partial volume and excess parenchymal water partial volume. The model parameters were obtained using spatially normalised brain images of a group of 20 healthy controls. The pathological brain was modelled in terms of the reduction of myelin content and presence of excess parenchymal water, which indicates the degree of oedema. The method was tested on spatially normalised brain images of a group of 20 age-matched multiple sclerosis (MS patients. Clear differences were observed with respect to the healthy controls: the MS group had a 79 mL smaller brain volume (1069 vs. 1148 mL, a 38 mL smaller myelin volume (119 vs. 157 mL and a 21 mL larger excess parenchymal water volume (78 vs. 57 mL. Template regions of interest of various brain structures indicated that the myelin partial volume in the MS group was 1.6±1.5% lower for grey matter (GM structures and 2.8±1.0% lower for white matter (WM structures. The excess parenchymal water partial volume was 9±10% larger for GM and 5±2% larger for WM. Manually placed ROIs indicated that the results using the template ROIs may have suffered from loss of anatomical detail due to the spatial normalization process. Examples of the application of the method on high-resolution images are provided for three individual subjects, a 45-year-old healthy subject, a 72-year-old healthy subject and a 45-year-old MS patient. The observed results agreed with the expected behaviour considering both age and disease. In conclusion, the proposed model may provide clinically important parameters such as the total brain volume, degree of myelination and degree of oedema, based on

  13. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  14. Analysis, Thematic Maps and Data Mining from Point Cloud to Ontology for Software Development

    Science.gov (United States)

    Nespeca, R.; De Luca, L.

    2016-06-01

    The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based) are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  15. Fault Localization Method by Partitioning Memory Using Memory Map and the Stack for Automotive ECU Software Testing

    Directory of Open Access Journals (Sweden)

    Kwanhyo Kim

    2016-09-01

    Full Text Available Recently, the usage of the automotive Electronic Control Unit (ECU and its software in cars is increasing. Therefore, as the functional complexity of such software increases, so does the likelihood of software-related faults. Therefore, it is important to ensure the reliability of ECU software in order to ensure automobile safety. For this reason, systematic testing methods are required that can guarantee software quality. However, it is difficult to locate a fault during testing with the current ECU development system because a tester performs the black-box testing using a Hardware-in-the-Loop (HiL simulator. Consequently, developers consume a large amount of money and time for debugging because they perform debugging without any information about the location of the fault. In this paper, we propose a method for localizing the fault utilizing memory information during black-box testing. This is likely to be of use to developers who debug automotive software. In order to observe whether symbols stored in the memory have been updated, the memory is partitioned by a memory map and the stack, thus the fault candidate region is reduced. A memory map method has the advantage of being able to finely partition the memory, and the stack method can partition the memory without a memory map. We validated these methods by applying these to HiL testing of the ECU for a body control system. The preliminary results indicate that a memory map and the stack reduce the possible fault locations to 22% and 19% of the updated memory, respectively.

  16. Algorithms and software tools for ordering clone libraries: application to the mapping of the genome of Schizosaccharomyces pombe.

    Science.gov (United States)

    Mott, R; Grigoriev, A; Maier, E; Hoheisel, J; Lehrach, H

    1993-04-25

    A complete set of software tools to aid the physical mapping of a genome has been developed and successfully applied to the genomic mapping of the fission yeast Schizosaccharomyces pombe. Two approaches were used for ordering single-copy hybridisation probes: one was based on the simulated annealing algorithm to order all probes, and another on inferring the minimum-spanning subset of the probes using a heuristic filtering procedure. Both algorithms produced almost identical maps, with minor differences in the order of repetitive probes and those having identical hybridisation patterns. A separate algorithm fitted the clones to the established probe order. Approaches for handling experimental noise and repetitive elements are discussed. In addition to these programs and the database management software, tools for visualizing and editing the data are described. The issues of combining the information from different libraries are addressed. Also, ways of handling multiple-copy probes and non-hybridisation data are discussed.

  17. Improving flood risk mapping in Italy: the FloodRisk open-source software

    Science.gov (United States)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  18. Application of knowledge-based approaches in software architecture : A systematic mapping study

    NARCIS (Netherlands)

    Li, Zengyang; Liang, Peng; Avgeriou, Paris

    2013-01-01

    Context: Knowledge management technologies have been employed across software engineering activities for more than two decades. Knowledge-based approaches can be used to facilitate software architecting activities (e.g., architectural evaluation). However, there is no comprehensive understanding on

  19. [Application evaluation of multi-parametric MRI in the diagnosis and differential diagnosis of early prostate cancer and prostatitis].

    Science.gov (United States)

    Li, P; Huang, Y; Li, Y; Cai, L; Ji, G H; Zheng, Y; Chen, Z Q

    2016-10-11

    Objective: To evaluate the value of multi-parametric MRI (Mp-MRI) in the diagnosis and differential diagnosis of early prostate cancer(PCa) in the peripheral zone(PZ) and low T2WI signal intensity of prostatitis. Methods: A total of 40 patients with PZ early PCa and 37 with prostatitis of hypointense T2WI signal in PZ were retrospectively analyzed, which were collected from the General Hospital of Ningxia Medical University from Janurary 2009 to June 2015, who underwent T2WI, DWI, and DCE-MRI examination and all patients were confirmed by pathology. All the data was transferred to GE Advanced Workstation AW4.3, the indexes divided into cancerous and prostatitis regions were calculated by Functool2 of signal intensity-time(SI-T) curve and ADC value, to calcuate the time to minimum(Tmax), the whole enhancment degree (SImax). ROC cure was used to determine the cutoff value for PCa detection with the ADC value. Result: On T2WI, 57.5% of PCa (23/40) showed focal nodular homogeneous low signal intensity, 70.3% of prostatitis(26/37) showed diffuse inhomogeneous low signal intensity. DCE-MRI, the distribution of curve types for malignant tumors was type Ⅰ 2.5%(1/40), typeⅡ32.5%(13/40) and type Ⅲ 65.0% (26/40). While the numbers for prostatitis was type Ⅰ 16.2%(6/37) , type Ⅱ 56.8% (21/37) and type Ⅲ 27.0% (10/37)respectively.The patterns of curve types in malignant lesions were different from benign lesions significantly(χ(2) =12.32, P<0.01). The mean values of Tmax, SImax in cancerous and prostatitis regions were (17.96±2.91)s, 1.76%±0.23% and (21.19±3.59)s, 1.53%±0.18%, respectively (t=5.37, 6.10; P<0.01). On DWI, The mean ADC values in cancerous and prostatitis regions were (0.95±0.13)×10(-3) mm(2)/s and (1.12±0.13)×10(-3) mm(2)/s, respectively (t=7.10, P<0.01). According to the ROC analysis, when the cutoff value was 1.01×10(-3) mm(2)/s, the early PCa of diagnostic sensitivity, specificity and accuracy was 79.1%, 72.7% and 76.1% respectively

  20. Evaluation of automated multi-parametric indirect immunofluorescence assays to detect anti-neutrophil cytoplasmic antibodies (ANCA) in granulomatosis with polyangiitis (GPA) and microscopic polyangiitis (MPA).

    Science.gov (United States)

    Csernok, Elena; Damoiseaux, Jan; Rasmussen, Niels; Hellmich, Bernhard; van Paassen, Pieter; Vermeersch, Pieter; Blockmans, Daniel; Cohen Tervaert, Jan-Willem; Bossuyt, Xavier

    2016-07-01

    The aim of this multicenter EUVAS study was to evaluate the diagnostic performance of multi-parametric indirect immunofluorescence (IIF) assays to detect anti-neutrophil cytoplasmic antibodies (ANCA) in granulomatosis with polyangiitis (GPA) and microscopic polyangiitis (MPA). The study included 912 samples from diseased controls and 249 diagnostic samples from GPA (n=183) and MPA (n=66) patients. The performance of two automated multi-parametric assays [Aklides (Medipan/Generic Assays) and EuroPattern (Euroimmun)] combining IIF on cellular and purified antigen substrates was compared with two manual IIF analyses and with commercially available ELISAs for MPO- and PR3-ANCA (Euroimmun). The area under the curve (AUC) of the receiver operating characteristics (ROC) curve to discriminate AAV from controls was 0.925, 0.848, 0.855 and 0.904 for, respectively, the two manual analyses, Aklides and EuroPattern, and 0.959, 0.921 and 0.886 for, respectively, antigen-specific ELISA, antigen-coated beads, and microdot, respectively. Variation in pattern assignment between IIF methods was observed. The performance of IIF depends on the substrate used and the definition of IIF patterns. The performance of automated IIF is improved by multi-parameter testing (combined IIF and antigen-specific testing). Given the variability between IIF methods, the diagnostic importance of this technique is questioned. Copyright © 2016. Published by Elsevier B.V.

  1. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  2. Software process improvement: a systematic mapping study on the state of the art

    Directory of Open Access Journals (Sweden)

    Marco Kuhrmann

    2016-05-01

    Full Text Available Software process improvement (SPI has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small-to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%. Furthermore, we find a growing interest in success factors (approx. 16% to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%. Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.

  3. Study of Noise Map and its Features in an Indoor Work Environment through GIS-Based Software

    Directory of Open Access Journals (Sweden)

    Faramarz Majidi

    2016-06-01

    Full Text Available Background: Noise mapping in industry can be useful to assess the risks of harmful noise, or to monitor noise in machine rooms. Using GIS -based software for plotting noise maps in an indoor noisy work environment can be helpful for occupational hygienists to monitor noise pollution. Methods: This study was carried out in a noisy packaging unit of a food industry in Ghazvin industrial zone, to evaluate noise levels by GIS technique. For this reason the floor of packaging unit was divided into squares of 2×2 meters and the center of each square was marked as a measurement station based on NIOSH method. The sound pressure level in each station was measured and then the measurement values were imported into Arc GIS software to plot noise map. Results: Unlike the current method, the noise maps generated by GIS technique are consistent with the nature of sound propagation. Conclusion: This study showed that for an indoor work environment, the application of GIS technology rendering the assessment of noise levels in the form of noise maps, is more realistic and more accurate than the routine method which is now being used by the occupational hygienists.

  4. Application of Open Source Software by the Lunar Mapping and Modeling Project

    Science.gov (United States)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  5. Metric Evolution Maps : Multidimensional Attribute-driven Exploration of Software Repositories

    NARCIS (Netherlands)

    Rodrigues Oliveira da Silva, Renato; Faccin Vernier, Eduardo; Rauber, Paulo; Comba, J. L. D.; Minghim, Rosane; Telea, Alexandru

    2016-01-01

    Understanding how software entities in a repository evolve over time is challenging, as an entity has many aspects that undergo such changes. We cast this problem in a multidimensional visualization context: First, we capture change by extracting quality metrics from all software entities in all rev

  6. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  7. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies.

  8. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  9. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  10. Monitoring tumor response of prostate cancer to radiation therapy by multi-parametric 1H and hyperpolarized 13C magnetic resonance imaging

    Science.gov (United States)

    Zhang, Vickie Yi

    Radiation therapy is one of the most common curative therapies for patients with localized prostate cancer, but despite excellent success rates, a significant number of patients suffer post- treatment cancer recurrence. The accurate characterization of early tumor response remains a major challenge for the clinical management of these patients. Multi-parametric MRI/1H MR spectroscopy imaging (MRSI) has been shown to increase the diagnostic performance in evaluating the effectiveness of radiation therapy. 1H MRSI can detect altered metabolic profiles in cancerous tissue. In this project, the concentrations of prostate metabolites from snap-frozen biopsies of recurrent cancer after failed radiation therapy were correlated with histopathological findings to identify quantitative biomarkers that predict for residual aggressive versus indolent cancer. The total choline to creatine ratio was significantly higher in recurrent aggressive versus indolent cancer, suggesting that use of a higher threshold tCho/Cr ratio in future in vivo 1H MRSI studies could improve the selection and therapeutic planning for patients after failed radiation therapy. Varying radiation doses may cause a diverse effect on prostate cancer micro-environment and metabolism, which could hold the key to improving treatment protocols for individual patients. The recent development and clinical translation of hyperpolarized 13C MRI have provided the ability to monitor both changes in the tumor micro-environment and its metabolism using a multi-probe approach, [1-13C]pyruvate and 13C urea, combined with 1H Multi-parametric MRI. In this thesis, hyperpolarized 13C MRI, 1H dynamic contrast enhancement, and diffusion weighted imaging were used to identify early radiation dose response in a transgenic prostate cancer model. Hyperpolarized pyruvate to lactate metabolism significantly decreased in a dose dependent fashion by 1 day after radiation therapy, prior to any changes observed using 1H DCE and diffusion

  11. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian K.; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  12. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  13. A Mapping Model for Transforming Traditional Software Development Methods to Agile Methodology

    National Research Council Canada - National Science Library

    Rashmi Popli; Anita; Naresh Chauhan

    2013-01-01

    .... Agile model is growing in the market at very good pace.Companies are drifting from traditional Software Development Life Cycle models to Agile Environment forthe purpose of attaining quality and for the sake of saving cost and time...

  14. A Mathematical Analysis of Semantic Maps, with Theoretical and Applied Implications for Blended Learning Software

    Science.gov (United States)

    Tang, Michael; David, Hyerle; Byrne, Roxanne; Tran, John

    2012-01-01

    This paper is a mathematical (Boolean) analysis a set of cognitive maps called Thinking Maps[R], based on Albert Upton's semantic principles developed in his seminal works, Design for Thinking (1961) and Creative Analysis (1961). Albert Upton can be seen as a brilliant thinker who was before his time or after his time depending on the future of…

  15. A Mathematical Analysis of Semantic Maps, with Theoretical and Applied Implications for Blended Learning Software

    Science.gov (United States)

    Tang, Michael; David, Hyerle; Byrne, Roxanne; Tran, John

    2012-01-01

    This paper is a mathematical (Boolean) analysis a set of cognitive maps called Thinking Maps[R], based on Albert Upton's semantic principles developed in his seminal works, Design for Thinking (1961) and Creative Analysis (1961). Albert Upton can be seen as a brilliant thinker who was before his time or after his time depending on the future of…

  16. Strategy for the mapping of interactive genes using bulked segregant analysis method and Mapmaker/Exp software

    Institute of Scientific and Technical Information of China (English)

    WU Weiren; HUANG Biguang

    2006-01-01

    A qualitative trait is usually controlled by a single gene, but it may be sometimes controlled by two or even more genes. This phenomenon is called gene interaction. Rapidly searching for linked molecular markers via bulked segregant analysis (BSA)and then constructing regional linkage map with Mapmaker/Exp has become a common approach to mapping single major genes. However, methods and computer programs developed for mapping single major genes cannot be simply applied to interactive genes because the genetic patterns of gene interactions are quite different from that of single-gene inheritance. Up to now, experimental methods for quickly screening molecular markers linked to interactive genes and statistical methods and corresponding computer softwares for simultaneously analyzing the linkage relationships of multiple molecular markers to an interactive gene have not been available. To solve this problem, in this paper, we propose a strategy for mapping interactive genes using BSA and Mapmaker/Exp. We demonstrate that Mapmaker/Exp' strategy using F2 generation (in a few cases, F3 generation is also needed). As BSA and Mapmaker/Exp have been broadly used in gene mapping studies and are well known by many researchers, the strategies proposed in this paper will be useful for practical researches.

  17. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  18. Identification of RNA molecules by specific enzyme digestion and mass spectrometry: software for and implementation of RNA mass mapping

    DEFF Research Database (Denmark)

    Matthiesen, Rune; Kirpekar, Finn

    2009-01-01

    The idea of identifying or characterizing an RNA molecule based on a mass spectrum of specifically generated RNA fragments has been used in various forms for well over a decade. We have developed software-named RRM for 'RNA mass mapping'-which can search whole prokaryotic genomes or RNA FASTA...... and for genome searches. A simple and powerful probability model for ranking RNA matches is proposed. We demonstrate viability of the entire setup by identifying the DNA template of a series of RNAs of biological and of in vitro transcriptional origin in complete microbial genomes and by identifying authentic 16...

  19. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  20. 基于MapInfo软件绘制湖北省森林分布图%Forest Distribution Mapping of Hubei Province based on MapInfo Software

    Institute of Scientific and Technical Information of China (English)

    肖微

    2006-01-01

    以地理信息系统软件MapInfo为设计平台,介绍了利用湖北省遥感卫星影像(RS)判读数据制作森林分布图的方法和步骤.重点阐述了MapInfo在制图方面的各项功能以及遥感卫星影像判读数据在MapInfo平台中应用的技术手段.

  1. Surface Mapping with an AFM-CMM Integrated System and Stitching Software

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Marinello, Francesco; Bariani, Paolo

    2004-01-01

    with piezoelectric actuation, provide sub nanometre vertical resolution and a few nanometres lateral resolution. On the other hand, the scanning range is limited to few hundred micrometers. In this respect, mapping is a solution. During the past few years, an integrated instrument has been developed at IPL based...

  2. TU-CD-BRB-09: Prediction of Chemo-Radiation Outcome for Rectal Cancer Based On Radiomics of Tumor Clinical Characteristics and Multi-Parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Nie, K; Yue, N [Department of Radiaiton Oncology, Rutgers Cancer Institute of New Jersey, New Brunswick, NJ (United States); Shi, L; Hu, X; Chen, Q; Sun, X; Niu, T [Sir RunRun Shaw Hospital, College of Medicine, Zhejiang University, Hangzhou, Zhejiang (China)

    2015-06-15

    Purpose: To evaluate the tumor clinical characteristics and quantitative multi-parametric MR imaging features for prediction of response to chemo-radiation treatment (CRT) in locally advanced rectal cancer (LARC). Methods: Forty-three consecutive patients (59.7±6.9 years, from 09/2013 – 06/2014) receiving neoadjuvant CRT followed by surgery were enrolled. All underwent MRI including anatomical T1/T2, Dynamic Contrast Enhanced (DCE)-MRI and Diffusion-Weighted MRI (DWI) prior to the treatment. A total of 151 quantitative features, including morphology/Gray Level Co-occurrence Matrix (GLCM) texture from T1/T2, enhancement kinetics and the voxelized distribution from DCE-MRI, apparent diffusion coefficient (ADC) from DWI, along with clinical information (carcinoembryonic antigen CEA level, TNM staging etc.), were extracted for each patient. Response groups were separated based on down-staging, good response and pathological complete response (pCR) status. Logistic regression analysis (LRA) was used to select the best predictors to classify different groups and the predictive performance were calculated using receiver operating characteristic (ROC) analysis. Results: Individual imaging category or clinical charateristics might yield certain level of power in assessing the response. However, the combined model outperformed than any category alone in prediction. With selected features as Volume, GLCM AutoCorrelation (T2), MaxEnhancementProbability (DCE-MRI), and MeanADC (DWI), the down-staging prediciton accuracy (area under the ROC curve, AUC) could be 0.95, better than individual tumor metrics with AUC from 0.53–0.85. While for the pCR prediction, the best set included CEA (clinical charateristics), Homogeneity (DCE-MRI) and MeanADC (DWI) with an AUC of 0.89, more favorable compared to conventional tumor metrics with an AUC ranging from 0.511–0.79. Conclusion: Through a systematic analysis of multi-parametric MR imaging features, we are able to build models with

  3. Using Commercial Digital Cameras and Structure-for-Motion Software to Map Snow Cover Depth from Small Aircraft

    Science.gov (United States)

    Sturm, M.; Nolan, M.; Larsen, C. F.

    2014-12-01

    A long-standing goal in snow hydrology has been to map snow cover in detail, either mapping snow depth or snow water equivalent (SWE) with sub-meter resolution. Airborne LiDAR and air photogrammetry have been used successfully for this purpose, but both require significant investments in equipment and substantial processing effort. Here we detail a relatively inexpensive and simple airborne photogrammetric technique that can be used to measure snow depth. The main airborne hardware consists of a consumer-grade digital camera attached to a survey-quality, dual-frequency GPS. Photogrammetric processing is done using commercially available Structure from Motion (SfM) software that does not require ground control points. Digital elevation models (DEMs) are made from snow-free acquisitions in the summer and snow-covered acquisitions in winter, and the maps are then differenced to arrive at snow thickness. We tested the accuracy and precision of snow depths measured using this system through 1) a comparison with airborne scanning LiDAR, 2) a comparison of results from two independent and slightly different photogrameteric systems, and 3) comparison to extensive on-the-ground measured snow depths. Vertical accuracy and precision are on the order of +/-30 cm and +/- 8 cm, respectively. The accuracy can be made to approach that of the precision if suitable snow-free ground control points exists and are used to co-register summer to winter DEM maps. Final snow depth accuracy from our series of tests was on the order of ±15 cm. This photogrammetric method substantially lowers the economic and expertise barriers to entry for mapping snow.

  4. Analysis of pairwise correlations in multi-parametric PET/MR data for biological tumor characterization and treatment individualization strategies

    Energy Technology Data Exchange (ETDEWEB)

    Leibfarth, Sara; Moennich, David; Thorwarth, Daniela [University Hospital Tuebingen, Section for Biomedical Physics, Department of Radiation Oncology, Tuebingen (Germany); Simoncic, Urban [University Hospital Tuebingen, Section for Biomedical Physics, Department of Radiation Oncology, Tuebingen (Germany); University of Ljubljana, Faculty of Mathematics and Physics, Ljubljana (Slovenia); Jozef Stefan Institute, Ljubljana (Slovenia); Welz, Stefan; Zips, Daniel [University Hospital Tuebingen, Department of Radiation Oncology, Tuebingen (Germany); Schmidt, Holger; Schwenzer, Nina [University Hospital Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany)

    2016-07-15

    The aim of this pilot study was to explore simultaneous functional PET/MR for biological characterization of tumors and potential future treatment adaptations. To investigate the extent of complementarity between different PET/MR-based functional datasets, a pairwise correlation analysis was performed. Functional datasets of N=15 head and neck (HN) cancer patients were evaluated. For patients of group A (N=7), combined PET/MR datasets including FDG-PET and ADC maps were available. Patients of group B (N=8) had FMISO-PET, DCE-MRI and ADC maps from combined PET/MRI, an additional dynamic FMISO-PET/CT acquired directly after FMISO tracer injection as well as an FDG-PET/CT acquired a few days earlier. From DCE-MR, parameter maps K{sup trans}, v{sub e} and v{sub p} were obtained with the extended Tofts model. Moreover, parameter maps of mean DCE enhancement, ΔS{sub DCE}, and mean FMISO signal 0-4 min p.i., anti A{sub FMISO}, were derived. Pairwise correlations were quantified using the Spearman correlation coefficient (r) on both a voxel and a regional level within the gross tumor volume. Between some pairs of functional imaging modalities moderate correlations were observed with respect to the median over all patient datasets, whereas distinct correlations were only present on an individual basis. Highest inter-modality median correlations on the voxel level were obtained for FDG/FMISO (r = 0.56), FDG/ anti A{sub FMISO} (r = 0.55), anti A{sub FMISO}/ΔS{sub DCE} (r = 0.46), and FDG/ADC (r = -0.39). Correlations on the regional level showed comparable results. The results of this study suggest that the examined functional datasets provide complementary information. However, only pairwise correlations were examined, and correlations could still exist between combinations of three or more datasets. These results might contribute to the future design of individually adapted treatment approaches based on multiparametric functional imaging.

  5. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  6. The gastrointestinal electrical mapping suite (GEMS: software for analyzing and visualizing high-resolution (multi-electrode recordings in spatiotemporal detail

    Directory of Open Access Journals (Sweden)

    Yassi Rita

    2012-06-01

    Full Text Available Abstract Background Gastrointestinal contractions are controlled by an underlying bioelectrical activity. High-resolution spatiotemporal electrical mapping has become an important advance for investigating gastrointestinal electrical behaviors in health and motility disorders. However, research progress has been constrained by the low efficiency of the data analysis tasks. This work introduces a new efficient software package: GEMS (Gastrointestinal Electrical Mapping Suite, for analyzing and visualizing high-resolution multi-electrode gastrointestinal mapping data in spatiotemporal detail. Results GEMS incorporates a number of new and previously validated automated analytical and visualization methods into a coherent framework coupled to an intuitive and user-friendly graphical user interface. GEMS is implemented using MATLAB®, which combines sophisticated mathematical operations and GUI compatibility. Recorded slow wave data can be filtered via a range of inbuilt techniques, efficiently analyzed via automated event-detection and cycle clustering algorithms, and high quality isochronal activation maps, velocity field maps, amplitude maps, frequency (time interval maps and data animations can be rapidly generated. Normal and dysrhythmic activities can be analyzed, including initiation and conduction abnormalities. The software is distributed free to academics via a community user website and forum (http://sites.google.com/site/gimappingsuite. Conclusions This software allows for the rapid analysis and generation of critical results from gastrointestinal high-resolution electrical mapping data, including quantitative analysis and graphical outputs for qualitative analysis. The software is designed to be used by non-experts in data and signal processing, and is intended to be used by clinical researchers as well as physiologists and bioengineers. The use and distribution of this software package will greatly accelerate efforts to improve the

  7. Longitudinal Assessment of Amyloid Pathology in Transgenic ArcAβ Mice Using Multi-Parametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Jan Klohs

    Full Text Available Magnetic resonance imaging (MRI can be used to monitor pathological changes in Alzheimer's disease (AD. The objective of this longitudinal study was to assess the effects of progressive amyloid-related pathology on multiple MRI parameters in transgenic arcAβ mice, a mouse model of cerebral amyloidosis. Diffusion-weighted imaging (DWI, T1-mapping and quantitative susceptibility mapping (QSM, a novel MRI based technique, were applied to monitor structural alterations and changes in tissue composition imposed by the pathology over time. Vascular function and integrity was studied by assessing blood-brain barrier integrity with dynamic contrast-enhanced MRI and cerebral microbleed (CMB load with susceptibility weighted imaging and QSM. A linear mixed effects model was built for each MRI parameter to incorporate effects within and between groups (i.e. genotype and to account for changes unrelated to the disease pathology. Linear mixed effects modelling revealed a strong association of all investigated MRI parameters with age. DWI and QSM in addition revealed differences between arcAβ and wt mice over time. CMBs became apparent in arcAβ mice with 9 month of age; and the CMB load reflected disease stage. This study demonstrates the benefits of linear mixed effects modelling of longitudinal imaging data. Moreover, the diagnostic utility of QSM and assessment of CMB load should be exploited further in studies of AD.

  8. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    Science.gov (United States)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  9. WEB MAPPING ARCHITECTURES BASED ON OPEN SPECIFICATIONS AND FREE AND OPEN SOURCE SOFTWARE IN THE WATER DOMAIN

    Directory of Open Access Journals (Sweden)

    C. Arias Muñoz

    2017-09-01

    Full Text Available The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS and the use of open specifications (OS that address different users’ needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  10. Prediction of cardiac complications for thalassemia major in the widespread cardiac magnetic resonance era: a prospective multicentre study by a multi-parametric approach.

    Science.gov (United States)

    Pepe, Alessia; Meloni, Antonella; Rossi, Giuseppe; Midiri, Massimo; Missere, Massimiliano; Valeri, Gianluca; Sorrentino, Francesco; D'Ascola, Domenico Giuseppe; Spasiano, Anna; Filosa, Aldo; Cuccia, Liana; Dello Iacono, Nicola; Forni, Gianluca; Caruso, Vincenzo; Maggio, Aurelio; Pitrolo, Lorella; Peluso, Angelo; De Marchi, Daniele; Positano, Vincenzo; Wood, John C

    2017-02-14

    Cardiovascular magnetic resonance (CMR) has dramatically changed the clinical practice in thalassemia major (TM), lowering cardiac complications. We prospectively reassessed the predictive value of CMR parameters for heart failure (HF) and arrhythmias in TM. We considered 481 white TM patients (29.48 ± 8.93 years, 263 females) enrolled in the Myocardial Iron Overload in Thalassemia (MIOT) network. Myocardial and liver iron overload were measured by T2* multiecho technique. Atrial dimensions and biventricular function were quantified by cine images. Late gadolinium enhancement images were acquired to detect myocardial fibrosis. Mean follow-up was 57.91 ± 18.23 months. After the first CMR scan 69.6% of the patients changed chelation regimen. We recorded 18 episodes of HF. In the multivariate analysis the independent predictive factors were myocardial fibrosis (HR = 10.94, 95% CI = 3.28-36.43, P risk of iron-mediated HF and of arrhythmias than previously reported. Homogeneous MIO remained a risk factor for HF but also myocardial fibrosis and ventricular dysfunction identified patients at high risk. Arrhythmias were independent of MIO but increased with atrial dilatation. CMR by a multi-parametric approach dramatically improves cardiac outcomes and provides prognostic information beyond cardiac iron estimation.

  11. Multi-Parametric MRI-Directed Focal Salvage Permanent Interstitial Brachytherapy for Locally Recurrent Adenocarcinoma of the Prostate: A Novel Approach.

    Science.gov (United States)

    Wallace, T; Avital, I; Stojadinovic, A; Brücher, B L D M; Cote, E; Yu, J

    2013-01-01

    Even with the technological advances of dose-escalated IMRT with the addition of the latest image guidance technologies, local failures still occur. The combination of MRI-based imaging techniques can yield quantitative information that reflects on the biological properties of prostatic tissues. These techniques provide unique information that can be used for tumor detection in the treated gland. With the advent of these improved imaging modalities, it has become possible to more effectively image local recurrences within the prostate gland. With better imaging, these focal recurrences can be differentially targeted with salvage brachytherapy minimizing rectal and bladder toxicity. Here we report a novel use of MRI-directed focal brachytherapy after local recurrence. This technique offers a unique opportunity to safely and successfully treat recurrent prostate cancer, previously treated with definitive radiation therapy. The use of multi-parametric MRI-directed focal salvage permanent interstitial brachytherapy for locally recurrent adenocarcinoma of the prostate is a promising strategy to avoid more aggressive and expensive treatments that are associated with increased morbidity, potentially improving survival at potentially lower costs.

  12. Local Dynamic Map als modulares Software Framework für Fahrerassistenzsysteme

    Science.gov (United States)

    Reisdorf, P.; Auerswald, A.; Wanielik, G.

    2015-11-01

    Moderne Fahrerassistenzsysteme basieren auf der Verarbeitung von Informationen, welche durch die Umfeldwahrnehmung mit unterschiedlicher Sensorik erfolgt. Neben den Informationen aus dem eigenen Fahrzeug ergeben sich durch unterschiedliche Kommunikationsmöglichkeiten (Car2Car, Car2X,...) erweiterte Umfeldwahrnehmungen (siehe Abb. 1). Diese Daten gilt es aufbereitet und zielorientiert einer Anwendung zur Verfügung zu stellen, was mit Hilfe einer Local Dynamic Map (LDM) erfüllt werden kann. Die vorliegende Veröffentlichung beschreibt den Aufbau, Verwendungszweck und Eigenschaften einer entwickelten LDM und geht auf einige Applikationen ein, die mit Hilfe dieser realisiert wurden.

  13. Surface Mapping with an AFM-CMM Integrated System and Stitching Software

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Marinello, Francesco; Bariani, Paolo

    2004-01-01

    In the context of micro-technology, dimensions in the order of hundreds of micrometers are often to be measured, while the detection of the finest details and the analysis of nano-roughness call for the use of highly resolving sensors. AFM probes, owing to the sharpness of their tip combined...... with piezoelectric actuation, provide sub nanometre vertical resolution and a few nanometres lateral resolution. On the other hand, the scanning range is limited to few hundred micrometers. In this respect, mapping is a solution. During the past few years, an integrated instrument has been developed at IPL based...

  14. SNPFile - A software library and file format for large scale association mapping and population genetics studies

    DEFF Research Database (Denmark)

    Nielsen, Jesper; Mailund, Thomas

    2008-01-01

    and manipulate the data. While spreadsheets and at text files were adequate solutions earlier, the increased data size mandates more efficient solutions. Results We describe a new binary file format for SNP data, together with a software library for file manipulation. The file format stores genotype data...... together with any kind of additional data, using a flexible serialisation mechanism. The format is designed to be IO efficient for the access patterns of most multi-locus analysis methods. Conclusion The new file format has been very useful for our own studies where it has significantly reduced...... the informatics burden in keeping track of various secondary data, and where the memory and IO efficiency has greatly simplified analysis runs. A main limitation with the file format is that it is only supported by the very limited set of analysis tools developed in our own lab. This is somewhat alleviated...

  15. The GEMPAK Barnes interactive objective map analysis scheme. [General Meteorological Software Package

    Science.gov (United States)

    Koch, S. E.; Kocin, P. J.; Desjardins, M.

    1983-01-01

    The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.

  16. Fracture network evaluation program (FraNEP): A software for analyzing 2D fracture trace-line maps

    Science.gov (United States)

    Zeeb, Conny; Gomez-Rivas, Enrique; Bons, Paul D.; Virgo, Simon; Blum, Philipp

    2013-10-01

    Fractures, such as joints, faults and veins, strongly influence the transport of fluids through rocks by either enhancing or inhibiting flow. Techniques used for the automatic detection of lineaments from satellite images and aerial photographs, LIDAR technologies and borehole televiewers significantly enhanced data acquisition. The analysis of such data is often performed manually or with different analysis software. Here we present a novel program for the analysis of 2D fracture networks called FraNEP (Fracture Network Evaluation Program). The program was developed using Visual Basic for Applications in Microsoft Excel™ and combines features from different existing software and characterization techniques. The main novelty of FraNEP is the possibility to analyse trace-line maps of fracture networks applying the (1) scanline sampling, (2) window sampling or (3) circular scanline and window method, without the need of switching programs. Additionally, binning problems are avoided by using cumulative distributions, rather than probability density functions. FraNEP is a time-efficient tool for the characterisation of fracture network parameters, such as density, intensity and mean length. Furthermore, fracture strikes can be visualized using rose diagrams and a fitting routine evaluates the distribution of fracture lengths. As an example of its application, we use FraNEP to analyse a case study of lineament data from a satellite image of the Oman Mountains.

  17. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  18. Complex interplay between brain function and structure during cerebral amyloidosis in APP transgenic mouse strains revealed by multi-parametric MRI comparison.

    Science.gov (United States)

    Grandjean, Joanes; Derungs, Rebecca; Kulic, Luka; Welt, Tobias; Henkelman, Mark; Nitsch, Roger M; Rudin, Markus

    2016-07-01

    Alzheimer's disease is a fatal neurodegenerative disorder affecting the aging population. Neuroimaging methods, in particular magnetic resonance imaging (MRI), have helped reveal alterations in the brain structure, metabolism, and function of patients and in groups at risk of developing AD, yet the nature of these alterations is poorly understood. Neuroimaging in mice is attractive for investigating mechanisms underlying functional and structural changes associated with AD pathology. Several preclinical murine models of AD have been generated based on transgenic insertion of human mutated APP genes. Depending on the specific mutations, mouse strains express different aspects of amyloid pathology, e.g. intracellular amyloid-β (Aβ) aggregates, parenchymal plaques, or cerebral amyloid angiopathy. We have applied multi-parametric MRI in three transgenic mouse lines to compare changes in brain function with resting-state fMRI and structure with diffusion tensor imaging and high resolution anatomical imaging. E22ΔAβ developing intracellular Aβ aggregates did not present functional or structural alterations compared to their wild-type littermates. PSAPP mice displaying parenchymal amyloid plaques displayed mild functional changes within the supplementary and barrel field cortices, and increased isocortical volume relative to controls. Extensive reduction in functional connectivity in the sensory-motor cortices and within the default mode network, as well as local volume increase in the midbrain relative to wild-type have been observed in ArcAβ mice bearing intracellular Aβ aggregates as well as parenchymal and vascular amyloid deposits. Patterns of functional and structural changes appear to be strain-specific and not directly related to amyloid deposition.

  19. Multi-field-of-view strategy for image-based outcome prediction of multi-parametric estrogen receptor-positive breast cancer histopathology: Comparison to Oncotype DX

    Directory of Open Access Journals (Sweden)

    Ajay Basavanhally

    2011-01-01

    Full Text Available In this paper, we attempt to quantify the prognostic information embedded in multi-parametric histologic biopsy images to predict disease aggressiveness in estrogen receptor-positive (ER+ breast cancers (BCa. The novel methodological contribution is in the use of a multi-field-of-view (multi-FOV framework for integrating image-based information from differently stained histopathology slides. The multi-FOV approach involves a fixed image resolution while simultaneously integrating image descriptors from many FOVs corresponding to different sizes. For each study, the corresponding risk score (high scores reflecting aggressive disease and vice versa, predicted by a molecular assay (Oncotype DX, is available and serves as the surrogate ground truth for long-term patient outcome. Using the risk scores, a trained classifier is used to identify disease aggressiveness for each FOV size. The predictions for each FOV are then combined to yield the final prediction of disease aggressiveness (good, intermediate, or poor outcome. Independent multi-FOV classifiers are constructed for (1 50 image features describing the spatial arrangement of cancer nuclei (via Voronoi diagram, Delaunay triangulation, and minimum spanning tree graphs in H and E stained histopathology and (2 one image feature describing the vascular density in CD34 IHC stained histopathology. In a cohort of 29 patients, the multi-FOV classifiers obtained by combining information from the H and E and CD34 IHC stained channels were able to distinguish low- and high-risk patients with an accuracy of 0.91 ± 0.02 and a positive predictive value of 0.94 ± 0.10, suggesting that a purely image-based assay could potentially replace more expensive molecular assays for making disease prognostic predictions.

  20. Multi-field-of-view strategy for image-based outcome prediction of multi-parametric estrogen receptor-positive breast cancer histopathology: Comparison to Oncotype DX.

    Science.gov (United States)

    Basavanhally, Ajay; Feldman, Michael; Shih, Natalie; Mies, Carolyn; Tomaszewski, John; Ganesan, Shridar; Madabhushi, Anant

    2011-01-01

    In this paper, we attempt to quantify the prognostic information embedded in multi-parametric histologic biopsy images to predict disease aggressiveness in estrogen receptor-positive (ER+) breast cancers (BCa). The novel methodological contribution is in the use of a multi-field-of-view (multi-FOV) framework for integrating image-based information from differently stained histopathology slides. The multi-FOV approach involves a fixed image resolution while simultaneously integrating image descriptors from many FOVs corresponding to different sizes. For each study, the corresponding risk score (high scores reflecting aggressive disease and vice versa), predicted by a molecular assay (Oncotype DX), is available and serves as the surrogate ground truth for long-term patient outcome. Using the risk scores, a trained classifier is used to identify disease aggressiveness for each FOV size. The predictions for each FOV are then combined to yield the final prediction of disease aggressiveness (good, intermediate, or poor outcome). Independent multi-FOV classifiers are constructed for (1) 50 image features describing the spatial arrangement of cancer nuclei (via Voronoi diagram, Delaunay triangulation, and minimum spanning tree graphs) in H and E stained histopathology and (2) one image feature describing the vascular density in CD34 IHC stained histopathology. In a cohort of 29 patients, the multi-FOV classifiers obtained by combining information from the H and E and CD34 IHC stained channels were able to distinguish low- and high-risk patients with an accuracy of 0.91 ± 0.02 and a positive predictive value of 0.94 ± 0.10, suggesting that a purely image-based assay could potentially replace more expensive molecular assays for making disease prognostic predictions.

  1. An evaluation of morphological and functional multi-parametric MRI sequences in classifying non-muscle and muscle invasive bladder cancer.

    Science.gov (United States)

    Panebianco, Valeria; De Berardinis, Ettore; Barchetti, Giovanni; Simone, Giuseppe; Leonardo, Constantino; Grompone, Marcello Domenico; Del Monte, Maurizio; Carano, Davide; Gallucci, Michele; Catto, James; Catalano, Carlo

    2017-09-01

    Our goal is to determine the ability of multi-parametric magnetic resonance imaging (mpMRI) to differentiate muscle invasive bladder cancer (MIBC) from non-muscle invasive bladder cancer (NMIBC). Patients underwent mpMRI before tumour resection. Four MRI sets, i.e. T2-weighted (T2W) + perfusion-weighted imaging (PWI), T2W plus diffusion-weighted imaging (DWI), T2W + DWI + PWI, and T2W + DWI + PWI + dif-fusion tensor imaging (DTI) were interpreted qualitatively by two radiologists, blinded to histology results. PWI, DWI and DTI were also analysed quantitatively. Accuracy was determined using histopathology as the reference standard. A total of 82 tumours were analysed. Ninety-six percent of T1-labeled tumours by the T2W + DWI + PWI image set were confirmed to be NMIBC at histopathology. Overall accuracy of the complete mpMRI protocol was 94% in differentiating NMIBC from MIBC. PWI, DWI and DTI quantitative parameters were shown to be significantly different in cancerous versus non-cancerous areas within the bladder wall in T2-labelled lesions. MpMRI with DWI and DTI appears a reliable staging tool for bladder cancer. If our data are validated, then mpMRI could precede cystoscopic resection to allow a faster recognition of MIBC and accelerated treatment pathways. • A critical step in BCa staging is to differentiate NMIBC from MIBC. • Morphological and functional sequences are reliable techniques in differentiating NMIBC from MIBC. • Diffusion tensor imaging could be an additional tool in BCa staging.

  2. Sparse representation of multi parametric DCE-MRI features using K-SVD for classifying gene expression based breast cancer recurrence risk

    Science.gov (United States)

    Mahrooghy, Majid; Ashraf, Ahmed B.; Daye, Dania; Mies, Carolyn; Rosen, Mark; Feldman, Michael; Kontos, Despina

    2014-03-01

    We evaluate the prognostic value of sparse representation-based features by applying the K-SVD algorithm on multiparametric kinetic, textural, and morphologic features in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). K-SVD is an iterative dimensionality reduction method that optimally reduces the initial feature space by updating the dictionary columns jointly with the sparse representation coefficients. Therefore, by using K-SVD, we not only provide sparse representation of the features and condense the information in a few coefficients but also we reduce the dimensionality. The extracted K-SVD features are evaluated by a machine learning algorithm including a logistic regression classifier for the task of classifying high versus low breast cancer recurrence risk as determined by a validated gene expression assay. The features are evaluated using ROC curve analysis and leave one-out cross validation for different sparse representation and dimensionality reduction numbers. Optimal sparse representation is obtained when the number of dictionary elements is 4 (K=4) and maximum non-zero coefficients is 2 (L=2). We compare K-SVD with ANOVA based feature selection for the same prognostic features. The ROC results show that the AUC of the K-SVD based (K=4, L=2), the ANOVA based, and the original features (i.e., no dimensionality reduction) are 0.78, 0.71. and 0.68, respectively. From the results, it can be inferred that by using sparse representation of the originally extracted multi-parametric, high-dimensional data, we can condense the information on a few coefficients with the highest predictive value. In addition, the dimensionality reduction introduced by K-SVD can prevent models from over-fitting.

  3. Simple and cost-effective hardware and software for functional brain mapping using intrinsic optical signal imaging.

    Science.gov (United States)

    Harrison, Thomas C; Sigler, Albrecht; Murphy, Timothy H

    2009-09-15

    We describe a simple and low-cost system for intrinsic optical signal (IOS) imaging using stable LED light sources, basic microscopes, and commonly available CCD cameras. IOS imaging measures activity-dependent changes in the light reflectance of brain tissue, and can be performed with a minimum of specialized equipment. Our system uses LED ring lights that can be mounted on standard microscope objectives or video lenses to provide a homogeneous and stable light source, with less than 0.003% fluctuation across images averaged from 40 trials. We describe the equipment and surgical techniques necessary for both acute and chronic mouse preparations, and provide software that can create maps of sensory representations from images captured by inexpensive 8-bit cameras or by 12-bit cameras. The IOS imaging system can be adapted to commercial upright microscopes or custom macroscopes, eliminating the need for dedicated equipment or complex optical paths. This method can be combined with parallel high resolution imaging techniques such as two-photon microscopy.

  4. General Purpose Real-time Data Analysis and Visualization Software for Volcano Observatories

    Science.gov (United States)

    Cervelli, P. F.; Miklius, A.; Antolik, L.; Parker, T.; Cervelli, D.

    2011-12-01

    In 2002, the USGS developed the Valve software for management, visualization, and analysis of volcano monitoring data. In 2004, the USGS developed similar software, called Swarm, for the same purpose but specifically tailored for seismic waveform data. Since then, both of these programs have become ubiquitous at US volcano observatories, and in the case of Swarm, common at volcano observatories across the globe. Though innovative from the perspective of software design, neither program is methodologically novel. Indeed, the software can perform little more than elementary 2D graphing, along with basic geophysical analysis. So, why is the software successful? The answer is that both of these programs take data from the realm of discipline specialists and make them universally available to all observatory scientists. In short, the software creates additional value from existing data by leveraging the observatory's entire intellectual capacity. It enables rapid access to different data streams, and allows anyone to compare these data on a common time scale or map base. It frees discipline specialists from routine tasks like preparing graphics or compiling data tables, thereby making more time for interpretive efforts. It helps observatory scientists browse through data, and streamlines routine checks for unusual activity. It encourages a multi-parametric approach to volcano monitoring. And, by means of its own usefulness, it creates incentive to organize and capture data streams not yet available. Valve and Swarm are both written in Java, open-source, and freely available. Swarm is a stand-alone Java application. Valve is a system consisting of three parts: a web-based user interface, a graphing and analysis engine, and a data server. Both can be used non-interactively (e.g., via scripts) to generate graphs or to dump raw data. Swarm has a simple, built-in alarm capability. Several alarm algorithms have been built around Valve. Both programs remain under active

  5. sfDM: Open-Source Software for Temporal Analysis and Visualization of Brain Tumor Diffusion MR Using Serial Functional Diffusion Mapping.

    Science.gov (United States)

    Ceschin, Rafael; Panigrahy, Ashok; Gopalakrishnan, Vanathi

    2015-01-01

    A major challenge in the diagnosis and treatment of brain tumors is tissue heterogeneity leading to mixed treatment response. Additionally, they are often difficult or at very high risk for biopsy, further hindering the clinical management process. To overcome this, novel advanced imaging methods are increasingly being adapted clinically to identify useful noninvasive biomarkers capable of disease stage characterization and treatment response prediction. One promising technique is called functional diffusion mapping (fDM), which uses diffusion-weighted imaging (DWI) to generate parametric maps between two imaging time points in order to identify significant voxel-wise changes in water diffusion within the tumor tissue. Here we introduce serial functional diffusion mapping (sfDM), an extension of existing fDM methods, to analyze the entire tumor diffusion profile along the temporal course of the disease. sfDM provides the tools necessary to analyze a tumor data set in the context of spatiotemporal parametric mapping: the image registration pipeline, biomarker extraction, and visualization tools. We present the general workflow of the pipeline, along with a typical use case for the software. sfDM is written in Python and is freely available as an open-source package under the Berkley Software Distribution (BSD) license to promote transparency and reproducibility.

  6. Thoughts on Stock Map Cartography with Calculating Software%利用计算机软件编制林相图的思考

    Institute of Scientific and Technical Information of China (English)

    陆蓉

    2009-01-01

    The stock map cartography process using computer software include data preparation and input, layer editing and analysis, inquiry and indexing, illustrating data and layer using ArcMap software, map decoration and layer output. It is pointed out that accurate collection and connection of data, and map decoration are two very important parts which are easily ignored.%利用计算机软件编制林相图的操作方法与步骤包括:数据准备与输入,图层编辑分析、查询与索引,利用ArcMap软件对数据和图层分类显示以及图幅整饰和图层输出等.在整个制图过程中,最重要且容易忽略的部分是数据库的准确采集和挂接以及图幅的整体修饰.

  7. 基于AutoCAD和Photoshop的林业制图%Forestry Mapping Design Based on Software of Auto CAD and Adobe Photoshop

    Institute of Scientific and Technical Information of China (English)

    罗超; 肖泽鑫; 彭剑华; 邹桂逢; 赖焕武

    2012-01-01

    介绍AutoCAD和Photoshop软件的特点,从绘图空间设置、底图录入调整、底图矢量化、数据库建立、各类专题图绘制、绘图比例设置及后期效果处理及出图等方面论述了其在林业制图上的应用方法及技巧并就2种制图软件的优缺点以及2种软件综合应用的优势进行了讨论.%Software features of Auto CAD and Photoshop were introduced in this paper, and application technique methods also been elaborated from aspects of drawing space setting, hase. map entry adjustments, base maps vectorization, database creation, various thematic mapping, drawing scale settings and late effects processing and so on. The advantages and disadvantages of the two mapping software in applications were discussed.

  8. Integration of Balanced Scorecard (BSC), Strategy Map, and Fuzzy Analytic Hierarchy Process (FAHP) for a Sustainability Business Framework: A Case Study of a Spanish Software Factory in the Financial Sector

    National Research Council Canada - National Science Library

    Cesar alvarez Perez; Vicente Rodríguez Montequín; Francisco Ortega Fernandez; Joaquín Villanueva Balsera

    2017-01-01

    This paper presents a case study of how a Spanish financial software factory (FSF) has determined the weights of the indicators and objectives included in their strategy map with the aim of ensuring its business sustainability...

  9. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    Science.gov (United States)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  10. Negotiation and Decision Making with Collaborative Software: How MarineMap `Changed the Game' in California's Marine Life Protected Act Initiative

    Science.gov (United States)

    Cravens, Amanda E.

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study—which draws on data from approximately 60 semi-structured interviews and an online survey—examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  11. Negotiation and Decision Making with Collaborative Software: How MarineMap 'Changed the Game' in California's Marine Life Protected Act Initiative.

    Science.gov (United States)

    Cravens, Amanda E

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study-which draws on data from approximately 60 semi-structured interviews and an online survey--examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  12. San Bernardino National Wildlife Refuge: Vegetation and Landcover Mapping Using Object-Based Image Analysis and Open Source Software

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — In May 2014, staff at the San Bernardino National Wildlife Refuge (SBNWR) requested the production of a vegetation map to document the ongoing restoration of the...

  13. 利用Mapgis软件将等值线逆向数值化的方法%The Method Introduction to Contour Map Transferring into Numerical Value by MapGIS Software

    Institute of Scientific and Technical Information of China (English)

    罗立红; 邵兴

    2015-01-01

    应用Mapgis软件处理无基础数据的地质图件时,存在后期使用困难、精度不高、绘图速度慢的问题。本文总结一套有效的方法,将图形文件,转成数据文件,便于后期数据分析利用,以地质灾害危险性评估工作中要求提供的5年平均沉降速率图为例,对等值线逆向数值化进行演示。%During the MapGIS software processing the geological map without base data, there is great dififculty in the late utilization stage, for example the accuracy is not enough, and drawing speed is slow. This paper summarizes an effective method of transferring contour map into numerical value which is so convenient for post-data analysis. To take an “average subsidence rate map within 5 years” as an example, that is required in geological hazard assessment work, this paper shows the detailed converting processes from the contour map to numerical value.

  14. TU-PIS-Exhibit Hall-2: Deformable Image Registration, Contour Propagation, and Dose Mapping inMIM Maestro - MIM Software

    Energy Technology Data Exchange (ETDEWEB)

    Piper, J. [MIM Software, Inc. (United States)

    2015-06-15

    Brachytherapy devices and software are designed to last for a certain period of time. Due to a number of considerations, such as material factors, wear-and-tear, backwards compatibility, and others, they all reach a date when they are no longer supported by the manufacturer. Most of these products have a limited duration for their use, and the information is provided to the user at time of purchase. Because of issues or concerns determined by the manufacturer, certain products are retired sooner than the anticipated date, and the user is immediately notified. In these situations, the institution is facing some difficult choices: remove these products from the clinic or perform tests and continue their usage. Both of these choices come with a financial burden: replacing the product or assuming a potential medicolegal liability. This session will provide attendees with the knowledge and tools to make better decisions when facing these issues. Learning Objectives: Understand the meaning of “end-of-life or “life expectancy” for brachytherapy devices and software Review items (devices and software) affected by “end-of-life” restrictions Learn how to effectively formulate “end-of-life” policies at your institution Learn about possible implications of “end-of-life” policy Review other possible approaches to “end-of-life” issue.

  15. National Insect and Disease Risk Map (NIDRM)--cutting edge software for rapid insect and disease risk model development

    Science.gov (United States)

    Frank J. Krist

    2010-01-01

    The Forest Health Technology Enterprise Team (FHTET) of the U.S. Forest Service is leading an effort to produce the next version of the National Insect and Disease Risk Map (NIDRM) for targeted release in 2011. The goal of this effort is to update spatial depictions of risk of tree mortality based on: (1) newly derived 240-m geospatial information depicting the...

  16. PETROMAP: MS-DOS software package for quantitative processing of X-ray maps of zoned minerals

    Science.gov (United States)

    Cossio, Roberto; Borghi, Alessandro

    1998-10-01

    This paper shows an application of energy dispersive spectrometry (EDS) for digital acquisition of multi-element X-ray compositional maps of minerals in polished thin sections. A square matrix of n EDS spectra with known X, Y coordinates is collected, converted and exported to a personal computer. Each spectrum of the matrix is processed and the apparent concentration of each analyzed element is calculated by means of PETROMAP, a program written in Quick-Basic which applies a quantitative ZAF/FLS correction. The results of processing are comparable to the conventional quantitative microprobe analyses, with similar counting statistics. The output is a numerical matrix, compatible with the most popular graphic and spreadsheet programs from which it is possible to produce two-dimensional wt% oxide, mole fractions and mineral end-members pseudocolored or black/white maps. The procedure has been tested using a metamorphic garnet of the medium-grade Stilo unit (Calabrian Arc, Southern Italy).

  17. Pilot study using 3D-longitudinal strain computation in a multi-parametric approach for best selecting responders to cardiac resynchronization therapy.

    Science.gov (United States)

    Fournet, Maxime; Bernard, Anne; Marechaux, Sylvestre; Galli, Elena; Martins, Raphael; Mabo, Philippe; Daubert, J Claude; Leclercq, Christophe; Hernandez, Alfredo; Donal, Erwan

    2017-06-17

    Almost all attempts to improve patient selection for cardiac resynchronization therapy (CRT) using echo-derived indices have failed so far. We sought to assess: the performance of homemade software for the automatic quantification of integral 3D regional longitudinal strain curves exploring left ventricular (LV) mechanics and the potential value of this tool to predict CRT response. Forty-eight heart failure patients in sinus rhythm, referred for CRT-implantation (mean age: 65 years; LV-ejection fraction: 26%; QRS-duration: 160 milliseconds) were prospectively explored. Thirty-four patients (71%) had positive responses, defined as an LV end-systolic volume decrease ≥15% at 6-months. 3D-longitudinal strain curves were exported for analysis using custom-made algorithms. The integrals of the longitudinal strain signals (I L,peak) were automatically measured and calculated for all 17 LV-segments. The standard deviation of longitudinal strain peak (SDI L,peak ) for all 17 LV-segments was greater in CRT responders than non-responders (1.18% s(-1) [0.96; 1.35] versus 0.83% s(-1) [0.55; 0.99], p = 0.007). The optimal cut-off value of SDI L,peak to predict response was 1.037%.s(-1). In the 18-patients without septal flash, SDI L,peak was significantly higher in the CRT-responders. This new automatic software for analyzing 3D longitudinal strain curves is avoiding previous limitations of imaging techniques for assessing dyssynchrony and then its value will have to be tested in a large group of patients.

  18. SNPFile – A software library and file format for large scale association mapping and population genetics studies

    Directory of Open Access Journals (Sweden)

    Nielsen Jesper

    2008-12-01

    Full Text Available Abstract Background High-throughput genotyping technology has enabled cost effective typing of thousands of individuals in hundred of thousands of markers for use in genome wide studies. This vast improvement in data acquisition technology makes it an informatics challenge to efficiently store and manipulate the data. While spreadsheets and at text files were adequate solutions earlier, the increased data size mandates more efficient solutions. Results We describe a new binary file format for SNP data, together with a software library for file manipulation. The file format stores genotype data together with any kind of additional data, using a flexible serialisation mechanism. The format is designed to be IO efficient for the access patterns of most multi-locus analysis methods. Conclusion The new file format has been very useful for our own studies where it has significantly reduced the informatics burden in keeping track of various secondary data, and where the memory and IO efficiency has greatly simplified analysis runs. A main limitation with the file format is that it is only supported by the very limited set of analysis tools developed in our own lab. This is somewhat alleviated by a scripting interfaces that makes it easy to write converters to and from the format.

  19. 基于Android系统的数字校园地图导览软件%Digital campus map navigation software based on Android platform

    Institute of Scientific and Technical Information of China (English)

    李涵; 韦程

    2016-01-01

    Nowadays Android is the most popular mobile terminal operating system, and it has been widely used in intelligent mobile phone and panel computer. This project is to design and develop a campus mobile navigation application based on the Android platform. Its purpose is to meet the needs that freshmen who enters unfamiliar campus environment want to have a general and accurate understanding of campus. This software is composed by positioning module, map display module, communication module and compass module. User can position its location quickly and find the specified campus location through GPS chip on the mobile phone. Also user can communicate with others through the network and obtain the latest information about campus, or distinguish the direction by using sensor on mobile phone. Results of running and testing shows that this software with simple and friendly interface, operating simply and has powerful function.%本课题基于Android平台上设计开发一款针对校园的移动导航应用,满足新生进入校园后尽快适应陌生校园环境的需求.软件包括定位模块、地图显示模块、通讯交流模块、指南针模块等.通过GPS定位芯片可以实时定位用户所处的校园位置,并且可以快速找到指定的校园地点;通过网络通讯可以实现用户间的交流和获取校园内的最新资讯,并且可以利用手机传感器来辨别当前的方向.通过运行和测试,实现界面简洁友好、功能强大、操作简单.

  20. Design and Implementation of Experimental Data Mapping Software%实验数据绘图软件设计及实现

    Institute of Scientific and Technical Information of China (English)

    陆道明; 毛燕华

    2014-01-01

    For the convenience of drawing experiment data chart quickly,in Visual Studio 2010 development platform,by the introduction of the Chart control in the.NET Framework and the change of the chart control properties and coding,the experimental data mapping software is developed,which includes the data drawing module,the data analysis module and the image export module,and it can provide free internet downloads that is simple and handy data drawing tools.%为方便、快速绘制实验数据图表,在Visual Studio 2010开发平台中,引入.NET Framework中的Chart控件,通过改变Chart控件的图表属性和代码编写,开发出实验数据绘图软件,绘图软件包括数据绘图模块、数据分析模块及图像导出模块,并提供免费的网络下载,是文档编辑人员简洁、方便易用的数据绘图工具。

  1. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  2. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  3. Remote Viewer for Maritime Robotics Software

    Science.gov (United States)

    Kuwata, Yoshiaki; Wolf, Michael; Huntsberger, Terrance L.; Howard, Andrew B.

    2013-01-01

    This software is a viewer program for maritime robotics software that provides a 3D visualization of the boat pose, its position history, ENC (Electrical Nautical Chart) information, camera images, map overlay, and detected tracks.

  4. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  5. Software Reviews.

    Science.gov (United States)

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  6. Capturing the impact of software

    Science.gov (United States)

    Piwowar, Heather

    2017-01-01

    Research software is undervalued in funding and tenure decisions because its impact is poorly evaluated within the traditional paper-based ecosystem. The talk presents the NSF-funded Depsy project (http://depsy.org) -- a proof-of-concept system designed to address this problem by tracking the impact of software in software-native ways. Depsy finds mentions of software itself in the literature, rather than just counting citations to a wrapper paper about the software. It discovers how software gets reused by other software, even when it's not cited at all. And finally Depsy attempts to represent the full complexity of software authorship, where one project can involve hundreds of contributors in multiple roles that don't map to traditional paper authorship.

  7. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  8. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  9. Enhancing the usability and performance of structured association mapping algorithms using automation, parallelization, and visualization in the GenAMap software system

    Directory of Open Access Journals (Sweden)

    Curtis Ross E

    2012-04-01

    Full Text Available Abstract Background Structured association mapping is proving to be a powerful strategy to find genetic polymorphisms associated with disease. However, these algorithms are often distributed as command line implementations that require expertise and effort to customize and put into practice. Because of the difficulty required to use these cutting-edge techniques, geneticists often revert to simpler, less powerful methods. Results To make structured association mapping more accessible to geneticists, we have developed an automatic processing system called Auto-SAM. Auto-SAM enables geneticists to run structured association mapping algorithms automatically, using parallelization. Auto-SAM includes algorithms to discover gene-networks and find population structure. Auto-SAM can also run popular association mapping algorithms, in addition to five structured association mapping algorithms. Conclusions Auto-SAM is available through GenAMap, a front-end desktop visualization tool. GenAMap and Auto-SAM are implemented in JAVA; binaries for GenAMap can be downloaded from http://sailing.cs.cmu.edu/genamap.

  10. MapGIS K9软件在数字地质图中的应用%DISCUSSION ON THE APPLICATION OF MAPGIS K9 SOFTWARE IN DIGITAL GEOLOGICAL MAPPING

    Institute of Scientific and Technical Information of China (English)

    代丽霞; 胡红霞; 朱忠梅

    2013-01-01

    从制图的角度,结合工作实践阐述了MapGIS K9在地形地质图的绘制、图形数据库的建立等方面的应用.使制图专业人员对MapGIS K9软件更加了解,并对其基本功能与应用由感性上升到理性认识,从而更好地掌握MapGIS K9,提高制图质量及精度要求.

  11. Software piracy

    OpenAIRE

    Kráčmer, Stanislav

    2011-01-01

    The objective of the present thesis is to clarify the term of software piracy and to determine responsibility of individual entities as to actual realization of software piracy. First, the thesis focuses on a computer programme, causes, realization and pitfalls of its inclusion under copyright protection. Subsequently, it observes methods of legal usage of a computer programme. This is the point of departure for the following attempt to define software piracy, accompanied with methods of actu...

  12. A Mapping of an Agile Software Development Method to the Personal Productivity of the Knowledge Worker. A Systematic Review of Self-Help Books

    OpenAIRE

    Helga Guðrún Óskarsdóttir 1987

    2014-01-01

    This work explores the problem of how to increase knowledge worker productivity by performing a systematic literature review of personal productivity self-help books. The assumption was that personal productivity self-help books are based on the same underlying concepts and that these concepts can give insight into the personal productivity of the knowledge worker. The intent was to identify these concepts, compare them to the state-of-the-art on knowledge worker productivity and the software...

  13. Pash 3.0: A versatile software package for read mapping and integrative analysis of genomic and epigenomic variation using massively parallel DNA sequencing

    Directory of Open Access Journals (Sweden)

    Chen Zuozhou

    2010-11-01

    Full Text Available Abstract Background Massively parallel sequencing readouts of epigenomic assays are enabling integrative genome-wide analyses of genomic and epigenomic variation. Pash 3.0 performs sequence comparison and read mapping and can be employed as a module within diverse configurable analysis pipelines, including ChIP-Seq and methylome mapping by whole-genome bisulfite sequencing. Results Pash 3.0 generally matches the accuracy and speed of niche programs for fast mapping of short reads, and exceeds their performance on longer reads generated by a new generation of massively parallel sequencing technologies. By exploiting longer read lengths, Pash 3.0 maps reads onto the large fraction of genomic DNA that contains repetitive elements and polymorphic sites, including indel polymorphisms. Conclusions We demonstrate the versatility of Pash 3.0 by analyzing the interaction between CpG methylation, CpG SNPs, and imprinting based on publicly available whole-genome shotgun bisulfite sequencing data. Pash 3.0 makes use of gapped k-mer alignment, a non-seed based comparison method, which is implemented using multi-positional hash tables. This allows Pash 3.0 to run on diverse hardware platforms, including individual computers with standard RAM capacity, multi-core hardware architectures and large clusters.

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  15. Benthic Photo Survey: Software for Geotagging, Depth-tagging, and Classifying Photos from Survey Data and Producing Shapefiles for Habitat Mapping in GIS

    Directory of Open Access Journals (Sweden)

    Jared Kibele

    2016-03-01

    Full Text Available Photo survey techniques are common for resource management, ecological research, and ground truthing for remote sensing but current data processing methods are cumbersome and inefficient. The Benthic Photo Survey (BPS software described here was created to simplify the data processing and management tasks associated with photo surveys of underwater habitats. BPS is free and open source software written in Python with a QT graphical user interface. BPS takes a GPS log and jpeg images acquired by a diver or drop camera and assigns the GPS position to each photo based on time-stamps (i.e. geotagging. Depth and temperature can be assigned in a similar fashion (i.e. depth-tagging using log files from an inexpensive consumer grade depth / temperature logger that can be attached to the camera. BPS provides the user with a simple interface to assign quantitative habitat and substrate classifications to each photo. Location, depth, temperature, habitat, and substrate data are all stored with the jpeg metadata in Exchangeable image file format (Exif. BPS can then export all of these data in a spatially explicit point shapefile format for use in GIS. BPS greatly reduces the time and skill required to turn photos into usable data thereby making photo survey methods more efficient and cost effective. BPS can also be used, as is, for other photo sampling techniques in terrestrial and aquatic environments and the open source code base offers numerous opportunities for expansion and customization.

  16. Software for Spatial Statistics

    Directory of Open Access Journals (Sweden)

    Edzer Pebesma

    2015-02-01

    Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

  17. Software for Spatial Statistics

    OpenAIRE

    Edzer Pebesma; Roger Bivand; Paulo Justiniano Ribeiro

    2015-01-01

    We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

  18. An Integrated Map of Soybean Physical Map and Genetic Map

    Institute of Scientific and Technical Information of China (English)

    QI Zhaoming; LI Hui; WU Qiong; SUN Yanan; LIU Chunyan; HU Guohua; CHEN Qingshan

    2009-01-01

    Soybean is a major crop in the world, and it is a main source of plant proteins and oil. A lot of soybean genetic maps and physical maps have been constructed, but there are no integrated map between soybean physical map and genetic map. In this study, soybean genome sequence data, released by JGI (US Department of Energy's Joint Genome Institute), had been downloaded. With the software Blast 2.2.16, a total of 161 super sequences were mapped on the soybean public genetic map to construct an integrated map. The length of these super sequences accounted for 73.08% of all the genome sequence. This integrated map could be used for gene cloning, gene mining, and comparative genome of legume.

  19. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  20. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation....

  1. Software Reviews.

    Science.gov (United States)

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  2. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  3. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  4. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  5. Reusable Software.

    Science.gov (United States)

    1984-03-01

    overseeing reusable software, the Reusable Software Organization ( RUSO ). This author does not feel at this time that establishment of such a specific...49] have not been accompanied by establishment of RUSO -like activities. There is need, however, for assurance that functions which a RUSO might be...assurance 6. establishment and maintenance of reuse archival facilities and activities. Actual establishment of a RUSO is best dictated by size of the

  6. Software Epistemology

    Science.gov (United States)

    2016-03-01

    comprehensive approach for determining software epistemology which significantly advances the state of the art in automated vulnerability discovery...straightforward. First, internet -based repositories of open source software (e.g., FreeBSD ports, GitHub, SourceForge, etc.) are mined Approved for...the fix delta, we attempted to perform the same process to determine if the firmware release present in an Internet -of-Things (IoT) streaming camera

  7. MARPLOT Software

    Science.gov (United States)

    Part of the CAMEO suite, MARPLOT® is a mapping application that people can use to quickly create, view, and modify maps. Users can create their own objects in MARPLOT (e.g., facilities, schools, response assets) and display them on top of a basemap.

  8. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  9. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  10. 智能化海量空间信息分析与地图制图软件包IMAT设计及构建%Design and Development of Software Package Intelligent Mapping Tools (IMAT)

    Institute of Scientific and Technical Information of China (English)

    张维理

    2014-01-01

    complex through multi- channels and with multi-point of view. It is a useful tool to examine and amend the assumptions to the world by taking advantage of the new data and evidence. The main difficulty of the big geo-data analysis comes from not only the treating of large data volume, but also the extracting, integrating and expressing information based on heterogeneous data resources. Since there has been no mainstream software tool available for big geo-data analysis, purpose of the study is to develop a professional tool for extracting and integrating heterologous massive geo-information from different resources as well as for making thematic maps with high resolution on large-scales. The intelligent software package should finish data processing and mapping automatically or human-computer interactively.[Method]Principles and rules of methodology for massive geo-data analysis and software design were applied for IMAT (Intelligent mapping tools) design. The whole design consisted of three parts, the system architecture, the system data supporting platform, and the design for system modules and models. For IMAT software development, C# was used as the programming language, NET Framework 4 Extended was applied as development environment, functions and components from software packages of ArcGIS, Access and DotNet Bar were called.[Result]With 38 independent functional modules, IMAT provides the main functions for analyzing massive spatial information and cartographic representing, which are required in agriculture and the environment research and working area. Each module can be used to independently conduct certain data analysis and processing, for example, the big data loading and storing, the statistical analysis, classification and coding of space elements, the data selecting, integration and mapping, etc. It can also be used as a combination of several modules to complete a more complex task of data extraction and expression. IMAT has made up the lack in functions to

  11. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  12. [Software version and medical device software supervision].

    Science.gov (United States)

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  13. Three-dimensional mapping of mechanical activation patterns, contractile dyssynchrony and dyscoordination by two-dimensional strain echocardiography: Rationale and design of a novel software toolbox

    Directory of Open Access Journals (Sweden)

    Cramer Maarten J

    2008-05-01

    of local 2-D echocardiographic deformation data into a 3-D model by dedicated software allows a comprehensive analysis of spatio-temporal distribution patterns of myocardial dyssynchrony, of the global left ventricular deformation and of newer indices that may better reflect myocardial dyscoordination and/or impaired ventricular contractile efficiency. The potential value of such an analysis is highlighted in two dyssynchronous pathologies that impose particular challenges to deformation imaging.

  14. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  15. Software Patents.

    Science.gov (United States)

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  16. Software Systems

    Institute of Scientific and Technical Information of China (English)

    崔涛; 周淼

    1996-01-01

    The information used with computers is known as software and includesprograms and data. Programs are sets of instructions telling the computerwhat operations have to be carried out and in what order they should be done. Specialised programs which enable the computer to be used for particularpurposes are called applications programs. A collection of these programs kept

  17. Software Reviews.

    Science.gov (United States)

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  18. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  19. Programming with Hierarchical Maps

    DEFF Research Database (Denmark)

    Ørbæk, Peter

    This report desribes the hierarchical maps used as a central data structure in the Corundum framework. We describe its most prominent features, ague for its usefulness and briefly describe some of the software prototypes implemented using the technology....

  20. A networked modular hardware and software system for MRI-guided robotic prostate interventions

    Science.gov (United States)

    Su, Hao; Shang, Weijian; Harrington, Kevin; Camilo, Alex; Cole, Gregory; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare; Fischer, Gregory S.

    2012-02-01

    Magnetic resonance imaging (MRI) provides high resolution multi-parametric imaging, large soft tissue contrast, and interactive image updates making it an ideal modality for diagnosing prostate cancer and guiding surgical tools. Despite a substantial armamentarium of apparatuses and systems has been developed to assist surgical diagnosis and therapy for MRI-guided procedures over last decade, the unified method to develop high fidelity robotic systems in terms of accuracy, dynamic performance, size, robustness and modularity, to work inside close-bore MRI scanner still remains a challenge. In this work, we develop and evaluate an integrated modular hardware and software system to support the surgical workflow of intra-operative MRI, with percutaneous prostate intervention as an illustrative case. Specifically, the distinct apparatuses and methods include: 1) a robot controller system for precision closed loop control of piezoelectric motors, 2) a robot control interface software that connects the 3D Slicer navigation software and the robot controller to exchange robot commands and coordinates using the OpenIGTLink open network communication protocol, and 3) MRI scan plane alignment to the planned path and imaging of the needle as it is inserted into the target location. A preliminary experiment with ex-vivo phantom validates the system workflow, MRI-compatibility and shows that the robotic system has a better than 0.01mm positioning accuracy.

  1. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  2. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology-Lausanne (EPFL), Solar Energy and Building Physics Laboratory (LESO-PB), Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Institute of Meteorology and Physics of Atmospheric Environment, Group Energy Conservation, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Division of Energy and Indoor Environment, Hoersholm, (Denmark)

    2000-07-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenarios and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (author)

  3. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  4. Software Maintenance.

    Science.gov (United States)

    Cannon, Glenn; Jobe, Holly

    Proper cleaning and storage of audiovisual aids is outlined in this brief guide. Materials and equipment needed for first line maintenance are listed, as well as maintenance procedures for records, audio and video tape, film, filmstrips, slides, realia, models, prints, graphics, maps, and overhead transparencies. A 15-item quiz on software…

  5. Software Engineering to Professionalize Software Development

    Directory of Open Access Journals (Sweden)

    Juan Miguel Alonso

    2011-12-01

    Full Text Available The role, increasingly important, that plays the software in the systems with widespread effects presents new challenges for the formation of Software Engineers. Not only because social dependence software is increasing, but also because the character of software development is also changing and with it the demands for software developers certified. In this paper are propose some challenges and aspirations that guide the learning processes Software Engineering and help to identify the need to train professionals in software development.

  6. BenMAP Downloads

    Science.gov (United States)

    Download the current and legacy versions of the BenMAP program. Download configuration and aggregation/pooling/valuation files to estimate benefits. BenMAP-CE is free and open source software, and the source code is available upon request.

  7. Petroleum software profiles: an update

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1998-05-01

    Computer software for the petroleum industry developed by various consulting and oil field services companies were described. A system for mapping and analysis in real time (S.M.A.R.T.), combining advanced information technology, advanced geostatistics and high levels of 3-D graphics capability, was developed by United Oil and Gas Consulting Ltd of Calgary. It is a multi-disciplinary tool, incorporating geoscientific concepts to build a spatial earth model of reservoirs, simplifying horizontal and vertical well planning and analysis, and allowing for the incorporation of 3-D seismic structure. Another company, Schlumberger Geoquest, provides the industry with data services and software products to maximize the value of exploration and production data. Their `Geoframe` systems cover a spectrum of reservoir characterization disciplines to enable detailed interpretation and integration of data from petrophysics, geology and geophysics. Among the Goequest software solutions a number of reservoir simulators (Eclipse 100, 300, 500), well log analysis software (QLA), software for integrating solutions for rock and reserves (CADE Office), and software for well analysis applications (PA/OFM) were highlighted. Geoquest`s wide range of data processing and interpretation, reservoir pressure and flow evaluation, mapping and cartography services, and leased systems, were also reviewed.

  8. Space Software

    Science.gov (United States)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  9. Software architecture

    CERN Document Server

    Vogel, Oliver; Chughtai, Arif

    2011-01-01

    As a software architect you work in a wide-ranging and dynamic environment. You have to understand the needs of your customer, design architectures that satisfy both functional and non-functional requirements, and lead development teams in implementing the architecture. And it is an environment that is constantly changing: trends such as cloud computing, service orientation, and model-driven procedures open up new architectural possibilities. This book will help you to develop a holistic architectural awareness and knowledge base that extends beyond concrete methods, techniques, and technologi

  10. XML Entity Architecture for Efficient Software Integration

    OpenAIRE

    Patwardhan, Amol; Patwardhan, Rahul

    2016-01-01

    This paper proposed xml entities based architectural implementation to improve integration between multiple third party vendor software systems with incompatible xml schema. The xml entity architecture implementation showed that the lines of code change required for mapping the schema between in house software and three other vendor schema, decreased by 5.2%, indicating an improvement in quality. The schema mapping development time decreased by 3.8% and overall release time decreased by 5.3%,...

  11. Infrared Imaging Data Reduction Software and Techniques

    CERN Document Server

    Sabbey, C N; Lewis, J R; Irwin, M J; Sabbey, Chris N.; Mahon, Richard G. Mc; Lewis, James R.; Irwin, Mike J.

    2001-01-01

    We describe the InfraRed Data Reduction (IRDR) software package, a small ANSI C library of fast image processing routines for automated pipeline reduction of infrared (dithered) observations. We developed the software to satisfy certain design requirements not met in existing packages (e.g., full weight map handling) and to optimize the software for large data sets (non-interactive tasks that are CPU and disk efficient). The software includes stand-alone C programs for tasks such as running sky frame subtraction with object masking, image registration and coaddition with weight maps, dither offset measurement using cross-correlation, and object mask dilation. Although we currently use the software to process data taken with CIRSI (a near-IR mosaic imager), the software is modular and concise and should be easy to adapt/reuse for other work. IRDR is available from anonymous ftp to ftp.ast.cam.ac.uk in pub/sabbey.

  12. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  13. Software Maintenance Management Evaluation and Continuous Improvement

    CERN Document Server

    April, Alain

    2008-01-01

    This book explores the domain of software maintenance management and provides road maps for improving software maintenance organizations. It describes full maintenance maturity models organized by levels 1, 2, and 3, which allow for benchmarking and continuous improvement paths. Goals for each key practice area are also provided, and the model presented is fully aligned with the architecture and framework of software development maturity models of CMMI and ISO 15504. It is complete with case studies, figures, tables, and graphs.

  14. The Biomes of Homewood: Interactive Map Software

    Science.gov (United States)

    Shingles, Richard; Feist, Theron; Brosnan, Rae

    2005-01-01

    To build a learning community, the General Biology faculty at Johns Hopkins University conducted collaborative, problem-based learning assignments outside of class in which students are assigned to specific areas on campus, and gather and report data about their area. To overcome the logistics challenges presented by conducting such assignments in…

  15. SOFTWARE METRICS VALIDATION METHODOLOGIES IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    K.P. Srinivasan

    2014-12-01

    Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

  16. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  17. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Dominic Letourneau Clement Raievsky

    2008-11-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  18. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. Software Metrics to Estimate Software Quality using Software Component Reusability

    Directory of Open Access Journals (Sweden)

    Prakriti Trivedi

    2012-03-01

    Full Text Available Today most of the applications developed using some existing libraries, codes, open sources etc. As a code is accessed in a program, it is represented as the software component. Such as in java beans and in .net ActiveX controls are the software components. These components are ready to use programming code or controls that excel the code development. A component based software system defines the concept of software reusability. While using these components the main question arise is whether to use such components is beneficial or not. In this proposed work we are trying to present the answer for the same question. In this work we are presenting a set of software matrix that will check the interconnection between the software component and the application. How strong this relation defines the software quality after using this software component. The overall metrics will return the final result in terms of the boundless of the component with application.

  1. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  2. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  3. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    evolutionary series of personal software engineering techniques that an engineer learns and ... Article History: Received : 30-04- ... began to realize that software process, plans and methodologies for ..... Executive Strategy. Addison-Wesley ...

  4. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  5. Ontologies for software engineering and software technology

    CERN Document Server

    Calero, Coral; Piattini, Mario

    2006-01-01

    Covers two applications of ontologies in software engineering and software technology: sharing knowledge of the problem domain and using a common terminology among all stakeholders; and filtering the knowledge when defining models and metamodels. This book is of benefit to software engineering researchers in both academia and industry.

  6. Multi-parametric MRI characterization of enzymatically degraded articular cartilage.

    Science.gov (United States)

    Nissi, Mikko J; Salo, Elli-Noora; Tiitu, Virpi; Liimatainen, Timo; Michaeli, Shalom; Mangia, Silvia; Ellermann, Jutta; Nieminen, Miika T

    2016-07-01

    Several laboratory and rotating frame quantitative MRI parameters were evaluated and compared for detection of changes in articular cartilage following selective enzymatic digestion. Bovine osteochondral specimens were subjected to 44 h incubation in control medium or in collagenase or chondroitinase ABC to induce superficial collagen or proteoglycan (glycosaminoglycan) alterations. The samples were scanned at 9.4 T for T1 , T1 Gd (dGEMRIC), T2 , adiabatic T1 ρ , adiabatic T2 ρ , continuous-wave T1 ρ , TRAFF2 , and T1 sat relaxation times and for magnetization transfer ratio (MTR). For reference, glycosaminoglycan content, collagen fibril orientation and biomechanical properties were determined. Changes primarily in the superficial cartilage were noted after enzymatic degradation. Most of the studied parameters were sensitive to the destruction of collagen network, whereas glycosaminoglycan depletion was detected only by native T1 and T1 Gd relaxation time constants throughout the tissue and by MTR superficially. T1 , adiabatic T1 ρ , adiabatic T2 ρ , continuous-wave T1 ρ , and T1 sat correlated significantly with the biomechanical properties while T1 Gd correlated with glycosaminoglycan staining. The findings indicated that most of the studied MRI parameters were sensitive to both glycosaminoglycan content and collagen network integrity, with changes due to enzymatic treatment detected primarily in the superficial tissue. Strong correlation of T1 , adiabatic T1ρ , adiabatic T2 ρ , continuous-wave T1 ρ , and T1 sat with the altered biomechanical properties, reflects that these parameters were sensitive to critical functional properties of cartilage. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:1111-1120, 2016.

  7. Discovery Radiomics for Multi-Parametric MRI Prostate Cancer Detection

    CERN Document Server

    Chung, Audrey G; Kumar, Devinder; Khalvati, Farzad; Haider, Masoom A; Wong, Alexander

    2015-01-01

    Prostate cancer is the most diagnosed form of cancer in Canadian men, and is the third leading cause of cancer death. Despite these statistics, prognosis is relatively good with a sufficiently early diagnosis, making fast and reliable prostate cancer detection crucial. As imaging-based prostate cancer screening, such as magnetic resonance imaging (MRI), requires an experienced medical professional to extensively review the data and perform a diagnosis, radiomics-driven methods help streamline the process and has the potential to significantly improve diagnostic accuracy and efficiency, and thus improving patient survival rates. These radiomics-driven methods currently rely on hand-crafted sets of quantitative imaging-based features, which are selected manually and can limit their ability to fully characterize unique prostate cancer tumour phenotype. In this study, we propose a novel \\textit{discovery radiomics} framework for generating custom radiomic sequences tailored for prostate cancer detection. Discover...

  8. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  9. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  10. Cortical cartography and Caret software.

    Science.gov (United States)

    Van Essen, David C

    2012-08-15

    Caret software is widely used for analyzing and visualizing many types of fMRI data, often in conjunction with experimental data from other modalities. This article places Caret's development in a historical context that spans three decades of brain mapping--from the early days of manually generated flat maps to the nascent field of human connectomics. It also highlights some of Caret's distinctive capabilities. This includes the ease of visualizing data on surfaces and/or volumes and on atlases as well as individual subjects. Caret can display many types of experimental data using various combinations of overlays (e.g., fMRI activation maps, cortical parcellations, areal boundaries), and it has other features that facilitate the analysis and visualization of complex neuroimaging datasets. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  12. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  13. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  14. Software and systems traceability

    CERN Document Server

    Cleland-Huang, Jane; Zisman, Andrea

    2012-01-01

    ""Software and Systems Traceability"" provides a comprehensive description of the practices and theories of software traceability across all phases of the software development lifecycle. The term software traceability is derived from the concept of requirements traceability. Requirements traceability is the ability to track a requirement all the way from its origins to the downstream work products that implement that requirement in a software system. Software traceability is defined as the ability to relate the various types of software artefacts created during the development of software syst

  15. Maximizing ROI on software development

    CERN Document Server

    Sikka, Vijay

    2004-01-01

    A brief review of software development history. Software complexity crisis. Software development ROI. The case for global software development and testing. Software quality and test ROI. How do you implement global software development and testing. Case studies.

  16. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  17. Concept mapping instrumental support for problem solving

    NARCIS (Netherlands)

    Stoyanov, Slavi; Kommers, Piet

    2008-01-01

    The main theoretical position of this paper is that it is the explicit problem-solving support in concept mapping software that produces a stronger effect in problem-solving performance than the implicit support afforded by the graphical functionality of concept mapping software. Explicit problem-so

  18. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  19. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  20. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  1. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  2. Software distribution using xnetlib

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science]|[Oak Ridge National Lab., TN (US); Rowan, T.H. [Oak Ridge National Lab., TN (US); Wade, R.C. [Univ. of Tennessee, Knoxville, TN (US). Dept. of Computer Science

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  3. Image Processing Software

    Science.gov (United States)

    Bosio, M. A.

    1990-11-01

    ABSTRACT: A brief description of astronomical image software is presented. This software was developed in a Digital Micro Vax II Computer System. : St presenta una somera descripci6n del software para procesamiento de imagenes. Este software fue desarrollado en un equipo Digital Micro Vax II. : DATA ANALYSIS - IMAGE PROCESSING

  4. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  5. Software productivity improvement through software engineering technology

    Science.gov (United States)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  6. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  7. Software Engineering for Practiced Software Enhancement

    Directory of Open Access Journals (Sweden)

    Rashmi Yadav

    2011-03-01

    Full Text Available Software development scenario particularly in IT industries is very competitive and demands for development with minimum resources. Software development started and prevailed up to an extent in industry without the use of software engineering practices, which was perceived as an overhead. This approach causes over use of resources, such as money, man-hours, hardware components. This paper attempts to present the causes of inefficiencies in an almost exhaustive way. Further, an attempt has been made to elaborate the software engineering methods as remedies against the listed causes of inefficiencies of development.

  8. Software Metrics for Identifying Software Size in Software Development Projects

    Directory of Open Access Journals (Sweden)

    V.S.P Vidanapathirana

    2015-11-01

    Full Text Available Measurements are fundamental any engineering discipline. They indicate the amount, extent, dimension or capacity of an attribute or a product, in a quantitative manner. The analyzed results of the measured data can be given as the basic idea of metrics. It is a quantitative representation of the measurements of the degree to which a system, component, or process possesses a given attribute. When it comes to software, the metrics are a wide scope of measurements of computer programming. The size oriented metrics takes a main role in it since they can be used as the key for better estimations, to improve trust and confidence, and to have a better control over the software products. Software professionals traditionally have been measuring the size of software applications by using several methods. In this paper the researchers discuss about the software size metrics for identifying software size and it is mainly focused on the software development projects in today’s Information Technology (IT industry.

  9. Software Cost Estimation Review

    OpenAIRE

    Ongere, Alphonce

    2013-01-01

    Software cost estimation is the process of predicting the effort, the time and the cost re-quired to complete software project successfully. It involves size measurement of the soft-ware project to be produced, estimating and allocating the effort, drawing the project schedules, and finally, estimating overall cost of the project. Accurate estimation of software project cost is an important factor for business and the welfare of software organization in general. If cost and effort estimat...

  10. Software Partitioning Technologies

    Science.gov (United States)

    2001-05-29

    1 Software Partitioning Technologies Tim Skutt Smiths Aerospace 3290 Patterson Ave. SE Grand Rapids, MI 49512-1991 (616) 241-8645 skutt_timothy...Limitation of Abstract UU Number of Pages 12 2 Agenda n Software Partitioning Overview n Smiths Software Partitioning Technology n Software Partitioning...Partition Level OS Core Module Level OS Timers MMU I/O API Layer Partitioning Services 6 Smiths Software Partitioning Technology n Smiths has developed

  11. Impact of Growing Business on Software Processes

    Science.gov (United States)

    Nikitina, Natalja; Kajko-Mattsson, Mira

    When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.

  12. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  13. Payload software technology: Software technology development plan

    Science.gov (United States)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  14. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation...

  15. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  16. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  17. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  18. ATLAS software packaging

    CERN Document Server

    Rybkin, G

    2012-01-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...

  19. Commercial Data Mining Software

    Science.gov (United States)

    Zhang, Qingyu; Segall, Richard S.

    This chapter discusses selected commercial software for data mining, supercomputing data mining, text mining, and web mining. The selected software are compared with their features and also applied to available data sets. The software for data mining are SAS Enterprise Miner, Megaputer PolyAnalyst 5.0, PASW (formerly SPSS Clementine), IBM Intelligent Miner, and BioDiscovery GeneSight. The software for supercomputing are Avizo by Visualization Science Group and JMP Genomics from SAS Institute. The software for text mining are SAS Text Miner and Megaputer PolyAnalyst 5.0. The software for web mining are Megaputer PolyAnalyst and SPSS Clementine . Background on related literature and software are presented. Screen shots of each of the selected software are presented, as are conclusions and future directions.

  20. Image processing and enhancement provided by commercial dental software programs

    National Research Council Canada - National Science Library

    Lehmann, T M; Troeltsch, E; Spitzer, K

    2002-01-01

    To identify and analyse methods/algorithms for image processing provided by various commercial software programs used in direct digital dental imaging and to map them onto a standardized nomenclature...

  1. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  2. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  3. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  4. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  5. Innovation Drivers and Outputs for Software Firms: Literature Review and Concept Development

    Directory of Open Access Journals (Sweden)

    Jeremy Rose

    2016-01-01

    Full Text Available Software innovation, the ability to produce novel and useful software systems, is an important capability for software development organizations and information system developers alike. However, the software development literature has traditionally focused on automation and efficiency while the innovation literature has given relatively little consideration to the software development context. As a result, there is a gap in our understanding of how software product and process innovation can be managed. Specifically, little attention has been directed toward synthesizing prior learning or providing an integrative perspective on the key concepts and focus of software innovation research. We therefore identify 93 journal articles and conference papers within the domain of software innovation and analyse repeating patterns in this literature using content analysis and causal mapping. We identify drivers and outputs for software innovation and develop an integrated theory-oriented concept map. We then discuss the implications of this map for future research.

  6. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  7. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  8. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  9. Java for flight software

    Science.gov (United States)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  10. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  11. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  12. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  13. Software Maintenance Success Recipes

    CERN Document Server

    Reifer, Donald J

    2011-01-01

    Dispelling much of the folklore surrounding software maintenance, Software Maintenance Success Recipes identifies actionable formulas for success based on in-depth analysis of more than 200 real-world maintenance projects. It details the set of factors that are usually present when effective software maintenance teams do their work and instructs on the methods required to achieve success. Donald J. Reifer--an award winner for his contributions to the field of software engineering and whose experience includes managing the DoD Software Initiatives Office--provides step-by-step guidance on how t

  14. Funding Research Software Development

    Science.gov (United States)

    Momcheva, Ivelina G.

    2017-01-01

    Astronomical software is used by each and every member of our scientific community. Purpose-build software is becoming ever more critical as we enter the regime of large datasets and simulations of increasing complexity. However, financial investments in building, maintaining and renovating the software infrastructure have been uneven. In this talk I will summarize past and current funding sources for astronomical software development, discuss other models of funding and introduce a new initiative for supporting community software at STScI. The purpose of this talk is to prompt discussion about how we allocate resources to this vital infrastructure.

  15. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  16. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... ... the test of time. Keywords: Software, software maintenance, software evolution, reverse engineering, ... area of human endeavour be it automobile, software, etc. at .... greater efficiency and productivity in the maintenance ...

  17. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. 自动扶梯安装工艺及其智能成图软件开发与应用%Development and Application of Escalator Installation Technology Intelligent Mapping Software

    Institute of Scientific and Technical Information of China (English)

    李成洋

    2015-01-01

    为了保证城市轨道交通自动扶梯的正确安装,提高设计质量和效率,研究了自动扶梯与相关专业接口工艺,总结自动扶梯安装工艺图出图要素,并利用模块化、流程化的3层软件框架,开发了1套自动扶梯安装工艺图智能绘制软件,实现了城市轨道交通自动扶梯工艺设计的简易化、标准化、智能化。%In order to ensure correct installation of escalators, improve the design quality and efficiency, the paper studies the escalator and related interface, and summarize the escalator installation process diagram and drawing elements. By using the 3 layer software framework of modular and worklfow process, it makes the development of 1 set of escalator installation process diagram intelligent drawing software.

  19. 在林业计算机制图中运用FontCreator软件制作点状图符号的方法%Application of Font Creator Software in Making Scatter-gram Symbols in Forestry Computer Mapping

    Institute of Scientific and Technical Information of China (English)

    杨璇玺

    2013-01-01

    Taking scatter-gram symbols in making LY/T 2009-2012 " county level forest protection and utilization planning cartographic norms" for example,from aspects of creating symbol files,editing symbol file,saving the installation symbol file,using the symbol files,etc,the application of Font Creator software in making scatter-gram symbols was elaborated in this paper.It was believed that scatter-gram symbols made by Font Creator software had characters of simple,fast,clear and sized up.%以LY/T 2009-2012《县级林地保护利用规划制图规范》中所示的点状符号制作为例,从创建符号文件、添加编辑符号文件、保存安装符号文件、使用符号文件等方面阐述FontCreator软件在地图符号制作中的应用,认为FontCreator软件制作专题图符简便、快捷,图符清晰规范,符合要求.

  20. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  1. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  2. COTS software selection process.

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  3. Social software in global software development

    DEFF Research Database (Denmark)

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  4. Social software in global software development

    DEFF Research Database (Denmark)

    2010-01-01

    Social software (SoSo) is defined by Farkas as tools that (1) allow people to communicate, collaborate, and build community online (2) can be syndicated, shared, reused or remixed and (3) let people learn easily from and capitalize on the behavior and knowledge of others. [1]. SoSo include a wide...... variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  5. Software Quality Assurance in Software Projects: A Study of Pakistan

    Directory of Open Access Journals (Sweden)

    Faisal Shafique Butt

    2013-05-01

    Full Text Available Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA, Software Quality Plan (SQP and Software Quality Control (SQC. So In this study, we are discussing the quality standards and principles of software projects in Pakistan software Industry and how these implemented quality standards are measured and managed. In this study, we will see how many software firms are following the rules of CMMI to create software. How many are reaching international standards and how many firms are measuring the quality of their projects. The results show some of the companies are using software quality assurance techniques in Pakstan.

  6. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  7. Real World Software Engineering

    Science.gov (United States)

    1994-07-15

    semester addresses the remaining principles of a complete, mature software development process [ Humphrey 88]. In order to provide an instructional...Software Innovations Technology, 1083 Mandarin Drive N.E.. Palm Bay FL 32905-4706 [ Humphrey 88] W. S. Humphrey , "Characterizing the Software Process: A...Copies of all the forms mentioned are available via electronic mail from the authors. 40 [1) Doris Carver, "Comparison of Techniques In Project-Based

  8. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  9. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  10. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  11. Software systems for astronomy

    CERN Document Server

    Conrad, Albert R

    2014-01-01

    This book covers the use and development of software for astronomy. It describes the control systems used to point the telescope and operate its cameras and spectrographs, as well as the web-based tools used to plan those observations. In addition, the book also covers the analysis and archiving of astronomical data once it has been acquired. Readers will learn about existing software tools and packages, develop their own software tools, and analyze real data sets.

  12. Lean software development

    OpenAIRE

    Hefnerová, Lucie

    2011-01-01

    The main goal of this bachelor thesis is the emergence of the clear Czech written material concerning the concept of Lean Software Development, which has been gaining significant attention in the field of software development, recently. Another goal of this thesis is to summarize the possible approaches of categorizing the concept and to summarize the possible approaches of defining the relationship between Lean and Agile software development. The detailed categorization of the tools potentia...

  13. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  14. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  15. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  16. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  17. Software Architecture Technology Initiative

    Science.gov (United States)

    2008-04-01

    2008 Carnegie Mellon University 2008 PLS March 2008 © 2008 Carnegie Mellon University Software Architecture Technology Initiative SATURN 2008...SUBTITLE Software Architecture Technology Initiative 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...SUPPLEMENTARY NOTES presented at the SEI Software Architecture Technology User Network (SATURN) Workshop, 30 Apr ? 1 May 2008, Pittsburgh, PA. 14

  18. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  19. Parallel Software Model Checking

    Science.gov (United States)

    2015-01-08

    JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Parallel Software Model Checking 5a. CONTRACT NUMBER 5b. GRANT NUMBER...AND ADDRESS(ES) Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 8. PERFORMING ORGANIZATION REPORT NUMBER 9...3: ∧ ≥ 10 ∧ ≠ 10 ⇒ : Parallel Software Model Checking Team Members Sagar Chaki, Arie Gurfinkel

  20. Gammasphere software development

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  1. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  2. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  3. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  4. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  5. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...... the organization and management of (software development) projects and process improvements projects....

  6. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  7. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  8. MYOB software for dummies

    CERN Document Server

    Curtis, Veechi

    2012-01-01

    Your complete guide to MYOB® AccountRight softwareNow in its seventh edition, MYOB® Software For Dummies walks you through everything you need to know, from starting your MYOB® file from scratch and recording payments and receipts, to tracking profit and analysing sales. This new edition includes all the information you need on the new generation of MYOB® AccountRight software, including the new cloud computing features. Set up MYOB® software - understand how to make it work the first time Keep track of purchases and sales - monitor customer accounts and ensure you get pai

  9. Genetic Mapping

    Science.gov (United States)

    ... Fact Sheets Fact Sheets En Español: Mapeo Genético Genetic Mapping What is genetic mapping? How do researchers create ... genetic map? What are genetic markers? What is genetic mapping? Among the main goals of the Human Genome ...

  10. Teaching Social Software with Social Software

    Science.gov (United States)

    Mejias, Ulises

    2006-01-01

    Ulises Mejias examines how social software--information and communications technologies that facilitate the collaboration and exchange of ideas--enables students to participate in distributed research, an approach to learning in which knowledge is collectively constructed and shared. During Fall 2005, Mejias taught a graduate seminar that provided…

  11. Differential diagnosis of diffuse lesion in peripheral zone of prostate by using multi-parametric MRI%多模态磁共振成像对前列腺外周带弥漫病变的鉴别诊断

    Institute of Scientific and Technical Information of China (English)

    王伟; 邵志红; 王国良; 吴登龙; 高晓龙; 王培军

    2016-01-01

    Objective To determine the utility of multi-parametric MR imaging (Mp-MRI) for differential diagnosis of diffuse prostate cancer and prostatitis in peripheral zone. Methods A retrospective study of 33 cases of prostate diseases was conducted, included 18 cases of prostatitis, 15 cases of prostate cancer, all of cases were confirmed histopathologically by transrectal ultrasound guided saturation biopsy or radical prostatectomy. All MR imaging examinations were performed using a 3.0 T Siemens Verio MR scanner. The imaging protocol consisted of high-resolution axial and sagittal T2-weighted scans, axial diffusion weighted imaging (DWI) and dynamic contrast-enhanced (DCE) scans. Results All cases of prostatitis were homogeneous or heterogeneous hypotensity or slightly hypotensity on T2WI, with clear boundary between transitional zone and intact capsule in 14 cases. Prostatitis showed slightly hyperintensity or hyperintensity on DWI, and mean ADC value was (1.12±0.15)×10-3 mm2/s. 9 cases of prostatitis showed plateau of enhancement, and 6 cases were progressive enhancement on DCE, and 3 cases of prostatitis showed heterogeneous enhancement with small abscess or cyst. All prostate cancer cases showed uniform hypotensity, with unclear boundaries between transitional zone and capsular invasion in 9 cases. Prostate cancer presented hyperintensity or slightly hyperintensity on DWI, and mean ADC value was (0.85±0.19)×10-3 mm2/s. Mean ADC value of prostate cancer was significantly lower than prostatitis (t=4.563, P<0.01). 10 cases of prostate cancer showed wash-out enhancement, and 5 cases were plateau of enhancement on DCE. Conclusion Mp-MRI is helpful in the differential diagnosis of diffuse prostate cancer and prostatitis in peripheral zone.%目的:探讨多模态磁共振成像对外周带弥漫前列腺癌和前列腺炎的鉴别诊断价值。方法回顾性分析经直肠超声(TRUS)引导下饱和穿刺活检或前列腺根治术

  12. ATLAS software packaging

    Science.gov (United States)

    Rybkin, Grigory

    2012-12-01

    Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.

  13. Concept Maps

    OpenAIRE

    Schwendimann, Beat Adrian

    2014-01-01

    A concept map is a node-link diagram showing the semantic relationships among concepts. The technique for constructing concept maps is called "concept mapping". A concept map consists of nodes, arrows as linking lines, and linking phrases that describe the relationship between nodes. Two nodes connected with a labeled arrow are called a proposition. Concept maps are versatile graphic organizers that can represent many different forms of relationships between concepts. The relationship between...

  14. Sand Lake WMD vegetation mapping project update

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Final report on the vegetation mapping project at Sand Lake Wetland Management District. This project is being completed by the use of SPRING software and ground...

  15. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  16. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  17. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in o...

  18. Software Project Management

    Science.gov (United States)

    1989-07-01

    on software management obstacles and ways Chakrabarty, which held that genetically altered to cope with them are presented. Standardization is... algorith - MacProject89 mic models used to estimate software costs (SLIM, MacProject 11. Claris Corp., Mountain View, Calif., COCOMO, Function Points

  19. Software Quality Metrics

    Science.gov (United States)

    1991-07-01

    March 1979, pp. 121-128. Gorla, Narasimhaiah, Alan C. Benander, and Barbara A. Benander, "Debugging Effort Estimation Using Software Metrics", IEEE...Society, IEEE Guide for the Use of IEEE Standard Dictionary of Measures to Produce Reliable Software, IEEE Std 982.2-1988, June 1989. Jones, Capers

  20. Software Engineering Education Directory

    Science.gov (United States)

    1989-02-01

    The C Programming Language by Kernighan, Brian W. and Ritchie, Dennis M. Compilers: C Computers: NCR Tower 32/600 running UNIX System V...Sun Microsystems, Ada Eiffel 3B2) Software Testing CS 429 U P E O 10 Textbooks: Software Testing Techniques by Beizer, Boris Systems

  1. Marketing Mix del Software.

    Directory of Open Access Journals (Sweden)

    Yudith del Carmen Rodríguez Pérez

    2006-03-01

    Por ello, en este trabajo se define el concepto de producto software, se caracteriza al mismo y se exponen sus atributos de calidad. Además, se aborda la mezcla de marketing del software necesaria y diferente a la de otros productos para que este triunfe en el mercado.

  2. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  3. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  4. Threats to Bitcoin Software

    OpenAIRE

    Kateraas, Christian H

    2014-01-01

    Collect and analyse threat models to the Bitcoin ecosystem and its software. The create misuse case, attack trees, and sequence diagrams of the threats. Create a malicious client from the gathered threat models. Once the development of the client is complete, test the client and evaluate its performance. From this, assess the security of the Bitcoin software.

  5. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  6. Cactus: Software Priorities

    Science.gov (United States)

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  7. Selecting the Right Software.

    Science.gov (United States)

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  8. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  9. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  10. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  11. Software Requirements Management

    Directory of Open Access Journals (Sweden)

    Ali Altalbe

    2015-04-01

    Full Text Available Requirements are defined as the desired set of characteristics of a product or a service. In the world of software development, it is estimated that more than half of the failures are attributed towards poor requirements management. This means that although the software functions correctly, it is not what the client requested. Modern software requirements management methodologies are available to reduce the occur-rence of such incidents. This paper performs a review on the available literature in the area while tabulating possible methods of managing requirements. It also highlights the benefits of following a proper guideline for the requirements management task. With the introduction of specific software tools for the requirements management task, better software products are now been developed with lesser resources.

  12. Software licenses: Stay honest!

    CERN Document Server

    Computer Security Team

    2012-01-01

    Do you recall our article about copyright violation in the last issue of the CERN Bulletin, “Music, videos and the risk for CERN”? Now let’s be more precise. “Violating copyright” not only means the illegal download of music and videos, it also applies to software packages and applications.   Users must respect proprietary rights in compliance with the CERN Computing Rules (OC5). Not having legitimately obtained a program or the required licenses to run that software is not a minor offense. It violates CERN rules and puts the Organization at risk! Vendors deserve credit and compensation. Therefore, make sure that you have the right to use their software. In other words, you have bought the software via legitimate channels and use a valid and honestly obtained license. This also applies to “Shareware” and software under open licenses, which might also come with a cost. Usually, only “Freeware” is complete...

  13. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  14. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  15. DIVERSIFICATION IN SOFTWARE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Er.Kirtesh Jailia,

    2010-06-01

    Full Text Available In this paper we examine the factors that have promoted the iversification of software process models. The intention is to understand more clearly the problem-solving process in software engineering & try to find out the efficient way to manage the risk. A review of software process modeling is given first, followed by a discussion of process evaluation techniques. A taxonomy for categorizing process models, based on establishing decision criteria,is identified that can guide selecting the appropriate model from a set of alternatives on the basis of model characteristics and software project needs. We are proposing a model in this paper, for dealing with the diversification in software engineering.

  16. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  17. AUTOMATED SOFTWARE DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    J.J. Strasheim

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Automated distribution of computer software via electronic means in large corporate networks is growing in popularity. The relative importance of personal computer software, in financial and logistical terms, is described and the developing need for automated software distribution explained. An actual comparitive example of alternative software distribution strategies is presented and discussed proving the viability of Electronic Software Distribution.

    AFRIKAANSE OPSOMMING: Geoutomatiseerde verspreiding van rekenaarprogrammatuur met behulp van elektroniese metodes in groot korporatiewe netwerke, is toenemend populer, Die relatiewe belangrikheid van persoonlike rekenaarprogrammatuur in finansiele en logistieke terme word bespreek en die groeiende behoefte na geoutomatiseerde programmatuurverspreiding verduidelik. 'n Werklike vergelykende voorbeeld van alternatiewe programmatuurverspreidingsstrategiee word aangebied en bespreek wat die lewensvatbaarheid van Elektroniese Programmatuurverspreiding bewys.

  18. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  19. Data structure and software engineering challenges and improvements

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Data structure and software engineering is an integral part of computer science. This volume presents new approaches and methods to knowledge sharing, brain mapping, data integration, and data storage. The author describes how to manage an organization's business process and domain data and presents new software and hardware testing methods. The book introduces a game development framework used as a learning aid in a software engineering at the university level. It also features a review of social software engineering metrics and methods for processing business information. It explains how to

  20. Scientific Software Component Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  1. Holographic Software for Quantum Networks

    CERN Document Server

    Jaffe, Arthur; Wozniakowski, Alex

    2016-01-01

    We introduce diagrammatic protocols and holographic software for quantum information. We give a dictionary to translate between diagrammatic protocols and the usual algebraic protocols. In particular we describe the intuitive diagrammatic protocol for teleportation. We introduce the string Fourier transform $\\mathfrak{F}_{s}$ in quantum information, which gives a topological quantum computer. We explain why the string Fourier transform maps the zero particle state to the multiple-qudit resource state, which maximizes the entanglement entropy. We give a protocol to construct this $n$-qudit resource state $|Max \\rangle$, which uses minimal cost. We study Pauli $X,Y,Z$ matrices, and their relation with diagrammatic protocols. This work provides bridges between the new theory of planar para algebras and quantum information, especially in questions involving communication in quantum networks.

  2. Biochemical software: Carbohydrates on Laboratory

    Directory of Open Access Journals (Sweden)

    D.N. Heidrich

    2005-07-01

    Full Text Available Educators around  the  world  are  being  challenged  to  develop  and  design  better and  more  effective strategies for student learning  using a variety  of modern  resources.  In this  present  work, an educa- tional  hypermedia  software  was constructed as a support tool to biochemistry teaching.  Occurrence, structure, main  characteristics and  biological  function  of the  biomolecule  Carbohydrates were pre- sented  through  modules.  The  software was developed  using concept  maps,  ISIS-Draw,  and  FLASH- MX animation program.  The chapter  Carbohydrates on Laboratory illustrates experimental methods of carbohydrates characterization, through  animation of a laboratory scenery.   The  subject was de- veloped showing reactions  as Bial, Benedict, Selliwanoff, Barfoed, Phenol  Sulphuric,  and Iodines, and also enzymatic  reactions  as glucose oxidase and amylase.  There are also links with short texts  in order to help the understanding of the contents  and principles of laboratory practice  as well as background reactions. Application of the software to undergraduate students and high school teachers  showed an excellent  acceptance.   All of them  considered  the  software  a very good learning  tool.  Both  teachers and students welcomed this program  as it is more flexible, and allows the learning in a more individual rhythm. In addition, application of the software would be suitable  to a more effective learning  and it is less expensive than conventional experimental teaching.

  3. Design and Implementation of World Map Changzhou Mobile Software Products based on Intelligent Terminal%面向智能终端的“天地图·常州”移动端软件产品设计与实现

    Institute of Scientific and Technical Information of China (English)

    宋法奇; 刘波; 蔡勇

    2016-01-01

    “天地图·常州”移动端地图软件产品包括 iPad、iPhone、Android平板电脑3个版本,涵盖 iOS、Android 2大平台,支持离线地图与本地缓存,并提供 iOS、Flex二次开发包。软件聚合了路线规划、团购、房产等生活化信息服务。系统服务器端部署在常州云计算应用服务中心。%“Tianditu changzhou”mobile map software products support iOS and Android platforms including three versions of the iPad,iPhone,and Android Tablet PC.It provides offline map,local cache and SDK of iOS and Flex.The products integrate life information services of route planning,group buy,and estate.Servers deploy in cloud computing service center in Changzhou.

  4. Software for producing trichromatic images in astronomy

    CERN Document Server

    Morel, S; Morel, Sebastien; Davoust, Emmanuel

    1995-01-01

    We present a software package for combining three monochromatic images of an astronomical object into a trichromatic color image. We first discuss the meaning of "true" colors in astronomical images. We then describe the different steps of our method, choosing the relevant dynamic intensity range in each filter, inventorying the different colors, optimizing the color map, modifying the balance of colors, and enhancing contrasts at low intensity levels. While the first steps are automatic, the last two are interactive.

  5. Characterization of Morphology using MAMA Software

    Energy Technology Data Exchange (ETDEWEB)

    Gravelle, Julie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-02

    The MAMA (Morphological Analysis for Material Attribution) software was developed at the Los Alamos National Laboratory funded through the National Technical Nuclear Forensics Center in the Department of Homeland Security. The software allows images to be analysed and quantified. The largest project I worked on was to quantify images of plutonium oxides and ammonium diuranates prepared by the group with the software and provide analyses on the particles of each sample. Images were quantified through MAMA, with a color analysis, a lexicon description and powder x-ray diffraction. Through this we were able to visually see a difference between some of the syntheses. An additional project was to revise the manual for MAMA to help streamline training and provide useful tips to users to more quickly become acclimated to using the software. The third project investigated expanding the scope of MAMA and finding a statistically relevant baseline for the particulates through the analysis of maps in the software and using known measurements to compare the error associated with the software. During this internship, I worked on several different projects dealing with the MAMA software. The revision of the usermanual for the MAMA software was the first project I was able to work and collaborate on. I first learned how to use the software by getting instruction from a skilled user at the laboratory, Dan Schwartz, and by using the existing user manual and examples. After becoming accustomed to the program, I started to go over the manual to correct and change items that were not as useful or descriptive as they could have been. I also added in tips that I learned as I explored the software. The updated manual was also worked on by several others who have been developing the program. The goal of these revisions was to ensure the most concise and simple directions to the software were available to future users. By incorporating tricks and shortcuts that I discovered and picked up

  6. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobsen, Jan; Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities...... in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs...... and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models...

  7. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobson, Jan; Kuhrmann, Marco; Münch, Jürgen

    Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities...... in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs...... and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models...

  8. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  9. Management systems and software.

    Science.gov (United States)

    Levin, R P

    2001-02-01

    To ensure that your software optimizes your practice management systems, design systems that allow you and your team to achieve your goals and provide high levels of quality dentistry and customer service to your patients. Then use your current software system or purchase a new practice management software program that will allow your practice to operate within the guidelines of the systems which you have established. You can be certain that taking these steps will allow you to practice dentistry with maximum profitability and minimum stress for the remainder of your career.

  10. Advanced fingerprint verification software

    Science.gov (United States)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  11. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  12. CNEOST Control Software System

    Science.gov (United States)

    Wang, X.; Zhao, H. B.; Xia, Y.; Lu, H.; Li, B.

    2015-03-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the message passing mechanism via WebSocket protocol, and improves its flexibility, expansibility, and scalability. The user interface with responsive web design realizes the remote operating under both desktop and mobile devices. The stable operating of software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  13. Speakeasy software development

    Science.gov (United States)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  14. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  15. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  16. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  17. Calidad de componentes software

    OpenAIRE

    2010-01-01

    En los últimos años se constata una tendencia creciente por parte de las organizaciones a desarrollar sus sistemas software mediante la combinación de componentes, en lugar de desarrollar dichos sistemas partiendo de cero. Esta tendencia es debida a varios factores. Entre ellos cabe destacar: la necesidad de las organizaciones de reducir los costes y el tiempo dedicados al desarrollo de los sistemas software; el crecimiento del mercado de componentes software; la reducción de la distancia ent...

  18. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  19. Leveraging software architectures to guide and verify the development of sense/compute/control applications

    DEFF Research Database (Denmark)

    Cassou, Damien; Balland, Emilie; Consel, Charles;

    2011-01-01

    A software architecture describes the structure of a computing system by specifying software components and their interactions. Mapping a software architecture to an implementation is a well known challenge. A key element of this mapping is the architecture’s description of the data and control-f...... verifications. We instantiate our approach in an architecture description language for Sense/Compute/Control applications, and describe associated compilation and verification strategies....

  20. Hail Size Distribution Mapping

    Science.gov (United States)

    2008-01-01

    A 3-D weather radar visualization software program was developed and implemented as part of an experimental Launch Pad 39 Hail Monitor System. 3DRadPlot, a radar plotting program, is one of several software modules that form building blocks of the hail data processing and analysis system (the complete software processing system under development). The spatial and temporal mapping algorithms were originally developed through research at the University of Central Florida, funded by NASA s Tropical Rainfall Measurement Mission (TRMM), where the goal was to merge National Weather Service (NWS) Next-Generation Weather Radar (NEXRAD) volume reflectivity data with drop size distribution data acquired from a cluster of raindrop disdrometers. In this current work, we adapted these algorithms to process data from a cluster of hail disdrometers positioned around Launch Pads 39A or 39B, along with the corresponding NWS radar data. Radar data from all NWS NEXRAD sites is archived at the National Climatic Data Center (NCDC). That data can be readily accessed at . 3DRadPlot plots Level III reflectivity data at four scan elevations (this software is available at Open Channel Software, ). By using spatial and temporal interpolation/extrapolation based on hydrometeor fall dynamics, we can merge the hail disdrometer array data coupled with local Weather Surveillance Radar-1988, Doppler (WSR-88D) radial velocity and reflectivity data into a 4-D (3-D space and time) picture of hail size distributions. Hail flux maps can then be generated and used for damage prediction and assessment over specific surfaces corresponding to structures within the disdrometer array volume. Immediately following a hail storm, specific damage areas and degree of damage can be identified for inspection crews.

  1. Planetary Geologic Mapping Handbook - 2009

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete

  2. Is there more valuable information in PWI datasets for a voxel-wise acute ischemic stroke tissue outcome prediction than what is represented by typical perfusion maps?

    Science.gov (United States)

    Forkert, Nils Daniel; Siemonsen, Susanne; Dalski, Michael; Verleger, Tobias; Kemmling, Andre; Fiehler, Jens

    2014-03-01

    The acute ischemic stroke is a leading cause for death and disability in the industry nations. In case of a present acute ischemic stroke, the prediction of the future tissue outcome is of high interest for the clinicians as it can be used to support therapy decision making. Within this context, it has already been shown that the voxel-wise multi-parametric tissue outcome prediction leads to more promising results compared to single channel perfusion map thresholding. Most previously published multi-parametric predictions employ information from perfusion maps derived from perfusion-weighted MRI together with other image sequences such as diffusion-weighted MRI. However, it remains unclear if the typically calculated perfusion maps used for this purpose really include all valuable information from the PWI dataset for an optimal tissue outcome prediction. To investigate this problem in more detail, two different methods to predict tissue outcome using a k-nearest-neighbor approach were developed in this work and evaluated based on 18 datasets of acute stroke patients with known tissue outcome. The first method integrates apparent diffusion coefficient and perfusion parameter (Tmax, MTT, CBV, CBF) information for the voxel-wise prediction, while the second method employs also apparent diffusion coefficient information but the complete perfusion information in terms of the voxel-wise residue functions instead of the perfusion parameter maps for the voxel-wise prediction. Overall, the comparison of the results of the two prediction methods for the 18 patients using a leave-one-out cross validation revealed no considerable differences. Quantitatively, the parameter-based prediction of tissue outcome led to a mean Dice coefficient of 0.474, while the prediction using the residue functions led to a mean Dice coefficient of 0.461. Thus, it may be concluded from the results of this study that the perfusion parameter maps typically derived from PWI datasets include all

  3. "IBSAR" Software 4.0

    Directory of Open Access Journals (Sweden)

    2004-06-01

    Full Text Available A review for Arabic software entitled "IBSAR" software assigned to help blinds in usage of the computer, the software pronounces the commands and the contents of screens and applications browsed by users, this review includes general introduction about the software, the components and commands of the software , system requirements , and its functions with Windows operating system and Microsoft Word.

  4. A Novel Software Evolution Model Based on Software Networks

    Science.gov (United States)

    Pan, Weifeng; Li, Bing; Ma, Yutao; Liu, Jing

    Many published papers analyzed the forming mechanisms and evolution laws of OO software systems from software reuse, software pattern, etc. There, however, have been fewer models so far merely built on the software components such as methods, classes, etc. and their interactions. In this paper, a novel Software Evolution Model based on Software Networks (called SEM-SN) is proposed. It uses software network at class level to represent software systems, and uses software network’s dynamical generating process to simulate activities in real software development process such as new classes’ dynamical creations and their dynamical interactions with already existing classes. It also introduces the concept of node/edge ageing to describe the decaying of classes with time. Empirical results on eight open-source Object-Oriented (OO) software systems demonstrate that SCM-SN roughly describes the evolution process of software systems and the emergence of their complex network characteristics.

  5. Map Projection

    CERN Document Server

    Ghaderpour, Ebrahim

    2014-01-01

    In this paper, we introduce some known map projections from a model of the Earth to a flat sheet of paper or map and derive the plotting equations for these projections. The first fundamental form and the Gaussian fundamental quantities are defined and applied to obtain the plotting equations and distortions in length, shape and size for some of these map projections.

  6. Project Portfolio Management Software

    OpenAIRE

    Paul POCATILU

    2006-01-01

    In order to design a methodology for the development of project portfolio management (PPM) applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  7. Project Portfolio Management Software

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2006-01-01

    Full Text Available In order to design a methodology for the development of project portfolio management (PPM applications, the existing applications have to be studied. This paper describes the main characteristics of the leading project portfolio management software applications.

  8. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...... række danske software-, elektronik- og IT-virksomheder....

  9. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  10. ACS: ALMA Common Software

    Science.gov (United States)

    Chiozzi, Gianluca; Šekoranja, Matej

    2013-02-01

    ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.

  11. Spreadsheet Auditing Software

    CERN Document Server

    Nixon, David

    2010-01-01

    It is now widely accepted that errors in spreadsheets are both common and potentially dangerous. Further research has taken place to investigate how frequently these errors occur, what impact they have, how the risk of spreadsheet errors can be reduced by following spreadsheet design guidelines and methodologies, and how effective auditing of a spreadsheet is in the detection of these errors. However, little research exists to establish the usefulness of software tools in the auditing of spreadsheets. This paper documents and tests office software tools designed to assist in the audit of spreadsheets. The test was designed to identify the success of software tools in detecting different types of errors, to identify how the software tools assist the auditor and to determine the usefulness of the tools.

  12. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...... række danske software-, elektronik- og IT-virksomheder....

  13. Software For Genetic Algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  14. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  15. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  16. Collaborative software development

    NARCIS (Netherlands)

    Jonge, M. de; Visser, E.; Visser, J.M.W.

    2001-01-01

    We present an approach to collaborative software development where obtaining components and contributing components across organizational boundaries are explicit phases in the development process. A lightweight generative infrastructure supports this approach with an online package base, and several

  17. Core Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The mission of the CFS project is to provide reusable software in support of human space exploration programs.   The top-level technical approach to...

  18. TMT common software update

    Science.gov (United States)

    Gillies, Kim; Brighton, Allan; Buur, Hanne

    2016-08-01

    TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their functional roles in the software system. TMT CSW has recently passed its preliminary design review. The unique features of CSW include its use of multiple, open-source products as the basis for services, and an approach that works to reduce the amount of CSW-provided infrastructure code. Considerable prototyping was completed during this phase to mitigate risk with results that demonstrate the validity of this design approach and the selected service implementation products. This paper describes the latest design of TMT CSW, key features, and results from the prototyping effort.

  19. Advanced Software Protection Now

    CERN Document Server

    Bendersky, Diego; Notarfrancesco, Luciano; Sarraute, Carlos; Waissbein, Ariel

    2010-01-01

    Software digital rights management is a pressing need for the software development industry which remains, as no practical solutions have been acclamaimed succesful by the industry. We introduce a novel software-protection method, fully implemented with today's technologies, that provides traitor tracing and license enforcement and requires no additional hardware nor inter-connectivity. Our work benefits from the use of secure triggers, a cryptographic primitive that is secure assuming the existence of an ind-cpa secure block cipher. Using our framework, developers may insert license checks and fingerprints, and obfuscate the code using secure triggers. As a result, this rises the cost that software analysis tools have detect and modify protection mechanisms. Thus rising the complexity of cracking this system.

  20. Banking Software Applications Security

    Directory of Open Access Journals (Sweden)

    Ioan Alexandru Bubu

    2015-03-01

    Full Text Available Computer software products are among the most complex artifacts, if not the most complex artifacts mankind has created. Securing those artifacts against intelligent attackers who try to exploit flaws in software design and construct is a great challenge too.The purpose of this paper is to introduce a secure alternative to banking software applications that are currently in use. This new application aims to cover most of the well-known vulnerabilities that plague the majority of current software.First we will take a quick look at current security methods that are in use, and a few known vulnerabilities. After this, we will discuss the security measures implemented in my application, and finally, we will the results of implementing them.

  1. Software citation principles

    Directory of Open Access Journals (Sweden)

    Arfon M. Smith

    2016-09-01

    Full Text Available Software is a critical part of modern research and yet there is little support across the scholarly ecosystem for its acknowledgement and citation. Inspired by the activities of the FORCE11 working group focused on data citation, this document summarizes the recommendations of the FORCE11 Software Citation Working Group and its activities between June 2015 and April 2016. Based on a review of existing community practices, the goal of the working group was to produce a consolidated set of citation principles that may encourage broad adoption of a consistent policy for software citation across disciplines and venues. Our work is presented here as a set of software citation principles, a discussion of the motivations for developing the principles, reviews of existing community practice, and a discussion of the requirements these principles would place upon different stakeholders. Working examples and possible technical solutions for how these principles can be implemented will be discussed in a separate paper.

  2. eSoftwareList

    Data.gov (United States)

    US Agency for International Development — USAID Software Database reporting tool created in Oracle Application Express (APEX). This version provides read only access to a database view of the JIRA SAR...

  3. Astronomical Software Directory Service

    Science.gov (United States)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  4. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  5. Biological Imaging Software Tools

    Science.gov (United States)

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  6. Engineering and Software Engineering

    Science.gov (United States)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  7. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  8. Towards research on software cybernetics

    OpenAIRE

    2002-01-01

    Software cybernetics is a newly proposed area in software engineering. It makes better use of the interplay between control theory/engineering and software engineering. In this paper, we look into the research potentials of this emerging area.

  9. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  10. Extending value stream mapping through waste definition beyond customer perspective

    OpenAIRE

    Khurum, Mahvish; Petersen, Kai; Gorschek, Tony

    2014-01-01

    Value Stream Mapping is one of the several Lean practices, which has recently attracted interest in the software engineering community. In other contexts (such as military, health, production), Value Stream Mapping has achieved considerable improvements in processes and products. The goal is to also leverage on these benefits in the software intensive product development context. The primary contribution is that we are extending the definition of waste to fit in the software intensive product...

  11. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  12. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  13. Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian

    2014-05-01

    With the upcoming launch of the next generation of hyperspectral satellites that will routinely deliver high spectral resolution images for the entire globe (e.g. EnMAP, HISUI, HyspIRI, HypXIM, PRISMA), an increasing demand for the availability/accessibility of hyperspectral soil products is coming from the geoscience community. Indeed, many robust methods for the prediction of soil properties based on imaging spectroscopy already exist and have been successfully used for a wide range of soil mapping airborne applications. Nevertheless, these methods require expert know-how and fine-tuning, which makes them used sparingly. More developments are needed toward easy-to-access soil toolboxes as a major step toward the operational use of hyperspectral soil products for Earth's surface processes monitoring and modelling, to allow non-experienced users to obtain new information based on non-expensive software packages where repeatability of the results is an important prerequisite. In this frame, based on the EU-FP7 EUFAR (European Facility for Airborne Research) project and EnMAP satellite science program, higher performing soil algorithms were developed at the GFZ German Research Center for Geosciences as demonstrators for end-to-end processing chains with harmonized quality measures. The algorithms were built-in into the HYSOMA (Hyperspectral SOil MApper) software interface, providing an experimental platform for soil mapping applications of hyperspectral imagery that gives the choice of multiple algorithms for each soil parameter. The software interface focuses on fully automatic generation of semi-quantitative soil maps such as soil moisture, soil organic matter, iron oxide, clay content, and carbonate content. Additionally, a field calibration option calculates fully quantitative soil maps provided ground truth soil data are available. Implemented soil algorithms have been tested and validated using extensive in-situ ground truth data sets. The source of the HYSOMA

  14. Ultrasonic Sensor Based 3D Mapping & Localization

    Directory of Open Access Journals (Sweden)

    Shadman Fahim Ahmad

    2016-04-01

    Full Text Available This article provides a basic level introduction to 3D mapping using sonar sensors and localization. It describes the methods used to construct a low-cost autonomous robot along with the hardware and software used as well as an insight to the background of autonomous robotic 3D mapping and localization. We have also given an overview to what the future prospects of the robot may hold in 3D based mapping.

  15. Technology Map-based Study on Key Industrial Cluster Prediction:Case Study of Fujian Software Industry%基于技术地图的产业重点集群预测研究*--以福建省软件产业为例

    Institute of Scientific and Technical Information of China (English)

    邹德林; 张文德

    2015-01-01

    The paper collects software industry patent information in 2001-2013, constructs cluster prediction model, analyzes the relationship between technology and market in industry cluster by using linear regression, factor analysis and PCA, and forms a tech-nology map. The result shows that various techniques in Fujian software industry cluster impact each other, eventually forms the indus-try development trend which is given priority to digital communications, and visual communications, telephone communications as the auxiliary in technology, pays equal attention to information processing and information transmission in product function.%收集了2001-2013年福建省软件产业的专利信息,构建集群预测模型。运用线性回归、因子分析、主成分分析的方法分析了产业集群中技术与市场之间的关系,并根据分析结果形成技术地图。结果表明:福建省软件产业集群中各项技术之间相互影响,最终在技术上形成以数字通信为主,视像通信、电话通信为辅,产品功能上信息内容处理与信息传输处理并重的产业发展趋势。

  16. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management of distr......Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  17. The Ettention software package

    Energy Technology Data Exchange (ETDEWEB)

    Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)

    2016-02-15

    We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.

  18. Software Activation Using Multithreading

    Directory of Open Access Journals (Sweden)

    Jianrui Zhang

    2012-11-01

    Full Text Available Software activation is an anti-piracy technology designed to verify that software products have been legitimately licensed. Activation should be quick and simple while simultaneously being secure and protecting customer privacy. The most common form of software activation is for the user to enter a legitimate product serial number. However, software activation based on serial numbers appears to be weak, since cracks for many programs are readily available on the Internet. Users can employ such cracks to bypass software activation.Serial number verification logic usually executes sequentially in a single thread. Such an approach is relatively easy to break since attackers can trace the code to understand how the logic works. In this paper, we develop a practical multi-threaded verification design. Our results show that by proper use of multi-threading, the amount of traceable code in a debugger can be reduced to a low percentage of the total and the traceable code in each run can differ as well. This makes it significantly more difficult for an attacker to reverse engineer the code as a means of bypassing a security check. Finally, we attempt to quantify the increased effort needed to break our verification logic.

  19. Concept Maps powered by computer software: a strategy for enhancing reading comprehension in English for Specific Purposes Mapas conceituais mediados pelo computador: uma estratégia para aumentar o nível de compreensão escrita em inglês para fins específicos

    Directory of Open Access Journals (Sweden)

    Reinildes Dias

    2011-01-01

    Full Text Available This paper focuses on the procedures of an action-research (STRINGER, 2007 that was conducted with undergraduates enrolled in an ESP course at Faculdade de Letras (UFMG. The impelling drive was the creation of a means to solve an educational problem, namely, the enhancement of students' reading comprehension of texts in English for academic purposes. The problem-solving process involved the use of concept maps (NOVAK; CAÑAS, 2008 powered by the CMap Tools software (CAÑAS et al., 2004 to meet the educational needs of a localized teaching situation. Data indicate that concept mapping, facilitated by computer software, can be a useful strategy to improve comprehension. Support for the investigation comes from the theories underlying the ESP approach, meaningful learning, learning as a social enterprise, and collaborative learning.Este artigo enfoca os procedimentos de uma pesquisa-ação (STRINGER, 2007 implementada em uma turma de alunos de graduação de um curso de inglês instrumental na Faculdade de Letras (UFMG. O impulso motivador foi a tentativa de criação de um meio para resolver um problema educativo localizado: o aumento do nível de compreensão de textos em inglês para fins acadêmicos. O processo de busca de uma possível solução envolveu o uso de mapas conceituais (NOVAK; CAÑAS, 2008, mediados pelo programa de computador CMap Tools (CAÑAS et al., 2004. Os dados indicam que a criação de mapas conceituais, empoderada pelo uso do CMap Tools, pode ser uma estratégia útil para aumentar o nível de compreensão escrita dos alunos. A investigação teve como suporte teórico os princípios da abordagem de inglês para fins específicos, aprendizagem significativa, aprendizagem como um ato social e aprendizagem colaborativa.

  20. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  1. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  2. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  3. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  4. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  5. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  6. Map projections and the Internet: Chapter 4

    Science.gov (United States)

    Kessler, Fritz; Battersby, Sarah E.; Finn, Michael P.; Clarke, Keith

    2017-01-01

    The field of map projections can be described as mathematical, static, and challenging. However, this description is evolving in concert with the development of the Internet. The Internet has enabled new outlets for software applications, learning, and interaction with and about map projections . This chapter examines specific ways in which the Internet has moved map projections from a relatively obscure paper-based setting to a more engaging and accessible online environment. After a brief overview of map projections, this chapter discusses four perspectives on how map projections have been integrated into the Internet. First, map projections and their role in web maps and mapping services is examined. Second, an overview of online atlases and the map projections chosen for their maps is presented. Third, new programming languages and code libraries that enable map projections to be included in mapping applications are reviewed. Fourth, the Internet has facilitated map projection education and research especially with the map reader’s comprehension and understanding of complex topics like map projection distortion is discussed.

  7. libdrdc: software standards library

    Science.gov (United States)

    Erickson, David; Peng, Tie

    2008-04-01

    This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.

  8. Lecture 2: Software Security

    CERN Document Server

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  9. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  10. Secure software practices among Malaysian software practitioners: An exploratory study

    Science.gov (United States)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  12. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  13. Mapping Deeply

    OpenAIRE

    Denis Wood

    2015-01-01

    This is a description of an avant la lettre deep mapping project carried out by a geographer and a number of landscape architecture students in the early 1980s. Although humanists seem to take the “mapping” in deep mapping more metaphorically than cartographically, in this neighborhood mapping project, the mapmaking was taken literally, with the goal of producing an atlas of the neighborhood. In this, the neighborhood was construed as a transformer, turning the stuff of the world (gas, wate...

  14. Processeringsoptimering med Canons software

    DEFF Research Database (Denmark)

    Precht, Helle

    2009-01-01

    . Muligheder i software optimering blev studeret i relation til optimal billedkvalitet og kontrol optagelser, for at undersøge om det var muligt at acceptere diagnostisk billedkvalitet og derved tage afsæt i ALARA. Metode og materialer Et kvantitativt eksperimentelt studie baseret på forsøg med teknisk og...... humant fantom. CD Rad fantom anvendes som teknisk fantom, hvor billederne blev analyseret med CD Rad software, og resultatet var en objektiv IQF værdi. Det humane fantom var et lamme pelvis med femur, der via NRPB’ er sammenlignelig med absorptionen ved et femårigt barn. De humane forsøgsbilleder blev...

  15. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  16. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  17. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  18. Software Safety and Security

    CERN Document Server

    Nipkow, T; Hauptmann, B

    2012-01-01

    Recent decades have seen major advances in methods and tools for checking the safety and security of software systems. Automatic tools can now detect security flaws not only in programs of the order of a million lines of code, but also in high-level protocol descriptions. There has also been something of a breakthrough in the area of operating system verification. This book presents the lectures from the NATO Advanced Study Institute on Tools for Analysis and Verification of Software Safety and Security; a summer school held at Bayrischzell, Germany, in 2011. This Advanced Study Institute was

  19. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  20. Software product quality control

    CERN Document Server

    Wagner, Stefan

    2013-01-01

    Quality is not a fixed or universal property of software; it depends on the context and goals of its stakeholders. Hence, when you want to develop a high-quality software system, the first step must be a clear and precise specification of quality. Yet even if you get it right and complete, you can be sure that it will become invalid over time. So the only solution is continuous quality control: the steady and explicit evaluation of a product's properties with respect to its updated quality goals.This book guides you in setting up and running continuous quality control in your environment. Star

  1. TIA Software User's Manual

    Science.gov (United States)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  2. Workstation software framework

    Science.gov (United States)

    Andolfato, L.; Karban, R.

    2008-08-01

    The Workstation Software Framework (WSF) is a state machine model driven development toolkit designed to generate event driven applications based on ESO VLT software. State machine models are used to generate executables. The toolkit provides versatile code generation options and it supports Mealy, Moore and hierarchical state machines. Generated code is readable and maintainable since it combines well known design patterns such as the State and the Template patterns. WSF promotes a development process that is based on model reusability through the creation of a catalog of state machine patterns.

  3. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  4. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  5. INTEGRATING CRM SOFTWARE APPLICATIONS

    OpenAIRE

    2008-01-01

    Scientists, end users of CRM applications and producers of CRM software, all come to an agreement when talking about the idea of CRM, the CRM strategy or the term CRM. The main aspect is that CRM can be analyzed from two different points of view: CRM – the marketing strategy and CRM – the software. The first term refers to establishing some personalized relationships with the customers that can be afterwards easily managed. This way, it can determine at any time the past client relations, the...

  6. Future Trends of Software Technology and Applications: Software Architecture

    Science.gov (United States)

    2006-01-01

    Sponsored by the U.S. Department of Defense © 2006 by Carnegie Mellon University 1 Pittsburgh, PA 15213-3890 Future Trends of Software Technology ...COVERED 00-00-2006 to 00-00-2006 4. TITLE AND SUBTITLE Future Trends of Software Technology and Applications: Software Architecture 5a. CONTRACT...and Applications: Software Architecture Paul Clements Software Engineering Institute Carnegie Mellon University Report Documentation Page Form

  7. Polynomial mappings

    CERN Document Server

    Narkiewicz, Wŀadysŀaw

    1995-01-01

    The book deals with certain algebraic and arithmetical questions concerning polynomial mappings in one or several variables. Algebraic properties of the ring Int(R) of polynomials mapping a given ring R into itself are presented in the first part, starting with classical results of Polya, Ostrowski and Skolem. The second part deals with fully invariant sets of polynomial mappings F in one or several variables, i.e. sets X satisfying F(X)=X . This includes in particular a study of cyclic points of such mappings in the case of rings of algebrai integers. The text contains several exercises and a list of open problems.

  8. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  9. Photovoltaics software package. Simulation, design and calculation software for photovoltaics; Softwarepaket Photovoltaik. Simulations-, Auslegungs- und Berechnungsprogramme fuer die Photovoltaik

    Energy Technology Data Exchange (ETDEWEB)

    Haas, Rudolf; Weinreich, Bernhard

    2007-07-01

    The software package comprises simulation, design and calculation tools: Professional configuration of photovoltaic systems; Design and optimization of PV systems and components; 3D visualization of shading situations; Economic efficiency and profit calculations; Software status replort; Measuring technology for characteristics, insolation, infrared radiation, etc.; Databases for modules, inverters and supports; Insolation maps for Germany dating back to 1998; Check lists: Site, diemensioning, comparison of systems, etc.; Useful addresses, bibliography, manufacturers; Other renewable energy sources, and much more. (orig.)

  10. Mapping Information

    Data.gov (United States)

    Department of Homeland Security — ArcGIS is a system that provides an integrated collection of GIS software products that provides a standards-based platform for spatial analysis, data management,...

  11. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  12. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  13. Methodology and software for georeferencing vineyards

    Directory of Open Access Journals (Sweden)

    Fialho Flávio Bello

    2016-01-01

    Full Text Available An agricultural registry is a collection of information about production area and yield of agricultural properties in a region or designated area. It allows to measure agricultural production and its spatial distribution, characterize rural structure, facilitate inspection and development of agricultural policies, optimize distribution of agricultural credit, estimate crop yield and generate research data. A key component for a quality registry is accurate measurement of areas and their geographical position, through georeferencing, to allow integration with other spatial information. The Vineyard Registry of Rio Grande do Sul is one of the most complete agricultural registries in Brazil. It has been carried out in all grape producing properties in the state since 1995, and its georeferencing began in 2005, with the objective of accurately map vineyards. Embrapa has developed a methodology to accelerate georeferencing, by simplifying the field mapping process. One of the central points of this methodology was the development of a software called MapaGPS to organize and classify measured points in the field. Recently, this software has been improved, with the incorporation of features, such as transformation between coordinate systems, conversion between files of different formats, and more control over generated charts. The georeferencing experience of the Vineyard Registry of Rio Grande do Sul may be used throughout Brazil and other countries. The software is available under a free license, and there are no restrictions to adopting the methodology. This document aims to disclose details of this methodology and how it may be used to facilitate zoning projects worldwide.

  14. Evaluation & Optimization of Software Engineering

    Directory of Open Access Journals (Sweden)

    Asaduzzaman Noman

    2016-06-01

    Full Text Available The term is made of two words, software and engineering. Software is more than just a program code. A program is an executable code, which serves some computational purpose. Software is considered to be collection of executable programming code, associated libraries and documentations. Software, when made for a specific requirement is called software product. Engineering on the other hand, is all about developing products, using well-defined, scientific principles and methods. The outcome of software engineering is an efficient and reliable software product. IEEE defines software engineering as: The application of a systematic, disciplined, quantifiable approach to the development, operation and maintenance of software; that is, the application of engineering to software.

  15. Software for multistate analysis

    Directory of Open Access Journals (Sweden)

    Frans J. Willekens

    2014-08-01

    Full Text Available Background: The growing interest in pathways, the increased availability of life-history data,innovations in statistical and demographic techniques, and advances in softwaretechnology have stimulated the development of software packages for multistatemodeling of life histories. Objective: In the paper we list and briefly discuss several software packages for multistate analysisof life-history data. The packages cover the estimation of multistate models (transitionrates and transition probabilities, multistate life tables, multistate populationprojections, and microsimulation. Methods: Brief description of software packages in a historical and comparative perspective. Results: During the past 10 years the advances in multistate modeling software have beenimpressive. New computational tools accompany the development of new methods instatistics and demography. The statistical theory of counting processes is the preferredmethod for the estimation of multistate models and R is the preferred programmingplatform. Conclusions: Innovations in method, data, and computer technology have removed the traditionalbarriers to multistate modeling of life histories and the computation of informative lifecourseindicators. The challenge ahead of us is to model and predict individual lifehistories.

  16. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  17. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  18. Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Krishna, S.; Bjørn, Pernille

    2013-01-01

    accounts of close collaboration processes in two large and complex projects, where off-shoring of software development is moved to a strategic level, we found that the vendor was able to establish a strategic partnership through long-term engagement with the field of banking and insurance as well...

  19. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  20. AOFlagger: RFI Software

    Science.gov (United States)

    Offringa, A. R.

    2010-10-01

    The RFI software presented here can automatically flag data and can be used to analyze the data in a measurement. The purpose of flagging is to mark samples that are affected by interfering sources such as radio stations, airplanes, electrical fences or other transmitting interferers. The tools in the package are meant for offline use. The software package contains a graphical interface ("rfigui") that can be used to visualize a measurement set and analyze mitigation techniques. It also contains a console flagger ("rficonsole") that can execute a script of mitigation functions without the overhead of a graphical environment. All tools were written in C++. The software has been tested extensively on low radio frequencies (150 MHz or lower) produced by the WSRT and LOFAR telescopes. LOFAR is the Low Frequency Array that is built in and around the Netherlands. Higher frequencies should work as well. Some of the methods implemented are the SumThreshold, the VarThreshold and the singular value decomposition (SVD) method. Included also are several surface fitting algorithms. The software is published under the GNU General Public License version 3.

  1. Writing testable software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Knirk, D. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  2. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  3. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  4. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...

  5. Complexity, Systems, and Software

    Science.gov (United States)

    2014-08-14

    complex ( Hidden issues; dumbs down operator) 11 Complexity, Systems, and Software Sarah Sheard August 14, 2014 © 2014 Carnegie...August 14, 2014 © 2014 Carnegie Mellon University Addressing Complexity in SoSs Source: SEBOK Wiki System Con truer Strateglc Context

  6. Green Software Products

    NARCIS (Netherlands)

    Jagroep, Erik Arijender

    2017-01-01

    The rising energy consumption of the ICT industry has triggered a quest for more green, energy efficient ICT solutions. The role of software as the true consumer of power and its potential contribution to reach sustainability goals has increasingly been acknowledged. At the same time, it is shown to

  7. The FARE Software

    Science.gov (United States)

    Pitarello, Adriana

    2015-01-01

    This article highlights the importance of immediate corrective feedback in tutorial software for language teaching in an academic learning environment. We aim to demonstrate that, rather than simply reporting on the performance of the foreign language learner, this feedback can act as a mediator of students' cognitive and metacognitive activity.…

  8. JSATS Decoder Software Manual

    Energy Technology Data Exchange (ETDEWEB)

    Flory, Adam E.; Lamarche, Brian L.; Weiland, Mark A.

    2013-05-01

    The Juvenile Salmon Acoustic Telemetry System (JSATS) Decoder is a software application that converts a digitized acoustic signal (a waveform stored in the .bwm file format) into a list of potential JSATS Acoustic MicroTransmitter (AMT) tagcodes along with other data about the signal including time of arrival and signal to noise ratios (SNR). This software is capable of decoding single files, directories, and viewing raw acoustic waveforms. When coupled with the JSATS Detector, the Decoder is capable of decoding in ‘real-time’ and can also provide statistical information about acoustic beacons placed within receive range of hydrophones within a JSATS array. This document details the features and functionality of the software. The document begins with software installation instructions (section 2), followed in order by instructions for decoder setup (section 3), decoding process initiation (section 4), then monitoring of beacons (section 5) using real-time decoding features. The last section in the manual describes the beacon, beacon statistics, and the results file formats. This document does not consider the raw binary waveform file format.

  9. Software Architecture Evolution

    Science.gov (United States)

    2013-12-01

    Clinical Linguistics & Phonetics 16(5): 299–316. doi:10.1080/ 02699200210135901 [162] G. C. Murphy, D. Notkin, K. J. Sullivan (2001). “Software...same connector in the model. One way of understanding such examples is as instances of a very common linguistic phenomenon called synecdoche, in which a

  10. High Assurance Software

    Science.gov (United States)

    2013-10-22

    Conference on Software Engineering. 2013. Fingerprinting Malware Using Bioinformatics Tools Building a Classifier for the Zeus Virus, Pedersen, J...file is a nucleotide representation of the original artifact file. This process can be reversed to obtain the original file from the FASTA file

  11. Limits of Software Reuse

    NARCIS (Netherlands)

    Holenderski, L.

    2006-01-01

    Software reuse is considered one of the main techniques to increasesoftware productivity. We present two simple mathematical argumentsthat show some theoretical limits of reuse. It turns out that the increase of productivity due to internal reuse is at most linear, farfrom the needed exponential gr

  12. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  13. Generalized Software Security Framework

    Directory of Open Access Journals (Sweden)

    Smriti Jain

    2011-01-01

    Full Text Available Security of information has become a major concern in today's digitized world. As a result, effective techniques to secure information are required. The most effective way is to incorporate security in the development process itself thereby resulting into secured product. In this paper, we propose a framework that enables security to be included in the software development process. The framework consists of three layers namely; control layer, aspect layer and development layer. The control layer illustrates the managerial control of the entire software development process with the help of governance whereas aspect layer recognizes the security mechanisms that can be incorporated during the software development to identify the various security features. The development layer helps to integrate the various security aspects as well as the controls identified in the above layers during the development process. The layers are further verified by a survey amongst the IT professionals. The professionals concluded that the developed framework is easy to use due to its layered architecture and, can be customized for various types of softwares.

  14. Iterative software kernels

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  15. Natural-color maps via coloring of bivariate grid data

    Science.gov (United States)

    Darbyshire, Jane E.; Jenny, Bernhard

    2017-09-01

    Natural ground color is useful for maps where a representation of the Earth's surface matters. Natural color schemes are less likely to be misinterpreted, as opposed to hypsometric color schemes, and are generally preferred by map readers. The creation of natural-color maps was once limited to manual cartographic techniques, but they can now be created digitally with the aid of raster graphics editing software. However, the creation of natural-color maps still requires many steps, a significant time investment, and fairly detailed digital land cover information, which makes this technique impossible to apply to global web maps at medium and large scales. A particular challenge for natural-color map creation is adjusting colors with location to create smoothly blending transitions. Adjustments with location are required to show land cover transitions between climate zones with a natural appearance. This study takes the first step in automating the process in order to facilitate the creation of medium- and large-scale natural-color maps covering large areas. A coloring method based on two grid inputs is presented. Here, we introduce an algorithmic method and prototype software for creating maps with this technique. The prototype software allows the map author to interactively assign colors to design the appearance of the map. This software can generate web map tiles at a global level for medium and large scales. Example natural-color web maps created with this coloring technique are provided.

  16. Profiling a Mind Map User: A Descriptive Appraisal

    Science.gov (United States)

    Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.

    2010-01-01

    Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…

  17. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    Science.gov (United States)

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  18. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  19. Software redundancy: what, where, how

    OpenAIRE

    Mattavelli, Andrea; Pezzè, Mauro; Carzaniga, Antonio

    2017-01-01

    Software systems have become pervasive in everyday life and are the core component of many crucial activities. An inadequate level of reliability may determine the commercial failure of a software product. Still, despite the commitment and the rigorous verification processes employed by developers, software is deployed with faults. To increase the reliability of software systems, researchers have investigated the use of various form of redundancy. Informally, a software system is redunda...

  20. Numerical software: science or alchemy

    Energy Technology Data Exchange (ETDEWEB)

    Gear, C.W.

    1979-06-01

    This is a summary of the Forsythe lecture presented at the Computer Science Conference, Dayton, Ohio, in February 1979. It examines the activity called Numerical Software, first to see what distinguishes numerical software from any other form of software and why numerical software is so much more difficult. Then it examines the scientific basis of such software and discusses that is lacking in that basis.