WorldWideScience

Sample records for postacquisition software-based scatter

  1. Post-Acquisition IT Integration

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Yetton, Philip

    2013-01-01

    The extant research on post-acquisition IT integration analyzes how acquirers realize IT-based value in individual acquisitions. However, serial acquirers make 60% of acquisitions. These acquisitions are not isolated events, but are components in growth-by-acquisition programs. To explain how...... serial acquirers realize IT-based value, we develop three propositions on the sequential effects on post-acquisition IT integration in acquisition programs. Their combined explanation is that serial acquirers must have a growth-by-acquisition strategy that includes the capability to improve...... IT integration capabilities, to sustain high alignment across acquisitions and to maintain a scalable IT infrastructure with a flat or decreasing cost structure. We begin the process of validating the three propositions by investigating a longitudinal case study of a growth-by-acquisition program....

  2. Industry Relatedness and Post-Acquisition Innovative Performance

    DEFF Research Database (Denmark)

    Cefis, Elena; Marsili, Orietta; Rigamonti, Damiana

    2015-01-01

    This paper examines how characteristics of acquiring and acquired firms influence the curvilinear (inverted U-shaped) relationship between relatedness and post-acquisition innovative performance. Using a relatedness index based on industry co-occurrence in a sample of 1,736 Dutch acquisitions, we...... find that acquirer's internal R&D and acquisition experience, and the small size of acquired firms, help to reach a balance between exploration of novelty and exploitation of synergies in unrelated acquisitions, and to achieve higher post-acquisition performance. However, while the acquirer's R......&D increases flexibility in the acquisition process in presence of deviations from the optimal level of relatedness, acquisition experience may enhance rigidities....

  3. Knowledge-sharing Behavior and Post-acquisition Integration Failure

    DEFF Research Database (Denmark)

    Gammelgaard, Jens; Husted, Kenneth; Michailova, Snejina

    2004-01-01

    AbstractNot achieving the anticipated synergy effects in the post-acquisition integration context is a serious causefor the high acquisition failure rate. While existing studies on failures of acquisitions exist fromeconomics, finance, strategy, organization theory, and human resources management......, this paper appliesinsights from the knowledge-sharing literature. The paper establishes a conceptual link between obstaclesin the post-acquisition integration processes and individual knowledge-sharing behavior as related toknowledge transmitters and knowledge receivers. We argue that such an angle offers...... important insights toexplaining the high failure rate in acquisitions.Descriptors: post-acquisition integration, acquisition failure, individual knowledge-sharing behavior...

  4. Post-acquisition Integration as Sensemaking: Glimpses of Ambiguity, Confusion, Hypocrisy, and Politicization

    OpenAIRE

    Vaara, Eero

    2003-01-01

    Though many studies have examined post-acquisition integration challenges, they have mainly focused on rationalistic explanations for the difficulties encountered in post-acquisition integration. There remains little knowledge of how the ‘irrational’ features of post-acquisition decision-making may impede organizational integration. This study attempts to bridge that gap by examining post-acquisition decision-making from a sensemaking perspective. The paper presents an in-depth analysis of a ...

  5. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  6. A software-based x-ray scatter correction method for breast tomosynthesis

    OpenAIRE

    Jia Feng, Steve Si; Sechopoulos, Ioannis

    2011-01-01

    Purpose: To develop a software-based scatter correction method for digital breast tomosynthesis (DBT) imaging and investigate its impact on the image quality of tomosynthesis reconstructions of both phantoms and patients.

  7. The Impact of the Dimensions of Transformational Leadership on the Post-acquisition Performance of the Acquired Company

    Directory of Open Access Journals (Sweden)

    Sladjana Savovic

    2017-08-01

    Full Text Available Mergers and acquisitions (M&A are the important mechanisms through which companies can achieve growth, gain access to new markets and diversify their activities. Although companies engage themselves in M&As with optimism, empirical evidence shows that many M&A transactions are not successful. Therefore, research is often focused on the identification of the ways to improve post-acquisition performance. One of the key success factors of M&A is to provide adequate transformational leadership during the process of change, especially in the critical phase of the post-acquisition integration. A transformational leader should provide incentives and support to the employees in order for them to accept changes and focus on achieving challenging goals. This paper explores the impact of the different dimensions of transformational leadership on the post-acquisition performance based on the example of a company operating in the Republic of Serbia’s retail sector, which was the subject of a cross-border acquisition. In order to ensure the adequate representativeness of the sample, a questionnaire was distributed in all parts of the company throughout the Republic of Serbia. The results of this study show that all the dimensions of transformational leadership positively impact post-acquisition performance. The “individual consideration” dimension of transformational leadership has the strongest impact on post-acquisition performance, whereas the “intellectual simulation” dimension has the weakest.

  8. Technological Similarity, Post-acquisition R&D Reorganization, and Innovation Performance in Horizontal Acquisition

    DEFF Research Database (Denmark)

    Colombo, Massimo G.; Rabbiosi, Larissa

    2014-01-01

    This paper aims to disentangle the mechanisms through which technological similarity between acquiring and acquired firms influences innovation in horizontal acquisitions. We develop a theoretical model that links technological similarity to: (i) two key aspects of post-acquisition reorganization...... of acquired R&D operations – the rationalization of the R&D operations and the replacement of the R&D top manager, and (ii) two intermediate effects that are closely associated with the post-acquisition innovation performance of the combined firm – improvements in R&D productivity and disruptions in R......&D personnel. We rely on PLS techniques to test our theoretical model using detailed information on 31 horizontal acquisitions in high- and medium-tech industries. Our results indicate that in horizontal acquisitions, technological similarity negatively affects post-acquisition innovation performance...

  9. A software-based x-ray scatter correction method for breast tomosynthesis

    International Nuclear Information System (INIS)

    Jia Feng, Steve Si; Sechopoulos, Ioannis

    2011-01-01

    Purpose: To develop a software-based scatter correction method for digital breast tomosynthesis (DBT) imaging and investigate its impact on the image quality of tomosynthesis reconstructions of both phantoms and patients. Methods: A Monte Carlo (MC) simulation of x-ray scatter, with geometry matching that of the cranio-caudal (CC) view of a DBT clinical prototype, was developed using the Geant4 toolkit and used to generate maps of the scatter-to-primary ratio (SPR) of a number of homogeneous standard-shaped breasts of varying sizes. Dimension-matched SPR maps were then deformed and registered to DBT acquisition projections, allowing for the estimation of the primary x-ray signal acquired by the imaging system. Noise filtering of the estimated projections was then performed to reduce the impact of the quantum noise of the x-ray scatter. Three dimensional (3D) reconstruction was then performed using the maximum likelihood-expectation maximization (MLEM) method. This process was tested on acquisitions of a heterogeneous 50/50 adipose/glandular tomosynthesis phantom with embedded masses, fibers, and microcalcifications and on acquisitions of patients. The image quality of the reconstructions of the scatter-corrected and uncorrected projections was analyzed by studying the signal-difference-to-noise ratio (SDNR), the integral of the signal in each mass lesion (integrated mass signal, IMS), and the modulation transfer function (MTF). Results: The reconstructions of the scatter-corrected projections demonstrated superior image quality. The SDNR of masses embedded in a 5 cm thick tomosynthesis phantom improved 60%-66%, while the SDNR of the smallest mass in an 8 cm thick phantom improved by 59% (p < 0.01). The IMS of the masses in the 5 cm thick phantom also improved by 15%-29%, while the IMS of the masses in the 8 cm thick phantom improved by 26%-62% (p < 0.01). Some embedded microcalcifications in the tomosynthesis phantoms were visible only in the scatter

  10. Comparison of four software packages applied to a scattering problem

    DEFF Research Database (Denmark)

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...

  11. Software correction of scatter coincidence in positron CT

    International Nuclear Information System (INIS)

    Endo, M.; Iinuma, T.A.

    1984-01-01

    This paper describes a software correction of scatter coincidence in positron CT which is based on an estimation of scatter projections from true projections by an integral transform. Kernels for the integral transform are projected distributions of scatter coincidences for a line source at different positions in a water phantom and are calculated by Klein-Nishina's formula. True projections of any composite object can be determined from measured projections by iterative applications of the integral transform. The correction method was tested in computer simulations and phantom experiments with Positologica. The results showed that effects of scatter coincidence are not negligible in the quantitation of images, but the correction reduces them significantly. (orig.)

  12. The importance of cultural leadership during post-acquisition integration

    OpenAIRE

    Mcconnon, Tom

    2013-01-01

    Mergers and acquisitions (M&A) are not only financial decisions but can also be understood as social processes. Due to the myriad of changes generated by an acquisition, the integration period is characterised by multiple adjustment difficulties. A substantive body of research blames post-acquisition ‘cultural clash’ caused by cultural differences between the two merging organisations as a major cause of disappointing integration outcomes. Yet research into the process of cultural leadership ...

  13. SIMSAS - a window based software package for simulation and analysis of multiple small-angle scattering data

    International Nuclear Information System (INIS)

    Jayaswal, B.; Mazumder, S.

    1998-09-01

    Small-angle scattering data from strong scattering systems, e.g. porous materials, cannot be analysed invoking single scattering approximation as specimen needed to replicate the bulk matrix in essential properties are too thick to validate the approximation. The presence of multiple scattering is indicated by invalidity of the functional invariance property of the observed scattering profile with variation of sample thickness and/or wave length of the probing radiation. This article delineates how non accounting of multiple scattering affects the results of analysis and then how to correct the data for its effect. It deals with an algorithm to extract single scattering profile from small-angle scattering data affected by multiple scattering. The algorithm can process the scattering data and deduce single scattering profile in absolute scale. A software package, SIMSAS, is introduced for executing this inversion step. This package is useful both to simulate and to analyse multiple small-angle scattering data. (author)

  14. Scatter radiation in digital tomosynthesis of the breast

    International Nuclear Information System (INIS)

    Sechopoulos, Ioannis; Suryanarayanan, Sankararaman; Vedantham, Srinivasan; D'Orsi, Carl J.; Karellas, Andrew

    2007-01-01

    Digital tomosynthesis of the breast is being investigated as one possible solution to the problem of tissue superposition present in planar mammography. This imaging technique presents various advantages that would make it a feasible replacement for planar mammography, among them similar, if not lower, radiation glandular dose to the breast; implementation on conventional digital mammography technology via relatively simple modifications; and fast acquisition time. One significant problem that tomosynthesis of the breast must overcome, however, is the reduction of x-ray scatter inclusion in the projection images. In tomosynthesis, due to the projection geometry and radiation dose considerations, the use of an antiscatter grid presents several challenges. Therefore, the use of postacquisition software-based scatter reduction algorithms seems well justified, requiring a comprehensive evaluation of x-ray scatter content in the tomosynthesis projections. This study aims to gain insight into the behavior of x-ray scatter in tomosynthesis by characterizing the scatter point spread functions (PSFs) and the scatter to primary ratio (SPR) maps found in tomosynthesis of the breast. This characterization was performed using Monte Carlo simulations, based on the Geant4 toolkit, that simulate the conditions present in a digital tomosynthesis system, including the simulation of the compressed breast in both the cranio-caudal (CC) and the medio-lateral oblique (MLO) views. The variation of the scatter PSF with varying tomosynthesis projection angle, as well as the effects of varying breast glandular fraction and x-ray spectrum, was analyzed. The behavior of the SPR for different projection angle, breast size, thickness, glandular fraction, and x-ray spectrum was also analyzed, and computer fit equations for the magnitude of the SPR at the center of mass for both the CC and the MLO views were found. Within mammographic energies, the x-ray spectrum was found to have no appreciable

  15. Optimizing hippocampal segmentation in infants utilizing MRI post-acquisition processing.

    Science.gov (United States)

    Thompson, Deanne K; Ahmadzai, Zohra M; Wood, Stephen J; Inder, Terrie E; Warfield, Simon K; Doyle, Lex W; Egan, Gary F

    2012-04-01

    This study aims to determine the most reliable method for infant hippocampal segmentation by comparing magnetic resonance (MR) imaging post-acquisition processing techniques: contrast to noise ratio (CNR) enhancement, or reformatting to standard orientation. MR scans were performed with a 1.5 T GE scanner to obtain dual echo T2 and proton density (PD) images at term equivalent (38-42 weeks' gestational age). 15 hippocampi were manually traced four times on ten infant images by 2 independent raters on the original T2 image, as well as images processed by: a) combining T2 and PD images (T2-PD) to enhance CNR; then b) reformatting T2-PD images perpendicular to the long axis of the left hippocampus. CNRs and intraclass correlation coefficients (ICC) were calculated. T2-PD images had 17% higher CNR (15.2) than T2 images (12.6). Original T2 volumes' ICC was 0.87 for rater 1 and 0.84 for rater 2, whereas T2-PD images' ICC was 0.95 for rater 1 and 0.87 for rater 2. Reliability of hippocampal segmentation on T2-PD images was not improved by reformatting images (rater 1 ICC = 0.88, rater 2 ICC = 0.66). Post-acquisition processing can improve CNR and hence reliability of hippocampal segmentation in neonate MR scans when tissue contrast is poor. These findings may be applied to enhance boundary definition in infant segmentation for various brain structures or in any volumetric study where image contrast is sub-optimal, enabling hippocampal structure-function relationships to be explored.

  16. Cognition and Knowledge Sharing in Post-acquisition Integration

    DEFF Research Database (Denmark)

    Jaura, Manya; Michailova, Snejina

    2014-01-01

    conducted with ten respondents in four Indian IT companies that have acquired firms abroad. Findings: The authors find evidence for supporting the negative effect of in- and out-groups differentiation and the positive effect of interpersonal interaction on knowledge sharing among employees of the acquired...... of organisational objectives in a post-acquisition context. Managers should understand that the knowledge their employees possess is a strategic asset, and therefore how they use it is influential in attaining organisational goals in general, and acquisition integration objectives in particular. The creation...... of task- and project-related communities or groups can help in establishing a shared organisational identity, especially after the turbulent event of one company acquiring another one. The creation of communities or groups where socialisation is encouraged can lead to employees interacting with one...

  17. Software for simulation and design of neutron scattering instrumentation

    DEFF Research Database (Denmark)

    Bertelsen, Mads

    designed using the software. The Union components uses a new approach to simulation of samples in McStas. The properties of a sample are split into geometrical and material, simplifying user input, and allowing the construction of complicated geometries such as sample environments. Multiple scattering...... from conventional choices. Simulation of neutron scattering instrumentation is used when designing instrumentation, but also to understand instrumental effects on the measured scattering data. The Monte Carlo ray-tracing package McStas is among the most popular, capable of simulating the path of each...... neutron through the instrument using an easy to learn language. The subject of the defended thesis is contributions to the McStas language in the form of the software package guide_bot and the Union components.The guide_bot package simplifies the process of optimizing neutron guides by writing the Mc...

  18. Impact on dose and image quality of a software-based scatter correction in mammography.

    Science.gov (United States)

    Monserrat, Teresa; Prieto, Elena; Barbés, Benigno; Pina, Luis; Elizalde, Arlette; Fernández, Belén

    2017-01-01

    Background In 2014, Siemens developed a new software-based scatter correction (Progressive Reconstruction Intelligently Minimizing Exposure [PRIME]), enabling grid-less digital mammography. Purpose To compare doses and image quality between PRIME (grid-less) and standard (with anti-scatter grid) modes. Material and Methods Contrast-to-noise ratio (CNR) was measured for various polymethylmethacrylate (PMMA) thicknesses and dose values provided by the mammograph were recorded. CDMAM phantom images were acquired for various PMMA thicknesses and inverse Image Quality Figure (IQF inv ) was calculated. Values of incident entrance surface air kerma (ESAK) and average glandular dose (AGD) were obtained from the DICOM header for a total of 1088 pairs of clinical cases. Two experienced radiologists compared subjectively the image quality of a total of 149 pairs of clinical cases. Results CNR values were higher and doses were lower in PRIME mode for all thicknesses. IQF inv values in PRIME mode were lower for all thicknesses except for 40 mm of PMMA equivalent, in which IQF inv was slightly greater in PRIME mode. A mean reduction of 10% in ESAK and 12% in AGD in PRIME mode with respect to standard mode was obtained. The clinical image quality in PRIME and standard acquisitions resulted to be similar in most of the cases (84% for the first radiologist and 67% for the second one). Conclusion The use of PRIME software reduces, in average, the dose of radiation to the breast without affecting image quality. This reduction is greater for thinner and denser breasts.

  19. Post-Acquisition Release of Glutamate and Norepinephrine in the Amygdala Is Involved in Taste-Aversion Memory Consolidation

    Science.gov (United States)

    Guzman-Ramos, Kioko; Osorio-Gomez, Daniel; Moreno-Castilla, Perla; Bermudez-Rattoni, Federico

    2012-01-01

    Amygdala activity mediates the acquisition and consolidation of emotional experiences; we have recently shown that post-acquisition reactivation of this structure is necessary for the long-term storage of conditioned taste aversion (CTA). However, the specific neurotransmitters involved in such reactivation are not known. The aim of the present…

  20. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  1. Post-acquisition data mining techniques for LC-MS/MS-acquired data in drug metabolite identification.

    Science.gov (United States)

    Dhurjad, Pooja Sukhdev; Marothu, Vamsi Krishna; Rathod, Rajeshwari

    2017-08-01

    Metabolite identification is a crucial part of the drug discovery process. LC-MS/MS-based metabolite identification has gained widespread use, but the data acquired by the LC-MS/MS instrument is complex, and thus the interpretation of data becomes troublesome. Fortunately, advancements in data mining techniques have simplified the process of data interpretation with improved mass accuracy and provide a potentially selective, sensitive, accurate and comprehensive way for metabolite identification. In this review, we have discussed the targeted (extracted ion chromatogram, mass defect filter, product ion filter, neutral loss filter and isotope pattern filter) and untargeted (control sample comparison, background subtraction and metabolomic approaches) post-acquisition data mining techniques, which facilitate the drug metabolite identification. We have also discussed the importance of integrated data mining strategy.

  2. Software design of control system of CCD side-scatter lidar

    Science.gov (United States)

    Kuang, Zhiqiang; Liu, Dong; Deng, Qian; Zhang, Zhanye; Wang, Zhenzhu; Yu, Siqi; Tao, Zongming; Xie, Chenbo; Wang, Yingjian

    2018-03-01

    Because of the existence of blind zone and transition zone, the application of backscattering lidar in near-ground is limited. The side-scatter lidar equipped with the Charge Coupled Devices (CCD) can separate the transmitting and receiving devices to avoid the impact of the geometric factors which is exited in the backscattering lidar and, detect the more precise near-ground aerosol signals continuously. Theories of CCD side-scatter lidar and the design of control system are introduced. The visible control of laser and CCD and automatic data processing method of the side-scatter lidar are developed by using the software of Visual C #. The results which are compared with the calibration of the atmospheric aerosol lidar data show that signals from the CCD side- scatter lidar are convincible.

  3. A customizable software for fast reduction and analysis of large X-ray scattering data sets: applications of the new DPDAK package to small-angle X-ray scattering and grazing-incidence small-angle X-ray scattering.

    Science.gov (United States)

    Benecke, Gunthard; Wagermaier, Wolfgang; Li, Chenghao; Schwartzkopf, Matthias; Flucke, Gero; Hoerth, Rebecca; Zizak, Ivo; Burghammer, Manfred; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Trebbin, Martin; Förster, Stephan; Paris, Oskar; Roth, Stephan V; Fratzl, Peter

    2014-10-01

    X-ray scattering experiments at synchrotron sources are characterized by large and constantly increasing amounts of data. The great number of files generated during a synchrotron experiment is often a limiting factor in the analysis of the data, since appropriate software is rarely available to perform fast and tailored data processing. Furthermore, it is often necessary to perform online data reduction and analysis during the experiment in order to interactively optimize experimental design. This article presents an open-source software package developed to process large amounts of data from synchrotron scattering experiments. These data reduction processes involve calibration and correction of raw data, one- or two-dimensional integration, as well as fitting and further analysis of the data, including the extraction of certain parameters. The software, DPDAK (directly programmable data analysis kit), is based on a plug-in structure and allows individual extension in accordance with the requirements of the user. The article demonstrates the use of DPDAK for on- and offline analysis of scanning small-angle X-ray scattering (SAXS) data on biological samples and microfluidic systems, as well as for a comprehensive analysis of grazing-incidence SAXS data. In addition to a comparison with existing software packages, the structure of DPDAK and the possibilities and limitations are discussed.

  4. Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.

    Science.gov (United States)

    Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua

    2018-02-01

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  5. Post-acquisition repetitive thought in fear conditioning: an experimental investigation of the effect of CS-US-rehearsal.

    Science.gov (United States)

    Joos, Els; Vansteenwegen, Debora; Hermans, Dirk

    2012-06-01

    Although repetitive thought (e.g., worry) is generally assumed to be a risk factor for psychopathological disorders such as anxiety disorders, the repetitive thought processes occurring after a conditioning event have not yet received much theoretical attention. However, as repetitive thought can be mimicked by (mental) rehearsal, which is well-known to enhance memory performance, it seems worthwhile to explore the role of rehearsal in conditioning. Therefore, the current study investigates the impact of rehearsing an acquired CS-US-contingency on subsequent conditioned fear responding. After acquiring two CS-US-contingencies with either a human scream or a white noise as US, participants were instructed to rehearse one of these CS-US-pairings during an experimental session as well as during the following week. Fear responding to the CS which was previously paired with the scream persisted in the participants who rehearsed the CS-US(scream)-contingency, but decreased in those participants who rehearsed the CS-US(noise)-contingency. The same pattern emerged in the US-expectancy ratings, but the effect failed to reach significance. For the CS which was paired with the noise-US, no rehearsal effect emerged. As acquisition to the noise-US was less pronounced and less robust as compared to the scream-US, claims regarding the rehearsal effect might be hampered for the CS-US(noise)-contingency. Repetitive post-acquisition activation of a CS-US-contingency impacts CR retention. As the USs were not rated as more intense, aversive or startling after rehearsal compared to post-acquisition, US-inflation is discarded as a possible explanation of this effect. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code

    International Nuclear Information System (INIS)

    Rinkel, J.; Dinten, J.M.; Tabary, J.

    2004-01-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  7. The new 'BerSANS-PC' software for reduction and treatment of small angle neutron scattering data

    International Nuclear Information System (INIS)

    Keiderling, U.

    2002-01-01

    Measurements on small angle neutron scattering (SANS) instruments are typically characterized by a large number of samples, short measurement times for the individual samples, and a frequent change of visiting scientist groups. Besides this, recent advances in instrumentation have led to more frequent measurements of kinetic sequences and a growing interest in analyzing two-dimensional scattering data, these requiring special software tools that enable the users to extract physically relevant information from the scattering data with a minimum of effort. The new 'BerSANS-PC' data-processing software has been developed at the Hahn-Meitner-Institut (HMI) in Berlin, Germany, to meet these requirements and to support an efficiently working guest-user service. Comprising some basic functions of the 'BerSANS' program available at the HMI and other institutes in the past, BerSANS-PC is a completely new development for network-independent use on local PCs with a full-feature graphical interface. (orig.)

  8. 并购后高管变更、合法性与并购绩效——基于制度理论的视角%Top Management Turnover, Legitimacy and Acquisition Performance in the Post-Acquisition Phase: An Institutional Approach

    Institute of Scientific and Technical Information of China (English)

    乐琦

    2012-01-01

    并购后的高管变更对于企业的并购绩效具有重要的影响,但目前的研究结论没有取得一致性.本文基于制度理论的视角引入合法性的概念,通过123份并购样本实证分析了并购后高管变更、合法性以及并购绩效之间的关系.研究结果显示:并购后高管变更与并购的外部合法性和内部合法性之间均存在显著的负相关;而外部合法性和内部合法性对于并购绩效具有显著的积极作用.本研究结论对于我国企业的并购后高管变更决策以及如何提升并购绩效具有理论指导意义.%Top management turnover in the post-acquisition phase has significant effect on corporate acquisition performance. However, findings of these studies are inconsistent. Most studies have been done from the perspective of enterprise resources theory, organizational learning theory, corporate culture theory and transaction cost theory to explore the relationship between post-acquisition TMT and firm performance. However, this area of research is rarely investigated from the institutional theory. The institutional theory emphasizes that each organization is embedded in its institutional environment in which it is subject to various institutional factors. Especially in China and other emerging markets, these institutional factors may have direct or indirect effect on corporate strategic behavior and performance. In addition, corporate different strategic choices and behaviors will result in different degrees of legitimacy of these institutional factors. The current study argues that the social recognition obtained from all the stakeholders represent the legitimacy of the corporate behavior or the firm itself. Different groups of stakeholders judge the legitimacy of a firm based different perspectives and interests. Different strategic actions may result in different levels of legitimacy, which will further stimulate or constrain corporate strategic behavior and thus affect the

  9. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  10. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  11. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  12. GPIB based instrumentation and control system for ADITYA Thomson Scattering Diagnostic

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Kiran, E-mail: kkpatel@ipr.res.in; Pillai, Vishal; Singh, Neha; Chaudhary, Vishnu; Thomas, Jinto; Kumar, Ajai

    2016-11-15

    The ADITYA Thomson Scattering Diagnostic is a single point Ruby laser based system with a spectrometer for spectral dispersion and photomultiplier tubes for the detection of scattered light. The system uses CAMAC (Computer Automated Measurement And Control) based control and data acquisition system, which synchronizes the Ruby laser, detectors and the digitizer. Previously used serial based CAMAC controller is upgraded to GPIB (General Purpose Interface Bus) based CAMAC controller for configuration and data transfer. The communication protocols for different instruments are converted to a single GPIB based for better interface. The entire control and data acquisition program is developed on LabVIEW platform for versatile operation of diagnostics with improved user friendly GUI (Graphical User Interfaces) and allows user to remotely update the laser firing time with respect to the plasma shot. The software is in handshake with the Tokamak main control program through network to minimize manual interventions for the operation of the diagnostics. The upgraded system improved the performance of the diagnostics in comparison to earlier in terms of better data transmission rate, easy to maintain and program is upgradable.

  13. GPIB based instrumentation and control system for ADITYA Thomson Scattering Diagnostic

    International Nuclear Information System (INIS)

    Patel, Kiran; Pillai, Vishal; Singh, Neha; Chaudhary, Vishnu; Thomas, Jinto; Kumar, Ajai

    2016-01-01

    The ADITYA Thomson Scattering Diagnostic is a single point Ruby laser based system with a spectrometer for spectral dispersion and photomultiplier tubes for the detection of scattered light. The system uses CAMAC (Computer Automated Measurement And Control) based control and data acquisition system, which synchronizes the Ruby laser, detectors and the digitizer. Previously used serial based CAMAC controller is upgraded to GPIB (General Purpose Interface Bus) based CAMAC controller for configuration and data transfer. The communication protocols for different instruments are converted to a single GPIB based for better interface. The entire control and data acquisition program is developed on LabVIEW platform for versatile operation of diagnostics with improved user friendly GUI (Graphical User Interfaces) and allows user to remotely update the laser firing time with respect to the plasma shot. The software is in handshake with the Tokamak main control program through network to minimize manual interventions for the operation of the diagnostics. The upgraded system improved the performance of the diagnostics in comparison to earlier in terms of better data transmission rate, easy to maintain and program is upgradable.

  14. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  15. Simulation on scattering features of biological tissue based on generated refractive-index model

    International Nuclear Information System (INIS)

    Wang Baoyong; Ding Zhihua

    2011-01-01

    Important information on morphology of biological tissue can be deduced from elastic scattering spectra, and their analyses are based on the known refractive-index model of tissue. In this paper, a new numerical refractive-index model is put forward, and its scattering properties are intensively studied. Spectral decomposition [1] is a widely used method to generate random medium in geology, but it is never used in biology. Biological tissue is different from geology in the sense of random medium. Autocorrelation function describe almost all of features in geology, but biological tissue is not as random as geology, its structure is regular in the sense of fractal geometry [2] , and fractal dimension can be used to describe its regularity under random. Firstly scattering theories of this fractal media are reviewed. Secondly the detailed generation process of refractive-index is presented. Finally the scattering features are simulated in FDTD (Finite Difference Time Domain) Solutions software. From the simulation results, we find that autocorrelation length and fractal dimension controls scattering feature of biological tissue.

  16. ORNL-SAS: Versatile software for calculation of small-angle x-ray and neutron scattering intensity profiles from arbitrary structures

    International Nuclear Information System (INIS)

    Heller, William T; Tjioe, Elina

    2007-01-01

    ORNL-SAS is software for calculating solution small-angle scattering intensity profiles from any structure provided in the Protein Data Bank format and can also compare the results with experimental data

  17. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  18. Library based x-ray scatter correction for dedicated cone beam breast CT

    International Nuclear Information System (INIS)

    Shi, Linxi; Zhu, Lei; Vedantham, Srinivasan; Karellas, Andrew

    2016-01-01

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal

  19. Library based x-ray scatter correction for dedicated cone beam breast CT

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Linxi; Zhu, Lei, E-mail: leizhu@gatech.edu [Nuclear and Radiological Engineering and Medical Physics Programs, The George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Vedantham, Srinivasan; Karellas, Andrew [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States)

    2016-08-15

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal

  20. Crowdsourcing cloud-based software development

    CERN Document Server

    Li, Wei; Tsai, Wei-Tek; Wu, Wenjun

    2015-01-01

    This book presents the latest research on the software crowdsourcing approach to develop large and complex software in a cloud-based platform. It develops the fundamental principles, management organization and processes, and a cloud-based infrastructure to support this new software development approach. The book examines a variety of issues in software crowdsourcing processes, including software quality, costs, diversity of solutions, and the competitive nature of crowdsourcing processes. Furthermore, the book outlines a research roadmap of this emerging field, including all the key technology and management issues for the foreseeable future. Crowdsourcing, as demonstrated by Wikipedia and Facebook for online web applications, has shown promising results for a variety of applications, including healthcare, business, gold mining exploration, education, and software development. Software crowdsourcing is emerging as a promising solution to designing, developing and maintaining software. Preliminary software cr...

  1. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  2. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  3. Towards Archetypes-Based Software Development

    Science.gov (United States)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  4. ITERATIVE SCATTER CORRECTION FOR GRID-LESS BEDSIDE CHEST RADIOGRAPHY: PERFORMANCE FOR A CHEST PHANTOM.

    Science.gov (United States)

    Mentrup, Detlef; Jockel, Sascha; Menser, Bernd; Neitzel, Ulrich

    2016-06-01

    The aim of this work was to experimentally compare the contrast improvement factors (CIFs) of a newly developed software-based scatter correction to the CIFs achieved by an antiscatter grid. To this end, three aluminium discs were placed in the lung, the retrocardial and the abdominal areas of a thorax phantom, and digital radiographs of the phantom were acquired both with and without a stationary grid. The contrast generated by the discs was measured in both images, and the CIFs achieved by grid usage were determined for each disc. Additionally, the non-grid images were processed with a scatter correction software. The contrasts generated by the discs were determined in the scatter-corrected images, and the corresponding CIFs were calculated. The CIFs obtained with the grid and with the software were in good agreement. In conclusion, the experiment demonstrates quantitatively that software-based scatter correction allows restoring the image contrast of a non-grid image in a manner comparable with an antiscatter grid. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  6. A method for determination mass absorption coefficient of gamma rays by Compton scattering

    International Nuclear Information System (INIS)

    El Abd, A.

    2014-01-01

    A method was proposed for determination mass absorption coefficient of gamma rays for compounds, alloys and mixtures. It is based on simulating interaction processes of gamma rays with target elements having atomic numbers from Z=1 to Z=92 using the MCSHAPE software. Intensities of Compton scattered gamma rays at saturation thicknesses and at a scattering angle of 90° were calculated for incident gamma rays of different energies. The obtained results showed that the intensity of Compton scattered gamma rays at saturations and mass absorption coefficients can be described by mathematical formulas. These were used to determine mass absorption coefficients for compound, alloys and mixtures with the knowledge of their Compton scattered intensities. The method was tested by calculating mass absorption coefficients for some compounds, alloys and mixtures. There is a good agreement between obtained results and calculated ones using WinXom software. The advantages and limitations of the method were discussed. - Highlights: • Compton scattering of γ−rays was used for determining mass absorption coefficient. • Scattered intensities were determined by the MCSHAPE software. • Mass absorption coefficients were determined for some compounds, mixtures and alloys. • Mass absorption coefficients were calculated by Winxcom software. • Good agreements were found between determined and calculated results

  7. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  8. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  9. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1997-01-01

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  10. A systematic approach for component-based software development

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis

    2000-01-01

    Component-based software development enables the construction of software artefacts by assembling prefabricated, configurable and independently evolving building blocks, called software components. This paper presents an approach for the development of component-based software artefacts. This

  11. Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS).

    Science.gov (United States)

    Perkins, Stephen J; Wright, David W; Zhang, Hailiang; Brookes, Emre H; Chen, Jianhan; Irving, Thomas C; Krueger, Susan; Barlow, David J; Edler, Karen J; Scott, David J; Terrill, Nicholas J; King, Stephen M; Butler, Paul D; Curtis, Joseph E

    2016-12-01

    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-art molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in which GenApp provides the deployment infrastructure for running applications on both standard and high-performance computing hardware, and SASSIE provides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data. GenApp produces the accessible web-based front end termed SASSIE-web , and GenApp and SASSIE also make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic 'bottlebrush' polymers.

  12. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  13. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code; Couplage d'une methode de description analytique de grilles anti diffusantes avec un logiciel de simulation de systemes radiographiques base sur un code Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Rinkel, J.; Dinten, J.M.; Tabary, J

    2004-07-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  14. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  15. A Combined Approach for Component-based Software Design

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis; Quartel, Dick; Baldoni, R.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called software components. Several approaches addressing component-based development have been proposed recently. Most of these

  16. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  17. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  18. Essence: Team-Based Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2012-01-01

    Essence is a methodology supporting innovative software teams. It is designed with agile development in mind to allow for the problem situation to talk back to the team as they go along building solutions. Traditional software development teams – and for that matter probably also development teams...... using technologies other than software – might also enjoy adapting Essence to suit their situation. Essence is not yet another method for generating ideas. There are plenty of good methods already, and for that reason I choose to focus less on idea generation and more on the thereafter. Most teams....... Essence is based on the idea that challenges are open to interpretation and choice. We may often choose how we understand a challenge and choose among several strategies for answering it. Software development and indeed software innovation are far from linear. Essence is built on structures rather than...

  19. Scattering angle-based filtering via extension in velocity

    KAUST Repository

    Kazei, Vladimir; Tessmer, Ekkehart; Alkhalifah, Tariq

    2016-01-01

    The scattering angle between the source and receiver wavefields can be utilized in full-waveform inversion (FWI) and in reverse-time migration (RTM) for regularization and quality control or to remove low frequency artifacts. The access to the scattering angle information is costly as the relation between local image features and scattering angles has non-stationary nature. For the purpose of a more efficient scattering angle information extraction, we develop techniques that utilize the simplicity of the scattering angle based filters for constantvelocity background models. We split the background velocity model into several domains with different velocity ranges, generating an

  20. Scattering angle-based filtering via extension in velocity

    KAUST Repository

    Kazei, Vladimir

    2016-09-06

    The scattering angle between the source and receiver wavefields can be utilized in full-waveform inversion (FWI) and in reverse-time migration (RTM) for regularization and quality control or to remove low frequency artifacts. The access to the scattering angle information is costly as the relation between local image features and scattering angles has non-stationary nature. For the purpose of a more efficient scattering angle information extraction, we develop techniques that utilize the simplicity of the scattering angle based filters for constantvelocity background models. We split the background velocity model into several domains with different velocity ranges, generating an

  1. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  2. Repository-based software engineering program

    Science.gov (United States)

    Wilson, James

    1992-01-01

    The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.

  3. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  4. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  5. Electromagnetic Drop Scale Scattering Modelling for Dynamic Statistical Rain Fields

    OpenAIRE

    Hipp, Susanne

    2015-01-01

    This work simulates the scattering of electromagnetic waves by a rain field. The calculations are performed for the individual drops and accumulate to a time signal dependent on the dynamic properties of the rain field. The simulations are based on the analytical Mie scattering model for spherical rain drops and the simulation software considers the rain characteristics drop size (including their distribution in rain), motion, and frequency and temperature dependent permittivity. The performe...

  6. Instrument hardware and software upgrades at IPNS

    International Nuclear Information System (INIS)

    Worlton, Thomas; Hammonds, John; Mikkelson, D.; Mikkelson, Ruth; Porter, Rodney; Tao, Julian; Chatterjee, Alok

    2006-01-01

    IPNS is in the process of upgrading their time-of-flight neutron scattering instruments with improved hardware and software. The hardware upgrades include replacing old VAX Qbus and Multibus-based data acquisition systems with new systems based on VXI and VME. Hardware upgrades also include expanded detector banks and new detector electronics. Old VAX Fortran-based data acquisition and analysis software is being replaced with new software as part of the ISAW project. ISAW is written in Java for ease of development and portability, and is now used routinely for data visualization, reduction, and analysis on all upgraded instruments. ISAW provides the ability to process and visualize the data from thousands of detector pixels, each having thousands of time channels. These operations can be done interactively through a familiar graphical user interface or automatically through simple scripts. Scripts and operators provided by end users are automatically included in the ISAW menu structure, along with those distributed with ISAW, when the application is started

  7. Diversity requirements for safety critical software-based automation systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1998-03-01

    System vendors nowadays propose software-based systems even for the most critical safety functions in nuclear power plants. Due to the nature and mechanisms of influence of software faults new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)' various safety assessment methods and tools for software based systems are developed and evaluated. This report first discusses the (common cause) failure mechanisms in software-based systems, then defines fault-tolerant system architectures to avoid common cause failures, then studies the various alternatives to apply diversity and their influence on system reliability. Finally, a method for the assessment of diversity is described. Other recently published reports in OHA-report series handles the statistical reliability assessment of software based (STUK-YTO-TR 119), usage models in reliability assessment of software-based systems (STUK-YTO-TR 128) and handling of programmable automation in plant PSA-studies (STUK-YTO-TR 129)

  8. V&V Within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.

  9. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  10. A Software Phantom : Application in Digital Tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Lazos, D; Kolitsi, Z; Badea, C; Pallikarakis, N [Medical Physics Laboratory, School of Medicine, Univercity of Patras (Greece)

    1999-12-31

    A software phantom intended to be used in radiographic applications has been developed. The application was used for research in the field of Digital Tomosynthesis and specifically for studying tomographic noise removal methods. The application consists of a phantom design and a phantom imaging module. The radiation-matter interaction is based on the exponential relation of attenuation. Projections are formed by simulated irradiation with selectable geometrical parameters, source spectrum and detector response. Phantoms are defined either as sets containing certain geometrical objects or as groups of voxels. Comparison with real projections taken from a physical phantom with identical geometry and composition with the simulated one, showed good approximation with improved contrast due to the absence of scatter in the simulated projections. The software phantom proved to be a very useful tool for DTS investigations. Further development to include scatter is expected to expand the use of the application to more areas in radiological imaging research. (author) 4 refs., 3 figs

  11. A Software Phantom : Application in Digital Tomosynthesis

    International Nuclear Information System (INIS)

    Lazos, D.; Kolitsi, Z.; Badea, C.; Pallikarakis, N.

    1998-01-01

    A software phantom intended to be used in radiographic applications has been developed. The application was used for research in the field of Digital Tomosynthesis and specifically for studying tomographic noise removal methods. The application consists of a phantom design and a phantom imaging module. The radiation-matter interaction is based on the exponential relation of attenuation. Projections are formed by simulated irradiation with selectable geometrical parameters, source spectrum and detector response. Phantoms are defined either as sets containing certain geometrical objects or as groups of voxels. Comparison with real projections taken from a physical phantom with identical geometry and composition with the simulated one, showed good approximation with improved contrast due to the absence of scatter in the simulated projections. The software phantom proved to be a very useful tool for DTS investigations. Further development to include scatter is expected to expand the use of the application to more areas in radiological imaging research. (author)

  12. Safety management of software-based equipment

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    A review of the principles of the safety of software-based equipment, this book begins by presenting the definition principles of safety objectives. It then moves on to show how it is possible to define a safety architecture (including redundancy, diversification, error-detection techniques) on the basis of safety objectives and how to identify objectives related to software programs. From software objectives, the authors present the different safety techniques (fault detection, redundancy and quality control). "Certifiable system" aspects are taken into account throughout the book. C

  13. The equivalent square concept for the head scatter factor based on scatter from flattening filter

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Siyong; Palta, Jatinder R.; Zhu, Timothy C. [Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida (United States)

    1998-06-01

    The equivalent field relationship between square and circular fields for the head scatter factor was evaluated at the source plane. The method was based on integrating the head scatter parameter for projected shaped fields in the source plane and finding a field that produced the same ratio of head scatter to primary dose on the central axis. A value of {sigma}/R{approx_equal}0.9 was obtained, where {sigma} was one-half of the side length of the equivalent square and R was the radius of the circular field. The assumptions were that the equivalent field relationship for head scatter depends primarily on the characteristics of scatter from the flattening filter, and that the differential scatter-to-primary ratio of scatter from the flattening filter decreases linearly with the radius, within the physical radius of the flattening filter. Lam and co-workers showed empirically that the area-to-perimeter ratio formula, when applied to an equivalent square formula at the flattening filter plane, gave an accurate prediction of the head scatter factor. We have analytically investigated the validity of the area-to-perimeter ratio formula. Our results support the fact that the area-to-perimeter ratio formula can also be used as the equivalent field formula for head scatter at the source plane. The equivalent field relationships for wedge and tertiary collimator scatter were also evaluated. (author)

  14. The equivalent square concept for the head scatter factor based on scatter from flattening filter

    International Nuclear Information System (INIS)

    Kim, Siyong; Palta, Jatinder R.; Zhu, Timothy C.

    1998-01-01

    The equivalent field relationship between square and circular fields for the head scatter factor was evaluated at the source plane. The method was based on integrating the head scatter parameter for projected shaped fields in the source plane and finding a field that produced the same ratio of head scatter to primary dose on the central axis. A value of σ/R≅0.9 was obtained, where σ was one-half of the side length of the equivalent square and R was the radius of the circular field. The assumptions were that the equivalent field relationship for head scatter depends primarily on the characteristics of scatter from the flattening filter, and that the differential scatter-to-primary ratio of scatter from the flattening filter decreases linearly with the radius, within the physical radius of the flattening filter. Lam and co-workers showed empirically that the area-to-perimeter ratio formula, when applied to an equivalent square formula at the flattening filter plane, gave an accurate prediction of the head scatter factor. We have analytically investigated the validity of the area-to-perimeter ratio formula. Our results support the fact that the area-to-perimeter ratio formula can also be used as the equivalent field formula for head scatter at the source plane. The equivalent field relationships for wedge and tertiary collimator scatter were also evaluated. (author)

  15. Retrieval method of aerosol extinction coefficient profile based on backscattering, side-scattering and Raman-scattering lidar

    Science.gov (United States)

    Shan, Huihui; Zhang, Hui; Liu, Junjian; Tao, Zongming; Wang, Shenhao; Ma, Xiaomin; Zhou, Pucheng; Yao, Ling; Liu, Dong; Xie, Chenbo; Wang, Yingjian

    2018-03-01

    Aerosol extinction coefficient profile is an essential parameter for atmospheric radiation model. It is difficult to get higher signal to noise ratio (SNR) of backscattering lidar from the ground to the tropopause especially in near range. Higher SNR problem can be solved by combining side-scattering and backscattering lidar. Using Raman-scattering lidar, aerosol extinction to backscatter ratio (lidar ratio) can be got. Based on side-scattering, backscattering and Raman-scattering lidar system, aerosol extinction coefficient is retrieved precisely from the earth's surface to the tropopause. Case studies show this method is reasonable and feasible.

  16. A method for determination mass absorption coefficient of gamma rays by Compton scattering.

    Science.gov (United States)

    El Abd, A

    2014-12-01

    A method was proposed for determination mass absorption coefficient of gamma rays for compounds, alloys and mixtures. It is based on simulating interaction processes of gamma rays with target elements having atomic numbers from Z=1 to Z=92 using the MCSHAPE software. Intensities of Compton scattered gamma rays at saturation thicknesses and at a scattering angle of 90° were calculated for incident gamma rays of different energies. The obtained results showed that the intensity of Compton scattered gamma rays at saturations and mass absorption coefficients can be described by mathematical formulas. These were used to determine mass absorption coefficients for compound, alloys and mixtures with the knowledge of their Compton scattered intensities. The method was tested by calculating mass absorption coefficients for some compounds, alloys and mixtures. There is a good agreement between obtained results and calculated ones using WinXom software. The advantages and limitations of the method were discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  18. Fully iterative scatter corrected digital breast tomosynthesis using GPU-based fast Monte Carlo simulation and composition ratio update

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyungsang; Ye, Jong Chul, E-mail: jong.ye@kaist.ac.kr [Bio Imaging and Signal Processing Laboratory, Department of Bio and Brain Engineering, KAIST 291, Daehak-ro, Yuseong-gu, Daejeon 34141 (Korea, Republic of); Lee, Taewon; Cho, Seungryong [Medical Imaging and Radiotherapeutics Laboratory, Department of Nuclear and Quantum Engineering, KAIST 291, Daehak-ro, Yuseong-gu, Daejeon 34141 (Korea, Republic of); Seong, Younghun; Lee, Jongha; Jang, Kwang Eun [Samsung Advanced Institute of Technology, Samsung Electronics, 130, Samsung-ro, Yeongtong-gu, Suwon-si, Gyeonggi-do, 443-803 (Korea, Republic of); Choi, Jaegu; Choi, Young Wook [Korea Electrotechnology Research Institute (KERI), 111, Hanggaul-ro, Sangnok-gu, Ansan-si, Gyeonggi-do, 426-170 (Korea, Republic of); Kim, Hak Hee; Shin, Hee Jung; Cha, Joo Hee [Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 88 Olympic-ro, 43-gil, Songpa-gu, Seoul, 138-736 (Korea, Republic of)

    2015-09-15

    Purpose: In digital breast tomosynthesis (DBT), scatter correction is highly desirable, as it improves image quality at low doses. Because the DBT detector panel is typically stationary during the source rotation, antiscatter grids are not generally compatible with DBT; thus, a software-based scatter correction is required. This work proposes a fully iterative scatter correction method that uses a novel fast Monte Carlo simulation (MCS) with a tissue-composition ratio estimation technique for DBT imaging. Methods: To apply MCS to scatter estimation, the material composition in each voxel should be known. To overcome the lack of prior accurate knowledge of tissue composition for DBT, a tissue-composition ratio is estimated based on the observation that the breast tissues are principally composed of adipose and glandular tissues. Using this approximation, the composition ratio can be estimated from the reconstructed attenuation coefficients, and the scatter distribution can then be estimated by MCS using the composition ratio. The scatter estimation and image reconstruction procedures can be performed iteratively until an acceptable accuracy is achieved. For practical use, (i) the authors have implemented a fast MCS using a graphics processing unit (GPU), (ii) the MCS is simplified to transport only x-rays in the energy range of 10–50 keV, modeling Rayleigh and Compton scattering and the photoelectric effect using the tissue-composition ratio of adipose and glandular tissues, and (iii) downsampling is used because the scatter distribution varies rather smoothly. Results: The authors have demonstrated that the proposed method can accurately estimate the scatter distribution, and that the contrast-to-noise ratio of the final reconstructed image is significantly improved. The authors validated the performance of the MCS by changing the tissue thickness, composition ratio, and x-ray energy. The authors confirmed that the tissue-composition ratio estimation was quite

  19. A Novel Rules Based Approach for Estimating Software Birthmark

    Science.gov (United States)

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  20. Modernization of tank floor scanning system (TAFLOSS) software

    International Nuclear Information System (INIS)

    Mohd Fitri Abdul Rahman; Jaafar Abdullah; Susan Maria Sipaun

    2002-01-01

    Tank Floor Scanning System (TAFLOSS) is a portable nucleonic device based on the scattering and moderation phenomena of neutrons. TAFLOSS, which was developed by MINT, can precisely and non-destructively measure the gap and hydrogen content in the foundation of a gigantic industrial tank in a practical and cost-effective manner. In recording and analysing measured data, three different computer software were used. In analysing the initial data, a Disk Operating System (DOS) based software called MesTank 3.0 have been developed. The system also used commercial software such as Table Curve 2D and SURFER for graphics purposes. Table Curve 2D was used to plot and evaluate curve fitting, whereas SURFER software used to draw contours. It is not user friendly and time consuming to switch from a software to another software for different tasks of this system. Therefore, the main objective of the project is to develop new user-friendly software that combined the old and commercial software into a single package. The computer programming language that was used to develop the software is Microsoft Visual C++ ver. 6.0. The process of developing this software involved complex mathematical calculation, curve fitting and contour plot. This paper describes the initial development of a computer programme for analysing the initial data and plotting exponential curve fitting. (Author)

  1. Automated support for experience-based software management

    Science.gov (United States)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  2. Optimization-based scatter estimation using primary modulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yi; Ma, Jingchen; Zhao, Jun, E-mail: junzhao@sjtu.edu.cn [School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Song, Ying [Department of Radiation Oncology, West China Hospital, Sichuan University, Chengdu 610041 (China)

    2016-08-15

    Purpose: Scatter reduces the image quality in computed tomography (CT), but scatter correction remains a challenge. A previously proposed primary modulation method simultaneously obtains the primary and scatter in a single scan. However, separating the scatter and primary in primary modulation is challenging because it is an underdetermined problem. In this study, an optimization-based scatter estimation (OSE) algorithm is proposed to estimate and correct scatter. Methods: In the concept of primary modulation, the primary is modulated, but the scatter remains smooth by inserting a modulator between the x-ray source and the object. In the proposed algorithm, an objective function is designed for separating the scatter and primary. Prior knowledge is incorporated in the optimization-based framework to improve the accuracy of the estimation: (1) the primary is always positive; (2) the primary is locally smooth and the scatter is smooth; (3) the location of penumbra can be determined; and (4) the scatter-contaminated data provide knowledge about which part is smooth. Results: The simulation study shows that the edge-preserving weighting in OSE improves the estimation accuracy near the object boundary. Simulation study also demonstrates that OSE outperforms the two existing primary modulation algorithms for most regions of interest in terms of the CT number accuracy and noise. The proposed method was tested on a clinical cone beam CT, demonstrating that OSE corrects the scatter even when the modulator is not accurately registered. Conclusions: The proposed OSE algorithm improves the robustness and accuracy in scatter estimation and correction. This method is promising for scatter correction of various kinds of x-ray imaging modalities, such as x-ray radiography, cone beam CT, and the fourth-generation CT.

  3. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  4. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  5. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  6. Research on software behavior trust based on hierarchy evaluation

    Science.gov (United States)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  7. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  8. Teaching Agile Software Engineering Using Problem-Based Learning

    Science.gov (United States)

    El-Khalili, Nuha H.

    2013-01-01

    Many studies have reported the utilization of Problem-Based Learning (PBL) in teaching Software Engineering courses. However, these studies have different views of the effectiveness of PBL. This paper presents the design of an Advanced Software Engineering course for undergraduate Software Engineering students that uses PBL to teach them Agile…

  9. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  10. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  11. Property-Based Software Engineering Measurement

    Science.gov (United States)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  12. Software for ASS-500 based early warning system

    International Nuclear Information System (INIS)

    Lipinski, P.; Isajenko, K.

    1998-01-01

    The article describes the software for the management of early warning system based on ASS-500 station. The software can communicate with the central computer using TCP/IP protocol. This allows remote control of the station through modem or local area network connection. The article describes Windows based user interface of the program

  13. Movable Thomson scattering system based on optical fiber (TS-probe)

    International Nuclear Information System (INIS)

    Narihara, K.; Hayashi, H.

    2009-01-01

    This paper proposes a movable compact Thomson scattering (TS) system based on optical fibers (TS-probe). A TS-probe consists of a probe head, optical fiber, a laser-diode, polychromators and lock-in amplifiers. A laser beam optics and light collection optics are mounted rigidly on a probe head with a fixed scattering position. Laser light and scattered light are transmitted by flexible optical fibers, enabling us to move the TS-prove head freely during plasma discharge. The light signal scattered from an amplitude-modulated laser is detected against the plasma light based on the principle of the lock-in amplifier. With a modulated laser power of 300W, the scattered signal from a sheet plasma of 15 mm depth and n e -10 19 m -3 will be measured with 10% accuracy by setting the integrating time to 0.1 s. The TS-probe head is like a 1/20 model of the currently operating LHD-TS. (author)

  14. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  15. Advanced methods for scattering amplitudes in gauge theories

    International Nuclear Information System (INIS)

    Peraro, Tiziano

    2014-01-01

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  16. Advanced methods for scattering amplitudes in gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Peraro, Tiziano

    2014-09-24

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  17. Electronic Health Record for Intensive Care based on Usual Windows Based Software.

    Science.gov (United States)

    Reper, Arnaud; Reper, Pascal

    2015-08-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  18. Instructional Uses of Web-Based Survey Software

    Directory of Open Access Journals (Sweden)

    Concetta A. DePaolo, Ph.D.

    2006-07-01

    Full Text Available Recent technological advances have led to changes in how instruction is delivered. Such technology can create opportunities to enhance instruction and make instructors more efficient in performing instructional tasks, especially if the technology is easy to use and requires no training. One such technology, web-based survey software, is extremely accessible for anyone with basic computer skills. Web-based survey software can be used for a variety of instructional purposes to streamline instructor tasks, as well as enhance instruction and communication with students. Following a brief overview of the technology, we discuss how Web Forms from nTreePoint can be used to conduct instructional surveys, collect course feedback, conduct peer evaluations of group work, collect completed assignments, schedule meeting times among multiple people, and aid in pedagogical research. We also discuss our experiences with these tasks within traditional on-campus courses and how they were enhanced or expedited by the use of web-based survey software.

  19. Software development for a switch-based data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Booth, A. (Superconducting Super Collider Lab., Dallas, TX (United States)); Black, D.; Walsh, D. (Fermi National Accelerator Lab., Batavia, IL (United States))

    1991-12-01

    We report on the software aspects of the development of a switch-based data acquisition system at Fermilab. This paper describes how, with the goal of providing an integrated systems engineering'' environment, several powerful software tools were put in place to facilitate extensive exploration of all aspects of the design. These tools include a simulation package, graphics package and an Expert System shell which have been integrated to provide an environment which encourages the close interaction of hardware and software engineers. This paper includes a description of the simulation, user interface, embedded software, remote procedure calls, and diagnostic software which together have enabled us to provide real-time control and monitoring of a working prototype switch-based data acquisition (DAQ) system.

  20. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  1. Interface-based software testing

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of softwar...

  2. User-friendly software for SANS data reduction and analysis

    International Nuclear Information System (INIS)

    Biemann, P.; Haese-Seiller, M.; Staron, P.

    1999-01-01

    At the Geesthacht Neutron Facility (GeNF) a new software is being developed for the reduction of two-dimensional small-angle neutron scattering (SANS) data. The main motivation for this work was to created software for users of our SANS facilities that is easy to use. Another motivation was to provide users with software they can also use at their home institute. Therefore, the software is implemented on a personal computer running WINDOWS. The program reads raw data from an area detector in binary or ascii format and produces ascii files containing the scattering curve. The cross section can be averaged over the whole area of the detector or over users defined sectors only. Scripts can be created for processing large numbers of files. (author)

  3. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  4. Larmor-precession based neutron scattering instrumentation

    International Nuclear Information System (INIS)

    Ioffe, Alexander

    2009-01-01

    The Larmor precession of the neutron spin in a magnetic field allows the attachment of a Larmor clock to every neutron. Such Larmor labelling opens the possibility for the development of unusual neutron scattering techniques, where the energy (momentum) resolution does not require the initial and final states to be well selected. This principally allows for achievement of very high energy (momentum) resolution that is not feasible at all with conventional neutron scattering techniques, because the required neutron beam monochromatization (collimation) will result in intolerable intensity losses. Such decoupling of resolution and collimation allows, for example, for a significant increase in the luminosity of small-angle scattering or high-resolution diffractometers; the fact that opens new perspectives for their implementation at middle flux neutron sources. Different kinds of Larmor clock-based instrumentation, particularly two alternative NSE techniques using rotating and time-gradient magnetic field arrangements, which can be considered as inexpensive and affordable alternatives to present day NSE techniques, will be discussed and results of simulations and first experiments will be presented. (author)

  5. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  6. Study of scattering from turbulence structure generated by propeller with FLUENT

    Science.gov (United States)

    Luo, Gen

    2017-07-01

    In this article, the turbulence structure generated by a propeller is simulated with the computational fluid dynamics (CFD) software FLUENT. With the method of moments, the backscattering radar cross sections (RCS) of the turbulence structure are calculated. The scattering results can reflect the turbulent intensity of the wave profiles. For the wake turbulence with low rotating speed, the scattering intensity of HH polarization is much smaller than VV polarization at large incident angles. When the turbulence becomes stronger with high rotating speed, the scattering intensity of HH polarization also becomes stronger at large incident angles, which is almost the same with VV polarization. And also, the bistatic scattering of the turbulence structure has the similar situation. These scattering results indicate that the turbulence structure can also give rise to an anomaly compared with traditional sea surface. The study of electromagnetic (EM) scattering from turbulence structure generated by the propeller can help in better understanding of the scattering from different kinds of waves and provide more bases to explain the anomalies of EM scattering from sea surfaces.

  7. Uji Performa Software-based Openflow Switch Berbasis Openwrt

    OpenAIRE

    Kartadie, Rikie; Suryanto, Tommy

    2015-01-01

    Perkembangan pesat Software-Defined Network telah dirasakan oleh vendor vendor besar. HP, Google dan IBM, mulai merubah pola routing-switching pada network mereka dari pola routingswitching tradisional ke pola infrastruktur routing-switching Software-defined Network. Untuk melakukan eksperimen tentang OpenFlow, para peneliti sering kali harus menggunakan perangkat hardware/dedicated switch OpenFlow yang dikeluarkan oleh beberapa vendor dengan harga yang tinggi. Kenyataannya, software-based sw...

  8. Software-based Microarchitectural Attacks

    OpenAIRE

    Gruss, Daniel

    2017-01-01

    Modern processors are highly optimized systems where every single cycle of computation time matters. Many optimizations depend on the data that is being processed. Software-based microarchitectural attacks exploit effects of these optimizations. Microarchitectural side-channel attacks leak secrets from cryptographic computations, from general purpose computations, or from the kernel. This leakage even persists across all common isolation boundaries, such as processes, containers, and virtual ...

  9. Nd:YAG Laser-Based Dual-Line Detection Rayleigh Scattering and Current Efforts on UV, Filtered Rayleigh Scattering

    Science.gov (United States)

    Otugen, M. Volkan; Popovic, Svetozar

    1996-01-01

    Ongoing research in Rayleigh scattering diagnostics for variable density low speed flow applications and for supersonic flow measurements are described. During the past several years, the focus has been on the development and use of a Nd:YAG-based Rayleigh scattering system with improved signal-to-noise characteristics and with applicability to complex, confined flows. This activity serves other research projects in the Aerodynamics Laboratory which require the non-contact, accurate, time-frozen measurement of gas density, pressure, and temperature (each separately), in a fairly wide dynamic range of each parameter. Recently, with the acquisition of a new seed-injected Nd:YAG laser, effort also has been directed to the development of a high-speed velocity probe based on a spectrally resolved Rayleigh scattering technique.

  10. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  11. Component-based development of software language engineering tools

    NARCIS (Netherlands)

    Ssanyu, J.; Hemerik, C.

    2011-01-01

    In this paper we outline how Software Language Engineering (SLE) could benefit from Component-based Software Development (CBSD) techniques and present an architecture aimed at developing a coherent set of lightweight SLE components, fitting into a general-purpose component framework. In order to

  12. A Web-Based Learning System for Software Test Professionals

    Science.gov (United States)

    Wang, Minhong; Jia, Haiyang; Sugumaran, V.; Ran, Weijia; Liao, Jian

    2011-01-01

    Fierce competition, globalization, and technology innovation have forced software companies to search for new ways to improve competitive advantage. Web-based learning is increasingly being used by software companies as an emergent approach for enhancing the skills of knowledge workers. However, the current practice of Web-based learning is…

  13. Scatter-Reducing Sounding Filtration Using a Genetic Algorithm and Mean Monthly Standard Deviation

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    Retrieval algorithms like that used by the Orbiting Carbon Observatory (OCO)-2 mission generate massive quantities of data of varying quality and reliability. A computationally efficient, simple method of labeling problematic datapoints or predicting soundings that will fail is required for basic operation, given that only 6% of the retrieved data may be operationally processed. This method automatically obtains a filter designed to reduce scatter based on a small number of input features. Most machine-learning filter construction algorithms attempt to predict error in the CO2 value. By using a surrogate goal of Mean Monthly STDEV, the goal is to reduce the retrieved CO2 scatter rather than solving the harder problem of reducing CO2 error. This lends itself to improved interpretability and performance. This software reduces the scatter of retrieved CO2 values globally based on a minimum number of input features. It can be used as a prefilter to reduce the number of soundings requested, or as a post-filter to label data quality. The use of the MMS (Mean Monthly Standard deviation) provides a much cleaner, clearer filter than the standard ABS(CO2-truth) metrics previously employed by competitor methods. The software's main strength lies in a clearer (i.e., fewer features required) filter that more efficiently reduces scatter in retrieved CO2 rather than focusing on the more complex (and easily removed) bias issues.

  14. Product-based Safety Certification for Medical Devices Embedded Software.

    Science.gov (United States)

    Neto, José Augusto; Figueiredo Damásio, Jemerson; Monthaler, Paul; Morais, Misael

    2015-01-01

    Worldwide medical device embedded software certification practices are currently focused on manufacturing best practices. In Brazil, the national regulatory agency does not hold a local certification process for software-intensive medical devices and admits international certification (e.g. FDA and CE) from local and international industry to operate in the Brazilian health care market. We present here a product-based certification process as a candidate process to support the Brazilian regulatory agency ANVISA in medical device software regulation. Center of Strategic Technology for Healthcare (NUTES) medical device embedded software certification is based on a solid safety quality model and has been tested with reasonable success against the Class I risk device Generic Infusion Pump (GIP).

  15. Speckle-learning-based object recognition through scattering media.

    Science.gov (United States)

    Ando, Takamasa; Horisaki, Ryoichi; Tanida, Jun

    2015-12-28

    We experimentally demonstrated object recognition through scattering media based on direct machine learning of a number of speckle intensity images. In the experiments, speckle intensity images of amplitude or phase objects on a spatial light modulator between scattering plates were captured by a camera. We used the support vector machine for binary classification of the captured speckle intensity images of face and non-face data. The experimental results showed that speckles are sufficient for machine learning.

  16. Post-acquisition data processing for the screening of transformation products of different organic contaminants. Two-year monitoring of river water using LC-ESI-QTOF-MS and GCxGC-EI-TOF-MS.

    Science.gov (United States)

    López, S Herrera; Ulaszewska, M M; Hernando, M D; Martínez Bueno, M J; Gómez, M J; Fernández-Alba, A R

    2014-11-01

    This study describes a comprehensive strategy for detecting and elucidating the chemical structures of expected and unexpected transformation products (TPs) from chemicals found in river water and effluent wastewater samples, using liquid chromatography coupled to electrospray ionization quadrupole-time-of-flight mass spectrometer (LC-ESI-QTOF-MS), with post-acquisition data processing and an automated search using an in-house database. The efficacy of the mass defect filtering (MDF) approach to screen metabolites from common biotransformation pathways was tested, and it was shown to be sufficiently sensitive and applicable for detecting metabolites in environmental samples. Four omeprazole metabolites and two venlafaxine metabolites were identified in river water samples. This paper reports the analytical results obtained during 2 years of monitoring, carried out at eight sampling points along the Henares River (Spain). Multiresidue monitoring, for targeted analysis, includes a group of 122 chemicals, amongst which are pharmaceuticals, personal care products, pesticides and PAHs. For this purpose, two analytical methods were used based on direct injection with a LC-ESI-QTOF-MS system and stir bar sorptive extraction (SBSE) with bi-dimensional gas chromatography coupled with a time-of-flight spectrometer (GCxGC-EI-TOF-MS).

  17. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  18. Scattering-angle based filtering of the waveform inversion gradients

    KAUST Repository

    Alkhalifah, Tariq Ali

    2014-01-01

    Full waveform inversion (FWI) requires a hierarchical approach to maneuver the complex non-linearity associated with the problem of velocity update. In anisotropic media, the non-linearity becomes far more complex with the potential trade-off between the multiparameter description of the model. A gradient filter helps us in accessing the parts of the gradient that are suitable to combat the potential non-linearity and parameter trade-off. The filter is based on representing the gradient in the time-lag normalized domain, in which the low scattering angle of the gradient update is initially muted out in the FWI implementation, in what we may refer to as a scattering angle continuation process. The result is a low wavelength update dominated by the transmission part of the update gradient. In this case, even 10 Hz data can produce vertically near-zero wavenumber updates suitable for a background correction of the model. Relaxing the filtering at a later stage in the FWI implementation allows for smaller scattering angles to contribute higher-resolution information to the model. The benefits of the extended domain based filtering of the gradient is not only it's ability in providing low wavenumber gradients guided by the scattering angle, but also in its potential to provide gradients free of unphysical energy that may correspond to unrealistic scattering angles.

  19. Scattering-angle based filtering of the waveform inversion gradients

    KAUST Repository

    Alkhalifah, Tariq Ali

    2014-11-22

    Full waveform inversion (FWI) requires a hierarchical approach to maneuver the complex non-linearity associated with the problem of velocity update. In anisotropic media, the non-linearity becomes far more complex with the potential trade-off between the multiparameter description of the model. A gradient filter helps us in accessing the parts of the gradient that are suitable to combat the potential non-linearity and parameter trade-off. The filter is based on representing the gradient in the time-lag normalized domain, in which the low scattering angle of the gradient update is initially muted out in the FWI implementation, in what we may refer to as a scattering angle continuation process. The result is a low wavelength update dominated by the transmission part of the update gradient. In this case, even 10 Hz data can produce vertically near-zero wavenumber updates suitable for a background correction of the model. Relaxing the filtering at a later stage in the FWI implementation allows for smaller scattering angles to contribute higher-resolution information to the model. The benefits of the extended domain based filtering of the gradient is not only it\\'s ability in providing low wavenumber gradients guided by the scattering angle, but also in its potential to provide gradients free of unphysical energy that may correspond to unrealistic scattering angles.

  20. SCT: a suite of programs for comparing atomistic models with small-angle scattering data.

    Science.gov (United States)

    Wright, David W; Perkins, Stephen J

    2015-06-01

    Small-angle X-ray and neutron scattering techniques characterize proteins in solution and complement high-resolution structural studies. They are of particular utility when large proteins cannot be crystallized or when the structure is altered by solution conditions. Atomistic models of the averaged structure can be generated through constrained modelling, a technique in which known domain or subunit structures are combined with linker models to produce candidate global conformations. By randomizing the configuration adopted by the different elements of the model, thousands of candidate structures are produced. Next, theoretical scattering curves are generated for each model for trial-and-error fits to the experimental data. From these, a small family of best-fit models is identified. In order to facilitate both the computation of theoretical scattering curves from atomistic models and their comparison with experiment, the SCT suite of tools was developed. SCT also includes programs that provide sequence-based estimates of protein volume (either incorporating hydration or not) and add a hydration layer to models for X-ray scattering modelling. The original SCT software, written in Fortran, resulted in the first atomistic scattering structures to be deposited in the Protein Data Bank, and 77 structures for antibodies, complement proteins and anionic oligosaccharides were determined between 1998 and 2014. For the first time, this software is publicly available, alongside an easier-to-use reimplementation of the same algorithms in Python. Both versions of SCT have been released as open-source software under the Apache 2 license and are available for download from https://github.com/dww100/sct.

  1. Testing digital safety system software with a testability measure based on a software fault tree

    International Nuclear Information System (INIS)

    Sohn, Se Do; Hyun Seong, Poong

    2006-01-01

    Using predeveloped software, a digital safety system is designed that meets the quality standards of a safety system. To demonstrate the quality, the design process and operating history of the product are reviewed along with configuration management practices. The application software of the safety system is developed in accordance with the planned life cycle. Testing, which is a major phase that takes a significant time in the overall life cycle, can be optimized if the testability of the software can be evaluated. The proposed testability measure of the software is based on the entropy of the importance of basic statements and the failure probability from a software fault tree. To calculate testability, a fault tree is used in the analysis of a source code. With a quantitative measure of testability, testing can be optimized. The proposed testability can also be used to demonstrate whether the test cases based on uniform partitions, such as branch coverage criteria, result in homogeneous partitions that is known to be more effective than random testing. In this paper, the testability measure is calculated for the modules of a nuclear power plant's safety software. The module testing with branch coverage criteria required fewer test cases if the module has higher testability. The result shows that the testability measure can be used to evaluate whether partitions have homogeneous characteristics

  2. Designing on ICT reconstruction software based on DSP techniques

    International Nuclear Information System (INIS)

    Liu Jinhui; Xiang Xincheng

    2006-01-01

    The convolution back project (CBP) algorithm is used to realize the CT image's reconstruction in ICT generally, which is finished by using PC or workstation. In order to add the ability of multi-platform operation of CT reconstruction software, a CT reconstruction method based on modern digital signal processor (DSP) technique is proposed and realized in this paper. The hardware system based on TI's C6701 DSP processor is selected to support the CT software construction. The CT reconstruction software is compiled only using assembly language related to the DSP hardware. The CT software can be run on TI's C6701 EVM board by inputting the CT data, and can get the CT Images that satisfy the real demands. (authors)

  3. IT-based Value Creation in Serial Acquisitions

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Yetton, Philip

    2013-01-01

    serial acquirers realize IT-based value, we integrate and model the findings on individual acquisitions from the extant literature, and extend that model to explain the effects of sequential acquisitions in a growth-by-acquisition strategy. This extended model, drawing on the Resource-Based Theory......The extant research on post-acquisition IT integration analyzes how acquirers realize IT-based value in individual acquisitions. However, serial acquirers make 60% of acquisitions. These acquisitions are not isolated events, but are components in growth-by-acquisition programs. To explain how...

  4. Radiative transfer through terrestrial atmosphere and ocean: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Rozanov, A.V.; Kokhanovsky, A.A.; Burrows, J.P.

    2014-01-01

    SCIATRAN is a comprehensive software package for the modeling of radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40μm) including multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The software is capable of modeling spectral and angular distributions of the intensity or the Stokes vector of the transmitted, scattered, reflected, and emitted radiation assuming either a plane-parallel or a spherical atmosphere. Simulations are done either in the scalar or in the vector mode (i.e. accounting for the polarization) for observations by space-, air-, ship- and balloon-borne, ground-based, and underwater instruments in various viewing geometries (nadir, off-nadir, limb, occultation, zenith-sky, off-axis). All significant radiative transfer processes are accounted for. These are, e.g. the Rayleigh scattering, scattering by aerosol and cloud particles, absorption by gaseous components, and bidirectional reflection by an underlying surface including Fresnel reflection from a flat or roughened ocean surface. The software package contains several radiative transfer solvers including finite difference and discrete-ordinate techniques, an extensive database, and a specific module for solving inverse problems. In contrast to many other radiative transfer codes, SCIATRAN incorporates an efficient approach to calculate the so-called Jacobians, i.e. derivatives of the intensity with respect to various atmospheric and surface parameters. In this paper we discuss numerical methods used in SCIATRAN to solve the scalar and vector radiative transfer equation, describe databases of atmospheric, oceanic, and surface parameters incorporated in SCIATRAN, and demonstrate how to solve some selected radiative transfer problems using the SCIATRAN package. During the last decades, a lot of studies have been published demonstrating that SCIATRAN is a valuable

  5. Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-03-01

    Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this

  6. The ALMA Common Software as a Basis for a Distributed Software Development

    Science.gov (United States)

    Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.

  7. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  8. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  9. Target scattering characteristics for OAM-based radar

    Directory of Open Access Journals (Sweden)

    Kang Liu

    2018-02-01

    Full Text Available The target scattering characteristics are crucial for radar systems. However, there is very little study conducted for the recently developed orbital angular momentum (OAM based radar system. To illustrate the role of OAM-based radar cross section (ORCS, conventional radar equation is modified by taking characteristics of the OAM waves into account. Subsequently, the ORCS is defined in analogy to classical radar cross section (RCS. The unique features of the incident OAM-carrying field are analyzed. The scattered field is derived, and the analytical expressions of ORCSs for metal plate and cylinder targets are obtained. Furthermore, the ORCS and RCS are compared to illustrate the influences of OAM mode number, target size and signal frequency on the ORCS. Analytical studies demonstrate that the mirror-reflection phenomenon disappears and peak values of ORCS are in the non-specular direction. Finally, the ORCS features are summarized to show its advantages in radar target detection. This work can provide theoretical guidance to the design of OAM-based radar as well as the target detection and identification applications.

  10. Target scattering characteristics for OAM-based radar

    Science.gov (United States)

    Liu, Kang; Gao, Yue; Li, Xiang; Cheng, Yongqiang

    2018-02-01

    The target scattering characteristics are crucial for radar systems. However, there is very little study conducted for the recently developed orbital angular momentum (OAM) based radar system. To illustrate the role of OAM-based radar cross section (ORCS), conventional radar equation is modified by taking characteristics of the OAM waves into account. Subsequently, the ORCS is defined in analogy to classical radar cross section (RCS). The unique features of the incident OAM-carrying field are analyzed. The scattered field is derived, and the analytical expressions of ORCSs for metal plate and cylinder targets are obtained. Furthermore, the ORCS and RCS are compared to illustrate the influences of OAM mode number, target size and signal frequency on the ORCS. Analytical studies demonstrate that the mirror-reflection phenomenon disappears and peak values of ORCS are in the non-specular direction. Finally, the ORCS features are summarized to show its advantages in radar target detection. This work can provide theoretical guidance to the design of OAM-based radar as well as the target detection and identification applications.

  11. Field-based dynamic light scattering microscopy: theory and numerical analysis.

    Science.gov (United States)

    Joo, Chulmin; de Boer, Johannes F

    2013-11-01

    We present a theoretical framework for field-based dynamic light scattering microscopy based on a spectral-domain optical coherence phase microscopy (SD-OCPM) platform. SD-OCPM is an interferometric microscope capable of quantitative measurement of amplitude and phase of scattered light with high phase stability. Field-based dynamic light scattering (F-DLS) analysis allows for direct evaluation of complex-valued field autocorrelation function and measurement of localized diffusive and directional dynamic properties of biological and material samples with high spatial resolution. In order to gain insight into the information provided by F-DLS microscopy, theoretical and numerical analyses are performed to evaluate the effect of numerical aperture of the imaging optics. We demonstrate that sharp focusing of fields affects the measured diffusive and transport velocity, which leads to smaller values for the dynamic properties in the sample. An approach for accurately determining the dynamic properties of the samples is discussed.

  12. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  13. V & V Within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.

  14. Scattering angle base filtering of the inversion gradients

    KAUST Repository

    Alkhalifah, Tariq Ali

    2014-01-01

    Full waveform inversion (FWI) requires a hierarchical approach based on the availability of low frequencies to maneuver the complex nonlinearity associated with the problem of velocity inversion. I develop a model gradient filter to help us access the parts of the gradient more suitable to combat this potential nonlinearity. The filter is based on representing the gradient in the time-lag normalized domain, in which low scattering angles of the gradient update are initially muted. The result are long-wavelength updates controlled by the ray component of the wavefield. In this case, even 10 Hz data can produce near zero wavelength updates suitable for a background correction of the model. Allowing smaller scattering angle to contribute provides higher resolution information to the model.

  15. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  16. Family-Based Benchmarking of Copy Number Variation Detection Software.

    Science.gov (United States)

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  17. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  18. Neural network-based retrieval from software reuse repositories

    Science.gov (United States)

    Eichmann, David A.; Srinivas, Kankanahalli

    1992-01-01

    A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline an approach to this problem based upon neural networks which avoids requiring the repository administrators to define a conceptual closeness graph for the classification vocabulary.

  19. DiSCaMB: a software library for aspherical atom model X-ray scattering factor calculations with CPUs and GPUs.

    Science.gov (United States)

    Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria

    2018-02-01

    It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.

  20. Automated Search-Based Robustness Testing for Autonomous Vehicle Software

    Directory of Open Access Journals (Sweden)

    Kevin M. Betts

    2016-01-01

    Full Text Available Autonomous systems must successfully operate in complex time-varying spatial environments even when dealing with system faults that may occur during a mission. Consequently, evaluating the robustness, or ability to operate correctly under unexpected conditions, of autonomous vehicle control software is an increasingly important issue in software testing. New methods to automatically generate test cases for robustness testing of autonomous vehicle control software in closed-loop simulation are needed. Search-based testing techniques were used to automatically generate test cases, consisting of initial conditions and fault sequences, intended to challenge the control software more than test cases generated using current methods. Two different search-based testing methods, genetic algorithms and surrogate-based optimization, were used to generate test cases for a simulated unmanned aerial vehicle attempting to fly through an entryway. The effectiveness of the search-based methods in generating challenging test cases was compared to both a truth reference (full combinatorial testing and the method most commonly used today (Monte Carlo testing. The search-based testing techniques demonstrated better performance than Monte Carlo testing for both of the test case generation performance metrics: (1 finding the single most challenging test case and (2 finding the set of fifty test cases with the highest mean degree of challenge.

  1. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  2. Advances in model-based software for simulating ultrasonic immersion inspections of metal components

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; Engle, Brady J.; Roberts, Ronald A.

    2018-04-01

    Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was initiated in 2015 to repackage existing research-grade software into user-friendly tools for the rapid estimation of signal-to-noise ratio (SNR) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray measurement model for the response from an internal defect, and the Thompson-Margetan independent scatterer model for backscattered grain noise. This paper, the third in the series [1-2], provides an overview of the ongoing modeling effort with emphasis on recent developments. These include the ability to: (1) treat microstructures where grain size, shape and tilt relative to the incident sound direction can all vary with depth; and (2) simulate C-scans of defect signals in the presence of backscattered grain noise. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-square values of grain noise amplitudes, and SNR as functions of the depth of the defect within the metal component. At any particular depth, the user can view

  3. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    Science.gov (United States)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  4. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  5. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  6. MRI/TRUS fusion software-based targeted biopsy: the new standard of care?

    Science.gov (United States)

    Manfredi, M; Costa Moretti, T B; Emberton, M; Villers, A; Valerio, M

    2015-09-01

    The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.

  7. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  8. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    Science.gov (United States)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  9. Polarimetric SAR interferometry-based decomposition modelling for reliable scattering retrieval

    Science.gov (United States)

    Agrawal, Neeraj; Kumar, Shashi; Tolpekin, Valentyn

    2016-05-01

    Fully Polarimetric SAR (PolSAR) data is used for scattering information retrieval from single SAR resolution cell. Single SAR resolution cell may contain contribution from more than one scattering objects. Hence, single or dual polarized data does not provide all the possible scattering information. So, to overcome this problem fully Polarimetric data is used. It was observed in previous study that fully Polarimetric data of different dates provide different scattering values for same object and coefficient of determination obtained from linear regression between volume scattering and aboveground biomass (AGB) shows different values for the SAR dataset of different dates. Scattering values are important input elements for modelling of forest aboveground biomass. In this research work an approach is proposed to get reliable scattering from interferometric pair of fully Polarimetric RADARSAT-2 data. The field survey for data collection was carried out for Barkot forest during November 10th to December 5th, 2014. Stratified random sampling was used to collect field data for circumference at breast height (CBH) and tree height measurement. Field-measured AGB was compared with the volume scattering elements obtained from decomposition modelling of individual PolSAR images and PolInSAR coherency matrix. Yamaguchi 4-component decomposition was implemented to retrieve scattering elements from SAR data. PolInSAR based decomposition was the great challenge in this work and it was implemented with certain assumptions to create Hermitian coherency matrix with co-registered polarimetric interferometric pair of SAR data. Regression analysis between field-measured AGB and volume scattering element obtained from PolInSAR data showed highest (0.589) coefficient of determination. The same regression with volume scattering elements of individual SAR images showed 0.49 and 0.50 coefficients of determination for master and slave images respectively. This study recommends use of

  10. Numerical studies of time-independent and time-dependent scattering by several elliptical cylinders

    Science.gov (United States)

    Nigsch, Martin

    2007-07-01

    A numerical solution to the problem of time-dependent scattering by an array of elliptical cylinders with parallel axes is presented. The solution is an exact one, based on the separation-of-variables technique in the elliptical coordinate system, the addition theorem for Mathieu functions, and numerical integration. Time-independent solutions are described by a system of linear equations of infinite order which are truncated for numerical computations. Time-dependent solutions are obtained by numerical integration involving a large number of these solutions. First results of a software package generating these solutions are presented: wave propagation around three impenetrable elliptical scatterers. As far as we know, this method described has never been used for time-dependent multiple scattering.

  11. Light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities based on hybrid simplified spherical harmonics with radiosity model.

    Science.gov (United States)

    Yang, Defu; Chen, Xueli; Peng, Zhen; Wang, Xiaorui; Ripoll, Jorge; Wang, Jing; Liang, Jimin

    2013-01-01

    Modeling light propagation in the whole body is essential and necessary for optical imaging. However, non-scattering, low-scattering and high absorption regions commonly exist in biological tissues, which lead to inaccuracy of the existing light transport models. In this paper, a novel hybrid light transport model that couples the simplified spherical harmonics approximation (SPN) with the radiosity theory (HSRM) was presented, to accurately describe light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities. In the model, the radiosity theory was used to characterize the light transport in non-scattering regions and the SPN was employed to handle the scattering problems, including subsets of low-scattering and high absorption. A Neumann source constructed by the light transport in the non-scattering region and formed at the interface between the non-scattering and scattering regions was superposed into the original light source, to couple the SPN with the radiosity theory. The accuracy and effectiveness of the HSRM was first verified with both regular and digital mouse model based simulations and a physical phantom based experiment. The feasibility and applicability of the HSRM was then investigated by a broad range of optical properties. Lastly, the influence of depth of the light source on the model was also discussed. Primary results showed that the proposed model provided high performance for light transport in turbid media with non-scattering, low-scattering and high absorption heterogeneities.

  12. Sources of the X-rays Based on Compton Scattering

    International Nuclear Information System (INIS)

    Androsov, V.; Bulyak, E.; Gladkikh, P.; Karnaukhov, I.; Mytsykov, A.; Telegin, Yu.; Shcherbakov, A.; Zelinsky, A.

    2007-01-01

    The principles of the intense X-rays generation by laser beam scattering on a relativistic electron beam are described and description of facilities assigned to produce the X-rays based on Compton scattering is presented. The possibilities of various types of such facilities are estimated and discussed. The source of the X-rays based on a storage ring with low beam energy is described in details and advantages of the sources of such type are discussed.The results of calculation and numerical simulation carried out for laser electron storage ring NESTOR that is under development in NSC KIPT show wide prospects of the accelerator facility of such type

  13. Software requirements management based on use cases

    International Nuclear Information System (INIS)

    Xiao Jin

    2009-01-01

    In this paper, the requirements management based on use cases is theoretically explored, and a multi-layer use-case model is introduced, which combined with three levels of use cases and a single use-case refinement model. Through the practice in a software project, the multi-layer use-case model provides a good solution on how to control the requirements scope and change, and provides the balance of work assignment between customer departments, information management departments and software development outsourcing team. (authors)

  14. Subsurface Scattering-Based Object Rendering Techniques for Real-Time Smartphone Games

    Directory of Open Access Journals (Sweden)

    Won-Sun Lee

    2014-01-01

    Full Text Available Subsurface scattering that simulates the path of a light through the material in a scene is one of the advanced rendering techniques in the field of computer graphics society. Since it takes a number of long operations, it cannot be easily implemented in real-time smartphone games. In this paper, we propose a subsurface scattering-based object rendering technique that is optimized for smartphone games. We employ our subsurface scattering method that is utilized for a real-time smartphone game. And an example game is designed to validate how the proposed method can be operated seamlessly in real time. Finally, we show the comparison results between bidirectional reflectance distribution function, bidirectional scattering distribution function, and our proposed subsurface scattering method on a smartphone game.

  15. An expert system based software sizing tool, phase 2

    Science.gov (United States)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  16. Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch

    Directory of Open Access Journals (Sweden)

    Rikie Kartadie

    2016-11-01

    Full Text Available A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet. 

  17. Fast scattering simulation tool for multi-energy x-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Sossin, A., E-mail: artur.sossin@cea.fr [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Tabary, J.; Rebuffel, V. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Létang, J.M.; Freud, N. [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, Centre Léon Bérard (France); Verger, L. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France)

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  18. X-ray generator based on Compton scattering

    NARCIS (Netherlands)

    Androsov, V.P.; Agafonov, A.V.; Botman, J.I.M.; Bulyak, E.V.; Drebot, I.; Gladkikh, P.I.; Grevtsev, V.; Ivashchenko, V.; Karnaukhov, I.M.; Lapshin, V.I.

    2005-01-01

    Nowadays, the sources of the X-rays based on a storage ring with low beam energy and Compton scattering of intense laser beam are under development in several laboratories. In the paper the state-of-art in development and construction of cooperative project of a Kharkov advanced X-ray source NESTOR

  19. Scatter measurement and correction method for cone-beam CT based on single grating scan

    Science.gov (United States)

    Huang, Kuidong; Shi, Wenlong; Wang, Xinyu; Dong, Yin; Chang, Taoqi; Zhang, Hua; Zhang, Dinghua

    2017-06-01

    In cone-beam computed tomography (CBCT) systems based on flat-panel detector imaging, the presence of scatter significantly reduces the quality of slices. Based on the concept of collimation, this paper presents a scatter measurement and correction method based on single grating scan. First, according to the characteristics of CBCT imaging, the scan method using single grating and the design requirements of the grating are analyzed and figured out. Second, by analyzing the composition of object projection images and object-and-grating projection images, the processing method for the scatter image at single projection angle is proposed. In addition, to avoid additional scan, this paper proposes an angle interpolation method of scatter images to reduce scan cost. Finally, the experimental results show that the scatter images obtained by this method are accurate and reliable, and the effect of scatter correction is obvious. When the additional object-and-grating projection images are collected and interpolated at intervals of 30 deg, the scatter correction error of slices can still be controlled within 3%.

  20. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  1. Workflow Based Software Development Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  2. Workflow Based Software Development Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  3. Light focusing through a multiple scattering medium: ab initio computer simulation

    Science.gov (United States)

    Danko, Oleksandr; Danko, Volodymyr; Kovalenko, Andrey

    2018-01-01

    The present study considers ab initio computer simulation of the light focusing through a complex scattering medium. The focusing is performed by shaping the incident light beam in order to obtain a small focused spot on the opposite side of the scattering layer. MSTM software (Auburn University) is used to simulate the propagation of an arbitrary monochromatic Gaussian beam and obtain 2D distribution of the optical field in the selected plane of the investigated volume. Based on the set of incident and scattered fields, the pair of right and left eigen bases and corresponding singular values were calculated. The pair of right and left eigen modes together with the corresponding singular value constitute the transmittance eigen channel of the disordered media. Thus, the scattering process is described in three steps: 1) initial field decomposition in the right eigen basis; 2) scaling of decomposition coefficients for the corresponding singular values; 3) assembling of the scattered field as the composition of the weighted left eigen modes. Basis fields are represented as a linear combination of the original Gaussian beams and scattered fields. It was demonstrated that 60 independent control channels provide focusing the light into a spot with the minimal radius of approximately 0.4 μm at half maximum. The intensity enhancement in the focal plane was equal to 68 that coincided with theoretical prediction.

  4. Software tools for microprocessor based systems

    International Nuclear Information System (INIS)

    Halatsis, C.

    1981-01-01

    After a short review of the hardware and/or software tools for the development of single-chip, fixed instruction set microprocessor-based sytems we focus on the software tools for designing systems based on microprogrammed bit-sliced microprocessors. Emphasis is placed on meta-microassemblers and simulation facilties at the register-transfer-level and architecture level. We review available meta-microassemblers giving their most important features, advantages and disadvantages. We also make extentions to higher-level microprogramming languages and associated systems specifically developed for bit-slices. In the area of simulation facilities we first discuss the simulation objectives and the criteria for chosing the right simulation language. We consertrate to simulation facilities already used in bit-slices projects and discuss the gained experience. We conclude by describing the way the Signetics meta-microassembler and the ISPS simulation tool have been employed in the design of a fast microprogrammed machine, called MICE, made out of ECL bit-slices. (orig.)

  5. AWARE-P: a collaborative, system-based IAM planning software

    OpenAIRE

    Coelho, S. T.; Vitorino, D.

    2011-01-01

    The AWARE-P project aims to promote the application of integrated and risk-based approaches to the rehabilitation of urban water supply and wastewater drainage systems. Central to the project is the development of a software platform based on a set of computational components, which assist in the analyses and decision support involved in the planning process for sustainable infrastructural asset management. The AWARE-P software system brings together onto a common platform the inf...

  6. A model-based radiography restoration method based on simple scatter-degradation scheme for improving image visibility

    Science.gov (United States)

    Kim, K.; Kang, S.; Cho, H.; Kang, W.; Seo, C.; Park, C.; Lee, D.; Lim, H.; Lee, H.; Kim, G.; Park, S.; Park, J.; Kim, W.; Jeon, D.; Woo, T.; Oh, J.

    2018-02-01

    In conventional planar radiography, image visibility is often limited mainly due to the superimposition of the object structure under investigation and the artifacts caused by scattered x-rays and noise. Several methods, including computed tomography (CT) as a multiplanar imaging modality, air-gap and grid techniques for the reduction of scatters, phase-contrast imaging as another image-contrast modality, etc., have extensively been investigated in attempt to overcome these difficulties. However, those methods typically require higher x-ray doses or special equipment. In this work, as another approach, we propose a new model-based radiography restoration method based on simple scatter-degradation scheme where the intensity of scattered x-rays and the transmission function of a given object are estimated from a single x-ray image to restore the original degraded image. We implemented the proposed algorithm and performed an experiment to demonstrate its viability. Our results indicate that the degradation of image characteristics by scattered x-rays and noise was effectively recovered by using the proposed method, which improves the image visibility in radiography considerably.

  7. Computer control in a compton scattering spectrometer

    International Nuclear Information System (INIS)

    Cui Ningzhuo; Chen Tao; Gong Zhufang; Yang Baozhong; Mo Haiding; Hua Wei; Bian Zuhe

    1995-01-01

    The authors introduced the hardware and software of computer autocontrol of calibration and data acquisition in a Compton Scattering spectrometer which consists of a HPGe detector, Amplifiers and a MCA

  8. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  9. Porting a Java-based Brain Simulation Software to C++

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    A currently available software solution to simulate neural development is Cx3D. However, this software is Java-based, and not ideal for high performance computing. This talk presents our step-by-step porting approach, that uses SWIG as a tool to interface C++ code from Java.

  10. WE-DE-207B-10: Library-Based X-Ray Scatter Correction for Dedicated Cone-Beam Breast CT: Clinical Validation

    Energy Technology Data Exchange (ETDEWEB)

    Shi, L; Zhu, L [Georgia Institute of Technology, Atlanta, GA (Georgia); Vedantham, S; Karellas, A [University of Massachusetts Medical School, Worcester, MA (United States)

    2016-06-15

    Purpose: Scatter contamination is detrimental to image quality in dedicated cone-beam breast CT (CBBCT), resulting in cupping artifacts and loss of contrast in reconstructed images. Such effects impede visualization of breast lesions and the quantitative accuracy. Previously, we proposed a library-based software approach to suppress scatter on CBBCT images. In this work, we quantify the efficacy and stability of this approach using datasets from 15 human subjects. Methods: A pre-computed scatter library is generated using Monte Carlo simulations for semi-ellipsoid breast models and homogeneous fibroglandular/adipose tissue mixture encompassing the range reported in literature. Projection datasets from 15 human subjects that cover 95 percentile of breast dimensions and fibroglandular volume fraction were included in the analysis. Our investigations indicate that it is sufficient to consider the breast dimensions alone and variation in fibroglandular fraction does not significantly affect the scatter-to-primary ratio. The breast diameter is measured from a first-pass reconstruction; the appropriate scatter distribution is selected from the library; and, deformed by considering the discrepancy in total projection intensity between the clinical dataset and the simulated semi-ellipsoidal breast. The deformed scatter-distribution is subtracted from the measured projections for scatter correction. Spatial non-uniformity (SNU) and contrast-to-noise ratio (CNR) were used as quantitative metrics to evaluate the results. Results: On the 15 patient cases, our method reduced the overall image spatial non-uniformity (SNU) from 7.14%±2.94% (mean ± standard deviation) to 2.47%±0.68% in coronal view and from 10.14%±4.1% to 3.02% ±1.26% in sagittal view. The average contrast to noise ratio (CNR) improved by a factor of 1.49±0.40 in coronal view and by 2.12±1.54 in sagittal view. Conclusion: We demonstrate the robustness and effectiveness of a library-based scatter correction

  11. Elements of strategic capability for software outsourcing enterprises based on the resource

    Science.gov (United States)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  12. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    Science.gov (United States)

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  13. RISK MANAGEMENT AUTOMATION OF SOFTWARE PROJECTS BASED ОN FUZZY INFERENCE

    Directory of Open Access Journals (Sweden)

    T. M. Zubkova

    2015-09-01

    Full Text Available Application suitability for one of the intelligent methods for risk management of software projects has been shown based on the review of existing algorithms for fuzzy inference in the field of applied problems. Information sources in the management of software projects are analyzed; major and minor risks are highlighted. The most critical parameters have been singled out giving the possibility to estimate the occurrence of an adverse situations (project duration, the frequency of customer’s requirements changing, work deadlines, experience of developers’ participation in such projects and others.. The method of qualitative fuzzy description based on fuzzy logic has been developed for analysis of these parameters. Evaluation of possible situations and knowledge base formation rely on a survey of experts. The main limitations of existing automated systems have been identified in relation to their applicability to risk management in the software design. Theoretical research set the stage for software system that makes it possible to automate the risk management process for software projects. The developed software system automates the process of fuzzy inference in the following stages: rule base formation of the fuzzy inference systems, fuzzification of input variables, aggregation of sub-conditions, activation and accumulation of conclusions for fuzzy production rules, variables defuzzification. The result of risk management automation process in the software design is their quantitative and qualitative assessment and expert advice for their minimization. Practical significance of the work lies in the fact that implementation of the developed automated system gives the possibility for performance improvement of software projects.

  14. Reliability estimation of safety-critical software-based systems using Bayesian networks

    International Nuclear Information System (INIS)

    Helminen, A.

    2001-06-01

    Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of software-based safety-critical automation systems in nuclear power plants. In the research project 'Programmable automation system safety integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002), various safety assessment methods and tools for software based systems are developed and evaluated. The project is financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT). In this report the applicability of Bayesian networks to the reliability estimation of software-based systems is studied. The applicability is evaluated by building Bayesian network models for the systems of interest and performing simulations for these models. In the simulations hypothetical evidence is used for defining the parameter relations and for determining the ability to compensate disparate evidence in the models. Based on the experiences from modelling and simulations we are able to conclude that Bayesian networks provide a good method for the reliability estimation of software-based systems. (orig.)

  15. Using scenario based programming to develop embedded control software

    NARCIS (Netherlands)

    Bettiol, F.

    2015-01-01

    A new paradigm to develop embedded software is waking up the interest of companies. Its name is Scenario Based Programming and it claims to be a good approach to develop embedded software. Live Sequence Charts (LSC), a visual language supporting the paradigm, enables the developers to specify a

  16. Ultraviolet refractometry using field-based light scattering spectroscopy

    Science.gov (United States)

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Oh, Seungeun; Yaqoob, Zahid; Park, YongKeun; Dasari, Ramachandra R.; Feld, Michael S.

    2010-01-01

    Accurate refractive index measurement in the deep ultraviolet (UV) range is important for the separate quantification of biomolecules such as proteins and DNA in biology. This task is demanding and has not been fully exploited so far. Here we report a new method of measuring refractive index using field-based light scattering spectroscopy, which is applicable to any wavelength range and suitable for both solutions and homogenous objects with well-defined shape such as microspheres. The angular scattering distribution of single microspheres immersed in homogeneous media is measured over the wavelength range 260 to 315 nm using quantitative phase microscopy. By least square fitting the observed scattering distribution with Mie scattering theory, the refractive index of either the sphere or the immersion medium can be determined provided that one is known a priori. Using this method, we have measured the refractive index dispersion of SiO2 spheres and bovine serum albumin (BSA) solutions in the deep UV region. Specific refractive index increments of BSA are also extracted. Typical accuracy of the present refractive index technique is ≤0.003. The precision of refractive index measurements is ≤0.002 and that of specific refractive index increment determination is ≤0.01 mL/g. PMID:20372622

  17. Development of Software for dose Records Data Base Access

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The CIEMAT personal dose records are computerized in a Dosimetric Data Base whose primary purpose was the individual dose follow-up control and the data handling for epidemiological studies. Within the Data Base management scheme, software development to allow searching of individual dose records by external authorised users was undertaken. The report describes the software developed to allow authorised persons to visualize on screen a summary of the individual dose records from workers included in the Data Base. The report includes the User Guide for the authorised list of users and listings of codes and subroutines developed. (Author) 2 refs

  18. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  19. A combined Component-Based Approach for the Design of Distributed Software Systems

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.; Quartel, Dick; Yang, H.; Gupta, S.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called components. Several approaches to component-based development have been proposed recently. Most of these approaches are based on

  20. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  1. Elastic scattering at the LHC

    CERN Document Server

    Kaspar, Jan; Deile, M

    The seemingly simple elastic scattering of protons still presents a challenge for the theory. In this thesis we discuss the elastic scattering from theoretical as well as experimental point of view. In the theory part, we present several models and their predictions for the LHC. We also discuss the Coulomb-hadronic interference, where we present a new eikonal calculation to all orders of alpha, the fine-structure constant. In the experimental part we introduce the TOTEM experiment which is dedicated, among other subjects, to the measurement of the elastic scattering at the LHC. This measurement is performed primarily with the Roman Pot (RP) detectors - movable beam-pipe insertions hundreds of meters from the interaction point, that can detect protons scattered to very small angles. We discuss some aspects of the RP simulation and reconstruction software. A central point is devoted to the techniques of RP alignment - determining the RP sensor positions relative to each other and to the beam. At the end we pres...

  2. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  3. A Bayesian belief nets based quantitative software reliability assessment for PSA: COTS case study

    International Nuclear Information System (INIS)

    Eom, H. S.; Sung, T. Y.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K. Y.; Park, J. K

    2002-03-01

    Current reliability assessments of safety critical software embedded in the digital systems in nuclear power plants are based on the rule-based qualitative assessment methods. Then recently practical needs require the quantitative features of software reliability for Probabilistic Safety Assessment (PSA) that is one of important methods being used in assessing the whole safety of nuclear power plant. But conventional quantitative software reliability assessment methods are not enough to get the necessary results in assessing the safety critical software used in nuclear power plants. Thus, current reliability assessment methods for these digital systems exclude the software part or use arbitrary values for the software reliability in the assessment. This reports discusses a Bayesian Belief Nets (BBN) based quantification method that models current qualitative software assessment in formal way and produces quantitative results required for PSA. Commercial Off-The-Shelf (COTS) software dedication process that KAERI developed was applied to the discussed BBN based method for evaluating the plausibility of the proposed method in PSA

  4. An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description

    Science.gov (United States)

    2013-10-01

    14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences

  5. Empowering global software development with business intelligence

    OpenAIRE

    Maté Morga, Alejandro; Trujillo Mondéjar, Juan Carlos; García, Félix; Serrano Martín, Manuel; Piattini, Mario

    2016-01-01

    Context: Global Software Development (GSD) allows companies to take advantage of talent spread across the world. Most research has been focused on the development aspect. However, little if any attention has been paid to the management of GSD projects. Studies report a lack of adequate support for management’s decisions made during software development, further accentuated in GSD since information is scattered throughout multiple factories, stored in different formats and standards. Objective...

  6. A system design of data acquisition and processing for side-scatter lidar

    Science.gov (United States)

    Zhang, ZhanYe; Xie, ChenBo; Wang, ZhenZhu; Kuang, ZhiQiang; Deng, Qian; Tao, ZongMing; Liu, Dong; Wang, Yingjian

    2018-03-01

    A system for collecting data of Side-Scatter lidar based on Charge Coupled Device (CCD),is designed and implemented. The system of data acquisition is based on Microsoft. Net structure and the language of C# is used to call dynamic link library (DLL) of CCD for realization of the real-time data acquisition and processing. The software stores data as txt file for post data acquisition and analysis. The system has ability to operate CCD device in all-day, automatic, continuous and high frequency data acquisition and processing conditions, which will catch 24-hour information of the atmospheric scatter's light intensity and retrieve the spatial and temporal properties of aerosol particles. The experimental result shows that the system is convenient to observe the aerosol optical characteristics near surface.

  7. Monitoring extensions for component-based distributed software

    NARCIS (Netherlands)

    Diakov, N.K.; Papir, Z.; van Sinderen, Marten J.; Quartel, Dick

    2000-01-01

    This paper defines a generic class of monitoring extensions to component-based distributed enterprise software. Introducing a monitoring extension to a legacy application system can be very costly. In this paper, we identify the minimum support for application monitoring within the generic

  8. Thermal invisibility based on scattering cancellation and mantle cloaking

    KAUST Repository

    Farhat, Mohamed; Chen, P.-Y.; Bagci, Hakan; Amra, C.; Guenneau, S.; Alù , A.

    2015-01-01

    We theoretically and numerically analyze thermal invisibility based on the concept of scattering cancellation and mantle cloaking. We show that a small object can be made completely invisible to heat diffusion waves, by tailoring the heat conductivity of the spherical shell enclosing the object. This means that the thermal scattering from the object is suppressed, and the heat flow outside the object and the cloak made of these spherical shells behaves as if the object is not present. Thermal invisibility may open new vistas in hiding hot spots in infrared thermography, military furtivity, and electronics heating reduction.

  9. A compact X-ray source based on Compton scattering

    Energy Technology Data Exchange (ETDEWEB)

    Bulyak, E.; Gladkikh, P.; Grigor' ev, Yu.; Guk, I.; Karnaukhov, I.; Khodyachikh, A.; Kononenko, S.; Mocheshnikov, N.; Mytsykov, A.; Shcherbakov, A. E-mail: shcherbakov@kipt.kharkov.ua; Tarasenko, A.; Telegin, Yu.; Zelinsky, A

    2001-07-21

    The main parameters of Kharkov electron storage ring N-100 with a beam energy range from 70 to 150 MeV are presented. The main results that were obtained in experimental researches are briefly described. The future of the N-100 upgrade to the development of the X-ray generator based on Compton back-scattering are presented. The electron beam energy range will be extended up to 250 MeV and the circumference of the storage ring will be 13.72 m. The lattice, parameters of the electron beam and the Compton back-scattering photons flux are described.

  10. A compact X-ray source based on Compton scattering

    International Nuclear Information System (INIS)

    Bulyak, E.; Gladkikh, P.; Grigor'ev, Yu.; Guk, I.; Karnaukhov, I.; Khodyachikh, A.; Kononenko, S.; Mocheshnikov, N.; Mytsykov, A.; Shcherbakov, A.; Tarasenko, A.; Telegin, Yu.; Zelinsky, A.

    2001-01-01

    The main parameters of Kharkov electron storage ring N-100 with a beam energy range from 70 to 150 MeV are presented. The main results that were obtained in experimental researches are briefly described. The future of the N-100 upgrade to the development of the X-ray generator based on Compton back-scattering are presented. The electron beam energy range will be extended up to 250 MeV and the circumference of the storage ring will be 13.72 m. The lattice, parameters of the electron beam and the Compton back-scattering photons flux are described

  11. Thermal invisibility based on scattering cancellation and mantle cloaking

    KAUST Repository

    Farhat, Mohamed

    2015-04-30

    We theoretically and numerically analyze thermal invisibility based on the concept of scattering cancellation and mantle cloaking. We show that a small object can be made completely invisible to heat diffusion waves, by tailoring the heat conductivity of the spherical shell enclosing the object. This means that the thermal scattering from the object is suppressed, and the heat flow outside the object and the cloak made of these spherical shells behaves as if the object is not present. Thermal invisibility may open new vistas in hiding hot spots in infrared thermography, military furtivity, and electronics heating reduction.

  12. Entrepreneurial model based technology creative industries sector software through the use of free open source software for Universitas Pendidikan Indonesia students

    Science.gov (United States)

    Hasan, B.; Hasbullah; Purnama, W.; Hery, A.

    2016-04-01

    Creative industry development areas of software by using Free Open Source Software (FOSS) is expected to be one of the solutions to foster new entrepreneurs of the students who can open job opportunities and contribute to economic development in Indonesia. This study aims to create entrepreneurial coaching model based on the creative industries by utilizing FOSS software field as well as provide understanding and fostering entrepreneurial creative industries based field software for students of Universitas Pendidikan Indonesia. This activity phase begins with identifying entrepreneurs or business software technology that will be developed, training and mentoring, apprenticeship process at industrial partners, creation of business plans and monitoring and evaluation. This activity involves 30 UPI student which has the motivation to self-employment and have competence in the field of information technology. The results and outcomes expected from these activities is the birth of a number of new entrepreneurs from the students engaged in the software industry both software in the world of commerce (e-commerce) and education/learning (e-learning/LMS) and games.

  13. Software-Based Visual Loan Calculator For Banking Industry

    Science.gov (United States)

    Isizoh, A. N.; Anazia, A. E.; Okide, S. O. 3; Onyeyili, T. I.; Okwaraoka, C. A. P.

    2012-03-01

    industry is very necessary in modern day banking system using many design techniques for security reasons. This paper thus presents the software-based design and implementation of a Visual Loan calculator for banking industry using Visual Basic .Net (VB.Net). The fundamental approach to this is to develop a Graphical User Interface (GUI) using VB.Net operating tools, and then developing a working program which calculates the interest of any loan obtained. The VB.Net programming was done, implemented and the software proved satisfactory.

  14. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  15. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  16. Software Component Clustering and Retrieval: An Entropy-based Fuzzy k-Modes Methodology

    OpenAIRE

    Stylianou, Constantinos; Andreou, Andreas S.

    2008-01-01

    The number of software houses attempting to adopt a component-based development approach is rapidly increasing. However many organisations still find it difficult to complete the shift as it requires them to alter their entire software development process and philosophy. Furthermore, to promote component-based software engineering, organisations must be ready to promote reusability and this can only be attained if the proper framework exists from which a developer can access, search and retri...

  17. Use of Data Base Microcomputer Software in Descriptive Nursing Research

    OpenAIRE

    Chapman, Judy Jean

    1985-01-01

    Data base microcomputer software was used to design a file for data storage and retrieval in a qualitative nursing research project. The needs of 50 breast feeding mothers from birth to four months were studied. One thousand records with descriptive nursing data were entered into the file. The search and retrieval capability of data base software facilitated this qualitative research. The findings will be discussed in three areas: (1) infant concerns, (2) postpartum concerns, and (3) breast c...

  18. Utility of ck metrics in predicting size of board-based software games

    International Nuclear Information System (INIS)

    Sabhat, N.; Azam, F.; Malik, A.A.

    2017-01-01

    Software size is one of the most important inputs of many software cost and effort estimation models. Early estimation of software plays an important role at the time of project inception. An accurate estimate of software size is, therefore, crucial for planning, managing, and controlling software development projects dealing with the development of software games. However, software size is unavailable during early phase of software development. This research determines the utility of CK (Chidamber and Kemerer) metrics, a well-known suite of object-oriented metrics, in estimating the size of software applications using the information from its UML (Unified Modeling Language) class diagram. This work focuses on a small subset dealing with board-based software games. Almost sixty games written using an object-oriented programming language are downloaded from open source repositories, analyzed and used to calibrate a regression-based size estimation model. Forward stepwise MLR (Multiple Linear Regression) is used for model fitting. The model thus obtained is assessed using a variety of accuracy measures such as MMRE (Mean Magnitude of Relative Error), Prediction of x(PRED(x)), MdMRE (Median of Relative Error) and validated using K-fold cross validation. The accuracy of this model is also compared with an existing model tailored for size estimation of board games. Based on a small subset of desktop games developed in various object-oriented languages, we obtained a model using CK metrics and forward stepwise multiple linear regression with reasonable estimation accuracy as indicated by the value of the coefficient of determination (R2 = 0.756).Comparison results indicate that the existing size estimation model outperforms the model derived using CK metrics in terms of accuracy of prediction. (author)

  19. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  20. Component-Based Software Engineering and Runtime Type Definition

    OpenAIRE

    A. R. Shakurov

    2011-01-01

    The component-based approach to software engineering, its current implementations and their limitations are discussed. A new extended architecture for such systems is presented. Its main architectural concepts and principles are considered.

  1. Biomolecular structure refinement using the GROMOS simulation software

    International Nuclear Information System (INIS)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jožica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van

    2011-01-01

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, 3 J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  2. Biomolecular structure refinement using the GROMOS simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, Nathan; Allison, Jane R.; Dolenc, Jozica; Eichenberger, Andreas P.; Kunz, Anna-Pitschna E.; Gunsteren, Wilfred F. van, E-mail: wfvgn@igc.phys.chem.ethz.ch [Swiss Federal Institute of Technology ETH, Laboratory of Physical Chemistry (Switzerland)

    2011-11-15

    For the understanding of cellular processes the molecular structure of biomolecules has to be accurately determined. Initial models can be significantly improved by structure refinement techniques. Here, we present the refinement methods and analysis techniques implemented in the GROMOS software for biomolecular simulation. The methodology and some implementation details of the computation of NMR NOE data, {sup 3}J-couplings and residual dipolar couplings, X-ray scattering intensities from crystals and solutions and neutron scattering intensities used in GROMOS is described and refinement strategies and concepts are discussed using example applications. The GROMOS software allows structure refinement combining different types of experimental data with different types of restraining functions, while using a variety of methods to enhance conformational searching and sampling and the thermodynamically calibrated GROMOS force field for biomolecular simulation.

  3. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  4. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  5. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  6. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    Directory of Open Access Journals (Sweden)

    Florian Schumacher

    2016-01-01

    Full Text Available Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth’s interior remains of high interest in Earth sciences. Here, we give a description from a user’s and programmer’s perspective of the highly modular, flexible and extendable software package ASKI–Analysis of Sensitivity and Kernel Inversion–recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski.

  7. Acoustic inverse scattering using topological derivative of far-field measurements-based L2 cost functionals

    International Nuclear Information System (INIS)

    Bellis, Cédric; Bonnet, Marc; Cakoni, Fioralba

    2013-01-01

    Originally formulated in the context of topology optimization, the concept of topological derivative has also proved effective as a qualitative inversion tool for a wave-based identification of finite-sized objects. This approach remains, however, largely based on a heuristic interpretation of the topological derivative, whereas most other qualitative approaches to inverse scattering are backed by a mathematical justification. As an effort toward bridging this gap, this study focuses on a topological derivative approach applied to the L 2 -norm of the misfit between far-field measurements. Either an inhomogeneous medium or a finite number of point-like scatterers are considered, using either the Born approximation or a full-scattering model. Topological derivative-based imaging functionals are analyzed using a suitable factorization of the far-field operator, for each of the considered cases, in order to characterize their behavior and assess their ability to reconstruct the unknown scatterer(s). Results include the justification of the usual sign heuristic underpinning the method for (i) the Born approximation and (ii) full-scattering models limited to moderately strong scatterers. Semi-analytical and numerical examples are presented. Within the chosen framework, the topological derivative approach is finally discussed and compared to other well-known qualitative methods. (paper)

  8. The user's manual of 'Manyo Library' data reduction software framework at MLF, J-PARC

    International Nuclear Information System (INIS)

    Inamura, Yasuhiro; Nakatani, Takeshi; Ito, Takayoshi; Suzuki, Jiro

    2016-06-01

    Manyo Library is a software framework for developing analysis software of neutron scattering data produced at MLF, J-PARC. This software framework is required to work on many instruments in MLF and to include base functions applied to various scientific purposes at beam lines. This framework mainly consists of data containers, which enable to store 1, 2 and 3 dimensional axes data for neutron scattering. Data containers have many functions to calculate four arithmetic operations with errors distribution between containers, to store the meta-data about measurements and to read or write text file. The analysis codes are constructed using various analysis operators defined in Manyo Library, which executes functions with given data containers and output the results. On the other hands, the main interface for instrument scientists and users must be easy and interactive to treat data containers and functions or to develop new analysis codes. Therefore we chose Python as user interface. Since Manyo Library is built in C++ language, we've introduced the technology to call C++ function from Python environment into the framework. As a result, we have already developed a lot of software for data reduction, analysis and visualization, which are utilized widely in beam lines at MLF. This document is the manual for the beginner to touch this framework. (author)

  9. Normalization to specific gravity prior to analysis improves information recovery from high resolution mass spectrometry metabolomic profiles of human urine.

    Science.gov (United States)

    Edmands, William M B; Ferrari, Pietro; Scalbert, Augustin

    2014-11-04

    Extraction of meaningful biological information from urinary metabolomic profiles obtained by liquid-chromatography coupled to mass spectrometry (MS) necessitates the control of unwanted sources of variability associated with large differences in urine sample concentrations. Different methods of normalization either before analysis (preacquisition normalization) through dilution of urine samples to the lowest specific gravity measured by refractometry, or after analysis (postacquisition normalization) to urine volume, specific gravity and median fold change are compared for their capacity to recover lead metabolites for a potential future use as dietary biomarkers. Twenty-four urine samples of 19 subjects from the European Prospective Investigation into Cancer and nutrition (EPIC) cohort were selected based on their high and low/nonconsumption of six polyphenol-rich foods as assessed with a 24 h dietary recall. MS features selected on the basis of minimum discriminant selection criteria were related to each dietary item by means of orthogonal partial least-squares discriminant analysis models. Normalization methods ranked in the following decreasing order when comparing the number of total discriminant MS features recovered to that obtained in the absence of normalization: preacquisition normalization to specific gravity (4.2-fold), postacquisition normalization to specific gravity (2.3-fold), postacquisition median fold change normalization (1.8-fold increase), postacquisition normalization to urinary volume (0.79-fold). A preventative preacquisition normalization based on urine specific gravity was found to be superior to all curative postacquisition normalization methods tested for discovery of MS features discriminant of dietary intake in these urinary metabolomic datasets.

  10. [Heart rate variability study based on a novel RdR RR Intervals Scatter Plot].

    Science.gov (United States)

    Lu, Hongwei; Lu, Xiuyun; Wang, Chunfang; Hua, Youyuan; Tian, Jiajia; Liu, Shihai

    2014-08-01

    On the basis of Poincare scatter plot and first order difference scatter plot, a novel heart rate variability (HRV) analysis method based on scatter plots of RR intervals and first order difference of RR intervals (namely, RdR) was proposed. The abscissa of the RdR scatter plot, the x-axis, is RR intervals and the ordinate, y-axis, is the difference between successive RR intervals. The RdR scatter plot includes the information of RR intervals and the difference between successive RR intervals, which captures more HRV information. By RdR scatter plot analysis of some records of MIT-BIH arrhythmias database, we found that the scatter plot of uncoupled premature ventricular contraction (PVC), coupled ventricular bigeminy and ventricular trigeminy PVC had specific graphic characteristics. The RdR scatter plot method has higher detecting performance than the Poincare scatter plot method, and simpler and more intuitive than the first order difference method.

  11. Approach to the nonrelatiVistic scattering theory based on the causality superposition and unitarity principles

    International Nuclear Information System (INIS)

    Gajnutdinov, R.Kh.

    1983-01-01

    Possibility is studied to build the nonrelativistic scattering theory on the base of the general physical principles: causality, superposition, and unitarity, making no use of the Schroedinger formalism. The suggested approach is shown to be more general than the nonrelativistic scattering theory based on the Schroedinger equation. The approach is applied to build a model ofthe scattering theory for a system which consists of heavy nonrelativistic particles and a light relativistic particle

  12. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  13. Software-based annunciator replacement: a tale of two projects

    International Nuclear Information System (INIS)

    Simmons, G.T.

    2015-01-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  14. Software-based annunciator replacement: a tale of two projects

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, G.T., E-mail: simmongt@westinghouse.com [Westinghouse Electric Company LLC, Cranberry Township, PA (United States)

    2015-07-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  15. A Technology-Neutral Role-Based Collaboration Model for Software Ecosystems

    DEFF Research Database (Denmark)

    Stanciulescu, Stefan; Rabiser, Daniela; Seidl, Christoph

    2016-01-01

    by contributing a role-based collaboration model for software ecosystems to make such implicit similarities explicit and to raise awareness among developers during their ongoing efforts. We extract this model based on realization artifacts in a specific programming language located in a particular source code......In large-scale software ecosystems, many developers contribute extensions to a common software platform. Due to the independent development efforts and the lack of a central steering mechanism, similar functionality may be developed multiple times by different developers. We tackle this problem...... efforts and information of ongoing development efforts. Finally, using the collaborations defined in the formalism we model real artifacts from Marlin, a firmware for 3D printers, and we show that for the selected scenarios, the five collaborations were sufficient to raise awareness and make implicit...

  16. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  17. V and V based Fault Estimation Method for Safety-Critical Software using BNs

    International Nuclear Information System (INIS)

    Eom, Heung Seop; Park, Gee Yong; Jang, Seung Cheol; Kang, Hyun Gook

    2011-01-01

    Quantitative software reliability measurement approaches have severe limitations in demonstrating the proper level of reliability for safety-critical software. These limitations can be overcome by using some other means of assessment. One of the promising candidates is based on the quality of the software development. Particularly in the nuclear industry, regulatory bodies in most countries do not accept the concept of quantitative goals as a sole means of meeting their regulations for the reliability of digital computers in NPPs, and use deterministic criteria for both hardware and software. The point of deterministic criteria is to assess the whole development process and its related activities during the software development life cycle for the acceptance of safety-critical software, and software V and V plays an important role in this process. In this light, we studied a V and V based fault estimation method using Bayesian Nets (BNs) to assess the reliability of safety-critical software, especially reactor protection system software in a NPP. The BNs in the study were made for an estimation of software faults and were based on the V and V frame, which governs the development of safety-critical software in the nuclear field. A case study was carried out for a reactor protection system that was developed as a part of the Korea Nuclear Instrumentation and Control System. The insight from the case study is that some important factors affecting the fault number of the target software include the residual faults in the system specification, maximum number of faults introduced in the development phase, ratio between process/function characteristic, uncertainty sizing, and fault elimination rate by inspection activities

  18. Repository-Based Software Engineering Program: Working Program Management Plan

    Science.gov (United States)

    1993-01-01

    Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.

  19. Software Engineering and Swarm-Based Systems

    Science.gov (United States)

    Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.

    2006-01-01

    We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.

  20. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  1. SIGKit: a New Data-based Software for Learning Introductory Geophysics

    Science.gov (United States)

    Zhang, Y.; Kruse, S.; George, O.; Esmaeili, S.; Papadimitrios, K. S.; Bank, C. G.; Cadmus, A.; Kenneally, N.; Patton, K.; Brusher, J.

    2016-12-01

    Students of diverse academic backgrounds take introductory geophysics courses to learn the theory of a variety of measurement and analysis methods with the expectation to be able to apply their basic knowledge to real data. Ideally, such data is collected in field courses and also used in lecture-based courses because they provide a critical context for better learning and understanding of geophysical methods. Each method requires a separate software package for the data processing steps, and the complexity and variety of professional software makes the path through data processing to data interpretation a strenuous learning process for students and a challenging teaching task for instructors. SIGKit (Student Investigation of Geophysics Toolkit) being developed as a collaboration between the University of South Florida, the University of Toronto, and MathWorks intends to address these shortcomings by showing the most essential processing steps and allowing students to visualize the underlying physics of the various methods. It is based on MATLAB software and offered as an easy-to-use graphical user interface and packaged so it can run as an executable in the classroom and the field even on computers without MATLAB licenses. An evaluation of the software based on student feedback from focus-group interviews and think-aloud observations helps drive its development and refinement. The toolkit provides a logical gateway into the more sophisticated and costly software students will encounter later in their training and careers by combining essential visualization, modeling, processing, and analysis steps for seismic, GPR, magnetics, gravity, resistivity, and electromagnetic data.

  2. SU-E-I-07: An Improved Technique for Scatter Correction in PET

    International Nuclear Information System (INIS)

    Lin, S; Wang, Y; Lue, K; Lin, H; Chuang, K

    2014-01-01

    Purpose: In positron emission tomography (PET), the single scatter simulation (SSS) algorithm is widely used for scatter estimation in clinical scans. However, bias usually occurs at the essential steps of scaling the computed SSS distribution to real scatter amounts by employing the scatter-only projection tail. The bias can be amplified when the scatter-only projection tail is too small, resulting in incorrect scatter correction. To this end, we propose a novel scatter calibration technique to accurately estimate the amount of scatter using pre-determined scatter fraction (SF) function instead of the employment of scatter-only tail information. Methods: As the SF depends on the radioactivity distribution and the attenuating material of the patient, an accurate theoretical relation cannot be devised. Instead, we constructed an empirical transformation function between SFs and average attenuation coefficients based on a serious of phantom studies with different sizes and materials. From the average attenuation coefficient, the predicted SFs were calculated using empirical transformation function. Hence, real scatter amount can be obtained by scaling the SSS distribution with the predicted SFs. The simulation was conducted using the SimSET. The Siemens Biograph™ 6 PET scanner was modeled in this study. The Software for Tomographic Image Reconstruction (STIR) was employed to estimate the scatter and reconstruct images. The EEC phantom was adopted to evaluate the performance of our proposed technique. Results: The scatter-corrected image of our method demonstrated improved image contrast over that of SSS. For our technique and SSS of the reconstructed images, the normalized standard deviation were 0.053 and 0.182, respectively; the root mean squared errors were 11.852 and 13.767, respectively. Conclusion: We have proposed an alternative method to calibrate SSS (C-SSS) to the absolute scatter amounts using SF. This method can avoid the bias caused by the insufficient

  3. SU-E-I-07: An Improved Technique for Scatter Correction in PET

    Energy Technology Data Exchange (ETDEWEB)

    Lin, S; Wang, Y; Lue, K; Lin, H; Chuang, K [Chuang, National Tsing Hua University, Hsichu, Taiwan (China)

    2014-06-01

    Purpose: In positron emission tomography (PET), the single scatter simulation (SSS) algorithm is widely used for scatter estimation in clinical scans. However, bias usually occurs at the essential steps of scaling the computed SSS distribution to real scatter amounts by employing the scatter-only projection tail. The bias can be amplified when the scatter-only projection tail is too small, resulting in incorrect scatter correction. To this end, we propose a novel scatter calibration technique to accurately estimate the amount of scatter using pre-determined scatter fraction (SF) function instead of the employment of scatter-only tail information. Methods: As the SF depends on the radioactivity distribution and the attenuating material of the patient, an accurate theoretical relation cannot be devised. Instead, we constructed an empirical transformation function between SFs and average attenuation coefficients based on a serious of phantom studies with different sizes and materials. From the average attenuation coefficient, the predicted SFs were calculated using empirical transformation function. Hence, real scatter amount can be obtained by scaling the SSS distribution with the predicted SFs. The simulation was conducted using the SimSET. The Siemens Biograph™ 6 PET scanner was modeled in this study. The Software for Tomographic Image Reconstruction (STIR) was employed to estimate the scatter and reconstruct images. The EEC phantom was adopted to evaluate the performance of our proposed technique. Results: The scatter-corrected image of our method demonstrated improved image contrast over that of SSS. For our technique and SSS of the reconstructed images, the normalized standard deviation were 0.053 and 0.182, respectively; the root mean squared errors were 11.852 and 13.767, respectively. Conclusion: We have proposed an alternative method to calibrate SSS (C-SSS) to the absolute scatter amounts using SF. This method can avoid the bias caused by the insufficient

  4. Qt based control system software for Low Energy Accelerator Facility

    International Nuclear Information System (INIS)

    Basu, A.; Singh, S.; Nagraju, S.B.V.; Gupta, S.; Singh, P.

    2012-01-01

    Qt based control system software for Low Energy Accelerating Facility (LEAF) is operational at Bhabha Atomic Research Centre (BARC), Trombay, Mumbai. LEAF is a 50 keV negative ion electrostatic accelerator based on SNICS ion source. Control system uses Nokia Trolltech's QT 4.x API for control system software. Ni 6008 USB based multifunction cards has been used for control and read back field equipments such as power supplies, pumps, valves etc. Control system architecture is designed to be client server. Qt is chosen for its excellent GUI capability and platform independent nature. Control system follows client server architecture. The paper will describe the control system. (author)

  5. Mobile Agent-Based Software Systems Modeling Approaches: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Aissam Belghiat

    2016-06-01

    Full Text Available Mobile agent-based applications are special type of software systems which take the advantages of mobile agents in order to provide a new beneficial paradigm to solve multiple complex problems in several fields and areas such as network management, e-commerce, e-learning, etc. Likewise, we notice lack of real applications based on this paradigm and lack of serious evaluations of their modeling approaches. Hence, this paper provides a comparative study of modeling approaches of mobile agent-based software systems. The objective is to give the reader an overview and a thorough understanding of the work that has been done and where the gaps in the research are.

  6. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  7. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  8. Graph-based software specification and verification

    NARCIS (Netherlands)

    Kastenberg, H.

    2008-01-01

    The (in)correct functioning of many software systems heavily influences the way we qualify our daily lives. Software companies as well as academic computer science research groups spend much effort on applying and developing techniques for improving the correctness of software systems. In this

  9. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    Science.gov (United States)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the

  10. A software radio platform based on ARM and FPGA

    Directory of Open Access Journals (Sweden)

    Yang Xin.

    2016-01-01

    Full Text Available The rapid rise in computational performance offered by computer systems has greatly increased the number of practical software radio applications. A scheme presented in this paper is a software radio platform based on ARM and FPGA. FPGA works as the coprocessor together with the ARM, which serves as the core processor. ARM is used for digital signal processing and real-time data transmission, and FPGA is used for synchronous timing control and serial-parallel conversion. A SPI driver for real-time data transmission between ARM and FPGA under ARM-Linux system is provided. By adopting modular design, the software radio platform is capable of implementing wireless communication functions and satisfies the requirements of real-time signal processing platform for high security and broad applicability.

  11. A new entropy based method for computing software structural complexity

    CERN Document Server

    Roca, J L

    2002-01-01

    In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relation...

  12. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  13. Certain theories of multiple scattering in random media of discrete scatterers

    International Nuclear Information System (INIS)

    Olsen, R.L.; Kharadly, M.M.Z.; Corr, D.G.

    1976-01-01

    New information is presented on the accuracy of the heuristic approximations in two important theories of multiple scattering in random media of discrete scatterers: Twersky's ''free-space'' and ''two-space scatterer'' formalisms. Two complementary approaches, based primarily on a one-dimensional model and the one-dimensional forms of the theories, are used. For scatterer distributions of low average density, the ''heuristic'' asymptotic forms for the coherent field and the incoherent intensity are compared with asymptotic forms derived from a systematic analysis of the multiple scattering processes. For distributions of higher density, both in the average number of scatterers per wavelength and in the degree of packing of finite-size scatterers, the analysis is carried out ''experimentally'' by means of a Monte Carlo computer simulation. Approximate series expressions based on the systematic approach are numerically evaluated along with the heuristic expressions. The comparison (for both forward- and back-scattered field moments) is made for the worst-case conditions of strong multiple scattering for which the theories have not previously been evaluated. Several significant conclusions are drawn which have certain practical implications: in application of the theories to describe some of the scattering phenomena which occur in the troposphere, and in the further evaluation of the theories using experiments on physical models

  14. Tropospheric nitrogen dioxide inversions based on spectral measurements of scattered sunlight

    NARCIS (Netherlands)

    Vlemmix, T.

    2011-01-01

    This thesis describes the development of inversion methods for tropospheric nitrogen dioxide (NO2), based on ground based observations of scattered sunlight with themulti-axis differential optical absorption spectroscopy (MAX-DOAS) technique. NO2 is an atmospheric trace gas which, when present near

  15. Electron-longitudinal-acoustic-phonon scattering in double-quantum-dot based quantum gates

    International Nuclear Information System (INIS)

    Zhao Peiji; Woolard, Dwight L.

    2008-01-01

    We propose a nanostructure design which can significantly suppress longitudinal-acoustic-phonon-electron scattering in double-quantum-dot based quantum gates for quantum computing. The calculated relaxation rates vs. bias voltage exhibit a double-peak feature with a minimum approaching 10 5 s -1 . In this matter, the energy conservation law prohibits scattering contributions from phonons with large momenta; furthermore, increasing the barrier height between the double quantum dots reduces coupling strength between the dots. Hence, the joint action of the energy conservation law and the decoupling greatly reduces the scattering rates. The degrading effects of temperatures can be reduced simply by increasing the height of the barrier between the dots

  16. A Component-based Software Development and Execution Framework for CAx Applications

    Directory of Open Access Journals (Sweden)

    N. Matsuki

    2004-01-01

    Full Text Available Digitalization of the manufacturing process and technologies is regarded as the key to increased competitive ability. The MZ-Platform infrastructure is a component-based software development framework, designed for supporting enterprises to enhance digitalized technologies using software tools and CAx components in a self-innovative way. In the paper we show the algorithm, system architecture, and a CAx application example on MZ-Platform. We also propose a new parametric data structure based on MZ-Platform.

  17. Development of a practical image-based scatter correction method for brain perfusion SPECT: comparison with the TEW method

    International Nuclear Information System (INIS)

    Shidahara, Miho; Kato, Takashi; Kawatsu, Shoji; Yoshimura, Kumiko; Ito, Kengo; Watabe, Hiroshi; Kim, Kyeong Min; Iida, Hidehiro; Kato, Rikio

    2005-01-01

    An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with 99m Tc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I AC μb with Chang's attenuation correction factor. The scatter component image is estimated by convolving I AC μb with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and 99m Tc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine. (orig.)

  18. Development of a practical image-based scatter correction method for brain perfusion SPECT: comparison with the TEW method.

    Science.gov (United States)

    Shidahara, Miho; Watabe, Hiroshi; Kim, Kyeong Min; Kato, Takashi; Kawatsu, Shoji; Kato, Rikio; Yoshimura, Kumiko; Iida, Hidehiro; Ito, Kengo

    2005-10-01

    An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with 99mTc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I(mub)AC with Chang's attenuation correction factor. The scatter component image is estimated by convolving I(mub)AC with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and 99mTc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine.

  19. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  20. Software Engineering Environment for Component-based Design of Embedded Software

    DEFF Research Database (Denmark)

    Guo, Yu

    2010-01-01

    as well as application models in a computer-aided software engineering environment. Furthermore, component models have been realized following carefully developed design patterns, which provide for an efficient and reusable implementation. The components have been ultimately implemented as prefabricated...... executable objects that can be linked together into an executable application. The development of embedded software using the COMDES framework is supported by the associated integrated engineering environment consisting of a number of tools, which support basic functionalities, such as system modelling......, validation, and executable code generation for specific hardware platforms. Developing such an environment and the associated tools is a highly complex engineering task. Therefore, this thesis has investigated key design issues and analysed existing platforms supporting model-driven software development...

  1. Use of Cloud-Based Graphic Narrative Software in Medical Ethics Teaching

    Science.gov (United States)

    Weber, Alan S.

    2015-01-01

    Although used as a common pedagogical tool in K-12 education, online graphic narrative ("comics") software has not generally been incorporated into advanced professional or technical education. This contribution reports preliminary data from a study on the use of cloud-based graphics software Pixton.com to teach basic medical ethics…

  2. Analysis of an atom laser based on the spatial control of the scattering length

    International Nuclear Information System (INIS)

    Carpentier, Alicia V.; Michinel, Humberto; Rodas-Verde, Maria I.; Perez-Garcia, Victor M.

    2006-01-01

    In this paper we analyze atom lasers based on the spatial modulation of the scattering length of a Bose-Einstein condensate. We demonstrate, through numerical simulations and approximate analytical methods, the controllable emission of matter-wave bursts and study the dependence of the process on the spatial shape of the scattering length along the axis of emission. We also study the role of an additional modulation of the scattering length in time

  3. Perchlorate Detection at Nanomolar Concentrations by Surface-Enhanced Raman Scattering

    Science.gov (United States)

    2009-01-01

    grooves/mm grating light path controlled by Renishaw WiRE software and analyzed by Galactic GRAMS software. RESULTS AND DISCUSSION Quantitative... Federal Rights License 14. ABSTRACT Perchlorate (ClO4 ) has emerged as a widespread environmental contaminant and has been detected in various food...by means of dynamic light scattering using a ZetaPlus particle size analyzer (Brookhaven Instruments, Holtsville, NY). Data were collected for every

  4. FUZZY LOGIC BASED SOFTWARE PROCESS IMPROVIZATION FRAMEWORK FOR INDIAN SMALL SCALE SOFTWARE ORGANIZATIONS

    OpenAIRE

    A.M.Kalpana; Dr.A.Ebenezer Jeyakumar

    2010-01-01

    In this paper, the authors elaborate the results obtained after analyzing and assessing the software process activities in five small to medium sized Indian software companies. This work demonstrates a cost effective framework for software process appraisal, specificallytargeted at Indian software Small-to-Medium-sized Enterprises (SMEs). Improvisation deals with the unforeseen. It involves continual experimentation with new possibilities to create innovative and improved solutions outside cu...

  5. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  6. Scattering Fields Control by Metamaterial Device Based on Ultra-Broadband Polarization Converters

    Directory of Open Access Journals (Sweden)

    Si-Jia Li

    2016-12-01

    Full Text Available We proposed a novel ultra-broadband meta¬material screen with controlling the electromagnetic scat¬tering fields based on the three layers wideband polariza¬tion converter (TLW-PC. The unit cell of TLW-PC was composed of a three layers substrate loaded with double metallic split-rings structure and a metal ground plane. We observed that the polarization converter primarily per¬formed ultra-broadband cross polarization conversion from 5.71 GHz to 14.91 GHz. Furthermore, a metamaterial screen, which contributed to the low scattering charac¬teristics, had been exploited with the orthogonal array based on TLW-PC. The near scattering electronic fields are controlled due to the change of phase and amplitude for incident wave. The metamaterial screen significantly exhibited low scattering characteristics from 5.81 GHz to 15.06 GHz. To demonstrate design, a metamaterial device easily implemented by the common printed circuit board method has been fabricated and measured. Experimental results agreed well with the simulated results.

  7. New software for neutron data reduction and visualization

    International Nuclear Information System (INIS)

    Worlton, T.; Chatterjee, A.; Hammonds, J.; Chen, D.; Loong, C.K.; Mikkelson, D.; Mikkelson, R.

    2001-01-01

    Development of advanced neutron sources and instruments has necessitated corresponding advances in software for neutron scattering data reduction and visualization. New sources produce datasets more rapidly, and new instruments produce large numbers of spectra. Because of the shorter collection times, users are able to make more measurements on a given sample. This rapid production of datasets requires that users be able to reduce and analyze data quickly to prevent a data bottleneck. In addition, the new sources and instruments are accommodating more users with less neutron-scattering specific expertise, which requires software that is easy to use and freely available. We have developed an Integrated Spectral Analysis Workbench (ISAW) software package to permit the rapid reduction and visualization of neutron data. It can handle large numbers of spectra and merge data from separate measurements. The data can be sorted according to any attribute and transformed in numerous ways. ISAW provides several views of the data that enable users to compare spectra and observe trends in the data. A command interpreter, which is now part of ISAW, allows scientists to easily set up a series of instrument-specific operations to reduce and visualize data automatically. ISAW is written entirely in Java to permit portability to different computer platforms and easy distribution of the software. The software was constructed using modern computer design methods to allow easy customization and improvement. ISAW currently only reads data from IPNS 'run' files, but work is underway to provide input of NeXus files. (author)

  8. New software for neutron data reduction and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Worlton, T.; Chatterjee, A.; Hammonds, J.; Chen, D.; Loong, C.K. [Argonne National Laboratory, Argonne, IL (United States); Mikkelson, D.; Mikkelson, R. [Univ. of Wisconsin-Stout, Menomonie, WI (United States)

    2001-03-01

    Development of advanced neutron sources and instruments has necessitated corresponding advances in software for neutron scattering data reduction and visualization. New sources produce datasets more rapidly, and new instruments produce large numbers of spectra. Because of the shorter collection times, users are able to make more measurements on a given sample. This rapid production of datasets requires that users be able to reduce and analyze data quickly to prevent a data bottleneck. In addition, the new sources and instruments are accommodating more users with less neutron-scattering specific expertise, which requires software that is easy to use and freely available. We have developed an Integrated Spectral Analysis Workbench (ISAW) software package to permit the rapid reduction and visualization of neutron data. It can handle large numbers of spectra and merge data from separate measurements. The data can be sorted according to any attribute and transformed in numerous ways. ISAW provides several views of the data that enable users to compare spectra and observe trends in the data. A command interpreter, which is now part of ISAW, allows scientists to easily set up a series of instrument-specific operations to reduce and visualize data automatically. ISAW is written entirely in Java to permit portability to different computer platforms and easy distribution of the software. The software was constructed using modern computer design methods to allow easy customization and improvement. ISAW currently only reads data from IPNS 'run' files, but work is underway to provide input of NeXus files. (author)

  9. Software-Based Student Response Systems: An Interdisciplinary Initiative

    Science.gov (United States)

    Fischer, Carol M.; Hoffman, Michael S.; Casey, Nancy C.; Cox, Maureen P.

    2015-01-01

    Colleagues from information technology and three academic departments collaborated on an instructional technology initiative to employ student response systems in classes in mathematics, accounting and education. The instructors assessed the viability of using software-based systems to enable students to use their own devices (cell phones,…

  10. SU-F-I-43: A Software-Based Statistical Method to Compute Low Contrast Detectability in Computed Tomography Images

    Energy Technology Data Exchange (ETDEWEB)

    Chacko, M; Aldoohan, S [University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: The low contrast detectability (LCD) of a CT scanner is its ability to detect and display faint lesions. The current approach to quantify LCD is achieved using vendor-specific methods and phantoms, typically by subjectively observing the smallest size object at a contrast level above phantom background. However, this approach does not yield clinically applicable values for LCD. The current study proposes a statistical LCD metric using software tools to not only to assess scanner performance, but also to quantify the key factors affecting LCD. This approach was developed using uniform QC phantoms, and its applicability was then extended under simulated clinical conditions. Methods: MATLAB software was developed to compute LCD using a uniform image of a QC phantom. For a given virtual object size, the software randomly samples the image within a selected area, and uses statistical analysis based on Student’s t-distribution to compute the LCD as the minimal Hounsfield Unit’s that can be distinguished from the background at the 95% confidence level. Its validity was assessed by comparison with the behavior of a known QC phantom under various scan protocols and a tissue-mimicking phantom. The contributions of beam quality and scattered radiation upon the computed LCD were quantified by using various external beam-hardening filters and phantom lengths. Results: As expected, the LCD was inversely related to object size under all scan conditions. The type of image reconstruction kernel filter and tissue/organ type strongly influenced the background noise characteristics and therefore, the computed LCD for the associated image. Conclusion: The proposed metric and its associated software tools are vendor-independent and can be used to analyze any LCD scanner performance. Furthermore, the method employed can be used in conjunction with the relationships established in this study between LCD and tissue type to extend these concepts to patients’ clinical CT

  11. A new entropy based method for computing software structural complexity

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2002-01-01

    In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relationship with the number of inherent software errors and it implies a basic hazard failure rate for it, so that a minimum structure assures a certain stability and maturity of the program. This metric can be used, either to evaluate the product or the process of software development, as development tool or for monitoring the stability and the quality of the final product. (author)

  12. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    Science.gov (United States)

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  13. Sensitivity Analysis of the Scattering-Based SARBM3D Despeckling Algorithm.

    Science.gov (United States)

    Di Simone, Alessio

    2016-06-25

    Synthetic Aperture Radar (SAR) imagery greatly suffers from multiplicative speckle noise, typical of coherent image acquisition sensors, such as SAR systems. Therefore, a proper and accurate despeckling preprocessing step is almost mandatory to aid the interpretation and processing of SAR data by human users and computer algorithms, respectively. Very recently, a scattering-oriented version of the popular SAR Block-Matching 3D (SARBM3D) despeckling filter, named Scattering-Based (SB)-SARBM3D, was proposed. The new filter is based on the a priori knowledge of the local topography of the scene. In this paper, an experimental sensitivity analysis of the above-mentioned despeckling algorithm is carried out, and the main results are shown and discussed. In particular, the role of both electromagnetic and geometrical parameters of the surface and the impact of its scattering behavior are investigated. Furthermore, a comprehensive sensitivity analysis of the SB-SARBM3D filter against the Digital Elevation Model (DEM) resolution and the SAR image-DEM coregistration step is also provided. The sensitivity analysis shows a significant robustness of the algorithm against most of the surface parameters, while the DEM resolution plays a key role in the despeckling process. Furthermore, the SB-SARBM3D algorithm outperforms the original SARBM3D in the presence of the most realistic scattering behaviors of the surface. An actual scenario is also presented to assess the DEM role in real-life conditions.

  14. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  15. Guideline for Bayesian Net based Software Fault Estimation Method for Reactor Protection System

    International Nuclear Information System (INIS)

    Eom, Heung Seop; Park, Gee Yong; Jang, Seung Cheol

    2011-01-01

    The purpose of this paper is to provide a preliminary guideline for the estimation of software faults in a safety-critical software, for example, reactor protection system's software. As the fault estimation method is based on Bayesian Net which intensively uses subjective probability and informal data, it is necessary to define formal procedure of the method to minimize the variability of the results. The guideline describes assumptions, limitations and uncertainties, and the product of the fault estimation method. The procedure for conducting a software fault-estimation method is then outlined, highlighting the major tasks involved. The contents of the guideline are based on our own experience and a review of research guidelines developed for a PSA

  16. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  17. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  18. Software development with two port calibration techniques for RHIC impedance measurements

    International Nuclear Information System (INIS)

    Mane, V.; Shea, T.

    1993-01-01

    The coupling impedance of accelerator devices is measured by simulating the beam with a central wire and measuring the scattering parameters of the system. The wire pipe system forms a mismatch with the 50 ohm transmission line. An integrated software environment has been developed in LabVIEW, for the Macintosh. The program measures the scattering parameters of some known standards, determines the connect scattering parameters of a device using TRL calibration technique and gives the impedance of the device. Its performance has been tested for some known microwave devices

  19. Software to model AXAF-I image quality

    Science.gov (United States)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  20. The Application of V&V within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  1. Development of a practical image-based scatter correction method for brain perfusion SPECT: comparison with the TEW method

    Energy Technology Data Exchange (ETDEWEB)

    Shidahara, Miho; Kato, Takashi; Kawatsu, Shoji; Yoshimura, Kumiko; Ito, Kengo [National Center for Geriatrics and Gerontology Research Institute, Department of Brain Science and Molecular Imaging, Obu, Aichi (Japan); Watabe, Hiroshi; Kim, Kyeong Min; Iida, Hidehiro [National Cardiovascular Center Research Institute, Department of Investigative Radiology, Suita (Japan); Kato, Rikio [National Center for Geriatrics and Gerontology, Department of Radiology, Obu (Japan)

    2005-10-01

    An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with {sup 99m}Tc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I{sub AC}{sup {mu}}{sup b} with Chang's attenuation correction factor. The scatter component image is estimated by convolving I{sub AC}{sup {mu}}{sup b} with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and {sup 99m}Tc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine. (orig.)

  2. Mass market development strategies of software industries: Case study based research

    Directory of Open Access Journals (Sweden)

    Varun Gupta

    2016-09-01

    Full Text Available The success in competitive mass market software development depends on the quality of software development and market segments targeted. Market segments are categorized by uncertainties contributed by “Newness” and “turbulences”, making the software success stochastic in nature. Selecting good market segments, delivering high quality software versions in the lowest time than competitors, result in increasing demand in markets and ultimately revenues. Enhanced customer base is beneficial for current product as well as for future products of industry in the form of increased reputation and increased involvement of customers in future development. The case study was conducted with 13 representatives drawing experiences of 14 mass market projects. Results indicate that software solutions are delivered to few investors or in highly competitive markets, as per the survey's findings of the marketing departments. The software organizations are reluctant to deliver relatively complex solutions in new markets unless and until strongly convinced with the probable success. The method for selection of market segments belonging to new and existing markets for undertaking the software delivery is also proposed in this paper. The model will help software industry decide the market segments and high abstract level features that could increase probability of software success. Poor selection of markets or targeting markets of “improper” size affects the market share of the industry to a great extend.

  3. Evolution of the transfer function characterization of surface scatter phenomena

    Science.gov (United States)

    Harvey, James E.; Pfisterer, Richard N.

    2016-09-01

    Based upon the empirical observation that BRDF measurements of smooth optical surfaces exhibited shift-invariant behavior when plotted versus    o , the original Harvey-Shack (OHS) surface scatter theory was developed as a scalar linear systems formulation in which scattered light behavior was characterized by a surface transfer function (STF) reminiscent of the optical transfer function (OTF) of modern image formation theory (1976). This shift-invariant behavior combined with the inverse power law behavior when plotting log BRDF versus log   o was quickly incorporated into several optical analysis software packages. Although there was no explicit smooth-surface approximation in the OHS theory, there was a limitation on both the incident and scattering angles. In 1988 the modified Harvey-Shack (MHS) theory removed the limitation on the angle of incidence; however, a moderate-angle scattering limitation remained. Clearly for large incident angles the BRDF was no longer shift-invariant as a different STF was now required for each incident angle. In 2011 the generalized Harvey-Shack (GHS) surface scatter theory, characterized by a two-parameter family of STFs, evolved into a practical modeling tool to calculate BRDFs from optical surface metrology data for situations that violate the smooth surface approximation inherent in the Rayleigh-Rice theory and/or the moderate-angle limitation of the Beckmann-Kirchhoff theory. And finally, the STF can be multiplied by the classical OTF to provide a complete linear systems formulation of image quality as degraded by diffraction, geometrical aberrations and surface scatter effects from residual optical fabrication errors.

  4. Equipment and software for the experiment on polarized proton scattering on hydrogen and nuclei

    International Nuclear Information System (INIS)

    Buklej, A.E.; Govorun, N.N.; Zhurkin, V.V.

    1980-01-01

    Installation for the conduction of polarization measurements upon the beam of polarized protons with the 2.1 GeV/c momentum using ITEP synchrotron is described. The installation is designed for polarization measurement in elastic pp-scattering and asymmetry in summary (elastic and inelastic without meson production) scattering of polarized protons upon nuclei in the angle range up to 180 mrad, as well as polarization in elastic pn-scattering. The installation consists of 18 two-coordinate magnetostriction wire spark chambers (s.c.), emitting counters, the system of veto-counters surrounding the target, liquid hydrogen or (deuterium) target and magnet to conduct pulse analysis of scattered particles in the background measurements. Primary processing of the material is conducted on the basis of modernized programs using the M-220 and BESM-6 computers. With a help of the experimental installation described asymmetry measurement on hydrogen, Li, C, Al, Ca have been conducted. The prospect of use of the method described to separate elastic reactions in the range of very small momentum transmitted, where the background of inelastic interactions can be decreased to the negligibly low level, for precise measurement of elastic reactions cross sections and the study of polarization phenomena in the range of coulomb interference is underlined [ru

  5. Simple smoothing technique to reduce data scattering in physics experiments

    International Nuclear Information System (INIS)

    Levesque, L

    2008-01-01

    This paper describes an experiment involving motorized motion and a method to reduce data scattering from data acquisition. Jitter or minute instrumental vibrations add noise to a detected signal, which often renders small modulations of a graph very difficult to interpret. Here we describe a method to reduce scattering amongst data points from the signal measured by a photodetector that is motorized and scanned in a direction parallel to the plane of a rectangular slit during a computer-controlled diffraction experiment. The smoothing technique is investigated using subsets of many data points from the data acquisition. A limit for the number of data points in a subset is determined from the results based on the trend of the small measured signal to avoid severe changes in the shape of the signal from the averaging procedure. This simple smoothing method can be achieved using any type of spreadsheet software

  6. Licensing process for safety-critical software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland); Pulkkinen, U. [VTT Automation, Espoo (Finland)

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications

  7. Licensing process for safety-critical software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Korhonen, J.; Pulkkinen, U.

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications. Many of the

  8. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    Science.gov (United States)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-06-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  9. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    Science.gov (United States)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-03-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  10. A Value-Based Business Approach to Product Line Software Engineering

    Directory of Open Access Journals (Sweden)

    Raman K. Agrawalla

    2009-08-01

    Full Text Available The present conceptual paper is an attempt to provide a Value-Based Business Approach (VBBA to product line software engineering. It argues that Product line software engineering should be seen as a system and considered as a means towards the end of appropriating more and more value for the business firm; contingent upon the fact that it provides value to customer and customer's customers operating its value creating system with agility, speed, economy and innovation; getting governed by the positive sum value creation outlook and guided by value- based management. With our value-based business triad, the product line engineering process can hope to achieve simultaneously value, variety and volume, product differentiation and cost leadership enabling the business firm to land on the virtuous value spiral.

  11. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  12. A CORBA BASED ARCHITECTURE FOR ACCESSING REUSABLE SOFTWARE COMPONENTS ON THE WEB.

    Directory of Open Access Journals (Sweden)

    R. Cenk ERDUR

    2003-01-01

    Full Text Available In a very near future, as a result of the continious growth of Internet and advances in networking technologies, Internet will become the common software repository for people and organizations who employ component based reuse approach in their software development life cycles. In order to use the reusable components such as source codes, analysis, designs, design patterns during new software development processes, environments that support the identification of the components over Internet are needed. Basic elements of such an environment are the coordinator programs which deliver user requests to appropriate component libraries, user interfaces for querying, and programs that wrap the component libraries. First, a CORBA based architecture is proposed for such an environment. Then, an alternative architecture that is based on the Java 2 platform technologies is given for the same environment. Finally, the two architectures are compared.

  13. Cell light scattering characteristic numerical simulation research based on FDTD algorithm

    Science.gov (United States)

    Lin, Xiaogang; Wan, Nan; Zhu, Hao; Weng, Lingdong

    2017-01-01

    In this study, finite-difference time-domain (FDTD) algorithm has been used to work out the cell light scattering problem. Before beginning to do the simulation contrast, finding out the changes or the differences between normal cells and abnormal cells which may be cancerous or maldevelopment is necessary. The preparation of simulation are building up the simple cell model of cell which consists of organelles, nucleus and cytoplasm and setting up the suitable precision of mesh. Meanwhile, setting up the total field scattering field source as the excitation source and far field projection analysis group is also important. Every step need to be explained by the principles of mathematic such as the numerical dispersion, perfect matched layer boundary condition and near-far field extrapolation. The consequences of simulation indicated that the position of nucleus changed will increase the back scattering intensity and the significant difference on the peak value of scattering intensity may result from the changes of the size of cytoplasm. The study may help us find out the regulations based on the simulation consequences and the regulations can be meaningful for early diagnosis of cancers.

  14. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  15. Gravitational Bhabha scattering

    International Nuclear Information System (INIS)

    Santos, A F; Khanna, Faqir C

    2017-01-01

    Gravitoelectromagnetism (GEM) as a theory for gravity has been developed similar to the electromagnetic field theory. A weak field approximation of Einstein theory of relativity is similar to GEM. This theory has been quantized. Traditional Bhabha scattering, electron–positron scattering, is based on quantized electrodynamics theory. Usually the amplitude is written in terms of one photon exchange process. With the development of quantized GEM theory, the scattering amplitude will have an additional component based on an exchange of one graviton at the lowest order of perturbation theory. An analysis will provide the relative importance of the two amplitudes for Bhabha scattering. This will allow an analysis of the relative importance of the two amplitudes as the energy of the exchanged particles increases. (paper)

  16. Laser light scattering instrument advanced technology development

    Science.gov (United States)

    Wallace, J. F.

    1993-01-01

    The objective of this advanced technology development (ATD) project has been to provide sturdy, miniaturized laser light scattering (LLS) instrumentation for use in microgravity experiments. To do this, we assessed user requirements, explored the capabilities of existing and prospective laser light scattering hardware, and both coordinated and participated in the hardware and software advances needed for a flight hardware instrument. We have successfully breadboarded and evaluated an engineering version of a single-angle glove-box instrument which uses solid state detectors and lasers, along with fiber optics, for beam delivery and detection. Additionally, we have provided the specifications and written verification procedures necessary for procuring a miniature multi-angle LLS instrument which will be used by the flight hardware project which resulted from this work and from this project's interaction with the laser light scattering community.

  17. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  18. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  19. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  20. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  1. Perceptions of Peer Review Using Cloud-Based Software

    Science.gov (United States)

    Andrichuk, Gjoa

    2016-01-01

    This study looks at the change in perception regarding the effect of peer feedback on writing skills using cloud-based software. Pre- and post-surveys were given. The students peer reviewed drafts of five sections of scientific reports using Google Docs. While students reported that they did not perceive their writing ability improved by being…

  2. Spectrum-based Fault Localization in Embedded Software

    NARCIS (Netherlands)

    Abreu, R.

    2009-01-01

    Locating software components that are responsible for observed failures is a time-intensive and expensive phase in the software development cycle. Automatic fault localization techniques aid developers/testers in pinpointing the root cause of software failures, as such reducing the debugging effort.

  3. Trustworthiness Measurement Algorithm for TWfMS Based on Software Behaviour Entropy

    Directory of Open Access Journals (Sweden)

    Qiang Han

    2018-03-01

    Full Text Available As the virtual mirror of complex real-time business processes of organisations’ underlying information systems, the workflow management system (WfMS has emerged in recent decades as a new self-autonomous paradigm in the open, dynamic, distributed computing environment. In order to construct a trustworthy workflow management system (TWfMS, the design of a software behaviour trustworthiness measurement algorithm is an urgent task for researchers. Accompanying the trustworthiness mechanism, the measurement algorithm, with uncertain software behaviour trustworthiness information of the WfMS, should be resolved as an infrastructure. Based on the framework presented in our research prior to this paper, we firstly introduce a formal model for the WfMS trustworthiness measurement, with the main property reasoning based on calculus operators. Secondly, this paper proposes a novel measurement algorithm from the software behaviour entropy of calculus operators through the principle of maximum entropy (POME and the data mining method. Thirdly, the trustworthiness measurement algorithm for incomplete software behaviour tests and runtime information is discussed and compared by means of a detailed explanation. Finally, we provide conclusions and discuss certain future research areas of the TWfMS.

  4. ATSAS 2.8: a comprehensive data analysis suite for small-angle scattering from macromolecular solutions.

    Science.gov (United States)

    Franke, D; Petoukhov, M V; Konarev, P V; Panjkovich, A; Tuukkanen, A; Mertens, H D T; Kikhney, A G; Hajizadeh, N R; Franklin, J M; Jeffries, C M; Svergun, D I

    2017-08-01

    ATSAS is a comprehensive software suite for the analysis of small-angle scattering data from dilute solutions of biological macromolecules or nanoparticles. It contains applications for primary data processing and assessment, ab initio bead modelling, and model validation, as well as methods for the analysis of flexibility and mixtures. In addition, approaches are supported that utilize information from X-ray crystallography, nuclear magnetic resonance spectroscopy or atomistic homology modelling to construct hybrid models based on the scattering data. This article summarizes the progress made during the 2.5-2.8 ATSAS release series and highlights the latest developments. These include AMBIMETER , an assessment of the reconstruction ambiguity of experimental data; DATCLASS , a multiclass shape classification based on experimental data; SASRES , for estimating the resolution of ab initio model reconstructions; CHROMIXS , a convenient interface to analyse in-line size exclusion chromatography data; SHANUM , to evaluate the useful angular range in measured data; SREFLEX , to refine available high-resolution models using normal mode analysis; SUPALM for a rapid superposition of low- and high-resolution models; and SASPy , the ATSAS plugin for interactive modelling in PyMOL . All these features and other improvements are included in the ATSAS release 2.8, freely available for academic users from https://www.embl-hamburg.de/biosaxs/software.html.

  5. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  6. VPD residue search by monitoring scattered x-rays

    International Nuclear Information System (INIS)

    Mori, Y.; Yamagami, M.; Yamada, T.

    2000-01-01

    Recently, VPD-TXRF has come into wide use for semiconductor analysis. In VPD-TXRF technique, adjusting the mechanical measuring point to the center of dried residue is of importance for accurate determination. Until now, the following searching methods have been used: monitoring light scattering under bright illumination, using laser scattering particle mapper, applying internal standard as a marker. However, each method has individual disadvantage. For example, interference of Kβ line (ex. Sc-Kβ to Ti-Kα) occurs in the internal standard method. We propose a new searching method 'scattered x-ray search' which utilizes x-ray scattering form the dried residue as a marker. Since the line profile of x-ray scattering agrees with that of fluorescent x-rays, scattered x-ray can be used as an alternative marker instead of internal standard. According to our experimental results, this search method shows the same accuracy as internal standard method. The merits are as follows: 1) no need to add internal standard, 2) rapid search because of high intensity of scattered x-rays, 3) searching software for internal standard can be applied without any modification. In this method, diffraction of incident x-rays by substrate causes irregular change over the detected scattering x-rays. Therefore, this method works better under x-y controlled stage than r-Θ one. (author)

  7. Software upgradation of PXI based data acquisition for Aditya experiments

    International Nuclear Information System (INIS)

    Panchal, Vipul K.; Chavda, Chhaya; Patel, Vijay; Patel, Narendra; Ghosh, Joydeep

    2015-01-01

    Aditya Data Acquisition and Control System is designed to acquire data from diagnostics like Loop Voltage, Rogowski, Magnetic probes, X-rays etc and for control of gas feed, gate valve control, trigger pulse generation etc. CAMAC based data acquisition system was updated with PXI based Multifunction modules. The System is interfaced using optical connectivity with PC using PCI based controller module. Data is acquired using LabVIEW graphical user interface (GUI) and stored in server. The present GUI based application does not have features like module parameters configuration, analysis, webcasting etc. So a new application software using LabVIEW is being developed with features for individual module support considering programmable channel configuration - sampling rate, number of pre and post trigger samples, number of active channel selection etc. It would also have facility of using multi-functionality of timer and counter. The software would be scalable considering more modules, channels and crates along with security of different access level of user privileges. (author)

  8. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  9. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    Science.gov (United States)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  10. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  11. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  12. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  13. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Folding model analysis of the nucleus–nucleus scattering based on ...

    Indian Academy of Sciences (India)

    ... Lecture Workshops · Refresher Courses · Symposia · Live Streaming. Home; Journals; Pramana – Journal of Physics; Volume 87; Issue 6. Folding model analysis of the nucleus–nucleus scattering based on Jacobi coordinates. F PAKDEL A A RAJABI L NICKHAH. Regular Volume 87 Issue 6 December 2016 Article ID 90 ...

  15. Realization of low-scattering metamaterial shell based on cylindrical wave expanding theory.

    Science.gov (United States)

    Wu, Xiaoyu; Hu, Chenggang; Wang, Min; Pu, Mingbo; Luo, Xiangang

    2015-04-20

    In this paper, we demonstrate the design of a low-scattering metamaterial shell with strong backward scattering reduction and a wide bandwidth at microwave frequencies. Low echo is achieved through cylindrical wave expanding theory, and such shell only contains one metamaterial layer with simultaneous low permittivity and permeability. Cut-wire structure is selected to realize the low electromagnetic (EM) parameters and low loss on the resonance brim region. The full-model simulations show good agreement with theoretical calculations, and illustrate that near -20dB reduction is achieved and the -10 dB bandwidth can reach up to 0.6 GHz. Compared with the cloak based on transformation electromagnetics, the design possesses advantage of simpler requirement of EM parameters and is much easier to be implemented when only backward scattering field is cared.

  16. Enhancement of graphic user interface data acquisition of small angle neutron scattering

    International Nuclear Information System (INIS)

    Abd Aziz Muhammad; Abd Jalil Abd Hamid

    2004-01-01

    This paper discusses the activities of the development of data acquisition software for PC, which capable of controlling instrument via IEEE-488 and graphic visualization for small angle neutron scattering (SANS) runs in DOS mode. With the help of outstanding free ware graphic library for DOS, this software has enhanced the efficiency of graphic visualization for SANSLab data acquisition. Featuring easy-to-use graphical user interface (GUI) and several other built-in tools for convenience, this software can be manipulated with the mouse or the keyboard. This software can be converted into an inexpensive data acquisition system for SANS. (Author)

  17. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  18. Metric-based Evaluation of Implemented Software Architectures

    NARCIS (Netherlands)

    Bouwers, E.M.

    2013-01-01

    Software systems make up an important part of our daily lives. Just like all man- made objects, the possibilities of a software system are constrained by the choices made during its creation. The complete set of these choices can be referred to as the software architecture of a system. Since the

  19. A computer graphics based model for scattering from objects of arbitrary shapes in the optical region

    Science.gov (United States)

    Goel, Narendra S.; Rozehnal, Ivan; Thompson, Richard L.

    1991-01-01

    A computer-graphics-based model, named DIANA, is presented for generation of objects of arbitrary shape and for calculating bidirectional reflectances and scattering from them, in the visible and infrared region. The computer generation is based on a modified Lindenmayer system approach which makes it possible to generate objects of arbitrary shapes and to simulate their growth, dynamics, and movement. Rendering techniques are used to display an object on a computer screen with appropriate shading and shadowing and to calculate the scattering and reflectance from the object. The technique is illustrated with scattering from canopies of simulated corn plants.

  20. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  1. The intersection of software and strengths: Using internet technology and case management software to assist Strength-Based Practice.

    Science.gov (United States)

    Clark, Michael D; Brien, Dale W

    2016-01-01

    The focus of this investigation is the helping professionals working within American Indian and Alaska Native (AI/AN) communities. This article looks at how innovative technology-in the form of automated case management software and Internet connectivity-can assist effective implementation of Strength-based Practice and agency services within tribal courts and the many other helping agencies that serve AI/AN populations. We seek to expand practice knowledge by reviewing the benefits that this software and Internet connectivity can offer to agency operations and exploring how they can assist case management services.

  2. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  3. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  4. The IPNS rietveld analysis software package for TOF [time-of-flight] powder diffraction data: Recent developments

    International Nuclear Information System (INIS)

    Rotella, F.J.; Richardson, J.W. Jr.

    1987-01-01

    A system of FORTRAN programs for the analysis of time-of-flight (TOF) neutron powder diffraction data via the Rietveld method at IPNS has been modified recently, making it possible to analyze data that exhibit diffraction maxima broadened due to anisotropic strain and that can be modeled by individual atomic anharmonic thermal vibrations. The observation of noncrystalline scattering in data from some powder samples has led to the development of software to fit such scattering by a function related to a radial distribution function through Fourier-filtering techniques. The ''user friendliness'' of the IPNS Rietveld package has been enhanced by the development of ''RIETVELD,'' a menu-based VAX/VMS command language routine for interactive file manipulation and program execution

  5. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    International Nuclear Information System (INIS)

    Kadrmas, Dan J.; Karimi, Seemeen S.; Frey, Eric C.; Tsui, Benjamin M.W.

    1998-01-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with 99m Tc tracer, and also using experimentally acquired data with 201 Tl tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for 64x64x24 image reconstruction). (author)

  6. AWARE-P: a system-based software for urban water IAM planning

    OpenAIRE

    Coelho, S.T.; Vitorino, D.; Alegre, H.

    2013-01-01

    The AWARE-P IAM planning software offers a non-intrusive, web-based, collaborative integration environment for a wide variety of data and processes that may be relevant to the IAM decision-making process, including maps, GIS shapefiles and geodatabases; inventory records; work orders, maintenance, inspections/CCTV records; network models, performance indicators, asset valuation records, among others. The software provides an organized framework for evaluating and comparing planning alternativ...

  7. Management of Globally Distributed Component-Based Software Development Projects

    NARCIS (Netherlands)

    J. Kotlarsky (Julia)

    2005-01-01

    textabstractGlobally Distributed Component-Based Development (GD CBD) is expected to become a promising area, as increasing numbers of companies are setting up software development in a globally distributed environment and at the same time are adopting CBD methodologies. Being an emerging area, the

  8. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  9. Biogem: an effective tool based approach for scaling up open source software development in bioinformatics

    NARCIS (Netherlands)

    Bonnal, R.J.P.; Smant, G.; Prins, J.C.P.

    2012-01-01

    Biogem provides a software development environment for the Ruby programming language, which encourages community-based software development for bioinformatics while lowering the barrier to entry and encouraging best practices. Biogem, with its targeted modular and decentralized approach, software

  10. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  11. A scatter-corrected list-mode reconstruction and a practical scatter/random approximation technique for dynamic PET imaging

    International Nuclear Information System (INIS)

    Cheng, J-C; Rahmim, Arman; Blinder, Stephan; Camborde, Marie-Laure; Raywood, Kelvin; Sossi, Vesna

    2007-01-01

    We describe an ordinary Poisson list-mode expectation maximization (OP-LMEM) algorithm with a sinogram-based scatter correction method based on the single scatter simulation (SSS) technique and a random correction method based on the variance-reduced delayed-coincidence technique. We also describe a practical approximate scatter and random-estimation approach for dynamic PET studies based on a time-averaged scatter and random estimate followed by scaling according to the global numbers of true coincidences and randoms for each temporal frame. The quantitative accuracy achieved using OP-LMEM was compared to that obtained using the histogram-mode 3D ordinary Poisson ordered subset expectation maximization (3D-OP) algorithm with similar scatter and random correction methods, and they showed excellent agreement. The accuracy of the approximated scatter and random estimates was tested by comparing time activity curves (TACs) as well as the spatial scatter distribution from dynamic non-human primate studies obtained from the conventional (frame-based) approach and those obtained from the approximate approach. An excellent agreement was found, and the time required for the calculation of scatter and random estimates in the dynamic studies became much less dependent on the number of frames (we achieved a nearly four times faster performance on the scatter and random estimates by applying the proposed method). The precision of the scatter fraction was also demonstrated for the conventional and the approximate approach using phantom studies

  12. A small angle neutron scattering study of mica based glass-ceramics with applications in dentistry

    International Nuclear Information System (INIS)

    Kilcoyne, S.H.; Bentley, P.M.; Al-Jawad, M.; Bubb, N.L.; Al-Shammary, H.A.O.; Wood, D.J.

    2004-01-01

    We are currently developing machinable and load-bearing mica-based glass-ceramics for use in restorative dental surgery. In this paper we present the results of an ambient temperature small angle neutron scattering (SANS) study of several such ceramics with chemical compositions chosen to optimise machinability and strength. The SANS spectra are all dominated by scattering from the crystalline-amorphous phase interface and exhibit Q -4 dependence (Porod scattering) indicating that, on a 100 A scale, the surface of the crystals is smooth

  13. A Java-based electronic healthcare record software for beta-thalassaemia.

    Science.gov (United States)

    Deftereos, S; Lambrinoudakis, C; Andriopoulos, P; Farmakis, D; Aessopos, A

    2001-01-01

    Beta-thalassaemia is a hereditary disease, the prevalence of which is high in persons of Mediterranean, African, and Southeast Asian ancestry. In Greece it constitutes an important public health problem. Beta-thalassaemia necessitates continuous and complicated health care procedures such as daily chelation; biweekly transfusions; and periodic cardiology, endocrinology, and hepatology evaluations. Typically, different care items are offered in different, often-distant, health care units, which leads to increased patient mobility. This is especially true in rural areas. Medical records of patients suffering from beta-thalassaemia are inevitably complex and grow in size very fast. They are currently paper-based, scattered over all units involved in the care process. This hinders communication of information between health care professionals and makes processing of the medical records difficult, thus impeding medical research. Our objective is to provide an electronic means for recording, communicating, and processing all data produced in the context of the care process of patients suffering from beta-thalassaemia. We have developed - and we present in this paper - Java-based Electronic Healthcare Record (EHCR) software, called JAnaemia. JAnaemia is a general-purpose EHCR application, which can be customized for use in all medical specialties. Customization for beta-thalassaemia has been performed in collaboration with 4 Greek hospitals. To be capable of coping with patient record diversity, JAnaemia has been based on the EHCR architecture proposed in the ENV 13606:1999 standard, published by the CEN/TC251 committee. Compliance with the CEN architecture also ensures that several additional requirements are fulfilled in relation to clinical comprehensiveness; to record sharing and communication; and to ethical, medico-legal, and computational issues. Special care has been taken to provide a user-friendly, form-based interface for data entry and processing. The

  14. Coherent anti-Stokes Raman scattering microscopy with a photonic crystal fiber based light source

    DEFF Research Database (Denmark)

    Paulsen, H.N.; Hilligsøe, Karen Marie; Thøgersen, J.

    2003-01-01

    A coherent anti-Stokes Raman scattering microscope based on a Ti:sapphire femtosecond oscillator and a photonic crystal fiber is demonstrated. The nonlinear response of the fiber is used to generate the additional wavelength needed in the Raman process. The applicability of the setup is demonstra......A coherent anti-Stokes Raman scattering microscope based on a Ti:sapphire femtosecond oscillator and a photonic crystal fiber is demonstrated. The nonlinear response of the fiber is used to generate the additional wavelength needed in the Raman process. The applicability of the setup...

  15. Nontargeted diagnostic ion network analysis (NINA): A software to streamline the analytical workflow for untargeted characterization of natural medicines.

    Science.gov (United States)

    Ye, Hui; Zhu, Lin; Sun, Di; Luo, Xiaozhuo; Lu, Gaoyuan; Wang, Hong; Wang, Jing; Cao, Guoxiu; Xiao, Wei; Wang, Zhenzhong; Wang, Guangji; Hao, Haiping

    2016-11-30

    The characterization of herbal prescriptions serves as a foundation for quality control and regulation of herbal medicines. Previously, the characterization of herbal chemicals from natural medicines often relied on the analysis of signature fragment ions from the acquired tandem mass spectrometry (MS/MS) spectra with prior knowledge of the herbal species present in the herbal prescriptions of interest. Nevertheless, such an approach is often limited to target components, and it risks missing the critical components that we have no prior knowledge of. We previously reported a "diagnostic ion-guided network bridging" strategy. It is a generally applicable and robust approach to analyze unknown substances from complex mixtures in an untargeted manner. In this study, we have developed a standalone software named "Nontargeted Diagnostic Ion Network Analysis (NINA)" with a graphical user interface based on a strategy for post-acquisition data analysis. NINA allows one to rapidly determine the nontargeted diagnostic ions (NIs) by summarizing all of the fragment ions shared by the precursors from the acquired MS/MS spectra. A NI-guided network using bridging components that possess two or more NIs can then be established via NINA. With such a network, we could sequentially identify the structures of all the NIs once a single compound has been identified de novo. The structures of NIs can then be used as "priori" knowledge to narrow the candidates containing the sub-structure of the corresponding NI from the database hits. Subsequently, we applied the NINA software to the characterization of a model herbal prescription, Re-Du-Ning injection, and rapidly identified 56 herbal chemicals from the prescription using an ultra-performance liquid chromatography quadrupole time-of-flight system in the negative mode with no knowledge of the herbal species or herbal chemicals in the mixture. Therefore, we believe the applications of NINA will greatly facilitate the characterization

  16. Biological impact of music and software-based auditory training

    OpenAIRE

    Kraus, Nina

    2012-01-01

    Auditory-based communication skills are developed at a young age and are maintained throughout our lives. However, some individuals – both young and old – encounter difficulties in achieving or maintaining communication proficiency. Biological signals arising from hearing sounds relate to real-life communication skills such as listening to speech in noisy environments and reading, pointing to an intersection between hearing and cognition. Musical experience, amplification, and software-based ...

  17. Usage models in reliability assessment of software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland)

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.).

  18. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  19. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    Science.gov (United States)

    Round, A. R.; Wilkinson, S. J.; Hall, C. J.; Rogers, K. D.; Glatter, O.; Wess, T.; Ellis, I. O.

    2005-09-01

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique.

  20. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Round, A R [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Wilkinson, S J [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Hall, C J [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Rogers, K D [Department of Materials and Medical Sciences, Cranfield University, Swindon, SN6 8LA (United Kingdom); Glatter, O [Department of Chemistry, University of Graz (Austria); Wess, T [School of Optometry and Vision Sciences, Cardiff University, Cardiff CF10 3NB, Wales (United Kingdom); Ellis, I O [Nottingham City Hospital, Nottingham (United Kingdom)

    2005-09-07

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique.

  1. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    International Nuclear Information System (INIS)

    Round, A R; Wilkinson, S J; Hall, C J; Rogers, K D; Glatter, O; Wess, T; Ellis, I O

    2005-01-01

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique

  2. Graphic Design Of “Green Mission” Education Game Using Software Based On Vector

    Directory of Open Access Journals (Sweden)

    Nur Yanti

    2018-01-01

    Full Text Available Educational game is a digital game in its design using the elements of education and in it support teaching and learning by using technology that is interactive media. Generally an educational game has a fun look, an easy-to-use menu, as well as color combinations that are used that are GUI-based (Graphic User Interface so as to create appeal to users. Because it is undeniable that the human brain tends to more quickly capture learning through visual images rather than writings. Therefore, graphic design of an educational game becomes one of the important points. Software applications become one of the solutions in making game design, one of which is a vector-based software applications. There are various software that can be used in accordance with the function and usefulness of each. But in general the way the software works almost same.

  3. E language based on MCNP modeling software for autonomous

    International Nuclear Information System (INIS)

    Li Fei; Ge Liangquan; Zhang Qingxian

    2010-01-01

    MCNP (Monte Carlo N-Particle Code) is based on the Monte Carlo method for computing neutron, photon and other particles as the object of the movement simulation computer program. Because of its powerful computing simulation, flexible and universal features in many fields has been widely used, but due to a software professional in the operating area has been greatly restricted, so that in later development has been greatly hindered. E-language was used in order to develop the autonomy of MCNP modeling software, used to address users not familiar with MCNP and can not create object model, get rid of dull red tape 'notebook' type of program type and built a new MCNP modeling system. (authors)

  4. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  5. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormanjieva, Olga; Abran, A.; Braungarten, R.; Dumke, R.; Cuadrado-Gallego, J.; Brunekreef, J.

    2009-01-01

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient

  6. gemcWeb: A Cloud Based Nuclear Physics Simulation Software

    Science.gov (United States)

    Markelon, Sam

    2017-09-01

    gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.

  7. Hybrid radiosity-SP3 equation based bioluminescence tomography reconstruction for turbid medium with low- and non-scattering regions

    Science.gov (United States)

    Chen, Xueli; Zhang, Qitan; Yang, Defu; Liang, Jimin

    2014-01-01

    To provide an ideal solution for a specific problem of gastric cancer detection in which low-scattering regions simultaneously existed with both the non- and high-scattering regions, a novel hybrid radiosity-SP3 equation based reconstruction algorithm for bioluminescence tomography was proposed in this paper. In the algorithm, the third-order simplified spherical harmonics approximation (SP3) was combined with the radiosity equation to describe the bioluminescent light propagation in tissues, which provided acceptable accuracy for the turbid medium with both low- and non-scattering regions. The performance of the algorithm was evaluated with digital mouse based simulations and a gastric cancer-bearing mouse based in situ experiment. Primary results demonstrated the feasibility and superiority of the proposed algorithm for the turbid medium with low- and non-scattering regions.

  8. Hybrid radiosity-SP3 equation based bioluminescence tomography reconstruction for turbid medium with low- and non-scattering regions

    International Nuclear Information System (INIS)

    Chen, Xueli; Zhang, Qitan; Yang, Defu; Liang, Jimin

    2014-01-01

    To provide an ideal solution for a specific problem of gastric cancer detection in which low-scattering regions simultaneously existed with both the non- and high-scattering regions, a novel hybrid radiosity-SP 3 equation based reconstruction algorithm for bioluminescence tomography was proposed in this paper. In the algorithm, the third-order simplified spherical harmonics approximation (SP 3 ) was combined with the radiosity equation to describe the bioluminescent light propagation in tissues, which provided acceptable accuracy for the turbid medium with both low- and non-scattering regions. The performance of the algorithm was evaluated with digital mouse based simulations and a gastric cancer-bearing mouse based in situ experiment. Primary results demonstrated the feasibility and superiority of the proposed algorithm for the turbid medium with low- and non-scattering regions

  9. Software development for statistical handling of dosimetric and epidemiological data base

    International Nuclear Information System (INIS)

    Amaro, M.

    1990-01-01

    The dose records from different groups of occupationally exposed workers are available in a computerized data base whose main purpose is the individual dose follow-up. Apart from this objective, such a dosimetric data base can be useful to obtain statistical analysis. The type of statistical n formation that can be extracted from the data base may aim to attain mainly two kinds of objectives: - Individual and collective dose distributions and statistics. -Epidemiological statistics. The report describes the software developed to obtain the statistical reports required by the Regulatory Body, as well as any other type of dose distributions or statistics to be included in epidemiological studies A Users Guide for the operators who handle this software package, and the codes listings, are also included in the report. (Author) 2 refs

  10. BBN based Quantitative Assessment of Software Design Specification

    International Nuclear Information System (INIS)

    Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol

    2007-01-01

    Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model

  11. Safety prediction for basic components of safety-critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2000-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  12. Ultrasound scatter in heterogeneous 3D microstructures: Parameters affecting multiple scattering

    Science.gov (United States)

    Engle, B. J.; Roberts, R. A.; Grandin, R. J.

    2018-04-01

    This paper reports on a computational study of ultrasound propagation in heterogeneous metal microstructures. Random spatial fluctuations in elastic properties over a range of length scales relative to ultrasound wavelength can give rise to scatter-induced attenuation, backscatter noise, and phase front aberration. It is of interest to quantify the dependence of these phenomena on the microstructure parameters, for the purpose of quantifying deleterious consequences on flaw detectability, and for the purpose of material characterization. Valuable tools for estimation of microstructure parameters (e.g. grain size) through analysis of ultrasound backscatter have been developed based on approximate weak-scattering models. While useful, it is understood that these tools display inherent inaccuracy when multiple scattering phenomena significantly contribute to the measurement. It is the goal of this work to supplement weak scattering model predictions with corrections derived through application of an exact computational scattering model to explicitly prescribed microstructures. The scattering problem is formulated as a volume integral equation (VIE) displaying a convolutional Green-function-derived kernel. The VIE is solved iteratively employing FFT-based con-volution. Realizations of random microstructures are specified on the micron scale using statistical property descriptions (e.g. grain size and orientation distributions), which are then spatially filtered to provide rigorously equivalent scattering media on a length scale relevant to ultrasound propagation. Scattering responses from ensembles of media representations are averaged to obtain mean and variance of quantities such as attenuation and backscatter noise levels, as a function of microstructure descriptors. The computational approach will be summarized, and examples of application will be presented.

  13. Development of geophysical and geochemical data processing software based on component GIS

    International Nuclear Information System (INIS)

    Ke Dan; Yu Xiang; Wu Qubo; Han Shaoyang; Li Xi

    2013-01-01

    Based on component GIS and mixed programming techniques, a software which combines the basic GIS functions, conventional and unconventional data process methods for the regional geophysical and geochemical data together, is designed and developed. The software has many advantages, such as friendly interface, easy to use and utility functions and provides a useful platform for regional geophysical and geochemical data processing. (authors)

  14. Kharkov X-ray Generator Based On Compton Scattering

    International Nuclear Information System (INIS)

    Shcherbakov, A.; Zelinsky, A.; Mytsykov, A.; Gladkikh, P.; Karnaukhov, I.; Lapshin, V.; Telegin, Y.; Androsov, V.; Bulyak, E.; Botman, J.I.M.; Tatchyn, R.; Lebedev, A.

    2004-01-01

    Nowadays X-ray sources based on storage rings with low beam energy and Compton scattering of intense laser beams are under development in several laboratories. An international cooperative project of an advanced X-ray source of this type at the Kharkov Institute of Physics and Technology (KIPT) is described. The status of the project is reviewed. The design lattice of the storage ring and calculated X-ray beam parameters are presented. The results of numerical simulation carried out for proposed facility show a peak spectral X-ray intensity of about 1014 can be produced

  15. TH-A-18C-04: Ultrafast Cone-Beam CT Scatter Correction with GPU-Based Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Y [UT Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou (China); Bai, T [UT Southwestern Medical Center, Dallas, TX (United States); Xi' an Jiaotong University, Xi' an (China); Yan, H; Ouyang, L; Wang, J; Pompos, A; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou (China)

    2014-06-15

    Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT). We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC) simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstructions within 30 seconds. Methods: The method consists of six steps: 1) FDK reconstruction using raw projection data; 2) Rigid Registration of planning CT to the FDK results; 3) MC scatter calculation at sparse view angles using the planning CT; 4) Interpolation of the calculated scatter signals to other angles; 5) Removal of scatter from the raw projections; 6) FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC scatter noise caused by low photon numbers. The method is validated on head-and-neck cases with simulated and clinical data. Results: We have studied impacts of photo histories, volume down sampling factors on the accuracy of scatter estimation. The Fourier analysis was conducted to show that scatter images calculated at 31 angles are sufficient to restore those at all angles with <0.1% error. For the simulated case with a resolution of 512×512×100, we simulated 10M photons per angle. The total computation time is 23.77 seconds on a Nvidia GTX Titan GPU. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU. Similar results were found for a real patient case. Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. The whole process of scatter correction and reconstruction is accomplished within 30 seconds. This study is supported in part by NIH (1R01CA154747-01), The Core Technology Research

  16. 31 CFR 560.540 - Exportation of certain services and software incident to Internet-based communications.

    Science.gov (United States)

    2010-07-01

    ....540 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Iran of software necessary to enable the services... indirect exportation of services or software with knowledge or reason to know that such services or...

  17. 31 CFR 538.533 - Exportation of certain services and software incident to Internet-based communications.

    Science.gov (United States)

    2010-07-01

    ....533 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Sudan of software necessary to enable the services... indirect exportation of services or software with knowledge or reason to know that such services or...

  18. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  19. Analysis and recommendations for a reliable programming of software based safety systems

    International Nuclear Information System (INIS)

    Nunez McLeod, J.; Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    The present paper summarizes the results of several studies performed for the development of high software on i486 microprocessors, towards its utilization for control and safety systems for nuclear power plants. The work is based on software programmed in C language. Several recommendations oriented to high reliability software are analyzed, relating the requirements on high level language to its influence on assembler level. Several metrics are implemented, that allow for the quantification of the results achieved. New metrics were developed and other were adapted, in order to obtain more efficient indexes for the software description. Such metrics are helpful to visualize the adaptation of the software under development to the quality rules under use. A specific program developed to assist the reliability analyst on this quantification is also present in the paper. It performs the analysis of an executable program written in C language, disassembling it and evaluating its inter al structures. (author)

  20. Development of an automated asbestos counting software based on fluorescence microscopy.

    Science.gov (United States)

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  1. Binary moving-blocker-based scatter correction in cone-beam computed tomography with width-truncated projections: proof of concept

    Science.gov (United States)

    Lee, Ho; Fahimian, Benjamin P.; Xing, Lei

    2017-03-01

    This paper proposes a binary moving-blocker (BMB)-based technique for scatter correction in cone-beam computed tomography (CBCT). In concept, a beam blocker consisting of lead strips, mounted in front of the x-ray tube, moves rapidly in and out of the beam during a single gantry rotation. The projections are acquired in alternating phases of blocked and unblocked cone beams, where the blocked phase results in a stripe pattern in the width direction. To derive the scatter map from the blocked projections, 1D B-Spline interpolation/extrapolation is applied by using the detected information in the shaded regions. The scatter map of the unblocked projections is corrected by averaging two scatter maps that correspond to their adjacent blocked projections. The scatter-corrected projections are obtained by subtracting the corresponding scatter maps from the projection data and are utilized to generate the CBCT image by a compressed-sensing (CS)-based iterative reconstruction algorithm. Catphan504 and pelvis phantoms were used to evaluate the method’s performance. The proposed BMB-based technique provided an effective method to enhance the image quality by suppressing scatter-induced artifacts, such as ring artifacts around the bowtie area. Compared to CBCT without a blocker, the spatial nonuniformity was reduced from 9.1% to 3.1%. The root-mean-square error of the CT numbers in the regions of interest (ROIs) was reduced from 30.2 HU to 3.8 HU. In addition to high resolution, comparable to that of the benchmark image, the CS-based reconstruction also led to a better contrast-to-noise ratio in seven ROIs. The proposed technique enables complete scatter-corrected CBCT imaging with width-truncated projections and allows reducing the acquisition time to approximately half. This work may have significant implications for image-guided or adaptive radiation therapy, where CBCT is often used.

  2. The visual and remote analyzing software for a Linux-based radiation information acquisition system

    International Nuclear Information System (INIS)

    Fan Zhaoyang; Zhang Li; Chen Zhiqiang

    2003-01-01

    A visual and remote analyzing software for the radiation information, which has the merit of universality and credibility, is developed based on the Linux operating system and the TCP/IP network protocol. The software is applied to visually debug and real time monitor of the high-speed radiation information acquisition system, and a safe, direct and timely control can assured. The paper expatiates the designing thought of the software, which provides the reference for other software with the same purpose for the similar systems

  3. Development of a Software Based Firewall System for Computer Network Traffic Control

    Directory of Open Access Journals (Sweden)

    Ikhajamgbe OYAKHILOME

    2009-12-01

    Full Text Available The connection of an internal network to an external network such as Internet has made it vulnerable to attacks. One class of network attack is unauthorized penetration into network due to the openness of networks. It is possible for hackers to sum access to an internal network, this pose great danger to the network and network resources. Our objective and major concern of network design was to build a secured network, based on software firewall that ensured the integrity and confidentiality of information on the network. We studied several mechanisms to achieve this; one of such mechanism is the implementation of firewall system as a network defence. Our developed firewall has the ability to determine which network traffic should be allowed in or out of the network. Part of our studied work was also channelled towards a comprehensive study of hardware firewall security system with the aim of developing this software based firewall system. Our software firewall goes a long way in protecting an internal network from external unauthorized traffic penetration. We included an anti virus software which is lacking in most firewalls.

  4. Safety prediction for basic components of safety critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2001-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, both of which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  5. Nebula: reconstruction and visualization of scattering data in reciprocal space.

    Science.gov (United States)

    Reiten, Andreas; Chernyshov, Dmitry; Mathiesen, Ragnvald H

    2015-04-01

    Two-dimensional solid-state X-ray detectors can now operate at considerable data throughput rates that allow full three-dimensional sampling of scattering data from extended volumes of reciprocal space within second to minute time-scales. For such experiments, simultaneous analysis and visualization allows for remeasurements and a more dynamic measurement strategy. A new software, Nebula , is presented. It efficiently reconstructs X-ray scattering data, generates three-dimensional reciprocal space data sets that can be visualized interactively, and aims to enable real-time processing in high-throughput measurements by employing parallel computing on commodity hardware.

  6. NuSEE: an integrated environment of software specification and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Jun Beom; Cha, Sung Deok; Youn, Cheong; Han, Hyun Chul

    2006-01-01

    As the use of digital systems becomes more prevalent, adequate techniques for software specification and analysis have become increasingly important in Nuclear Power Plant (NPP) safety-critical systems. Additionally, the importance of software Verification and Validation (V and V) based on adequate specification has received greater emphasis in view of improving software quality. For thorough V and V of safety-critical systems, V and V should be performed throughout the software lifecycle. However, systematic V and V is difficult as it involves many manual-oriented tasks. Tool support is needed in order to more conveniently perform software V and V. In response, we developed four kinds of Computer Aided Software Engineering (CASE) tools to support system specification for a formal-based analysis according to the software lifecycle. In this work, we achieved optimized integration of each tool. The toolset, NuSEE, is an integrated environment for software specification and V and V for PLC based safety-critical systems. In accordance with the software lifecycle, NuSEE consists of NuSISRT for the concept phase, NuSRS for the requirements phase, NuSDS for the design phase and NuSCM for configuration management. It is believed that after further development our integrated environment will be a unique and promising software specification and analysis toolset that will support the entire software lifecycle for the development of PLC based NPP safety-critical systems

  7. Improving scattering layer through mixture of nanoporous spheres and nanoparticles in ZnO-based dye-sensitized solar cells.

    Science.gov (United States)

    Kim, Chohui; Choi, Hongsik; Kim, Jae Ik; Lee, Sangheon; Kim, Jinhyun; Lee, Woojin; Hwang, Taehyun; Kang, Suji; Moon, Taeho; Park, Byungwoo

    2014-01-01

    A scattering layer is utilized by mixing nanoporous spheres and nanoparticles in ZnO-based dye-sensitized solar cells. Hundred-nanometer-sized ZnO spheres consisting of approximately 35-nm-sized nanoparticles provide not only effective light scattering but also a large surface area. Furthermore, ZnO nanoparticles are added to the scattering layer to facilitate charge transport and increase the surface area as filling up large voids. The mixed scattering layer of nanoparticles and nanoporous spheres on top of the nanoparticle-based electrode (bilayer geometry) improves solar cell efficiency by enhancing both the short-circuit current (J sc) and fill factor (FF), compared to the layer consisting of only nanoparticles or nanoporous spheres.

  8. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  9. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  10. State-of-the-art Hydrology Education: Development of Windows-based and Web-based Interactive Teaching-Learning Software

    Science.gov (United States)

    Chu, X.

    2011-12-01

    This study, funded by the NSF CAREER program, focuses on developing new methods to quantify microtopography-controlled overland flow processes and integrating the cutting-edge hydrologic research with all-level education and outreach activities. To achieve the educational goal, an interactive teaching-learning software package has been developed. This software, with enhanced visualization capabilities, integrates the new modeling techniques, computer-guided learning processes, and education-oriented tools in a user-friendly interface. Both Windows-based and web-based versions have been developed. The software is specially designed for three major user levels: elementary level (Level 1: K-12 and outreach education), medium level (Level 2: undergraduate education), and advanced level (Level 3: graduate education). Depending on the levels, users are guided to different educational systems. Each system consists of a series of mini "libraries" featured with movies, pictures, and documentation that cover fundamental theories, varying scale experiments, and computer modeling of overland flow generation, surface runoff, and infiltration processes. Testing and practical use of this educational software in undergraduate and graduate teaching demonstrate its effectiveness to promote students' learning and interest in hydrologic sciences. This educational software also has been used as a hydrologic demonstration tool for K-12 students and Native American students through the Nurturing American Tribal Undergraduate Research Education (NATURE) program and Science, Technology, Engineering and Mathematics (STEM) outreach activities.

  11. A general framework and review of scatter correction methods in cone beam CT. Part 2: Scatter estimation approaches

    International Nuclear Information System (INIS)

    Ruehrnschopf and, Ernst-Peter; Klingenbeck, Klaus

    2011-01-01

    The main components of scatter correction procedures are scatter estimation and a scatter compensation algorithm. This paper completes a previous paper where a general framework for scatter compensation was presented under the prerequisite that a scatter estimation method is already available. In the current paper, the authors give a systematic review of the variety of scatter estimation approaches. Scatter estimation methods are based on measurements, mathematical-physical models, or combinations of both. For completeness they present an overview of measurement-based methods, but the main topic is the theoretically more demanding models, as analytical, Monte-Carlo, and hybrid models. Further classifications are 3D image-based and 2D projection-based approaches. The authors present a system-theoretic framework, which allows to proceed top-down from a general 3D formulation, by successive approximations, to efficient 2D approaches. A widely useful method is the beam-scatter-kernel superposition approach. Together with the review of standard methods, the authors discuss their limitations and how to take into account the issues of object dependency, spatial variance, deformation of scatter kernels, external and internal absorbers. Open questions for further investigations are indicated. Finally, the authors refer on some special issues and applications, such as bow-tie filter, offset detector, truncated data, and dual-source CT.

  12. Studies of oxide-based thin-layered heterostructures by X-ray scattering methods

    Energy Technology Data Exchange (ETDEWEB)

    Durand, O. [Thales Research and Technology France, Route Departementale 128, F-91767 Palaiseau Cedex (France)]. E-mail: olivier.durand@thalesgroup.com; Rogers, D. [Nanovation SARL, 103 bis rue de Versailles 91400 Orsay (France); Universite de Technologie de Troyes, 10-12 rue Marie Curie, 10010 (France); Teherani, F. Hosseini [Nanovation SARL, 103 bis rue de Versailles 91400 Orsay (France); Andrieux, M. [LEMHE, ICMMOCNRS-UMR 8182, Universite d' Orsay, Batiment 410, 91410 Orsay (France); Modreanu, M. [Tyndall National Institute, Lee Maltings, Prospect Row, Cork (Ireland)

    2007-06-04

    Some X-ray scattering methods (X-ray reflectometry and Diffractometry) dedicated to the study of thin-layered heterostructures are presented with a particular focus, for practical purposes, on the description of fast, accurate and robust techniques. The use of X-ray scattering metrology as a routinely working non-destructive testing method, particularly by using procedures simplifying the data-evaluation, is emphasized. The model-independent Fourier-inversion method applied to a reflectivity curve allows a fast determination of the individual layer thicknesses. We demonstrate the capability of this method by reporting X-ray reflectometry study on multilayered oxide structures, even when the number of the layers constitutive of the stack is not known a-priori. Fast Fourier transform-based procedure has also been employed successfully on high resolution X-ray diffraction profiles. A study of the reliability of the integral-breadth methods in diffraction line-broadening analysis applied to thin layers, in order to determine coherent domain sizes, is also reported. Examples from studies of oxides-based thin-layers heterostructures will illustrate these methods. In particular, X-ray scattering studies performed on high-k HfO{sub 2} and SrZrO{sub 3} thin-layers, a (GaAs/AlOx) waveguide, and a ZnO thin-layer are reported.

  13. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  14. SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety

    International Nuclear Information System (INIS)

    Salomons, G; Kelly, D

    2015-01-01

    Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes that the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective

  15. SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety

    Energy Technology Data Exchange (ETDEWEB)

    Salomons, G [Cancer Center of Southeastern Ontario & Queen’s University, Kingston, ON (Canada); Kelly, D [Royal Military College of Canada, Kingston, ON, CA (Canada)

    2015-06-15

    Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes that the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective.

  16. Development of E-learning Software Based Multiplatform Components

    OpenAIRE

    Salamah, Irma; Ganiardi, M. Aris

    2017-01-01

    E-learning software is a product of information and communication technology used to help dynamic and flexible learning process between teacher and student. The software technology was first used in the development of e-learning software in the form of web applications. The advantages of this technology because of the ease in the development, installation, and distribution of data. Along with advances in mobile/wireless electronics technology, e-learning software is adapted to this technology...

  17. Knowledge-based software design for Defense-in-Depth risk monitor system and application for AP1000

    International Nuclear Information System (INIS)

    Ma Zhanguo; Yoshikawa, Hidekazu; Yang Ming; Nakagawa, Takashi

    2017-01-01

    As part of the new risk monitor system, the software for the plant Defense-in-Depth (DiD) risk monitor system was designed based on the state-transition and finite-state machine, and then the knowledge-based software was developed by object-oriented method utilizing the Unified Modeling Language (UML). Currently, there are mainly two functions in the developed plant DiD risk monitor software that are knowledge-base editor which is used to model the system in a hierarchical manner and the interaction simulator that simulates the interactions between the different actors in the model. In this paper, a model for playing its behavior is called an Actor which is modeled at the top level. The passive safety AP1000 power plant was studied and the small-break loss-of-coolant accident (SBLOCA) design basis accident transient is modeled using the plant DiD risk monitor software. Furthermore, the simulation result is shown for the interactions between the actors which are defined in the plant DiD risk monitor system as PLANT actor, OPERATOR actor, and SUPERVISOR actor. This paper shows that it is feasible to model the nuclear power plant knowledge base using the software modeling technique. The software can make the large knowledge base for the nuclear power plant with small effort. (author)

  18. How to simplify transmission-based scatter correction for clinical application

    International Nuclear Information System (INIS)

    Baccarne, V.; Hutton, B.F.

    1998-01-01

    Full text: The performances of ordered subsets (OS) EM reconstruction including attenuation, scatter and spatial resolution correction are evaluated using cardiac Monte Carlo data. We demonstrate how simplifications in the scatter model allow one to correct SPECT data for scatter in terms of quantitation and quality in a reasonable time. Initial reconstruction of the 20% window is performed including attenuation correction (broad beam μ values), to estimate the activity quantitatively (accuracy 3%), but not spatially. A rough reconstruction with 2 iterations (subset size: 8) is sufficient for subsequent scatter correction. Estimation of primary photons is obtained by projecting the previous distribution including attenuation (narrow beam μ values). Estimation of the scatter is obtained by convolving the primary estimates by a depth dependent scatter kernel, and scaling the result by a factor calculated from the attenuation map. The correction can be accelerated by convolving several adjacent planes with the same kernel, and using an average scaling factor. Simulation of the effects of the collimator during the scatter correction was demonstrated to be unnecessary. Final reconstruction is performed using 6 iterations OSEM, including attenuation (narrow beam μ values) and spatial resolution correction. Scatter correction is implemented by incorporating the estimated scatter as a constant offset in the forward projection step. The total correction + reconstruction (64 proj. 40x128 pixel) takes 38 minutes on a Sun Sparc 20. Quantitatively, the accuracy is 7% in a reconstructed slice. The SNR inside the whole myocardium (defined from the original object), is equal to 2.1 and 2.3 - in the corrected and the primary slices respectively. The scatter correction preserves the myocardium to ventricle contrast (primary: 0.79, corrected: 0.82). These simplifications allow acceleration of correction without influencing the quality of the result

  19. Nuclear resonance scattering study of iridates, iridium and antimony based pyrochlores

    International Nuclear Information System (INIS)

    Alexeev, P.

    2017-04-01

    This thesis shows the first synchrotron-based Moessbauer spectroscopy studies on iridium containing compounds and first vibrational spectroscopy on Sb containing compounds carried out at the P01 beamline of PETRA III. In this context, two types of X-ray monochromators have been developed: a monochromator for 73 keV photons with medium energy resolution, and a high-resolution backscattering monochromator based on a sapphire crystal. The monochromator for 73 keV X-rays is the key instrument for hyperfine spectroscopy on Iridium compounds, while the sapphire backscattering monochromator is purposed to vibrational spectroscopy on any Moessbauer resonances with the transition energies in the 20-50 keV range. Additionally, the signal detection for nuclear resonance scattering experiments at the beamline was significantly improved during this work, inspired by the high energies and low lifetimes of the employed resonances. The first synchrotron-based hyperfine spectroscopy on Iridium-containing compounds was demonstrated by NRS on 73 keV resonance in "1"9"3Ir. The results can be interpreted by dynamical theory of nuclear resonance scattering. In this work, special emphasis is set onto the electronic and magnetic properties of Ir nuclei in IrO_2 and in Ruddlesden-Popper (RP) phases of strontium iridates Sr_n_+_1Ir_nO_3_n_+_1 (n=0,1). These systems are well-suited for studies with X-ray scattering techniques, since the scattered signal contains vast information about the widely tunable crystallographic and electronic structure of these systems; furthermore, studies with X-rays are less limited by absorption from iridium as it is the case for neutron scattering experiments. The hyperfine parameters in IrO_2, SrIrO_3 and Sr_2IrO_4 have been measured via Nuclear Forward Scattering for the first time. Using the dynamical theory of NRS, the temperature and magnetic field dependence of the electric field gradient and magnetic hyperfine field on Ir nucleus have been determined for

  20. Nuclear resonance scattering study of iridates, iridium and antimony based pyrochlores

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, P.

    2017-04-15

    This thesis shows the first synchrotron-based Moessbauer spectroscopy studies on iridium containing compounds and first vibrational spectroscopy on Sb containing compounds carried out at the P01 beamline of PETRA III. In this context, two types of X-ray monochromators have been developed: a monochromator for 73 keV photons with medium energy resolution, and a high-resolution backscattering monochromator based on a sapphire crystal. The monochromator for 73 keV X-rays is the key instrument for hyperfine spectroscopy on Iridium compounds, while the sapphire backscattering monochromator is purposed to vibrational spectroscopy on any Moessbauer resonances with the transition energies in the 20-50 keV range. Additionally, the signal detection for nuclear resonance scattering experiments at the beamline was significantly improved during this work, inspired by the high energies and low lifetimes of the employed resonances. The first synchrotron-based hyperfine spectroscopy on Iridium-containing compounds was demonstrated by NRS on 73 keV resonance in {sup 193}Ir. The results can be interpreted by dynamical theory of nuclear resonance scattering. In this work, special emphasis is set onto the electronic and magnetic properties of Ir nuclei in IrO{sub 2} and in Ruddlesden-Popper (RP) phases of strontium iridates Sr{sub n+1}Ir{sub n}O{sub 3n+1} (n=0,1). These systems are well-suited for studies with X-ray scattering techniques, since the scattered signal contains vast information about the widely tunable crystallographic and electronic structure of these systems; furthermore, studies with X-rays are less limited by absorption from iridium as it is the case for neutron scattering experiments. The hyperfine parameters in IrO{sub 2}, SrIrO{sub 3} and Sr{sub 2}IrO{sub 4} have been measured via Nuclear Forward Scattering for the first time. Using the dynamical theory of NRS, the temperature and magnetic field dependence of the electric field gradient and magnetic hyperfine field

  1. Status of Kharkov X-ray Generator based on Compton Scattering NESTOR

    NARCIS (Netherlands)

    Zelinsky, A.; Androsov, V.P.; Bulyak, E.V.; Drebot, I.; Gladkikh, P.I.; Grevtsev, V.; Botman, J.I.M.; Ivashchenko, V.; Karnaukhov, I.M.; Lapshin, V.I.; Markov, V.; Mocheshnikov, N.; Mytsykov, A.; Peev, F.A.; Rezaev, A.; Shcherbakov, A.; Skomorkohov, V.; Skyrda, V.; Telegin, Y.; Trotsenko, V.; Tatchyn, R.; Lebedev, B.; Agafonov, A.V.

    2004-01-01

    Nowadays the sources of the X-rays based on a storage ring with low beam energy and Compton scattering of intense laser beam are under development in several laboratories. In the paper the state-of-art in development and construction of cooperative project of a Kharkov advanced X-ray source NESTOR

  2. SU-D-206-07: CBCT Scatter Correction Based On Rotating Collimator

    International Nuclear Information System (INIS)

    Yu, G; Feng, Z; Yin, Y; Qiang, L; Li, B; Huang, P; Li, D

    2016-01-01

    Purpose: Scatter correction in cone-beam computed tomography (CBCT) has obvious effect on the removal of image noise, the cup artifact and the increase of image contrast. Several methods using a beam blocker for the estimation and subtraction of scatter have been proposed. However, the inconvenience of mechanics and propensity to residual artifacts limited the further evolution of basic and clinical research. Here, we propose a rotating collimator-based approach, in conjunction with reconstruction based on a discrete Radon transform and Tchebichef moments algorithm, to correct scatter-induced artifacts. Methods: A rotating-collimator, comprising round tungsten alloy strips, was mounted on a linear actuator. The rotating-collimator is divided into 6 portions equally. The round strips space is evenly spaced on each portion but staggered between different portions. A step motor connected to the rotating collimator drove the blocker to around x-ray source during the CBCT acquisition. The CBCT reconstruction based on a discrete Radon transform and Tchebichef moments algorithm is performed. Experimental studies using water phantom and Catphan504 were carried out to evaluate the performance of the proposed scheme. Results: The proposed algorithm was tested on both the Monte Carlo simulation and actual experiments with the Catphan504 phantom. From the simulation result, the mean square error of the reconstruction error decreases from 16% to 1.18%, the cupping (τcup) from 14.005% to 0.66%, and the peak signal-to-noise ratio increase from 16.9594 to 31.45. From the actual experiments, the induced visual artifacts are significantly reduced. Conclusion: We conducted an experiment on CBCT imaging system with a rotating collimator to develop and optimize x-ray scatter control and reduction technique. The proposed method is attractive in applications where a high CBCT image quality is critical, for example, dose calculation in adaptive radiation therapy. We want to thank Dr. Lei

  3. SU-D-206-07: CBCT Scatter Correction Based On Rotating Collimator

    Energy Technology Data Exchange (ETDEWEB)

    Yu, G; Feng, Z [Shandong Normal University, Jinan, Shandong (China); Yin, Y [Shandong Cancer Hospital and Institute, China, Jinan, Shandong (China); Qiang, L [Zhang Jiagang STFK Medical Device Co, Zhangjiangkang, Suzhou (China); Li, B [Shandong Academy of Medical Sciences, Jinan, Shandong provice (China); Huang, P [Shandong Province Key Laboratory of Medical Physics and Image Processing Te, Ji’nan, Shandong province (China); Li, D [School of Physics and Electronics, Shandong Normal University, Jinan, Shandong (China)

    2016-06-15

    Purpose: Scatter correction in cone-beam computed tomography (CBCT) has obvious effect on the removal of image noise, the cup artifact and the increase of image contrast. Several methods using a beam blocker for the estimation and subtraction of scatter have been proposed. However, the inconvenience of mechanics and propensity to residual artifacts limited the further evolution of basic and clinical research. Here, we propose a rotating collimator-based approach, in conjunction with reconstruction based on a discrete Radon transform and Tchebichef moments algorithm, to correct scatter-induced artifacts. Methods: A rotating-collimator, comprising round tungsten alloy strips, was mounted on a linear actuator. The rotating-collimator is divided into 6 portions equally. The round strips space is evenly spaced on each portion but staggered between different portions. A step motor connected to the rotating collimator drove the blocker to around x-ray source during the CBCT acquisition. The CBCT reconstruction based on a discrete Radon transform and Tchebichef moments algorithm is performed. Experimental studies using water phantom and Catphan504 were carried out to evaluate the performance of the proposed scheme. Results: The proposed algorithm was tested on both the Monte Carlo simulation and actual experiments with the Catphan504 phantom. From the simulation result, the mean square error of the reconstruction error decreases from 16% to 1.18%, the cupping (τcup) from 14.005% to 0.66%, and the peak signal-to-noise ratio increase from 16.9594 to 31.45. From the actual experiments, the induced visual artifacts are significantly reduced. Conclusion: We conducted an experiment on CBCT imaging system with a rotating collimator to develop and optimize x-ray scatter control and reduction technique. The proposed method is attractive in applications where a high CBCT image quality is critical, for example, dose calculation in adaptive radiation therapy. We want to thank Dr. Lei

  4. Study on the influences of X Ray Scattering on radioscopic inspection

    Energy Technology Data Exchange (ETDEWEB)

    Wozniak, M.; Torrent, J.; Bancelin, A. [SNECMA NDE Dept. Laboratory, France, Evry Corbeil, 91 - Evry (France)

    2007-07-01

    This study issued from European project 'Verdict' (Virtual Evaluation and Robust Detection for engine Components non destructive Testing), aimed at developing and evaluating X Ray Non Destructive Method simulation. An qualitative appreciation and quantification for X Ray scattering for modelling (SINDBAD software) was identified. The effect of such radiation on radiogram results in a disturbing blur for interpretation of indications. The method and the results described are innovative in the analysis of X Ray scattering because for aeronautic field, the configurations used with this energy range are breakthrough. The approach followed consists in an experimental and practical method for evaluating scattered radiation on final image issued from the inspection. Experimental tests results confirmed that the influence of scattering radiation are linked to density variation, geometry of parts in the axis of direct radiation and spatial area. This study performed in industrial configurations contributed to improve X Ray scattering understanding. (authors)

  5. SYNRAD3D photon propagation and scattering simulations

    International Nuclear Information System (INIS)

    Dugan, G; Sagan, D

    2013-01-01

    The Bmad software library has been used very successfully at Cornell for modeling relativistic charged particles in storage rings and linacs. Associated with this library are a number of programs used for lattice design and analysis. Recently, as part of the CESRTA program, a new program that uses the Bmad library, called Synrad3D, has been developed to track synchrotron radiation photons generated in storage rings and linacs. The motivation for developing Synrad3D was to estimate the energy and position distribution of photon absorption sites, which are critical inputs to codes which model the growth of electron clouds. Synrad3D includes scattering from the vacuum chamber walls, based on X-ray data from an LBNL database for the smooth-surface reflectivity, and an analytical model for diffuse scattering from a surface with finite roughness. Synrad3D can handle any planar lattice and a wide variety of vacuum chamber profiles. In the following sections, the general approach used in Synrad3D will be described. The models used for the vacuum chamber, for specular reflection, and for diffuse reflection, will be described. Examples of the application to the program to predict the radiation environment in the CESRTA ring will be presented. Comparison of the scattering model with X-ray data from DAΦNE will be given. Finally, an application of the program to predict the radiation environment in the ILC damping ring will be shown

  6. From conventional software based systems to knowledge based systems

    International Nuclear Information System (INIS)

    Bologna, S.

    1995-01-01

    Even if todays nuclear power plants have a very good safety record, there is a continuous search for still improving safety. One direction of this effort address operational safety, trying to improve the handling of disturbances and accidents partly by further automation, partly by creating a better control room environment, providing the operator with intelligent support systems to help in the decision making process. Introduction of intelligent computerised operator support systems has proved to be an efficient way of improving the operators performance. A number of systems have been developed worldwide, assisting in tasks like process fault detection and diagnosis, selection and implementation of proper remedial actions. Unfortunately, the use of Knowledge Based Systems (KBSs), introduces a new dimension to the problem of the licensing process. KBSs, despite the different technology employed, are still nothing more than a computer program. Unfortunately, quite a few people building knowledge based systems seem to ignore the many good programming practices that have evolved over the years for producing traditional computer programs. In this paper the author will try to point out similarities and differences between conventional software based systems, and knowledge based systems, introducing also the concept of model based reasoning. (orig.) (25 refs., 2 figs.)

  7. Distributed Arithmetic for Efficient Base-Band Processing in Real-Time GNSS Software Receivers

    Directory of Open Access Journals (Sweden)

    Grégoire Waelchli

    2010-01-01

    Full Text Available The growing market of GNSS capable mobile devices is driving the interest of GNSS software solutions, as they can share many system resources (processor, memory, reducing both the size and the cost of their integration. Indeed, with the increasing performance of modern processors, it becomes now feasible to implement in software a multichannel GNSS receiver operating in real time. However, a major issue with this approach is the large computing resources required for the base-band processing, in particular for the correlation operations. Therefore, new algorithms need to be developed in order to reduce the overall complexity of the receiver architecture. Towards that aim, this paper first introduces the challenges of the software implementation of a GPS receiver, with a main focus given to the base-band processing and correlation operations. It then describes the already existing solutions and, from this, introduces a new algorithm based on distributed arithmetic.

  8. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  9. Estimation of biological parameters of marine organisms using linear and nonlinear acoustic scattering model-based inversion methods.

    Science.gov (United States)

    Chu, Dezhang; Lawson, Gareth L; Wiebe, Peter H

    2016-05-01

    The linear inversion commonly used in fisheries and zooplankton acoustics assumes a constant inversion kernel and ignores the uncertainties associated with the shape and behavior of the scattering targets, as well as other relevant animal parameters. Here, errors of the linear inversion due to uncertainty associated with the inversion kernel are quantified. A scattering model-based nonlinear inversion method is presented that takes into account the nonlinearity of the inverse problem and is able to estimate simultaneously animal abundance and the parameters associated with the scattering model inherent to the kernel. It uses sophisticated scattering models to estimate first, the abundance, and second, the relevant shape and behavioral parameters of the target organisms. Numerical simulations demonstrate that the abundance, size, and behavior (tilt angle) parameters of marine animals (fish or zooplankton) can be accurately inferred from the inversion by using multi-frequency acoustic data. The influence of the singularity and uncertainty in the inversion kernel on the inversion results can be mitigated by examining the singular values for linear inverse problems and employing a non-linear inversion involving a scattering model-based kernel.

  10. Spontaneous Rayleigh-Brillouin scattering spectral analysis based on the Wiener filter

    Directory of Open Access Journals (Sweden)

    Tao Wu

    2018-01-01

    Full Text Available In this paper, a spontaneous Rayleigh-Brillouin scattering spectrometer is developed to measure the gaseous spontaneous Rayleigh-Brillouin scattering profiles over the pressure range from 1 to 5 atm for a wavelength of 532nm at a constant room temperature of 296K and a 90o scattering angle. In order to make a direct comparison between the experimentally obtained spectrum and the theoretical spectrum calculated from the Tenti S6 model, the measured spontaneous Rayleigh-Brillouin scattering signal is deconvolved by the Wiener filtering. The purpose is to remove the effect on the spectrum by the transmission function of the Fabry-Perrot scanning interferometer. The results of the comparison show that the deconvolved spectra are consistent with the theoretical spectra calculated from the Tenti S6 model, and thus confirm that the deconvolution based on the Wiener filter is able to process the measured spectra and improve the spectral resolution. Some factors that influence the accuracy of deconvolution are analyzed and discussed. At the same time, another comparison between the raw experimentally obtained spectra and the theoretical spectra calculated by convolving the Tenti S6 model with instrument function of the measurement system is performed in the same experimental condition. The results of the two comparisons show that, compared with the raw experimentally obtained spectrum, the deconvolved spectrum matches the theoretically calculated spectrum more accurately under lower pressure (≤2atm than under relative higher pressure (>2atm.

  11. Scattering of photons from atomic electrons

    International Nuclear Information System (INIS)

    Pratt, R.H.; Zhou, B.; Bergstrom, P.M. Jr.; Pisk, K.; Suric, T.

    1990-01-01

    Validity of simpler approaches for elastic and inelastic photon scattering by atoms and ions is assessed by comparison with second-order S-matrix predictions. A simple scheme for elastic scattering based on angle-independent anomalous scattering factors has been found to give useful predictions near and below photoeffect thresholds. In inelastic scattering, major deviations are found from A 2 -based calculations. Extension of free-atom and free-ion cross sections to the dense plasma regime is discussed. 20 refs., 6 figs

  12. Hybrid radiosity-SP{sub 3} equation based bioluminescence tomography reconstruction for turbid medium with low- and non-scattering regions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xueli, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn; Zhang, Qitan; Yang, Defu; Liang, Jimin, E-mail: xlchen@xidian.edu.cn, E-mail: jimleung@mail.xidian.edu.cn [School of Life Science and Technology, Xidian University, Xi' an, Shaanxi 710071 (China)

    2014-01-14

    To provide an ideal solution for a specific problem of gastric cancer detection in which low-scattering regions simultaneously existed with both the non- and high-scattering regions, a novel hybrid radiosity-SP{sub 3} equation based reconstruction algorithm for bioluminescence tomography was proposed in this paper. In the algorithm, the third-order simplified spherical harmonics approximation (SP{sub 3}) was combined with the radiosity equation to describe the bioluminescent light propagation in tissues, which provided acceptable accuracy for the turbid medium with both low- and non-scattering regions. The performance of the algorithm was evaluated with digital mouse based simulations and a gastric cancer-bearing mouse based in situ experiment. Primary results demonstrated the feasibility and superiority of the proposed algorithm for the turbid medium with low- and non-scattering regions.

  13. Cross plane scattering correction

    International Nuclear Information System (INIS)

    Shao, L.; Karp, J.S.

    1990-01-01

    Most previous scattering correction techniques for PET are based on assumptions made for a single transaxial plane and are independent of axial variations. These techniques will incorrectly estimate the scattering fraction for volumetric PET imaging systems since they do not take the cross-plane scattering into account. In this paper, the authors propose a new point source scattering deconvolution method (2-D). The cross-plane scattering is incorporated into the algorithm by modeling a scattering point source function. In the model, the scattering dependence both on axial and transaxial directions is reflected in the exponential fitting parameters and these parameters are directly estimated from a limited number of measured point response functions. The authors' results comparing the standard in-plane point source deconvolution to the authors' cross-plane source deconvolution show that for a small source, the former technique overestimates the scatter fraction in the plane of the source and underestimate the scatter fraction in adjacent planes. In addition, the authors also propose a simple approximation technique for deconvolution

  14. Scattering transform and LSPTSVM based fault diagnosis of rotating machinery

    Science.gov (United States)

    Ma, Shangjun; Cheng, Bo; Shang, Zhaowei; Liu, Geng

    2018-05-01

    This paper proposes an algorithm for fault diagnosis of rotating machinery to overcome the shortcomings of classical techniques which are noise sensitive in feature extraction and time consuming for training. Based on the scattering transform and the least squares recursive projection twin support vector machine (LSPTSVM), the method has the advantages of high efficiency and insensitivity for noise signal. Using the energy of the scattering coefficients in each sub-band, the features of the vibration signals are obtained. Then, an LSPTSVM classifier is used for fault diagnosis. The new method is compared with other common methods including the proximal support vector machine, the standard support vector machine and multi-scale theory by using fault data for two systems, a motor bearing and a gear box. The results show that the new method proposed in this study is more effective for fault diagnosis of rotating machinery.

  15. Software for marine ecological environment comprehensive monitoring system based on MCGS

    Science.gov (United States)

    Wang, X. H.; Ma, R.; Cao, X.; Cao, L.; Chu, D. Z.; Zhang, L.; Zhang, T. P.

    2017-08-01

    The automatic integrated monitoring software for marine ecological environment based on MCGS configuration software is designed and developed to realize real-time automatic monitoring of many marine ecological parameters. The DTU data transmission terminal performs network communication and transmits the data to the user data center in a timely manner. The software adopts the modular design and has the advantages of stable and flexible data structure, strong portability and scalability, clear interface, simple user operation and convenient maintenance. Continuous site comparison test of 6 months showed that, the relative error of the parameters monitored by the system such as temperature, salinity, turbidity, pH, dissolved oxygen was controlled within 5% with the standard method and the relative error of the nutrient parameters was within 15%. Meanwhile, the system had few maintenance times, low failure rate, stable and efficient continuous monitoring capabilities. The field application shows that the software is stable and the data communication is reliable, and it has a good application prospect in the field of marine ecological environment comprehensive monitoring.

  16. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  17. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  18. Design of software platform based on linux operating system for γ-spectrometry instrument

    International Nuclear Information System (INIS)

    Hong Tianqi; Zhou Chen; Zhang Yongjin

    2008-01-01

    This paper described the design of γ-spectrometry instrument software platform based on s3c2410a processor with arm920t core, emphases are focused on analyzing the integrated application of embedded linux operating system, yaffs file system and qt/embedded GUI development library. It presented a new software platform in portable instrument for γ measurement. (authors)

  19. An open software system based on X Windows for process control and equipment monitoring

    International Nuclear Information System (INIS)

    Aimar, A.; Carlier, E.; Mertens, V.

    1992-01-01

    The construction and application of a configurable open software system for process control and equipment monitoring can speed up and simplify the development and maintenance of equipment specific software as compared to individual solutions. The present paper reports the status of such an approach for the distributed control systems of SPS and LEP beam transfer components, based on X Windows and the OSF/Motif tool kit and applying data modeling and software engineering methods. (author)

  20. Trajectory-based understanding of the quantum-classical transition for barrier scattering

    Science.gov (United States)

    Chou, Chia-Chun

    2018-06-01

    The quantum-classical transition of wave packet barrier scattering is investigated using a hydrodynamic description in the framework of a nonlinear Schrödinger equation. The nonlinear equation provides a continuous description for the quantum-classical transition of physical systems by introducing a degree of quantumness. Based on the transition equation, the transition trajectory formalism is developed to establish the connection between classical and quantum trajectories. The quantum-classical transition is then analyzed for the scattering of a Gaussian wave packet from an Eckart barrier and the decay of a metastable state. Computational results for the evolution of the wave packet and the transmission probabilities indicate that classical results are recovered when the degree of quantumness tends to zero. Classical trajectories are in excellent agreement with the transition trajectories in the classical limit, except in some regions where transition trajectories cannot cross because of the single-valuedness of the transition wave function. As the computational results demonstrate, the process that the Planck constant tends to zero is equivalent to the gradual removal of quantum effects originating from the quantum potential. This study provides an insightful trajectory interpretation for the quantum-classical transition of wave packet barrier scattering.

  1. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  2. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    Science.gov (United States)

    Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.

  3. A File Based Visualization of Software Evolution

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.

    2006-01-01

    Software Configuration Management systems are important instruments for supporting development of large software projects. They accumulate large amounts of evolution data that can be used for process accounting and auditing. We study how visualization can help developers and managers to get insight

  4. The LANSCE (Los Alamos Neutron Scattering Center) target data collection system

    International Nuclear Information System (INIS)

    Kernodle, A.K.

    1989-01-01

    The Los Alamos Neutron Scattering Center (LANSCE) Target Data Collection System is the result of an effort to provide a base of information from which to draw conclusions on the performance and operational condition of the overall LANSCE target system. During the conceptualization of the system, several goals were defined. A survey was made of both custom-made and off-the-shelf hardware and software that were capable of meeting these goals. The first stage of the system was successfully implemented for the LANSCE run cycle 52. From the operational experience gained thus far, it appears that the LANSCE Target Data Collection System will meet all of the previously defined requirements

  5. Knowledge Base for an Intelligent System in order to Identify Security Requirements for Government Agencies Software Projects

    Directory of Open Access Journals (Sweden)

    Adán Beltrán G.

    2016-01-01

    Full Text Available It has been evidenced that one of the most common causes in the failure of software security is the lack of identification and specification of requirements for information security, it is an activity with an insufficient importance in the software development or software acquisition We propose the knowledge base of CIBERREQ. CIBERREQ is an intelligent knowledge-based system used for the identification and specification of security requirements in the software development cycle or in the software acquisition. CIBERREQ receives functional software requirements written in natural language and produces non-functional security requirements through a semi-automatic process of risk management. The knowledge base built is formed by an ontology developed collaboratively by experts in information security. In this process has been identified six types of assets: electronic data, physical data, hardware, software, person and service; as well as six types of risk: competitive disadvantage, loss of credibility, economic risks, strategic risks, operational risks and legal sanctions. In addition there are defined 95 vulnerabilities, 24 threats, 230 controls, and 515 associations between concepts. Additionally, automatic expansion was used with Wikipedia for the asset types Software and Hardware, obtaining 7125 and 5894 software and hardware subtypes respectively, achieving thereby an improvement of 10% in the identification of the information assets candidates, one of the most important phases of the proposed system.

  6. Robust Automatic Target Recognition via HRRP Sequence Based on Scatterer Matching

    Directory of Open Access Journals (Sweden)

    Yuan Jiang

    2018-02-01

    Full Text Available High resolution range profile (HRRP plays an important role in wideband radar automatic target recognition (ATR. In order to alleviate the sensitivity to clutter and target aspect, employing a sequence of HRRP is a promising approach to enhance the ATR performance. In this paper, a novel HRRP sequence-matching method based on singular value decomposition (SVD is proposed. First, the HRRP sequence is decoupled into the angle space and the range space via SVD, which correspond to the span of the left and the right singular vectors, respectively. Second, atomic norm minimization (ANM is utilized to estimate dominant scatterers in the range space and the Hausdorff distance is employed to measure the scatter similarity between the test and training data. Next, the angle space similarity between the test and training data is evaluated based on the left singular vector correlations. Finally, the range space matching result and the angle space correlation are fused with the singular values as weights. Simulation and outfield experimental results demonstrate that the proposed matching metric is a robust similarity measure for HRRP sequence recognition.

  7. Between Innovation and Governance: The Case of Research-based Software Development in a Large Petroleum Company

    OpenAIRE

    Seifvand, Atiyeh

    2012-01-01

    Software innovations can offer organizations with competitive advantages. Research and development entities within the petroleum industry therefore seek to utilize IT capabilities to produce innovative software. Many factors may influence the success or failure of developing and implementing research-based software innovations in organizations. Of these issues the relation between software innovation and IT governance remains largely unexplored in the research literature.This study explores t...

  8. Research on cross - Project software defect prediction based on transfer learning

    Science.gov (United States)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  9. Stationary theory of scattering

    International Nuclear Information System (INIS)

    Kato, T.

    1977-01-01

    A variant of the stationary methods is described, and it is shown that it is useful in a wide range of problems, including scattering, by long-range potentials, two-space scattering, and multichannel scattering. The method is based on the notion of spectral forms. The paper is restricted to the simplest case of continuous spectral forms defined on a Banach space embedded in the basic Hilbert space. (P.D.)

  10. Detection of neurotransmitters by a light scattering technique based on seed-mediated growth of gold nanoparticles

    International Nuclear Information System (INIS)

    Shang Li; Dong Shaojun

    2008-01-01

    A simple light scattering detection method for neurotransmitters has been developed, based on the growth of gold nanoparticles. Neurotransmitters (dopamine, L-dopa, noradrenaline and adrenaline) can effectively function as active reducing agents for generating gold nanoparticles, which result in enhanced light scattering signals. The strong light scattering of gold nanoparticles then allows the quantitative detection of the neurotransmitters simply by using a common spectrofluorometer. In particular, Au-nanoparticle seeds were added to facilitate the growth of nanoparticles, which was found to enhance the sensing performance greatly. Using this light scattering technique based on the seed-mediated growth of gold nanoparticles, detection limits of 4.4 x 10 -7 M, 3.5 x 10 -7 M, 4.1 x 10 -7 M, and 7.7 x 10 -7 M were achieved for dopamine, L-dopa, noradrenaline and adrenaline, respectively. The present strategy can be extended to detect other biologically important molecules in a very fast, simple and sensitive way, and may have potential applications in a wide range of fields

  11. Detection of neurotransmitters by a light scattering technique based on seed-mediated growth of gold nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Shang Li; Dong Shaojun [State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Graduate School of the Chinese Academy of Sciences, Chinese Academy of Sciences, Changchun 130022 (China)], E-mail: dongsj@ciac.jl.cn

    2008-03-05

    A simple light scattering detection method for neurotransmitters has been developed, based on the growth of gold nanoparticles. Neurotransmitters (dopamine, L-dopa, noradrenaline and adrenaline) can effectively function as active reducing agents for generating gold nanoparticles, which result in enhanced light scattering signals. The strong light scattering of gold nanoparticles then allows the quantitative detection of the neurotransmitters simply by using a common spectrofluorometer. In particular, Au-nanoparticle seeds were added to facilitate the growth of nanoparticles, which was found to enhance the sensing performance greatly. Using this light scattering technique based on the seed-mediated growth of gold nanoparticles, detection limits of 4.4 x 10{sup -7} M, 3.5 x 10{sup -7} M, 4.1 x 10{sup -7} M, and 7.7 x 10{sup -7} M were achieved for dopamine, L-dopa, noradrenaline and adrenaline, respectively. The present strategy can be extended to detect other biologically important molecules in a very fast, simple and sensitive way, and may have potential applications in a wide range of fields.

  12. Software-Based Wireless Power Transfer Platform for Various Power Control Experiments

    Directory of Open Access Journals (Sweden)

    Sun-Han Hwang

    2015-07-01

    Full Text Available In this paper, we present the design and evaluation of a software-based wireless power transfer platform that enables the development of a prototype involving various open- and closed-loop power control functions. Our platform is based on a loosely coupled planar wireless power transfer circuit that uses a class-E power amplifier. In conjunction with this circuit, we implement flexible control functions using a National Instruments Data Acquisition (NI DAQ board and algorithms in the MATLAB/Simulink. To verify the effectiveness of our platform, we conduct two types of power-control experiments: a no-load or metal detection using open-loop power control, and an output voltage regulation for different receiver positions using closed-loop power control. The use of the MATLAB/Simulink software as a part of the planar wireless power transfer platform for power control experiments is shown to serve as a useful and inexpensive alternative to conventional hardware-based platforms.

  13. SOFTWARE ARCHITECTURE DESIGN OF GIS WEB SERVICE AGGREGATION BASED ON SERVICE GROUP

    Directory of Open Access Journals (Sweden)

    J.-C. Liu

    2012-08-01

    Full Text Available Based on the analysis of research status of domestic and international GIS web service aggregation and development tendency of public platform of GIS web service, the paper designed software architecture of GIS web service aggregation based on GIS web service group. Firstly, using heterogeneous GIS services model, the software architecture converted a variety of heterogeneous services to a unified interface of GIS services, and divided different types of GIS services into different service groups referring to description of GIS services. Secondly, a service aggregation process model was designed. This model completed the task of specific service aggregation instance, by automatically selecting member GIS Web services in the same service group. Dynamic capabilities and automatic adaptation of GIS Web services aggregation process were achieved. Thirdly, this paper designed a service evaluation model of GIS web service aggregation based on service group from three aspects, i.e. GIS Web Service itself, networking conditions and service consumer. This model implemented effective quality evaluation and performance monitoring of GIS web service aggregation. It could be used to guide the execution, monitor and service selection of aggregation process. Therefore, robustness of aggregated GIS web service was improved. Finally, the software architecture has been widely used in public platform of GIS web service and a number of geo-spatial framework constructions for digital city in Sichuan Province, and aggregated various GIS web services such as World Map(National Public Platform of Geo-spatial Service, ArcGIS, SuperMap, MapGIS, NewMap etc. Applications of items showed that this software architecture was practicability.

  14. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    Science.gov (United States)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  15. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  16. Evaluating the scattered radiation intensity in CBCT

    Science.gov (United States)

    Gonçalves, O. D.; Boldt, S.; Nadaes, M.; Devito, K. L.

    2018-03-01

    In this work we calculate the ratio between scattered and transmitted photons (STRR) by a water cylinder reaching a detector matrix element (DME) in a flat array of detectors, similar to the used in cone beam tomography (CBCT), as a function of the field of view (FOV) and the irradiated volume of the scanned object. We perform the calculation by obtaining an equation to determine the scattered and transmitted radiation and building a computer code in order to calculate the contribution of all voxels of the sample. We compare calculated results with the shades of gray in a central slice of a tomography obtained from a cylindrical glass container filled with distilled water. The tomography was performed with an I-CAT tomograph (Imaging Science International), from the Department of Dental Clinic - Oral Radiology, Universidade Federal de Juiz de Fora. The shade of gray (voxel gray value - VGV) was obtained using the software provided with the I-CAT. The experimental results show a general behavior compatible with theoretical previsions attesting the validity of the method used to calculate the scattering contributions from simple scattering theories in cone beam tomography. The results also attest to the impossibility of obtaining Hounsfield values from a CBCT.

  17. Windows pollution problems of the dust concentration measurement based on scattering method

    International Nuclear Information System (INIS)

    Zhao Yanjun; Zhang Yongtao; Shi Xinyue; Xu Chuanlong; Wang Shimin

    2009-01-01

    The windows are separated the measurement system from the dust space in the light Scattering dust concentration measurement system. The windows are polluted unavoidably by the dust and the measurement error is produced. Based on the Mie Scattering theory, the measurement error is researched in this paper. The numerical simulation results show that the measurement error is related to the particles diameter distribution and the refractive index, but is independent of the particles average diameter. A novel photoelectricity sensor is developed in this paper in order to solve the measurement error by the windows pollution. The calculated method is brought out which can amend the measurement errors by the windows pollution and improve the measurement accuracy.

  18. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    Science.gov (United States)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  19. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  20. Development of data acquisition and processing software based on MS-Windows 3.X for safeguards

    International Nuclear Information System (INIS)

    Tan Yajun

    1996-01-01

    The development method of data acquisition and processing software based on MS-Windows 3.X for safeguards is presented. The paper describes the design methods of graphical user interface (GUI), multiwindow and multitask-based spectrum graph display, data acquisition and processing and the application of object-oriented programming (OOP). Using the package, an effective prototype design path can be found for MS-Windows-based software. The methods and programs have been applied in some safeguard non-destructive assay system

  1. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  2. Operating Experience of Digital, Software-based Components Used in I and C and Electrical Systems in German NPPs

    International Nuclear Information System (INIS)

    Blum, Stefanie; Lochthofen, Andre; Quester, Claudia; Arians, Robert

    2015-01-01

    In recent years, many components in instrumentation and control (I and C) and electrical systems of nuclear power plants (NPPs) were replaced by digital, software-based components. Due to the more complex structure, software-based I and C and electrical components show the potential for new failure mechanisms and an increasing number of failure possibilities, including the potential for common cause failures. An evaluation of the operating experience of digital, software-based components may help to determine new failure modes of these components. In this paper, we give an overview over the results of the evaluation of the operating experience of digital, software-based components used in I and C and electrical systems in NPPs in Germany. (authors)

  3. Toward a new polyethylene scattering law determined using inelastic neutron scattering

    International Nuclear Information System (INIS)

    Lavelle, C.M.; Liu, C.-Y.; Stone, M.B.

    2013-01-01

    Monte Carlo neutron transport codes such as MCNP rely on accurate data for nuclear physics cross-sections to produce accurate results. At low energy, this takes the form of scattering laws based on the dynamic structure factor, S(Q,E). High density polyethylene (HDPE) is frequently employed as a neutron moderator at both high and low temperatures, however the only cross-sections available are for ambient temperatures (∼300K), and the evaluation has not been updated in quite some time. In this paper we describe inelastic neutron scattering measurements on HDPE at 5 and 294 K which are used to improve the scattering law for HDPE. We review some of the past HDPE scattering laws, describe the experimental methods, and compare computations using these models to the measured S(Q,E). The total cross-section is compared to available data, and the treatment of the carbon secondary scatterer as a free gas is assessed. We also discuss the use of the measurement itself as a scattering law via the one phonon approximation. We show that a scattering law computed using a more detailed model for the Generalized Density of States (GDOS) compares more favorably to this experiment, suggesting that inelastic neutron scattering can play an important role in both the development and validation of new scattering laws for Monte Carlo work. -- Highlights: ► Polyethylene at 5 K and 300 K is measured using inelastic neutron scattering (INS). ► Measurements conducted at the Wide Angular-Range Chopper Spectrometer at SNS. ► Several models for Polyethylene are compared to measurements. ► Improvements to existing models for the polyethylene scattering law are suggested. ► INS is shown to be highly valuable tool for scattering law development

  4. [Physical Activity in the Context of Workplace Health Promotion: A Systematic Review on the Effectiveness of Software-Based in Contrast to Personal-Based Interventions].

    Science.gov (United States)

    Rudolph, Sabrina; Göring, Arne; Padrok, Dennis

    2018-01-03

    Sports and physical activity interventions are attracting considerable attention in the context of workplace health promotion. Due to increasing digitalization, especially software-based interventions that promote physical activity are gaining acceptance in practice. Empirical evidence concerning the efficiency of software-based interventions in the context of workplace health promotion is rather low so far. This paper examines the question in what way software-based interventions are more efficient than personal-based interventions in terms of increasing the level of physical activity. A systematic review according to the specifications of the Cochrane Collaboration was conducted. Inclusion criteria and should-have criteria were defined and by means of the should-have criteria the quality score of the studies was calculated. The software-based and personal-based interventions are presented in 2 tables with the categories author, year, country, sample group, aim of the intervention, methods, outcome and study quality. A total of 25 studies are included in the evaluation (12 personal- and 13 software-based interventions). The quality scores of the studies are heterogeneous and range from 3 to 9 points. 5 personal- and 5 software-based studies achieved an increase of physical activity. Other positive effects on health could be presented in the studies, for example, a reduction in blood pressure or body-mass index. A few studies did not show any improvement in health-related parameters. This paper demonstrates that positive effects can be achieved with both intervention types. Software-based interventions show advantages due to the use of new technologies. Use of desktop or mobile applications facilitate organization, communication and data acquisition with fewer resources needed. A schooled trainer, on the other hand, is able to react to specific and varying needs of the employees. This aspect should be considered as very significant. © Georg Thieme Verlag KG

  5. A hybrid Scatter/Transform cloaking model

    Directory of Open Access Journals (Sweden)

    Gad Licht

    2015-01-01

    Full Text Available A new Scatter/Transform cloak is developed that combines the light bending of refraction characteristic of a Transform cloak with the scatter cancellation characteristic of a Scatter cloak. The hybrid cloak incorporates both Transform’s variable index of refraction with modified linear intrusions to maximize the Scatter cloak effect. Scatter/Transform improved the scattering cross-section of cloaking in a 2-dimensional space to 51.7% compared to only 39.6% or 45.1% respectively with either Scatter or Transform alone. Metamaterials developed with characteristics based on the new ST hybrid cloak will exhibit superior cloaking capabilities.

  6. The Computer-based Health Evaluation Software (CHES: a software for electronic patient-reported outcome monitoring

    Directory of Open Access Journals (Sweden)

    Holzner Bernhard

    2012-11-01

    Full Text Available Abstract Background Patient-reported Outcomes (PROs capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total. Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily

  7. Optical diagnostics based on elastic scattering: An update of clinical demonstrations with the Optical Biopsy System

    Energy Technology Data Exchange (ETDEWEB)

    Bigio, I.J.; Boyer, J.; Johnson, T.M.; Lacey, J.; Mourant, J.R. [Los Alamos National Lab., NM (United States); Conn, R. [Lovelace Medical Center, Albuquerque, NM (United States); Bohorfoush, A. [Wisconsin Medical School, Milwaukee, WI (United States)

    1994-10-01

    The Los Alamos National Laboratory has continued the development of the Optical Biopsy System (OBS) for noninvasive, real-time in situ diagnosis of tissue pathologies. Our clinical studies have expanded since the last Biomedical Optics Europe conference (Budapest, September 1993), and we report here on the latest results of clinical tests in gastrointestinal tract. The OBS invokes a unique approach to optical diagnosis of tissue pathologies based on the elastic scattering properties, over a wide range of wavelengths, of the tissue. The use of elastic scattering as the key to optical tissue diagnostics in the OBS is based on the fact that many tissue pathologies, including a majority of cancer forms, manifest significant architectural changes at the cellular and sub-cellular level. Since the cellular components that cause elastic scattering have dimensions typically on the order of visible to near-IR wavelengths, the elastic (Mie) scattering properties will be wavelength dependent. Thus, morphology and size changes can be expected to cause significant changes in an optical signature that is derived from the wavelength-dependence of elastic scattering. The OBS employs a small fiberoptic probe that is amenable to use with any endoscope or catheter, or to direct surface examination. The probe is designed to be used in optical contact with the tissue under examination and has separate illuminating and collecting fibers. Thus, the light that is collected and transmitted to the analyzing spectrometer must first scatter through a small volume of the tissue before entering the collection fiber(s). Consequently, the system is also sensitive to the optical absorption spectrum of the tissue, over an effective operating range of <300 to 950 nm, and such absorption adds valuable complexity to the scattering spectral signature.

  8. Reactor protection system software test-case selection based on input-profile considering concurrent events and uncertainties

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Lee, Seung Jun; Cho, Jaehyun; Jung, Wondea

    2016-01-01

    Recently, the input-profile-based testing for safety critical software has been proposed for determining the number of test cases and quantifying the failure probability of the software. Input-profile of a reactor protection system (RPS) software is the input which causes activation of the system for emergency shutdown of a reactor. This paper presents a method to determine the input-profile of a RPS software which considers concurrent events/transients. A deviation of a process parameter value begins through an event and increases owing to the concurrent multi-events depending on the correlation of process parameters and severity of incidents. A case of reactor trip caused by feedwater loss and main steam line break is simulated and analyzed to determine the RPS software input-profile and estimate the number of test cases. The different sizes of the main steam line breaks (e.g., small, medium, large break) with total loss of feedwater supply are considered in constructing the input-profile. The uncertainties of the simulation related to the input-profile-based software testing are also included. Our study is expected to provide an option to determine test cases and quantification of RPS software failure probability. (author)

  9. Blind source separation based on time-frequency morphological characteristics for rigid acoustic scattering by underwater objects

    Science.gov (United States)

    Yang, Yang; Li, Xiukun

    2016-06-01

    Separation of the components of rigid acoustic scattering by underwater objects is essential in obtaining the structural characteristics of such objects. To overcome the problem of rigid structures appearing to have the same spectral structure in the time domain, time-frequency Blind Source Separation (BSS) can be used in combination with image morphology to separate the rigid scattering components of different objects. Based on a highlight model, the separation of the rigid scattering structure of objects with time-frequency distribution is deduced. Using a morphological filter, different characteristics in a Wigner-Ville Distribution (WVD) observed for single auto term and cross terms can be simplified to remove any cross-term interference. By selecting time and frequency points of the auto terms signal, the accuracy of BSS can be improved. An experimental simulation has been used, with changes in the pulse width of the transmitted signal, the relative amplitude and the time delay parameter, in order to analyzing the feasibility of this new method. Simulation results show that the new method is not only able to separate rigid scattering components, but can also separate the components when elastic scattering and rigid scattering exist at the same time. Experimental results confirm that the new method can be used in separating the rigid scattering structure of underwater objects.

  10. Quantitative Evaluation of 2 Scatter-Correction Techniques for 18F-FDG Brain PET/MRI in Regard to MR-Based Attenuation Correction.

    Science.gov (United States)

    Teuho, Jarmo; Saunavaara, Virva; Tolvanen, Tuula; Tuokkola, Terhi; Karlsson, Antti; Tuisku, Jouni; Teräs, Mika

    2017-10-01

    In PET, corrections for photon scatter and attenuation are essential for visual and quantitative consistency. MR attenuation correction (MRAC) is generally conducted by image segmentation and assignment of discrete attenuation coefficients, which offer limited accuracy compared with CT attenuation correction. Potential inaccuracies in MRAC may affect scatter correction, because the attenuation image (μ-map) is used in single scatter simulation (SSS) to calculate the scatter estimate. We assessed the impact of MRAC to scatter correction using 2 scatter-correction techniques and 3 μ-maps for MRAC. Methods: The tail-fitted SSS (TF-SSS) and a Monte Carlo-based single scatter simulation (MC-SSS) algorithm implementations on the Philips Ingenuity TF PET/MR were used with 1 CT-based and 2 MR-based μ-maps. Data from 7 subjects were used in the clinical evaluation, and a phantom study using an anatomic brain phantom was conducted. Scatter-correction sinograms were evaluated for each scatter correction method and μ-map. Absolute image quantification was investigated with the phantom data. Quantitative assessment of PET images was performed by volume-of-interest and ratio image analysis. Results: MRAC did not result in large differences in scatter algorithm performance, especially with TF-SSS. Scatter sinograms and scatter fractions did not reveal large differences regardless of the μ-map used. TF-SSS showed slightly higher absolute quantification. The differences in volume-of-interest analysis between TF-SSS and MC-SSS were 3% at maximum in the phantom and 4% in the patient study. Both algorithms showed excellent correlation with each other with no visual differences between PET images. MC-SSS showed a slight dependency on the μ-map used, with a difference of 2% on average and 4% at maximum when a μ-map without bone was used. Conclusion: The effect of different MR-based μ-maps on the performance of scatter correction was minimal in non-time-of-flight 18 F-FDG PET

  11. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  12. UAV remote sensing atmospheric degradation image restoration based on multiple scattering APSF estimation

    Science.gov (United States)

    Qiu, Xiang; Dai, Ming; Yin, Chuan-li

    2017-09-01

    Unmanned aerial vehicle (UAV) remote imaging is affected by the bad weather, and the obtained images have the disadvantages of low contrast, complex texture and blurring. In this paper, we propose a blind deconvolution model based on multiple scattering atmosphere point spread function (APSF) estimation to recovery the remote sensing image. According to Narasimhan analytical theory, a new multiple scattering restoration model is established based on the improved dichromatic model. Then using the L0 norm sparse priors of gradient and dark channel to estimate APSF blur kernel, the fast Fourier transform is used to recover the original clear image by Wiener filtering. By comparing with other state-of-the-art methods, the proposed method can correctly estimate blur kernel, effectively remove the atmospheric degradation phenomena, preserve image detail information and increase the quality evaluation indexes.

  13. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enable rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed

  14. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    Science.gov (United States)

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  15. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  16. Portable bacterial identification system based on elastic light scatter patterns

    Directory of Open Access Journals (Sweden)

    Bae Euiwon

    2012-08-01

    Full Text Available Abstract Background Conventional diagnosis and identification of bacteria requires shipment of samples to a laboratory for genetic and biochemical analysis. This process can take days and imposes significant delay to action in situations where timely intervention can save lives and reduce associated costs. To enable faster response to an outbreak, a low-cost, small-footprint, portable microbial-identification instrument using forward scatterometry has been developed. Results This device, weighing 9 lb and measuring 12 × 6 × 10.5 in., utilizes elastic light scatter (ELS patterns to accurately capture bacterial colony characteristics and delivers the classification results via wireless access. The overall system consists of two CCD cameras, one rotational and one translational stage, and a 635-nm laser diode. Various software algorithms such as Hough transform, 2-D geometric moments, and the traveling salesman problem (TSP have been implemented to provide colony count and circularity, centering process, and minimized travel time among colonies. Conclusions Experiments were conducted with four bacteria genera using pure and mixed plate and as proof of principle a field test was conducted in four different locations where the average classification rate ranged between 95 and 100%.

  17. Polaron scattering by an external field

    International Nuclear Information System (INIS)

    Kochetov, E.A.

    1980-01-01

    The problem of polaron scattering by an external field is studied. The problem is solved using the stationary scattering theory formalism based on two operators: the G Green function operator and the T scattering operator. The dependence of the scattering amplitude on the quasi particle structure is studied. The variation approach is used for estimation of the ground energy level

  18. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  19. Attenuation correction for the HRRT PET-scanner using transmission scatter correction and total variation regularization

    DEFF Research Database (Denmark)

    Keller, Sune H; Svarer, Claus; Sibomana, Merence

    2013-01-01

    scatter correction in the μ-map reconstruction and total variation filtering to the transmission processing. Results: Comparing MAP-TR and the new TXTV with gold standard CT-based attenuation correction, we found that TXTV has less bias as compared to MAP-TR. We also compared images acquired at the HRRT......In the standard software for the Siemens high-resolution research tomograph (HRRT) positron emission tomography (PET) scanner the most commonly used segmentation in the μ -map reconstruction for human brain scans is maximum a posteriori for transmission (MAP-TR). Bias in the lower cerebellum...

  20. Study of a new approach to diagnose breast cancer based on synchrotron radiation scattering properties

    International Nuclear Information System (INIS)

    Conceicao, A.L.C.; Poletti, M.E.

    2012-01-01

    Full text: Breast cancer is the most frequently occurring cancer in women accounting for about 20% of all cancer deaths. This scenario is, among other factors, due to inherent limitations of the current clinical methods of diagnosis based on x-ray absorption. Meanwhile, recent researches have shown that the scattered radiation can provide information about the structures that compose a biological tissue, like breast tissue. Then, the information provided by x-ray scattering techniques can be used to identify breast cancer. In this work, we developed a classification model based on discriminant analysis of scattering profiles of 106 human breast samples histopathologically classified as normal tissue, benign and malignant lesion, at wide (WAXS) and small angle x-ray scattering (SAXS) regions. WAXS and SAXS experiments were carried out at the D12A-XRD1 and D02-SAXS2 beam lines in the National Synchrotron Light Laboratory (LNLS) in Campinas. For WAXS experiment, was used an x-ray beam energy of 11keV allowing to record the momentum transfer interval of 0.7nm -1 ≤(q=4π.sin(θ/2)/λ)≤70.5nm -1 on the NaI(Tl) detector. While for SAXS experiment was used an x-ray wavelength of 1.488 Angstrom, a two-dimensional detector and several sample-detector distances, allowing to get the range of 0.07nm -1 ≤q≤4.20nm -1 . The scattering profiles at both regions, for each sample were used to build the diagnosis model based on discriminant analysis. From WAXS data, differences related to position and intensity of the peaks of the molecular structures were found, when compared normal and pathological breast tissues. While for SAXS these differences were observed in supramolecular structures. The diagnostic model combining the information at WAXS and SAXS yield two linear functions which, allow to correlate changes at molecular scale with those at supramolecular level as well as, to classify correctly all samples analyzed in this study[1]. Finally, the results achieved in this

  1. How Well Can Existing Software Support Processes Accomplish Sustainment of a Non-Developmental Item-Based Acquisition Strategy

    Science.gov (United States)

    2017-04-06

    guidance to the PM regarding development and sustainment of software . The need for a strong application of software engineering principles is...on the battlefield by a government- developed network manager application . The configuration of this confluence of software will be jointly managed...How Well Can Existing Software -Support Processes Accomplish Sustainment of a Non- Developmental Item-Based Acquisition Strategy? Graciano

  2. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  3. Methodological approaches based on business rules

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2008-01-01

    Full Text Available Business rules and business processes are essential artifacts in defining the requirements of a software system. Business processes capture business behavior, while rules connect processes and thus control processes and business behavior. Traditionally, rules are scattered inside application code. This approach makes it very difficult to change rules and shorten the life cycle of the software system. Because rules change more quickly than the application itself, it is desirable to externalize the rules and move them outside the application. This paper analyzes and evaluates three well-known business rules approaches. It also outlines some critical factors that have to be taken into account in the decision to introduce business rules facilities in a software system. Based on the concept of explicit manipulation of business rules in a software system, the need for a general approach based on business rules is discussed.

  4. Improving quantitative dosimetry in (177)Lu-DOTATATE SPECT by energy window-based scatter corrections

    DEFF Research Database (Denmark)

    de Nijs, Robin; Lagerburg, Vera; Klausen, Thomas L

    2014-01-01

    and the activity, which depends on the collimator type, the utilized energy windows and the applied scatter correction techniques. In this study, energy window subtraction-based scatter correction methods are compared experimentally and quantitatively. MATERIALS AND METHODS: (177)Lu SPECT images of a phantom...... technique, the measured ratio was close to the real ratio, and the differences between spheres were small. CONCLUSION: For quantitative (177)Lu imaging MEGP collimators are advised. Both energy peaks can be utilized when the ESSE correction technique is applied. The difference between the calculated...

  5. Synthesis-Based Software Architecture Design

    NARCIS (Netherlands)

    Tekinerdogan, B.; Aksit, Mehmet; Aksit, Mehmet

    2001-01-01

    During the last decade several architecture design approaches have been introduced. These approaches however have to cope with several obstacles and software architecture design remains a difficult problem. To cope with these obstacles this chapter introduces a novel architecture design approach.

  6. A IF Signal Precessing System Design Based on Software Radio Platform

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2018-01-01

    Full Text Available Software radio is a definition of a design thought about how to implement flexible functions by using fixed hardware platform. Any platform based on this is characterized to be universal, standardized, modular, open and highly flexible. Due to some realistic reasons, a software radio platform is hard to be realized. So, most signal processing is operated after mixing. According to software radio requirements, a “FPGA+ADC+DAC” structure is designed. Compared with former processors, this module has broad application prospects with the small size, low power, configurable and programmable feathers. It has multifunction, such as generating IF signals, performing digital down conversion and realizing the synchronous demodulation and the other functions. This module also provides the extended host interface to communicate with upper computers. According to the practical test, take MSK signal for example, if the bit rate is 1Mb/s, bit error rate is lower than 10-6.

  7. On teaching software engineering based on formal techniques

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2001-01-01

    thoughts about and plans for a different software engineering text book peter Lucas Farewell Symposium......thoughts about and plans for a different software engineering text book peter Lucas Farewell Symposium...

  8. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  9. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  10. Design of microcomputer-based data acquisition system for the time-of-flight ion scattering spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Lo, H; Su, C [National Tsing Hua Univ., Hsinchu (Taiwan). Inst. of Nuclear Engineering

    1981-07-15

    A microcomputer-based data aquisition system used on a time-of-flight ion scattering spectrometer is described. The flight time of 90/sup 0/-scattered ions from target atom determined directly with a 30 MHz crystal-controlled oscillator and its associated circuit. The ion intensity is detected by a channel multiplier, and its output signal pulse is converted from the analog form into digital form by an ADC. Both flight time and ion intensity are stored in the microcomputer.

  11. Design of microcomputer-based data acquisition system for the time-of-flight ion scattering spectrometer

    International Nuclear Information System (INIS)

    Lo, H.; Su, C.

    1981-01-01

    A microcomputer-based data aquisition system used on a time-of-flight ion scattering spectrometer is described. The flight time of 90 0 -scattered ions from target atom determined directly with a 30 MHz crystal-controlled oscillator and its associated circuit. The ion intensity is detected by a channel multiplier, and its output signal pulse is converted from the analog form into digital form by an ADC. Both flight time and ion intensity are stored in the microcomputer. (orig.)

  12. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  13. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  14. Measurements of Nascent Soot Using a Cavity Attenauted Phase Shift (CAPS)-based Single Scattering Albedo Monitor

    Science.gov (United States)

    Freedman, A.; Onasch, T. B.; Renbaum-Wollf, L.; Lambe, A. T.; Davidovits, P.; Kebabian, P. L.

    2015-12-01

    Accurate, as compared to precise, measurement of aerosol absorption has always posed a significant problem for the particle radiative properties community. Filter-based instruments do not actually measure absorption but rather light transmission through the filter; absorption must be derived from this data using multiple corrections. The potential for matrix-induced effects is also great for organic-laden aerosols. The introduction of true in situ measurement instruments using photoacoustic or photothermal interferometric techniques represents a significant advance in the state-of-the-art. However, measurement artifacts caused by changes in humidity still represent a significant hurdle as does the lack of a good calibration standard at most measurement wavelengths. And, in the absence of any particle-based absorption standard, there is no way to demonstrate any real level of accuracy. We, along with others, have proposed that under the circumstance of low single scattering albedo (SSA), absorption is best determined by difference using measurement of total extinction and scattering. We discuss a robust, compact, field deployable instrument (the CAPS PMssa) that simultaneously measures airborne particle light extinction and scattering coefficients and thus the single scattering albedo (SSA) on the same sample volume. The extinction measurement is based on cavity attenuated phase shift (CAPS) techniques as employed in the CAPS PMex particle extinction monitor; scattering is measured using integrating nephelometry by incorporating a Lambertian integrating sphere within the sample cell. The scattering measurement is calibrated using the extinction measurement of non-absorbing particles. For small particles and low SSA, absorption can be measured with an accuracy of 6-8% at absorption levels as low as a few Mm-1. We present new results of the measurement of the mass absorption coefficient (MAC) of soot generated by an inverted methane diffusion flame at 630 nm. A value

  15. High frequency and pulse scattering physical acoustics

    CERN Document Server

    Pierce, Allan D

    1992-01-01

    High Frequency and Pulse Scattering investigates high frequency and pulse scattering, with emphasis on the phenomenon of echoes from objects. Geometrical and catastrophe optics methods in scattering are discussed, along with the scattering of sound pulses and the ringing of target resonances. Caustics and associated diffraction catastrophes are also examined.Comprised of two chapters, this volume begins with a detailed account of geometrically based approximation methods in scattering theory, focusing on waves transmitted through fluid and elastic scatterers and glory scattering; surface ray r

  16. 2016 American Conference on Neutron Scattering (ACNS)

    International Nuclear Information System (INIS)

    Woodward, Patrick

    2017-01-01

    The 8th American Conference on Neutron Scattering (ACNS) was held July 10-14, 2016 in Long Beach California, marking the first time the meeting has been held on the west coast. The meeting was coordinated by the Neutron Scattering Society of America (NSSA), and attracted 285 attendees. The meeting was chaired by NSSA vice president Patrick Woodward (the Ohio State University) assisted by NSSA president Stephan Rosenkranz (Argonne National Laboratory) together with the local organizing chair, Brent Fultz (California Institute of Technology). As in past years the Materials Research Society assisted with planning, logistics and operation of the conference. The science program was divided into the following research areas: (a) Sources, Instrumentation, and Software; (b) Hard Condensed Matter; (c) Soft Matter; (d) Biology; (e) Materials Chemistry and Materials for Energy; (f) Engineering and Industrial Applications; and (g) Neutron Physics.

  17. 2016 American Conference on Neutron Scattering (ACNS)

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, Patrick [Materials Research Society, Warrendale, PA (United States)

    2017-02-09

    The 8th American Conference on Neutron Scattering (ACNS) was held July 10-14, 2016 in Long Beach California, marking the first time the meeting has been held on the west coast. The meeting was coordinated by the Neutron Scattering Society of America (NSSA), and attracted 285 attendees. The meeting was chaired by NSSA vice president Patrick Woodward (the Ohio State University) assisted by NSSA president Stephan Rosenkranz (Argonne National Laboratory) together with the local organizing chair, Brent Fultz (California Institute of Technology). As in past years the Materials Research Society assisted with planning, logistics and operation of the conference. The science program was divided into the following research areas: (a) Sources, Instrumentation, and Software; (b) Hard Condensed Matter; (c) Soft Matter; (d) Biology; (e) Materials Chemistry and Materials for Energy; (f) Engineering and Industrial Applications; and (g) Neutron Physics.

  18. Structure of unilamellar vesicles: Numerical analysis based on small-angle neutron scattering data

    International Nuclear Information System (INIS)

    Zemlyanaya, E. V.; Kiselev, M. A.; Zbytovska, J.; Almasy, L.; Aswal, V. K.; Strunz, P.; Wartewig, S.; Neubert, R.

    2006-01-01

    The structure of polydispersed populations of unilamellar vesicles is studied by small-angle neutron scattering for three types of lipid systems, namely, single-, two-and four-component vesicular systems. Results of the numerical analysis based on the separated-form-factor model are reported

  19. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  20. Web based parallel/distributed medical data mining using software agents

    Energy Technology Data Exchange (ETDEWEB)

    Kargupta, H.; Stafford, B.; Hamzaoglu, I.

    1997-12-31

    This paper describes an experimental parallel/distributed data mining system PADMA (PArallel Data Mining Agents) that uses software agents for local data accessing and analysis and a web based interface for interactive data visualization. It also presents the results of applying PADMA for detecting patterns in unstructured texts of postmortem reports and laboratory test data for Hepatitis C patients.

  1. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  2. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  3. Impact of Base Functional Component Types on Software Functional Size based Effort Estimation

    OpenAIRE

    Gencel, Cigdem; Buglione, Luigi

    2008-01-01

    Software effort estimation is still a significant challenge for software management. Although Functional Size Measurement (FSM) methods have been standardized and have become widely used by the software organizations, the relationship between functional size and development effort still needs further investigation. Most of the studies focus on the project cost drivers and consider total software functional size as the primary input to estimation models. In this study, we investigate whether u...

  4. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  5. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  6. Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset

    Science.gov (United States)

    Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.

    2018-04-01

    Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.

  7. DESIGN AND APPLICATIONS OF RAPID IMAGE TILE PRODUCING SOFTWARE BASED ON MOSAIC DATASET

    Directory of Open Access Journals (Sweden)

    Z. Zha

    2018-04-01

    Full Text Available Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.

  8. 7. annual software survey 2009

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2009-07-15

    This article presented a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In addition to a description of the software application, this article listed the name of software providers and the new features available in each product. The featured software developed by Calgary-based providers included: OpenInvoice software developed by DO2 Technologies Inc; oil and gas solutions by Energy Navigator; WellSpring planning system by Enersight; Entero MOSAIC and Entero ONE software packages by Entero Corporation; Emission Manager by Envirosoft Corporation; ResSurveil, ResBalance and ResAssist by Epic Consulting Services Ltd.; OMNI 3D, VISTA 2D/3D seismic software by Gedco; geoSCOUT, petroCUBE and gDC by GeoLOGIC Systems Ltd.; IHS AccuMap and PETRA by IHS; WELLFLO, PIPEFLO and FORGAS wellbore solutions by Neotec; AFENexus, FANexus, GeoNexus, JVNexus, PANexus software by Pandell Technology Corporation; Oil and gas solutions by the Risk Advisory division of SAS; Petrel, ECLIPSE, Avocet, Osprey and Merak by Schlumberger Information Solutions; esi.manage and esi.executive by 3esi; and STABView, ROCKSBank by Weatherford Advanced Geotechnology. The featured software developed by Texas-based providers included the HTRI Xchanger Suite by Heat Transfer Research Inc.; the RFID-based asset tracking system by Merrick Systems; oil and gas solutions by Neuralog Inc.; geoscience data programs by OpenSpirit; and oil and gas solutions by Seismic Micro-Technology Inc. The featured software developed by Vancouver-based providers included the oil and gas solutions by Sustainet Software Solutions Inc.

  9. A systematic approach to robust preconditioning for gradient-based inverse scattering algorithms

    International Nuclear Information System (INIS)

    Nordebo, Sven; Fhager, Andreas; Persson, Mikael; Gustafsson, Mats

    2008-01-01

    This paper presents a systematic approach to robust preconditioning for gradient-based nonlinear inverse scattering algorithms. In particular, one- and two-dimensional inverse problems are considered where the permittivity and conductivity profiles are unknown and the input data consist of the scattered field over a certain bandwidth. A time-domain least-squares formulation is employed and the inversion algorithm is based on a conjugate gradient or quasi-Newton algorithm together with an FDTD-electromagnetic solver. A Fisher information analysis is used to estimate the Hessian of the error functional. A robust preconditioner is then obtained by incorporating a parameter scaling such that the scaled Fisher information has a unit diagonal. By improving the conditioning of the Hessian, the convergence rate of the conjugate gradient or quasi-Newton methods are improved. The preconditioner is robust in the sense that the scaling, i.e. the diagonal Fisher information, is virtually invariant to the numerical resolution and the discretization model that is employed. Numerical examples of image reconstruction are included to illustrate the efficiency of the proposed technique

  10. Object-Oriented Technology-Based Software Library for Operations of Water Reclamation Centers

    Science.gov (United States)

    Otani, Tetsuo; Shimada, Takehiro; Yoshida, Norio; Abe, Wataru

    SCADA systems in water reclamation centers have been constructed based on hardware and software that each manufacturer produced according to their design. Even though this approach used to be effective to realize real-time and reliable execution, it is an obstacle to cost reduction about system construction and maintenance. A promising solution to address the problem is to set specifications that can be used commonly. In terms of software, information model approach has been adopted in SCADA systems in other field, such as telecommunications and power systems. An information model is a piece of software specification that describes a physical or logical object to be monitored. In this paper, we propose information models for operations of water reclamation centers, which have not ever existed. In addition, we show the feasibility of the information model in terms of common use and processing performance.

  11. Development of a Brillouin scattering based distributed fibre optic strain sensor

    Science.gov (United States)

    Brown, Anthony Wayne

    2001-07-01

    The parameters of the Brillouin spectrum of an optical fibre depend upon the strain and temperature conditions of the fibre. As a result, fibre optic distributed sensors based on Brillouin scattering can measure strain and temperature in arbitrary regions of a sensing fibre. In the past, such sensors have often been demonstrated under laboratory conditions, demonstrating the principle of operation. Although some field tests of temperature sensing have been reported, the actual deployment of such sensors in the field for strain measurements has been limited by poor spatial resolution (typically 1 m or more) and poor strain accuracy (+/-100 muepsilon). Also, cross-sensitivity of the Brillouin spectrum to temperature further reduces the accuracy of strain measurement while long acquisition times hinders field use. The high level of user knowledge and lack of automation required to operate the equipment is another limiting factor of the only commercially available unit. The potential benefits of distributed measurements are great for instrumentation of civil structures provided that the above limitations are overcome. However, before this system is used with confidence by practitioners, it is essential that it can be effectively operated in field conditions. In light of this, the fibre optics group at the University of New Brunswick has been developing an automated system for field measurement of strain in civil structures, particularly in reinforced concrete. The development of the sensing system hardware and software was the main focus of this thesis. This has been made possible, in part, by observation of the Brillouin spectrum for the case of using very short light pulses (performance to measure strain to an accuracy of 10 muepsilon; and allow the simultaneous measurement of strain and temperature to an accuracy of 204 muepsilon and 3°C are presented. Finally, the results of field measurement of strain on a concrete structure are presented.

  12. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  13. Comparison of SOAP and REST Based Web Services Using Software Evaluation Metrics

    Directory of Open Access Journals (Sweden)

    Tihomirovs Juris

    2016-12-01

    Full Text Available The usage of Web services has recently increased. Therefore, it is important to select right type of Web services at the project design stage. The most common implementations are based on SOAP (Simple Object Access Protocol and REST (Representational State Transfer Protocol styles. Maintainability of REST and SOAP Web services has become an important issue as popularity of Web services is increasing. Choice of the right approach is not an easy decision since it is influenced by development requirements and maintenance considerations. In the present research, we present the comparison of SOAP and REST based Web services using software evaluation metrics. To achieve this aim, a systematic literature review will be made to compare REST and SOAP Web services in terms of the software evaluation metrics.

  14. V and V-based remaining fault estimation model for safety–critical software of a nuclear power plant

    International Nuclear Information System (INIS)

    Eom, Heung-seop; Park, Gee-yong; Jang, Seung-cheol; Son, Han Seong; Kang, Hyun Gook

    2013-01-01

    Highlights: ► A software fault estimation model based on Bayesian Nets and V and V. ► Use of quantified data derived from qualitative V and V results. ► Faults insertion and elimination process was modeled in the context of probability. ► Systematically estimates the expected number of remaining faults. -- Abstract: Quantitative software reliability measurement approaches have some limitations in demonstrating the proper level of reliability in cases of safety–critical software. One of the more promising alternatives is the use of software development quality information. Particularly in the nuclear industry, regulatory bodies in most countries use both probabilistic and deterministic measures for ensuring the reliability of safety-grade digital computers in NPPs. The point of deterministic criteria is to assess the whole development process and its related activities during the software development life cycle for the acceptance of safety–critical software. In addition software Verification and Validation (V and V) play an important role in this process. In this light, we propose a V and V-based fault estimation method using Bayesian Nets to estimate the remaining faults for safety–critical software after the software development life cycle is completed. By modeling the fault insertion and elimination processes during the whole development phases, the proposed method systematically estimates the expected number of remaining faults.

  15. Integration of LCoS-SLM and LabVIEW based software to simulate fundamental optics, wave optics, and Fourier optics

    Science.gov (United States)

    Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei

    2017-08-01

    Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.

  16. Ragnarok: An Architecture Based Software Development Environment

    OpenAIRE

    Christensen, Henrik Bærbak

    1999-01-01

    The Ragnarok project is an experimental computer science project within the field of software development environments. Taking current problems in software engineering as starting point, a small set of hypotheses are proposed, outlining plausible solutions for problems concerning the management of the development process and its associated data, and outlining how these solutions can be supported directly in a development environment. These hypotheses are all deeply rooted in the viewpoint tha...

  17. Arbitrary scattering of an acoustical Bessel beam by a rigid spheroid with large aspect-ratio

    Science.gov (United States)

    Gong, Zhixiong; Li, Wei; Mitri, Farid G.; Chai, Yingbin; Zhao, Yao

    2016-11-01

    In this paper, the T-matrix (null-field) method is applied to investigate the acoustic scattering by a large-aspect-ratio rigid spheroid immersed in a non-viscous fluid under the illumination of an unbounded zeroth-order Bessel beam with arbitrary orientation. Based on the proposed method, a MATLAB software package is constructed accordingly, and then verified and validated to compute the acoustic scattering by a rigid oblate or prolate spheroid in the Bessel beam. Several numerical examples are carried out to investigate the novel phenomenon of acoustic scattering by spheroids in Bessel beams with arbitrary incidence, with particular emphasis on the aspect ratio (i.e. the ratio of the polar radius over the equatorial radius of the spheroid), the half-cone angle of Bessel beam, the dimensionless frequency, as well as the angle of incidence. The quasi-periodic oscillations are observed in the plots of the far-field backscattering form function modulus versus the dimensionless frequency, owing to the interference between the specular reflection and the Franz wave circumnavigating the spheroid in the surrounding fluid. Furthermore, the 3D far-field scattering directivity patterns at end-on incidence and 2D polar plots at arbitrary angles of incidence are exhibited, which could provide new insights into the physical mechanisms of Bessel beam scattering by flat or elongated spheroid. This research work may provide an impetus for the application of acoustic Bessel beam in engineering practices.

  18. Objective measurement of intraocular forward light scatter using Hartmann-Shack spot patterns from clinical aberrometers. Model-eye and human-eye study.

    Science.gov (United States)

    Cerviño, Alejandro; Bansal, Dheeraj; Hosking, Sarah L; Montés-Micó, Robert

    2008-07-01

    To apply software-based image-analysis tools to objectively determine intraocular scatter determined from clinically derived Hartmann-Shack patterns. Aston Academy of Life Sciences, Aston University, Birmingham, United Kingdom, and Department of Optics, University of Valencia, Valencia, Spain. Purpose-designed image-analysis software was used to quantify scatter from centroid patterns obtained using a clinical Hartmann-Shack analyzer (WASCA, Zeiss/Meditec). Three scatter values, as the maximum standard deviation within a lenslet for all lenslets in the pattern, were obtained in 6 model eyes and 10 human eyes. In the model-eye sample, patterns were obtained in 4 sessions: 2 without realigning between measurements, 1 with realignment, and 1 with an angular shift of 6 degrees from the instrument axis. Three measurements were made in the human eyes with the C-Quant straylight meter (Oculus) to obtain psychometric and objective measures of retinal straylight. Analysis of variance, intraclass correlation coefficients, coefficient of repeatability (CoR), and correlations were used to determine intrasession and intersession repeatability and the relationship between measures. No significant differences were found between the sessions in the model eye (P=.234). The mean CoR was less than 10% in all model- and human-eye sessions. After incomplete patterns were removed, good correlation was achieved between psychometric and objective scatter measurements despite the small sample size (n=6; r=-0.831; P=.040). The methodology was repeatable in model and human eyes, strong against realignment and misalignment, and sensitive. Clinical application would benefit from effective use of the sensor's dynamic range.

  19. Bega - Android-Based Beergame Simulation Software for Interactive Training and Innovation

    Science.gov (United States)

    Lestyánszka Škůrková, Katarína; Szander, Norina

    2013-12-01

    The supply chain management challenges and inventory holding problems can easily be demonstrated by the widely known BeerGame simulation. In the Szabó-Szoba R&D Laboratory, we developed an android-based software application for tablets and smart phones for the purpose of having an adaptable, entertaining and effective program which can provide a real life experience to the participants about the nature of the bullwhip effect. Having an appropriate and comprehensive performance measurement system with the critical parameters and KPIs is inevitable for finding the right solutions - We used four perspectives of the Balanced Scorecard method. The innovative force of our research is based on the trainings: the discussion on outcomes and the team learning. The purpose of the current development is to build a new feature in the software: an artificial client can substitute one or more players in the supply chain, which makes decisions by using genetic algorithms.

  20. A communication-channel-based representation system for software

    NARCIS (Netherlands)

    Demirezen, Zekai; Tanik, Murat M.; Aksit, Mehmet; Skjellum, Anthony

    We observed that before initiating software development the objectives are minimally organized and developers introduce comparatively higher organization throughout the design process. To be able to formally capture this observation, a new communication channel representation system for software is

  1. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    Science.gov (United States)

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  2. High-Frequency Guided Wave Scattering by a Partly Through-Thickness Hole Based on 3D Theory

    International Nuclear Information System (INIS)

    Zhang Hai-Yan; Xu Jian; Ma Shi-Wei

    2015-01-01

    We present a theoretical investigation of the scattering of high frequency S0 Lamb mode from a circular blind hole defect in a plate based on the 3D theory. The S0 wave is incident at the frequency above the A1 mode cut-off frequency, in which the popular approximate plate theories are inapplicable. Due to the non-symmetric blind hole defect, the scattered fields will contain higher order converted modes in addition to the fundamental S0 and A0 modes. The far-field scattering amplitudes of various propagating Lamb modes for different hole sizes are inspected. The results are compared with those of lower frequencies and some different phenomena are found. Two-dimensional Fourier transform (2DFT) results of transient scattered Lamb and SH wave signals agree well with the analytical dispersion curves, which check the validity of the solutions from another point of view. (paper)

  3. Deep inelastic neutron scattering

    International Nuclear Information System (INIS)

    Mayers, J.

    1989-03-01

    The report is based on an invited talk given at a conference on ''Neutron Scattering at ISIS: Recent Highlights in Condensed Matter Research'', which was held in Rome, 1988, and is intended as an introduction to the techniques of Deep Inelastic Neutron Scattering. The subject is discussed under the following topic headings:- the impulse approximation I.A., scaling behaviour, kinematical consequences of energy and momentum conservation, examples of measurements, derivation of the I.A., the I.A. in a harmonic system, and validity of the I.A. in neutron scattering. (U.K.)

  4. An integrated environment of software development and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong

    2005-02-01

    To develop and implement a safety-critical system, the requirements of the system must be analyzed thoroughly during the phases of a software development's life cycle because a single error in the requirements can generate serious software faults. We therefore propose an Integrated Environment (IE) approach for requirements which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. For the V and V tasks of requirements phase, our approach uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and the analysis of requirements traceability are the most effective methods of software V and V. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in nuclear fields, as well as in other fields, because of their mathematical nature. We also propose another Integrated Environment (IE) for the design and implementation of safety-critical systems. In this study, a nuclear FED-style design specification and analysis (NuFDS) approach was proposed for PLC based safety-critical systems. The NuFDS approach is suggested in a straightforward manner for the effective and formal specification and analysis of software designs. Accordingly, the proposed NuFDS approach comprises one technique for specifying the software design and another for analyzing the software design. In addition, with the NuFDS approach, we can analyze the safety of software on the basis of fault tree synthesis. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Various tools have been needed to make software V and V more convenient. We therefore developed four kinds of computer-aided software engineering tools that could be used in accordance with the software's life cycle to

  5. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  6. Craniux: a LabVIEW-based modular software framework for brain-machine interface research.

    Science.gov (United States)

    Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei

    2011-01-01

    This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  7. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    Directory of Open Access Journals (Sweden)

    Alan D. Degenhart

    2011-01-01

    Full Text Available This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  8. The effect of high-scatter shielding geometries in validating the dose inferred by N-VisageTM - 16130

    International Nuclear Information System (INIS)

    Adams, Jamie C.; Joyce, Malcolm J.; Mellor, Matthew

    2009-01-01

    The aim of this paper is to further validate the physical capability of N-Visage TM under more challenging shielding geometries, when the number of mean free paths is greater than one. N-Visage TM is a recently established technique developed at REACT Engineering Ltd. The software locates radionuclide sources and contours radiation magnitude. The N-Visage TM software uses a geometric computer model combined with measured spectra. The software is able to estimate source locations through shielding materials by using mass attenuation coefficients to calculate the number of unscattered gamma photons arriving at the detector, and build-up factors to estimate scatter contribution to dose rate. The experiments described in this paper were carried out in a high-scatter environment using cobalt-60 and cesium- 137 sources, these two sources are the primary sources of radiological contamination found in the nuclear industry. It is hoped that this will further assist in the identification, characterisation and removal of buried radiologically contaminated waste. (authors)

  9. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    Science.gov (United States)

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established

  10. Advanced control system for the scattering chamber of K-130 cyclotron

    International Nuclear Information System (INIS)

    Bhaumik, Tapas Kumar; Saha, Amiya Kumar; Mandal, Bidhan Chandra; Adak, Debabrata; Purkait, Monirul; Bhattacharya, Sailajananda

    2013-01-01

    The old relay based position control system of the Scattering Chamber of K-130 Cyclotron was more than 30 years old, developed multiple problems over the years, and the components of the control system like relays, potentiometers, limit switches, radiation resistant cables were not functioning as per specifications. It has been replaced recently by a new PLC based embedded control system. This has been developed using Advantech ADAM-5510EKW 16 bit CPU together with required I/O modules. MULTIPROG Basic V.4.6 has been used as programming software and Ladder Diagram has been used to realize control scheme of the PLC. A resistive type touch screen based operator panel, WebOP-2104VN4AE has been mounted on the control rack for local control. iFIX SCADA, 75 tags Runtime Version 5.1 has been used to operate from remote PC connected to the PLC by 70m CAT6 cable. The PLC communicates with the touch panel and remote PC through 100 Mbps Ethernet switch. The Graphical User Interfaces of the local control panel and the remote panel have been developed as per the requirement from the users. All the old sensors, limit switches, relays and cables have been replaced by new items. This paper reports about the development and commissioning of basic hardware, software and operation of the control system for last six months. (author)

  11. Λ scattering equations

    Science.gov (United States)

    Gomez, Humberto

    2016-06-01

    The CHY representation of scattering amplitudes is based on integrals over the moduli space of a punctured sphere. We replace the punctured sphere by a double-cover version. The resulting scattering equations depend on a parameter Λ controlling the opening of a branch cut. The new representation of scattering amplitudes possesses an enhanced redundancy which can be used to fix, modulo branches, the location of four punctures while promoting Λ to a variable. Via residue theorems we show how CHY formulas break up into sums of products of smaller (off-shell) ones times a propagator. This leads to a powerful way of evaluating CHY integrals of generic rational functions, which we call the Λ algorithm.

  12. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  13. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  14. Screen Miniatures as Icons for Backward Navigation in Content-Based Software.

    Science.gov (United States)

    Boling, Elizabeth; Ma, Guoping; Tao, Chia-Wen; Askun, Cengiz; Green, Tim; Frick, Theodore; Schaumburg, Heike

    Users of content-based software programs, including hypertexts and instructional multimedia, rely on the navigation functions provided by the designers of those program. Typical navigation schemes use abstract symbols (arrows) to label basic navigational functions like moving forward or backward through screen displays. In a previous study, the…

  15. Software ecosystems analyzing and managing business networks in the software industry

    CERN Document Server

    Jansen, S; Cusumano, MA

    2013-01-01

    This book describes the state-of-the-art of software ecosystems. It constitutes a fundamental step towards an empirically based, nuanced understanding of the implications for management, governance, and control of software ecosystems. This is the first book of its kind dedicated to this emerging field and offers guidelines on how to analyze software ecosystems; methods for managing and growing; methods on transitioning from a closed software organization to an open one; and instruments for dealing with open source, licensing issues, product management and app stores. It is unique in bringing t

  16. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  17. The Need for V&V in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  18. Uniframe: A Unified Framework for Developing Service-Oriented, Component-Based Distributed Software Systems

    National Research Council Canada - National Science Library

    Raje, Rajeev R; Olson, Andrew M; Bryant, Barrett R; Burt, Carol C; Auguston, Makhail

    2005-01-01

    .... It describes how this approach employs a unifying framework for specifying such systems to unite the concepts of service-oriented architectures, a component-based software engineering methodology...

  19. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    Science.gov (United States)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  20. Implementation Of Carlson Survey Software2009 In Survey Works And Comparison With CDS Software

    Directory of Open Access Journals (Sweden)

    Mohamed Faraj EL Megrahi

    2017-02-01

    Full Text Available The automation surveying is one of the most influential changes to surveying concept and profession has had to go through, this has taken effect in two major courses, hardware (instrumentation used in data collection and presentation, and the software (the applications used in data processing and manipulation. Automation is majorly computer based and just like all such systems is subject to improvement often; this is manifested in the new kinds of instrumentation models every few years such as total station and newer versions of software’s. The software that has the potential to completely affect survey automation is Carlson Surveying Software. This when coupled with total station as data processing and collection methods respectively; is capable of greatly improving productivity while reducing time and cost required in the long run. However, it is only natural for users to desire a competent software and be able to choose from what is available on the market based on guided research and credible information from previous researches. Such studies not only help in choice of software but are also handy when it comes to testing approaches and recommending improvements based on advantages and disadvantages to the manufacturers to help in advancement in the software industry for better and more comfortable use. The expected outcome of the research is a successful implementation of Carlson survey 2009 software in survey works and a comparison with other existing software like Civil Design Software (CDS was highlighted its advantages and disadvantages.

  1. Thermal-neutron multiple scattering: critical double scattering

    International Nuclear Information System (INIS)

    Holm, W.A.

    1976-01-01

    A quantum mechanical formulation for multiple scattering of thermal-neutrons from macroscopic targets is presented and applied to single and double scattering. Critical nuclear scattering from liquids and critical magnetic scattering from ferromagnets are treated in detail in the quasielastic approximation for target systems slightly above their critical points. Numerical estimates are made of the double scattering contribution to the critical magnetic cross section using relevant parameters from actual experiments performed on various ferromagnets. The effect is to alter the usual Lorentzian line shape dependence on neutron wave vector transfer. Comparison with corresponding deviations in line shape resulting from the use of Fisher's modified form of the Ornstein-Zernike spin correlations within the framework of single scattering theory leads to values for the critical exponent eta of the modified correlations which reproduce the effect of double scattering. In addition, it is shown that by restricting the range of applicability of the multiple scattering theory from the outset to critical scattering, Glauber's high energy approximation can be used to provide a much simpler and more powerful description of multiple scattering effects. When sufficiently close to the critical point, it provides a closed form expression for the differential cross section which includes all orders of scattering and has the same form as the single scattering cross section with a modified exponent for the wave vector transfer

  2. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  3. Aquarius' Object-Oriented, Plug and Play Component-Based Flight Software

    Science.gov (United States)

    Murray, Alexander; Shahabuddin, Mohammad

    2013-01-01

    The Aquarius mission involves a combined radiometer and radar instrument in low-Earth orbit, providing monthly global maps of Sea Surface Salinity. Operating successfully in orbit since June, 2011, the spacecraft bus was furnished by the Argentine space agency, Comision Nacional de Actividades Espaciales (CONAE). The instrument, built jointly by NASA's Caltech/JPL and Goddard Space Flight Center, has been successfully producing expectation-exceeding data since it was powered on in August of 2011. In addition to the radiometer and scatterometer, the instrument contains an command & data-handling subsystem with a computer and flight software (FSW) that is responsible for managing the instrument, its operation, and its data. Aquarius' FSW is conceived and architected as a Component-based system, in which the running software consists of a set of Components, each playing a distinctive role in the subsystem, instantiated and connected together at runtime. Component architectures feature a well-defined set of interfaces between the Components, visible and analyzable at the architectural level (see [1]). As we will describe, this kind of an architecture offers significant advantages over more traditional FSW architectures, which often feature a monolithic runtime structure. Component-based software is enabled by Object-Oriented (OO) techniques and languages, the use of which again is not typical in space mission FSW. We will argue in this paper that the use of OO design methods and tools (especially the Unified Modeling Language), as well as the judicious usage of C++, are very well suited to FSW applications, and we will present Aquarius FSW, describing our methods, processes, and design, as a successful case in point.

  4. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  5. A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.

    Science.gov (United States)

    Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu

    2015-01-01

    X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.

  6. FPGA-Based Efficient Hardware/Software Co-Design for Industrial Systems with Consideration of Output Selection

    Science.gov (United States)

    Deliparaschos, Kyriakos M.; Michail, Konstantinos; Zolotas, Argyrios C.; Tzafestas, Spyros G.

    2016-05-01

    This work presents a field programmable gate array (FPGA)-based embedded software platform coupled with a software-based plant, forming a hardware-in-the-loop (HIL) that is used to validate a systematic sensor selection framework. The systematic sensor selection framework combines multi-objective optimization, linear-quadratic-Gaussian (LQG)-type control, and the nonlinear model of a maglev suspension. A robustness analysis of the closed-loop is followed (prior to implementation) supporting the appropriateness of the solution under parametric variation. The analysis also shows that quantization is robust under different controller gains. While the LQG controller is implemented on an FPGA, the physical process is realized in a high-level system modeling environment. FPGA technology enables rapid evaluation of the algorithms and test designs under realistic scenarios avoiding heavy time penalty associated with hardware description language (HDL) simulators. The HIL technique facilitates significant speed-up in the required execution time when compared to its software-based counterpart model.

  7. An Agent Based Software Approach towards Building Complex Systems

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Agent-oriented techniques represent an exciting new means of analyzing, designing and building complex software systems. They have the potential to significantly improve current practice in software engineering and to extend the range of applications that can feasibly be tackled. Yet, to date, there have been few serious attempts to cast agent systems as a software engineering paradigm. This paper seeks to rectify this omission. Specifically, points to be argued include:firstly, the conceptual apparatus of agent-oriented systems is well-suited to building software solutions for complex systems and secondly, agent-oriented approaches represent a genuine advance over the current state of the art for engineering complex systems. Following on from this view, the major issues raised by adopting an agentoriented approach to software engineering are highlighted and discussed in this paper.

  8. Measuring module of spectrometer of neutron small angle scattering on the IBR pulse reactor

    International Nuclear Information System (INIS)

    Vagov, V.A.; Zhukov, G.P.; Kozlova, E.P.; Korobchenko, M.L.; Namsraj, Yu.; Ostanevich, Yu.M.; Savvateev, A.S.; Salamatin, I.M.; Sirotin, A.P.

    1980-01-01

    Equipment and software for experiments with neutron small angle scattering is described. It is intended for data acquisition, equipment control storage of collected data and their output to network of the Laboratory measuring centre. The set-up equipment includes: 9 neutron detectors with corresponding electronic apparatus, sample exchanging device, communication link, SM-3 type minicomputer of an extended configuration and some units of CAMAC electronic equipment. The software (MUR applied operatio system) is intended for the automatic performance of the given number of cycles of successive uniform runs of a given duration with the sample list at two possible filter positions. Besides, the MUR system contains test, debugging and service software. The software has been designed using the SANPO system means [ru

  9. Comparison of Automated Atlas Based Segmentation Software for postoperative prostate cancer radiotherapy

    Directory of Open Access Journals (Sweden)

    Grégory Delpon

    2016-08-01

    Full Text Available Automated atlas-based segmentation algorithms present the potential to reduce the variability in volume delineation. Several vendors offer software that are mainly used for cranial, head and neck and prostate cases. The present study will compare the contours produced by a radiation oncologist to the contours computed by different automated atlas-based segmentation algorithms for prostate bed cases, including femoral heads, bladder and rectum. Contour comparison was evaluated by different metrics such as volume ratio, Dice coefficient and Hausdorff distance. Results depended on the volume of interest and showed some discrepancies between the different software. Automatic contours could be a good starting point for the delineation of organs since efficient editing tools are provided by different vendors. It should become an important help in the next few years for organ at risk delineation.

  10. Development of an irrigation scheduling software based on model predicted crop water stress

    Science.gov (United States)

    Modern irrigation scheduling methods are generally based on sensor-monitored soil moisture regimes rather than crop water stress which is difficult to measure in real-time, but can be computed using agricultural system models. In this study, an irrigation scheduling software based on RZWQM2 model pr...

  11. A diode laser-based velocimeter providing point measurements in unseeded flows using modulated filtered Rayleigh scattering (MFRS)

    Science.gov (United States)

    Jagodzinski, Jeremy James

    2007-12-01

    The development to date of a diode-laser based velocimeter providing point-velocity-measurements in unseeded flows using molecular Rayleigh scattering is discussed. The velocimeter is based on modulated filtered Rayleigh scattering (MFRS), a novel variation of filtered Rayleigh scattering (FRS), utilizing modulated absorption spectroscopy techniques to detect a strong absorption of a relatively weak Rayleigh scattered signal. A rubidium (Rb) vapor filter is used to provide the relatively strong absorption; alkali metal vapors have a high optical depth at modest vapor pressures, and their narrow linewidth is ideally suited for high-resolution velocimetry. Semiconductor diode lasers are used to generate the relatively weak Rayleigh scattered signal; due to their compact, rugged construction diode lasers are ideally suited for the environmental extremes encountered in many experiments. The MFRS technique utilizes the frequency-tuning capability of diode lasers to implement a homodyne detection scheme using lock-in amplifiers. The optical frequency of the diode-based laser system used to interrogate the flow is rapidly modulated about a reference frequency in the D2-line of Rb. The frequency modulation is imposed on the Rayleigh scattered light that is collected from the probe volume in the flow under investigation. The collected frequency modulating Rayleigh scattered light is transmitted through a Rb vapor filter before being detected. The detected modulated absorption signal is fed to two lock-in amplifers synchronized with the modulation frequency of the source laser. High levels of background rejection are attained since the lock-ins are both frequency and phase selective. The two lock-in amplifiers extract different Fourier components of the detected modulated absorption signal, which are ratioed to provide an intensity normalized frequency dependent signal from a single detector. A Doppler frequency shift in the collected Rayleigh scattered light due to a change

  12. Method and software to solution of inverse and inverse design fluid flow and heat transfer problems is compatible with CFD-software

    Energy Technology Data Exchange (ETDEWEB)

    Krukovsky, P G [Institute of Engineering Thermophysics, National Academy of Sciences of Ukraine, Kiev (Ukraine)

    1998-12-31

    The description of method and software FRIEND which provide a possibility of solution of inverse and inverse design problems on the basis of existing (base) CFD-software for solution of direct problems (in particular, heat-transfer and fluid-flow problems using software PHOENICS) are presented. FRIEND is an independent additional module that widens the operational capacities of the base software unified with this module. This unifying does not require any change or addition to the base software. Interfacing of FRIEND and the base software takes place through input and output files of the base software. A brief description of the computational technique applied for the inverse problem solution, same detailed information on the interfacing of FRIEND and CFD-software and solution results for testing inverse and inverse design problems, obtained using the tandem CFD-software PHOENICS and FRIEND, are presented. (author) 9 refs.

  13. Method and software to solution of inverse and inverse design fluid flow and heat transfer problems is compatible with CFD-software

    Energy Technology Data Exchange (ETDEWEB)

    Krukovsky, P.G. [Institute of Engineering Thermophysics, National Academy of Sciences of Ukraine, Kiev (Ukraine)

    1997-12-31

    The description of method and software FRIEND which provide a possibility of solution of inverse and inverse design problems on the basis of existing (base) CFD-software for solution of direct problems (in particular, heat-transfer and fluid-flow problems using software PHOENICS) are presented. FRIEND is an independent additional module that widens the operational capacities of the base software unified with this module. This unifying does not require any change or addition to the base software. Interfacing of FRIEND and the base software takes place through input and output files of the base software. A brief description of the computational technique applied for the inverse problem solution, same detailed information on the interfacing of FRIEND and CFD-software and solution results for testing inverse and inverse design problems, obtained using the tandem CFD-software PHOENICS and FRIEND, are presented. (author) 9 refs.

  14. A domain derivative-based method for solving elastodynamic inverse obstacle scattering problems

    International Nuclear Information System (INIS)

    Le Louër, Frédérique

    2015-01-01

    The present work is concerned with the shape reconstruction problem of isotropic elastic inclusions from far-field data obtained by the scattering of a finite number of time-harmonic incident plane waves. This paper aims at completing the theoretical framework which is necessary for the application of geometric optimization tools to the inverse transmission problem in elastodynamics. The forward problem is reduced to systems of boundary integral equations following the direct and indirect methods initially developed for solving acoustic transmission problems. We establish the Fréchet differentiability of the boundary to far-field operator and give a characterization of the first Fréchet derivative and its adjoint operator. Using these results we propose an inverse scattering algorithm based on the iteratively regularized Gauß–Newton method and show numerical experiments in the special case of star-shaped obstacles. (paper)

  15. Development of an ejecta particle size measurement diagnostic based on Mie scattering

    Energy Technology Data Exchange (ETDEWEB)

    Schauer, Martin Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Buttler, William Tillman [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Frayer, Daniel K. [National Security Tech, Inc., Los Alamos, NM (United States); Grover, Michael [National Security Technologies, Santa Barbara, CA (United States). Special Technologies Lab.; Monfared, Shabnam Kalighi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stevens, Gerald D. [National Security Technologies, Santa Barbara, CA (United States). Special Technologies Lab.; Stone, Benjamin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Turley, William Dale [National Security Technologies, Santa Barbara, CA (United States). Special Technologies Lab.

    2017-09-27

    The goal of this work is to determine the feasibility of extracting the size of particles ejected from shocked metal surfaces (ejecta) from the angular distribution of light scattered by a cloud of such particles. The basis of the technique is the Mie theory of scattering, and implicit in this approach are the assumptions that the scattering particles are spherical and that single scattering conditions prevail. The meaning of this latter assumption, as far as experimental conditions are concerned, will become clear later. The solution to Maxwell’s equations for spherical particles illuminated by a plane electromagnetic wave was derived by Gustav Mie more than 100 years ago, but several modern treatises discuss this solution in great detail. The solution is a complicated series expansion of the scattered electric field, as well as the field within the particle, from which the total scattering and absorption cross sections as well as the angular distribution of scattered intensity can be calculated numerically. The detailed nature of the scattering is determined by the complex index of refraction of the particle material as well as the particle size parameter, x, which is the product of the wavenumber of the incident light and the particle radius, i.e. x = 2rπ= λ. Figure 1 shows the angular distribution of scattered light for different particle size parameters and two orthogonal incident light polarizations as calculated using the Mie solution. It is obvious that the scattering pattern is strongly dependent on the particle size parameter, becoming more forward-directed and less polarizationdependent as the particle size parameter increases. This trend forms the basis for the diagnostic design.

  16. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...... are transferred to a large high-speed computer for bulk processing and for the production of isophot and equiphase contour maps or profiles. The performance of the system is demonstrated through results for a single conical horn, for interacting rectangular horns, for multiple cylindrical scatterers...

  17. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    Science.gov (United States)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  18. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  19. Methods and Software for Building Bibliographic Data Bases.

    Science.gov (United States)

    Daehn, Ralph M.

    1985-01-01

    This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…

  20. Software for project-based learning of robot motion planning

    Science.gov (United States)

    Moll, Mark; Bordeaux, Janice; Kavraki, Lydia E.

    2013-12-01

    Motion planning is a core problem in robotics concerned with finding feasible paths for a given robot. Motion planning algorithms perform a search in the high-dimensional continuous space of robot configurations and exemplify many of the core algorithmic concepts of search algorithms and associated data structures. Motion planning algorithms can be explained in a simplified two-dimensional setting, but this masks many of the subtleties and complexities of the underlying problem. We have developed software for project-based learning of motion planning that enables deep learning. The projects that we have developed allow advanced undergraduate students and graduate students to reflect on the performance of existing textbook algorithms and their own variations on such algorithms. Formative assessment has been conducted at three institutions. The core of the software used for this teaching module is also used within the Robot Operating System, a widely adopted platform by the robotics research community. This allows for transfer of knowledge and skills to robotics research projects involving a large variety robot hardware platforms.