WorldWideScience

Sample records for vessel analysis software

  1. Accuracy and initial clinical experience with measurement software (advanced vessel analysis) in three-dimensional imaging

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Toshi; Hirohata, Masaru [Kurume Univ., Fukuoka (Japan). School of Medicine; Tanigawa, Hitoshi [Kurume Univ., Fukuoka (Japan). Hospital] [and others

    2002-12-01

    Recently, the clinical benefits of three dimensional (3D) imaging, such as 3D-CTA and 3D-DSA, in cerebro-vascular disease have been widely recognized. Software for quantitative analysis of vascular structure in 3D imaging (advanced vessel analysis: AVA) has been developed. We evaluated AVA with both phantom studies and a few clinical cases. In spiral and curvy aluminum tube phantom studies, the accuracy of diameter measurements was good in 3D images produced from data set generated by multi-detector row CT or rotational angiography. The measurement error was less than 0.03 mm on aluminum tube phantoms that were 3 mm and 5 mm in diameter. In the clinical studies, the differences of carotid artery diameter measurements between 2D-DSA and 3D-DSA was less than 0.3 mm in. The measurement of length, diameter and angle by AVA should provide useful information for planning surgical and endovascular treatments of cerebro-vascular disease. (author)

  2. Analysis by NASA's VESGEN Software of Retinal Blood Vessels Before and After 70-Day Bed Rest: A Retrospective Study

    Science.gov (United States)

    Raghunandan, Sneha; Vyas, Ruchi J.; Vizzeri, Gianmarco; Taibbi, Giovanni; Zanello, Susana B.; Ploutz-Snyder, Robert; Parsons-Wingerter, Patricia A.

    2016-01-01

    Significant risks for visual impairment associated with increased intracranial pressure (VIIP) are incurred by microgravity spaceflight, especially long-duration missions. Impairments include decreased near visual acuity, posterior globe flattening, choroidal folds, optic disc edema and cotton wool spots. We hypothesize that microgravity-induced fluid shifts result in pathological changes within the retinal blood vessels that precede development of visual and other ocular impairments. Potential contributions of retinal vascular remodeling to VIIP etiology are therefore being investigated by NASAs innovative VESsel GENeration Analysis (VESGEN) software for two studies: (1) head-down tilt in human subjects before and after 70 days of bed rest, and (2) U.S. crew members before and after ISS missions. VESGEN analysis in previous research supported by the US National Institutes of Health identified surprising new opportunities to regenerate retinal vessels during early-stage, potentially reversible progression of the visually impairing and blinding disease, diabetic retinopathy.

  3. Analysis by NASA's VESGEN Software of Retinal Blood Vessels in Human Subjects Undergoing Head-Down Tilt During 70-Day Bed Rest

    Science.gov (United States)

    Vyas, Ruchi J.; Murray, Matthew C.; Predovic, Marina; Lim, Shiyin; Askin, Kayleigh N.; Vizzeri, Gianmarco; Taibbi, Giovanni; Mason, Sara Stroble; Zanello, Susana B.; Young, Millenia; hide

    2017-01-01

    Significant risks for visual impairment associated with increased intracranial pressure (VIIP) are incurred by microgravity spaceflight, especially long-duration missions [1]. We hypothesize that microgravity-induced fluid shifts result in pathological changes within blood vessels of the retina that precede development of visual and other ocular impairments. Potential contributions of retinal vascular remodeling to VIIP etiology are therefore being investigated for two studies in 30deg infrared (IR) Heidelberg Spectralis(Registered Trademark) images with NASA's innovative VESsel GENeration Analysis (VESGEN) software [2,3]. The retrospective studies include: (1) before, during and after (pre, mid and post) 6º head-down tilt (HDT) in human subjects during 70 days of bed rest, and (2) before and after missions to the International Space Station (ISS) by U.S. crew members. Results for both studies are almost complete. A preliminary example for HDT is described below.

  4. BIOASSAY VESSEL FAILURE ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Vormelker, P

    2008-09-22

    Two high-pressure bioassay vessels failed at the Savannah River Site during a microwave heating process for biosample testing. Improper installation of the thermal shield in the first failure caused the vessel to burst during microwave heating. The second vessel failure is attributed to overpressurization during a test run. Vessel failure appeared to initiate in the mold parting line, the thinnest cross-section of the octagonal vessel. No material flaws were found in the vessel that would impair its structural performance. Content weight should be minimized to reduce operating temperature and pressure. Outer vessel life is dependent on actual temperature exposure. Since thermal aging of the vessels can be detrimental to their performance, it was recommended that the vessels be used for a limited number of cycles to be determined by additional testing.

  5. THE TECHNIQUE OF ANALYSIS OF SOFTWARE OF ON-BOARD COMPUTERS OF AIR VESSEL TO ABSENCE OF UNDECLARED CAPABILITIES BY SIGNATURE-HEURISTIC WAY

    Directory of Open Access Journals (Sweden)

    Viktor Ivanovich Petrov

    2017-01-01

    Full Text Available The article considers the issues of civil aviation aircraft onboard computers data safety. Infor- mation security undeclared capabilities stand for technical equipment or software possibilities, which are not mentioned in the documentation. Documentation and tests content requirements are imposed during the software certification. Documentation requirements include documents composition and content of control (specification, description and program code, the source code. Test requirements include: static analysis of program codes (including the compliance of the sources with their loading modules monitoring; dynamic analysis of source code (including implementation of routes monitor- ing. Currently, there are no complex measures for checking onboard computer software. There are no rules and regulations that can allow controlling foreign production aircraft software, and the actual receiving of software is difficult. Consequently, the author suggests developing the basics of aviation rules and regulations, which allow to analyze the programs of CA aircraft onboard computers. If there are no software source codes the two approaches of code analysis are used: a structural static and dy- namic analysis of the source code; signature-heuristic analysis of potentially dangerous operations. Static analysis determines the behavior of the program by reading the program code (without running the program which is represented in the assembler language - disassembly listing. Program tracing is performed by the dynamic analysis. The analysis of aircraft software ability to detect undeclared capa- bilities using the interactive disassembler was considered in this article.

  6. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  7. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision and ground......From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision...... and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...... route where the result is the probability density functions for the cost of oil outflow in a given area per year for the two vessels. In this paper we describe the basic modelling principles and the capabilities of the software package. The software package can be downloaded for research purposes from...

  8. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  9. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  10. Usefulness of Cone-Beam Computed Tomography and Automatic Vessel Detection Software in Emergency Transarterial Embolization

    Energy Technology Data Exchange (ETDEWEB)

    Carrafiello, Gianpaolo, E-mail: gcarraf@gmail.com; Ierardi, Anna Maria, E-mail: amierardi@yahoo.it; Duka, Ejona, E-mail: ejonaduka@hotmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Radaelli, Alessandro, E-mail: alessandro.radaelli@philips.com [Philips Healthcare (Netherlands); Floridi, Chiara, E-mail: chiara.floridi@gmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Bacuzzi, Alessandro, E-mail: alessandro.bacuzzi@ospedale.varese.it [University of Insubria, Anaesthesia and Palliative Care (Italy); Bucourt, Maximilian de, E-mail: maximilian.de-bucourt@charite.de [Charité - University Medicine Berlin, Department of Radiology (Germany); Marchi, Giuseppe De, E-mail: giuseppedemarchi@email.it [Insubria University, Department of Radiology, Interventional Radiology (Italy)

    2016-04-15

    BackgroundThis study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings.MethodsTwenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was defined as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered.ResultsThe bleeding site was not detected by initial angiogram in 20 % of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95 %) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90 %) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95 %. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min.ConclusionsC-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.

  11. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  12. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  13. Kendall Analysis of Cannon Pressure Vessels

    Science.gov (United States)

    2012-04-11

    Comparisons”, Journal of Pressure Vessel Technology, Vol. 123, 271-281. [5] Underwood, J.H., deSwardt, R.R., Venter, A.M., Troiano , E., Hyland...Pressure and Fatigue Life,” ASME PVP Conference, San Antonio, TX, July 22-26, 2007. [6] J.H. Underwood, A.P. Parker, E. Troiano , 2006, “Effect of...Paris and G..R. Irwin, 1985, The Stress Analysis of Cracks Handbook, Paris Productions Inc., St. Louis, MO. [8] E. Troiano , A.P. Parker and J.H

  14. Software design of the hybrid robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: Ming.Li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China)

    2013-10-15

    A specific software design is elaborated in this paper for the hybrid robot machine used for the ITER vacuum vessel (VV) assembly and maintenance. In order to provide the multi-machining-function as well as the complicated, flexible and customizable GUI designing satisfying the non-standardized VV assembly process in one hand, and in another hand guarantee the stringent machining precision in the real-time motion control of robot machine, a client–server-control software architecture is proposed, which separates the user interaction, data communication and robot control implementation into different software layers. Correspondingly, three particular application protocols upon the TCP/IP are designed to transmit the data, command and status between the client and the server so as to deal with the abundant data streaming in the software. In order not to be affected by the graphic user interface (GUI) modification process in the future experiment in VV assembly working field, the real-time control system is realized as a stand-alone module in the architecture to guarantee the controlling performance of the robot machine. After completing the software development, a milling operation is tested on the robot machine, and the result demonstrates that both the specific GUI operability and the real-time motion control performance could be guaranteed adequately in the software design.

  15. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  16. Spotlight-8 Image Analysis Software

    Science.gov (United States)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  17. Fractal analysis reveals reduced complexity of retinal vessels in CADASIL.

    Directory of Open Access Journals (Sweden)

    Michele Cavallari

    Full Text Available The Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL affects mainly small cerebral arteries and leads to disability and dementia. The relationship between clinical expression of the disease and progression of the microvessel pathology is, however, uncertain as we lack tools for imaging brain vessels in vivo. Ophthalmoscopy is regarded as a window into the cerebral microcirculation. In this study we carried out an ophthalmoscopic examination in subjects with CADASIL. Specifically, we performed fractal analysis of digital retinal photographs. Data are expressed as mean fractal dimension (mean-D, a parameter that reflects complexity of the retinal vessel branching. Ten subjects with genetically confirmed diagnosis of CADASIL and 10 sex and age-matched control subjects were enrolled. Fractal analysis of retinal digital images was performed by means of a computer-based program, and the data expressed as mean-D. Brain MRI lesion volume in FLAIR and T1-weighted images was assessed using MIPAV software. Paired t-test was used to disclose differences in mean-D between CADASIL and control groups. Spearman rank analysis was performed to evaluate potential associations between mean-D values and both disease duration and disease severity, the latter expressed as brain MRI lesion volumes, in the subjects with CADASIL. The results showed that mean-D value of patients (1.42±0.05; mean±SD was lower than control (1.50±0.04; p = 0.002. Mean-D did not correlate with disease duration nor with MRI lesion volumes of the subjects with CADASIL. The findings suggest that fractal analysis is a sensitive tool to assess changes of retinal vessel branching, likely reflecting early brain microvessel alterations, in CADASIL patients.

  18. Pandora Operation and Analysis Software

    Science.gov (United States)

    Herman, Jay; Cede, Alexander; Abuhassan, Nader

    2012-01-01

    Pandora Operation and Analysis Software controls the Pandora Sun- and sky-pointing optical head and built-in filter wheels (neutral density, UV bandpass, polarization filters, and opaque). The software also controls the attached spectrometer exposure time and thermoelectric cooler to maintain the spectrometer temperature to within 1 C. All functions are available through a GUI so as to be easily accessible by the user. The data are automatically stored on a miniature computer (netbook) for automatic download to a designated server at user defined intervals (once per day, once per week, etc.), or to a USB external device. An additional software component reduces the raw data (spectrometer counts) to preliminary scientific products for quick-view purposes. The Pandora systems are built from off-the-shelf commercial parts and from mechanical parts machined using electronic machine shop drawings. The Pandora spectrometer system is designed to look at the Sun (tracking to within 0.1 ), or to look at the sky at any zenith or azimuth angle, to gather information about the amount of trace gases or aerosols that are present.

  19. Software implementation and hardware acceleration of retinal vessel segmentation for diabetic retinopathy screening tests.

    Science.gov (United States)

    Cavinato, L; Fidone, I; Bacis, M; Del Sozzo, E; Durelli, G C; Santambrogio, M D

    2017-07-01

    Screening tests are an effective tool for the diagnosis and prevention of several diseases. Unfortunately, in order to produce an early diagnosis, the huge number of collected samples has to be processed faster than before. In particular this issue concerns image processing procedures, as they require a high computational complexity, which is not satisfied by modern software architectures. To this end, Field Programmable Gate Arrays (FPGAs) can be used to accelerate partially or entirely the computation. In this work, we demonstrate that the use of FPGAs is suitable for biomedical application, by proposing a case of study concerning the implementation of a vessels segmentation algorithm. The experimental results, computed on DRIVE and STARE databases, show remarkable improvements in terms of both execution time and power efficiency (6X and 5.7X respectively) compared to the software implementation. On the other hand, the proposed hardware approach outperforms literature works (3X speedup) without affecting the overall accuracy and sensitivity measures.

  20. ANALISA STIFFENER RING DAN KONSTRUKSI VESSEL HP FLARE KO DRUM PADA PROYEK PUPUK KALTIM-5 MENGGUNAKAN SOFTWARE COMPRESS 6258

    Directory of Open Access Journals (Sweden)

    Fadhlika Ridha

    2015-02-01

    Full Text Available Pada proses pembuatan pupuk di PKT-5, berbagai gas limbah berbahaya dimusnahkan dengan cara membakarnya melalui Flare, sebelum terbakar di Flare gas-gas tersebut dialirkan dan ditampung pada sebuah Vessel bertekanan atau biasa disebut Vessel High Pressure Flare Knock Out Drum. Dalam perancangan konstruksinya perlu dilakukan analisis sehingga desain dari vessel tersebut sesuai dengan yang diharapkan dan aman untuk dioperasikan. Penelitian ini dilakukan dengan mensimulasikan desain dari Vessel KO Drum menggunakan perhitungan manual sesuai 2007 ASME BPVC Section VIII Division 1 dan Software Compress 6258. Perhitungan dilakukan pada desain head, shell, saddle, nozzle, stiffener ring secara manual dan menggunakan software untuk mengetahui tegangan-tegangan yang terjadi. Selanjutnya dari kedua metode tersebut akan dibandingan hasil perhitungan manual & software.

  1. Software Architecture Reliability Analysis using Failure Scenarios

    NARCIS (Netherlands)

    Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet

    2005-01-01

    We propose a Software Architecture Reliability Analysis (SARA) approach that benefits from both reliability engineering and scenario-based software architecture analysis to provide an early reliability analysis of the software architecture. SARA makes use of failure scenarios that are prioritized

  2. Hygro-Thermo-Mechanical Analysis of a Reactor Vessel

    Directory of Open Access Journals (Sweden)

    Jaroslav Kruis

    2012-01-01

    Full Text Available Determining the durability of a reactor vessel requires a hygro-thermo-mechanical analysis of the vessel throughout its service life. Damage, prestress losses, distribution of heat and moisture and some other quantities are needed for a durability assessment. A coupled analysis was performed on a two-level model because of the huge demands on computer hardware. This paper deals with a hygro-thermo-mechanical analysis of a reactor vessel made of prestressed concrete with a steel inner liner. The reactor vessel is located in Temelín, Czech Republic.

  3. Interactive 3D Analysis of Blood Vessel Trees and Collateral Vessel Volumes in Magnetic Resonance Angiograms in the Mouse Ischemic Hindlimb Model.

    Science.gov (United States)

    Marks, Peter C; Preda, Marilena; Henderson, Terry; Liaw, Lucy; Lindner, Volkhard; Friesel, Robert E; Pinz, Ilka M

    2013-10-31

    The quantitative analysis of blood vessel volumes from magnetic resonance angiograms (MRA) or μCT images is difficult and time-consuming. This fact, when combined with a study that involves multiple scans of multiple subjects, can represent a significant portion of research time. In order to enhance analysis options and to provide an automated and fast analysis method, we developed a software plugin for the ImageJ and Fiji image processing frameworks that enables the quick and reproducible volume quantification of blood vessel segments. The novel plugin named Volume Calculator (VolCal), accepts any binary (thresholded) image and produces a three-dimensional schematic representation of the vasculature that can be directly manipulated by the investigator. Using MRAs of the mouse hindlimb ischemia model, we demonstrate quick and reproducible blood vessel volume calculations with 95 - 98% accuracy. In clinical settings this software may enhance image interpretation and the speed of data analysis and thus enhance intervention decisions for example in peripheral vascular disease or aneurysms. In summary, we provide a novel, fast and interactive quantification of blood vessel volumes for single blood vessels or sets of vessel segments with particular focus on collateral formation after an ischemic insult.

  4. [Finite Element Analysis of Intravascular Stent Based on ANSYS Software].

    Science.gov (United States)

    Shi, Gengqiang; Song, Xiaobing

    2015-10-01

    This paper adopted UG8.0 to bulid the stent and blood vessel models. The models were then imported into the finite element analysis software ANSYS. The simulation results of ANSYS software showed that after endothelial stent implantation, the velocity of the blood was slow and the fluctuation of velocity was small, which meant the flow was relatively stable. When blood flowed through the endothelial stent, the pressure gradually became smaller, and the range of the pressure was not wide. The endothelial shear stress basically unchanged. In general, it can be concluded that the endothelial stents have little impact on the flow of blood and can fully realize its function.

  5. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  6. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  7. Next-Generation Bioacoustic Analysis Software

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Next- Generation Bioacoustic Analysis Software David K...estimates are in one dimension (bearing), two (X-Y position), or three (X-Y- Z position), analysis software is necessary. Marine mammal acoustic data is

  8. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  9. Design and analysis of multicavity prestressed concrete reactor vessels. [HTGR

    Energy Technology Data Exchange (ETDEWEB)

    Goodpasture, D.W.; Burdette, E.G.; Callahan, J.P.

    1977-01-01

    During the past 25 years, a rather rapid evolution has taken place in the design and use of prestressed concrete reactor vessels (PCRVs). Initially the concrete vessel served as a one-to-one replacement for its steel counterpart. This was followed by the development of the integral design which led eventually to the more recent multicavity vessel concept. Although this evolution has seen problems in construction and operation, a state-of-the-art review which was recently conducted by the Oak Ridge National Laboratory indicated that the PCRV has proven to be a satisfactory and inherently safe type of vessel for containment of gas-cooled reactors from a purely functional standpoint. However, functionalism is not the only consideration in a demanding and highly competitive industry. A summary is presented of the important considerations in the design and analysis of multicavity PCRVs together with overall conclusions concerning the state of the art of these vessels.

  10. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  11. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  12. The advent of failure analysis software technology

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, C.L. [Sandia National Labs., Albuquerque, NM (United States); Barnard, R.D. [Schlumberger Technologies, San Jose, CA (United States)

    1994-02-01

    The increasing complexity of integrated circuits demands that software tools, in addition to hardware tools, be used for successful diagnosis of failure. A series of customizable software tools have been developed that organize failure analysis information and provide expert level help to failure analysts to increase their productivity and success.

  13. Fractal Branching in Vascular Trees and Networks by VESsel GENeration Analysis (VESGEN)

    Science.gov (United States)

    Parsons-Wingerter, Patricia A.

    2016-01-01

    Vascular patterning offers an informative multi-scale, fractal readout of regulatory signaling by complex molecular pathways. Understanding such molecular crosstalk is important for physiological, pathological and therapeutic research in Space Biology and Astronaut countermeasures. When mapped out and quantified by NASA's innovative VESsel GENeration Analysis (VESGEN) software, remodeling vascular patterns become useful biomarkers that advance out understanding of the response of biology and human health to challenges such as microgravity and radiation in space environments.

  14. Analysis and Visualization of Nerve Vessel Contacts for Neurovascular Decompression

    Science.gov (United States)

    Süßmuth, Jochen; Piazza, Alexander; Enders, Frank; Naraghi, Ramin; Greiner, Günther; Hastreiter, Peter

    Neurovascular compression syndromes are caused by a pathological contact between cranial nerves and vascular structures at the surface of the brainstem. Aiming at improved pre-operative analysis of the target structures, we propose calculating distance fields to provide quantitative information of the important nerve-vessel contacts. Furthermore, we suggest reconstructing polygonal models for the nerves and vessels. Color-coding with the respective distance information is used for enhanced visualization. Overall, our new strategy contributes to a significantly improved clinical understanding.

  15. CADDIS Volume 4. Data Analysis: Download Software

    Science.gov (United States)

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  16. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  17. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  18. Aspects regarding analysis of the work deck from a support vessel

    Science.gov (United States)

    Axinte, T.; Nutu, C.; Stanca, C.; Cupsa, O.; Carp, A.

    2016-08-01

    The authors are presenting an analysis of the work deck only for the support vessel, a ship having in its structure among others: deck cranes and helicopter deck. The work deck is one of the most important parts of the support vessel's hull. We are starting the paper by presenting the role and the importance of the support vessel's type, by using an original execution drawing carried out using the Unigraphics NX 8.0 Software from Siemens.. Further on we can determine the shear, normal and the von Mises stresses pertaining to the work deck by using the finite element method. After determination of these stresses we can assess fatigue life, strength safety factor and fatigue safety factor. In order to determine the fatigue, the loading pattern only with the full unit cycle will be used. As for determining the safety factor only the ultimate strength stress criterion with the stress type von Mises from failure theories, will be used.

  19. On-line monitoring and analysis of reactor vessel integrity

    Energy Technology Data Exchange (ETDEWEB)

    Ackerson, D.S.; Impink, A.J. Jr.; Balkey, K.R.; Andreychek, T.S.

    1989-01-31

    A method is described for on-line monitoring and analysis of nuclear reactor pressure vessel integrity in a unit in which reactor coolant is circulated along the inner wall of the pressure vessel, the method comprising the steps of: generating on an on-line basis, temperature signals representative of the temperature of the reactor coolant circulating along the inner wall of the pressure vessel; generating on an on-line basis, a pressure signal representative of the reactor coolant pressure; generating a signal representative of fast neutron fluence to which the reactor pressure vessel has been subjected; generating as a function of the fluence signal a visual representation of the actual real time reference nil-ductibility transition temperature (RT/sub ndt/) across the entire pressure vessel wall thickness at a preselected critical location in the wall; generating as a function of transients in the reactor coolant temperature and pressur signals, a visual representation of the real time required RT/sub ndt/, across the entire pressure vessel wall thickness at the selected critical location, the required RT/sub ndt/ being the RT/sub ndt/ that would be required in the pressure vessel wall for flaw initiation to occur as a result of stresses set-up by the transients; and superimposing the visual representations of the real-time actual and required RT/sub ndt's/ for flaw initiation across the entire pressure vessel wall thickness for the selected critical location to generate a visual representation of the difference in value between the actual and required RT/sub ndt/ presented as an RT/sub ndt/ margin.

  20. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, G. A.; Dähnn, M.; Fausk, I.; Kirkvik, A. S.; Mysen, E.

    2016-12-01

    At the Norwegian Mapping Authority, we are currently developing Where, a newsoftware for geodetic analysis. Where is built on our experiences with theGeosat software, and will be able to analyse and combine data from VLBI, SLR,GNSS and DORIS. The software is mainly written in Python which has proved veryfruitful. The code is quick to write and the architecture is easily extendableand maintainable. The Python community provides a rich eco-system of tools fordoing data-analysis, including effective data storage and powerfulvisualization. Python interfaces well with other languages so that we can easilyreuse existing, well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where,including benchmarks against other software packages. In addition we will reporton some simple investigations we have done using the software, and outline ourplans for further progress.

  1. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  2. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  3. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  4. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  5. Modelling and Evaluating Software Project Risks with Quantitative Analysis Techniques in Planning Software Development

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2015-01-01

    Risk is not always avoidable, but it is controllable. The aim of this paper is to present new techniques which use the stepwise regression analysis tomodel and evaluate the risks in planning software development and reducing risk with software process improvement. Top ten software risk factors in planning software development phase and thirty control factors were presented to respondents. This study incorporates risk management approach and planning software development to mitigate software p...

  6. Software design and technical considerations for the calibration of gas process vessels

    Energy Technology Data Exchange (ETDEWEB)

    Holt, S.H.

    1991-12-31

    In a new facility at the Savannah River Site (SRS), the volume of 35 vessels were determined. A literature search was made to determine an appropriate calibration method. When no practical non-laboratory methods providing the targeted uncertainty were found, an innovative approach was developed using the Gas Law principles and a portable, computerized data acquisition system. These vessels were calibrated in their final installed configuration. Each tank was calibrated in less than 24 hours. The system was comprised of a portable computer, pressure sensor, resistance temperature device temperature sensor, bottled nitrogen, and a mass comparator. This paper will describe the development of the calibration program.

  7. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Objective facial photograph analysis using imaging software.

    Science.gov (United States)

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Intraprocedural Dataflow Analysis for Software Product Lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SPLs...

  10. Software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, Kristina; Ivanović, Ilija

    2014-02-01

    In this paper, we will present new software for analysis of IMO data collected from visual observations. The software consists of a package of functions written in the statistical programming language R, as well as a Java application which uses these functions in a user friendly environment. R code contains various filters for selection of data, methods for calculation of Zenithal Hourly Rate (ZHR), solar longitude, population index and graphical representation of ZHR and distribution of observed magnitudes. The Java application allows everyone to use these functions without any knowledge of R. Both R code and the Java application are open source and free with user manuals and examples provided.

  11. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  12. Integral experiments on in-vessel coolability and vessel creep: results and analysis of the FOREVER-C1 test

    Energy Technology Data Exchange (ETDEWEB)

    Sehgal, B.R.; Nourgaliev, R.R.; Dinh, T.N.; Karbojian, A. [Division of Nuclear Power Safety, Royal Institute of Technology, Drottning Kristinas Vaeg., Stockholm (Sweden)

    1999-07-01

    This paper describes the FOREVER (Failure Of REactor VEssel Retention) experimental program, which is currently underway at the Division of Nuclear Power Safety, Royal Institute of Technology (RIT/NPS). The objectives of the FOREVER experiments are to obtain data and develop validated models (i) on the melt coolability process inside the vessel, in the presence of water (in particular, on the efficacy of the postulated gap cooling to preclude vessel failure); and (ii) on the lower head failure due to the creep process in the absence of water inside and/or outside the lower head. The paper presents the experimental results and analysis of the first FOREVER-C1 test. During this experiment, the 1/10th scale pressure vessel, heated to about 900degC and pressurized to 26 bars, was subjected to creep deformation in a non-stop 24-hours test. The vessel wall displacement data clearly shows different stages of the vessel deformation due to thermal expansion, elastic, plastic and creep processes. The maximum displacement was observed at the lowermost region of the vessel lower plenum. Information on the FOREVER-C1 measured thermal characteristics and analysis of the observed thermal and structural behavior is presented. The coupled nature of thermal and mechanical processes, as well as the effect of other system conditions (such as depressurization) on the melt pool and vessel temperature responses are analyzed. (author)

  13. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  14. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. Gene expression analysis in human breast cancer associated blood vessels.

    Directory of Open Access Journals (Sweden)

    Dylan T Jones

    Full Text Available Angiogenesis is essential for solid tumour growth, whilst the molecular profiles of tumour blood vessels have been reported to be different between cancer types. Although presently available anti-angiogenic strategies are providing some promise for the treatment of some cancers it is perhaps not surprisingly that, none of the anti-angiogenic agents available work on all tumours. Thus, the discovery of novel anti-angiogenic targets, relevant to individual cancer types, is required. Using Affymetrix microarray analysis of laser-captured, CD31-positive blood vessels we have identified 63 genes that are upregulated significantly (5-72 fold in angiogenic blood vessels associated with human invasive ductal carcinoma (IDC of the breast as compared with blood vessels in normal human breast. We tested the angiogenic capacity of a subset of these genes. Genes were selected based on either their known cellular functions, their enriched expression in endothelial cells and/or their sensitivity to anti-VEGF treatment; all features implicating their involvement in angiogenesis. For example, RRM2, a ribonucleotide reductase involved in DNA synthesis, was upregulated 32-fold in IDC-associated blood vessels; ATF1, a nuclear activating transcription factor involved in cellular growth and survival was upregulated 23-fold in IDC-associated blood vessels and HEX-B, a hexosaminidase involved in the breakdown of GM2 gangliosides, was upregulated 8-fold in IDC-associated blood vessels. Furthermore, in silico analysis confirmed that AFT1 and HEX-B also were enriched in endothelial cells when compared with non-endothelial cells. None of these genes have been reported previously to be involved in neovascularisation. However, our data establish that siRNA depletion of Rrm2, Atf1 or Hex-B had significant anti-angiogenic effects in VEGF-stimulated ex vivo mouse aortic ring assays. Overall, our results provide proof-of-principle that our approach can identify a cohort of

  16. Software protocol design: Communication and control in a multi-task robot machine for ITER vacuum vessel assembly and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ming, E-mail: ming.li@lut.fi [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Wu, Huapeng; Handroos, Heikki [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland); Yang, Guangyou [School of Mechanical Engineering, Hubei University of Technology, Wuhan (China); Wang, Yongbo [Laboratory of Intelligent Machines, Lappeenranta University of Technology (Finland)

    2015-10-15

    applying these protocols, the software for a multi-task robot machine that is used for ITER vacuum vessel assembly and maintenance has been developed and it is demonstrated that machining tasks of the robot machine, such as milling, drilling, welding etc., can be implemented in both an individual and composite way.

  17. Circular cylinders and pressure vessels stress analysis and design

    CERN Document Server

    Vullo, Vincenzo

    2014-01-01

    This book provides comprehensive coverage of stress and strain analysis of circular cylinders and pressure vessels, one of the classic topics of machine design theory and methodology. Whereas other books offer only a partial treatment of the subject and frequently consider stress analysis solely in the elastic field, Circular Cylinders and Pressure Vessels broadens the design horizons, analyzing theoretically what happens at pressures that stress the material beyond its yield point and at thermal loads that give rise to creep. The consideration of both traditional and advanced topics ensures that the book will be of value for a broad spectrum of readers, including students in postgraduate, and doctoral programs and established researchers and design engineers. The relations provided will serve as a sound basis for the design of products that are safe, technologically sophisticated, and compliant with standards and codes and for the development of innovative applications.

  18. Finite element analysis of filament-wound composite pressure vessel under internal pressure

    Science.gov (United States)

    Sulaiman, S.; Borazjani, S.; Tang, S. H.

    2013-12-01

    In this study, finite element analysis (FEA) of composite overwrapped pressure vessel (COPV), using commercial software ABAQUS 6.12 was performed. The study deals with the simulation of aluminum pressure vessel overwrapping by Carbon/Epoxy fiber reinforced polymer (CFRP). Finite element method (FEM) was utilized to investigate the effects of winding angle on filament-wound pressure vessel. Burst pressure, maximum shell displacement and the optimum winding angle of the composite vessel under pure internal pressure were determined. The Laminae were oriented asymmetrically for [00,00]s, [150,-150]s, [300,-300]s, [450,-450]s, [550,-550]s, [600,-600]s, [750,-750]s, [900,-900]s orientations. An exact elastic solution along with the Tsai-Wu, Tsai-Hill and maximum stress failure criteria were employed for analyzing data. Investigations exposed that the optimum winding angle happens at 550 winding angle. Results were compared with the experimental ones and there was a good agreement between them.

  19. Thermal and structural analysis of a filter vessel ceramic tubesheet

    Energy Technology Data Exchange (ETDEWEB)

    Mallett, R.H. [Mallett Technology, Inc., Research Triangle Park, NC (United States); Swindeman, R.W. [Oak Ridge National Lab., TN (United States); Zievers, J.F. [Industrial Filter & Pump Mfg. Co., Cicero, IL (United States)

    1995-08-01

    A ceramic tubesheet assembly for a hot gas filter vessel is analyzed using the finite element method to determine stresses under differential pressure loading. The stresses include local concentration effects. Selection of the stress measures for evaluation of structural integrity is discussed. Specification of stress limits based upon limited data is considered. Stress results from this ongoing design analysis technology project are shown for one design concept.

  20. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  1. Trabeculectomy Improves Vessel Response Measured by Dynamic Vessel Analysis (DVA) in Glaucoma Patients.

    Science.gov (United States)

    J, Michael Selbach; Schallenberg, Maurice; Kramer, Sebastian; Anastassiou, Gerasimos; Steuhl, Klaus-Peter; Vilser, Walthard; Kremmer, Stephan

    2014-01-01

    To determine the effects of surgical IOP reduction (trabeculectomy) on retinal blood flow parameters in glaucoma patients using Dynamic Vessel Analysis (DVA). 26 eyes of 26 patients with progressive primary open-angle glaucoma (POAG) despite maximal topical therapy were examined before and after trabeculectomy. The responses of the retinal vessels to flickering light provocation were measured with DVA the day before surgery and 4 to 6 weeks after trabeculectomy. Between 3 and 4 weeks before surgery all local therapies were stopped and a systemic therapy with acetazolamide and conservative free topic steroidal eye drops was started. In 19 patients (73%), an inadequate response to the flicker stimulation was measured preoperatively. In these patients, the maximum dilation of arteries and veins was reduced significantly as compared to healthy eyes. In this group, the maximum dilation of the arteries following the flicker provocation improved from 1.4% before to 3.8% following trabeculectomy (p<0.01). In retinal veins, this parameter increased from 3.1% to 4.6% (p<0.05). In the 7 patients whose arterial and venous reactions to flickering light provocation preoperatively did not differ from healthy eyes, there was no significant change after surgery. The initial baseline values of arteries and veins (MU) did not deviate significantly in both groups. POAG patients with progressive disease and impaired vascular regulation profit from IOP lowering trabeculectomy concerning vascular reactivity and dilative reserve, indicating a possible improvement of retinal perfusion following effective IOP control. Future studies with long-term follow-up must determine the clinical importance of these findings for the treatment of glaucoma patients.

  2. Software For Multivariable Frequency-Domain Analysis

    Science.gov (United States)

    Armstrong, Ernest S.; Giesy, Daniel P.

    1991-01-01

    FREQ (Multivariable Frequency Domain Singular Value Analysis Package) software package of subroutines performing frequency-domain analysis of: continuous- or discrete-multivariable linear systems; any continuous system for which one calculates transfer matrix at points on imaginary axis; or any discrete system for which one calculates transfer matrix at points on unit circle. Four different versions available. Single-precision brief version LAR-14119, single-precision complete version LAR-14120, double-precision brief version LAR-14121, and double-precision complete version LAR-14122. Written in ANSI standard FORTRAN 77.

  3. Generalized Support Software: Domain Analysis and Implementation

    Science.gov (United States)

    Stark, Mike; Seidewitz, Ed

    1995-01-01

    For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.

  4. Computer imaging software for profile photograph analysis.

    Science.gov (United States)

    Tollefson, Travis T; Sykes, Jonathan M

    2007-01-01

    To describe a novel calibration technique for photographs of different sizes and to test a new method of chin evaluation in relation to established analysis measurements. A photograph analysis and medical record review of 14 patients who underwent combined rhinoplasty and chin correction at an academic center. Patients undergoing concurrent orthognathic surgery, rhytidectomy, or submental liposuction were excluded. Preoperative and postoperative digital photographs were analyzed using computer imaging software with a new method, the soft tissue porion to pogonion distance, and with established measurements, including the cervicomental angle, the mentocervical angle, and the facial convexity angle. The porion to pogonion distance consistently increased after the chin correction procedure (more in the osseous group). All photograph angle measurements changed toward the established normal range postoperatively. Surgery for facial disharmony requires artistic judgment and objective evaluation. Although 3-dimensional video analysis of the face seems promising, its clinical use is limited by cost. For surgeons who use computer imaging software, analysis of profile photographs is the most valuable tool. Even when preoperative and postoperative photographs are of different sizes, relative distance comparisons are possible with a new calibration technique using the constant facial landmarks, the porion and the pupil. The porion-pogonion distance is a simple reproducible measurement that can be used along with established soft tissue measurements as a guide for profile facial analysis.

  5. VESsel GENeration Analysis (VESGEN): Innovative Vascular Mappings for Astronaut Exploration Health Risks and Human Terrestrial Medicine

    Science.gov (United States)

    Parsons-Wingerter, Patricia; Kao, David; Valizadegan, Hamed; Martin, Rodney; Murray, Matthew C.; Ramesh, Sneha; Sekaran, Srinivaas

    2017-01-01

    Currently, astronauts face significant health risks in future long-duration exploration missions such as colonizing the Moon and traveling to Mars. Numerous risks include greatly increased radiation exposures beyond the low earth orbit (LEO) of the ISS, and visual and ocular impairments in response to microgravity environments. The cardiovascular system is a key mediator in human physiological responses to radiation and microgravity. Moreover, blood vessels are necessarily involved in the progression and treatment of vascular-dependent terrestrial diseases such as cancer, coronary vessel disease, wound-healing, reproductive disorders, and diabetes. NASA developed an innovative, globally requested beta-level software, VESsel GENeration Analysis (VESGEN) to map and quantify vascular remodeling for application to astronaut and terrestrial health challenges. VESGEN mappings of branching vascular trees and networks are based on a weighted multi-parametric analysis derived from vascular physiological branching rules. Complex vascular branching patterns are determined by biological signaling mechanisms together with the fluid mechanics of multi-phase laminar blood flow.

  6. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  7. CMS Computing Software and Analysis Challenge 2006

    Science.gov (United States)

    De Filippis, N.; CMS Collaboration

    2007-10-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  8. Software safety analysis activities during software development phases of the Microwave Limb Sounder (MLS)

    Science.gov (United States)

    Shaw, Hui-Yin; Sherif, Joseph S.

    2004-01-01

    This paper describes the MLS software safety analysis activities and documents the SSA results. The scope of this software safety effort is consistent with the MLS system safety definition and is concentrated on the software faults and hazards that may have impact on the personnel safety and the environment safety.

  9. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  10. Software-assisted live visualization system for subjacent blood vessels in endonasal endoscopic approaches

    Science.gov (United States)

    Lempe, B.; Taudt, Ch.; Maschke, R.; Gruening, J.; Ernstberger, M.; Basan, F.; Baselt, T.; Grunert, R.; Hartmann, P.

    2013-02-01

    Minimal invasive surgery methods have received growing attention in recent years. In vital important areas, it is crucial for the surgeon to have a precise knowledge of the tissue structure. Especially the visualization of arteries is desirable, as the destruction of the same can be lethal to the patient. In order to meet this requirement, the study presents a novel assistance system for endoscopic surgery. While state-of-the art systems rely on pre-operational data like computer-tomographic maps and require the use of radiation, the goal of the presented approach is to provide the clarification of subjacent blood vessels on live images of the endoscope camera system. Based on the transmission and reflection spectra of various human tissues, a prototype system with a NIR illumination unit working at 808 nm was established. Several image filtering, processing and enhancement techniques have been investigated and evaluated on the raw pictures in order to obtain high quality results. The most important were increasing contrast and thresholding by difference of Gaussian method. Based on that, it is possible to rectify a fragmented artery pattern and extract geometrical information about the structure in terms of position and orientation. By superposing the original image and the extracted segment, the surgeon is assisted with valuable live pictures of the region of interest. The whole system has been tested on a laboratory scale. An outlook on the integration of such a system in a clinical environment and obvious benefits are discussed.

  11. Factors Associated with Lymphosclerosis: An Analysis on 962 Lymphatic Vessels.

    Science.gov (United States)

    Yamamoto, Takumi; Yamamoto, Nana; Yoshimatsu, Hidehiko; Narushima, Mitsunaga; Koshima, Isao

    2017-10-01

    Lymphaticovenular anastomosis is a useful treatment option for compression-refractory lower extremity lymphedema, but its efficacy depends largely on the severity of lymphosclerosis. To maximize lymphaticovenular anastomosis efficacy, it is important to elucidate factors associated with severe lymphosclerosis. Medical charts of 134 lower extremity lymphedema patients who underwent preoperative indocyanine green lymphography and lymphaticovenular anastomosis were reviewed to obtain data of clinical demographics, indocyanine green lymphography findings, and intraoperative findings. Based on intraoperative findings of lymphatic vessels, severity of lymphosclerosis was classified into s0, s1, s2, and s3. Severe lymphosclerosis was defined as lymphatic vessels with s3 sclerosis. Logistic regression analysis was used to identify independent factors associated with severe lymphosclerosis. In total, 962 lymphatic vessels were analyzed, among which severe lymphosclerosis was observed in 97 (10.1 percent). Multivariate analysis revealed that independent factors associated with severe lymphosclerosis were higher body mass index (OR, 1.803; 95 percent CI, 1.041 to 3.123; p = 0.035), incision site in the thigh/foot compared with in the groin (OR, 2.355/4.471; 95 percent CI, 1.201 to 4.617/2.135 to 9.362; p = 0.013/p < 0.001), and S-region/D-region on indocyanine green lymphography compared with L-region (OR, 83.134/1441.126; 95 percent CI, 11.296 to 611.843/146.782 to 14149.195; p < 0.001/p < 0.001). Inverse associations were observed in positive history of radiation therapy (OR, 0.461; 95 percent CI, 0.269 to 0.788; p = 0.005). Independent factors associated with severe lymphosclerosis were clarified. Indocyanine green lymphography pattern had the strongest association with severe lymphosclerosis. D-region on indocyanine green lymphography should be avoided for lymphaticovenular anastomosis. Risk, III.

  12. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  13. Software Architecture Reliability Analysis using Failure Scenarios

    NARCIS (Netherlands)

    Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet

    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components

  14. Mapping Pedagogical Opportunities Provided by Mathematics Analysis Software

    Science.gov (United States)

    Pierce, Robyn; Stacey, Kaye

    2010-01-01

    This paper proposes a taxonomy of the pedagogical opportunities that are offered by mathematics analysis software such as computer algebra systems, graphics calculators, dynamic geometry or statistical packages. Mathematics analysis software is software for purposes such as calculating, drawing graphs and making accurate diagrams. However, its…

  15. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    Software process improvement is a necessity especially since the dynamic nature of today's hardware demands reciprocal improvements in the underlying software systems. Several process improvement models exist where organizations perform an introspective study of the current software development process and ...

  16. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    Science.gov (United States)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  17. Systems Theoretic Process Analysis Applied to an Offshore Supply Vessel Dynamic Positioning System

    Science.gov (United States)

    2016-06-01

    support vessels, cable layers, pipe-laying vessels, shuttle tankers , trenching and dredging vessels, [and] supply vessels” [3]. The term DP system ... SYSTEMS THEORETIC PROCESS ANALYSIS APPLIED TO AN OFFSHORE SUPPLY VESSEL DYNAMIC POSITIONING SYSTEM by Blake Ryan Abrecht B.S. Systems ...Engineering with a Focus on Human Factors United States Air Force Academy, 2014 SUBMITTED TO THE INSTITUTE FOR DATA, SYSTEMS , AND SOCIETY IN PARTIAL

  18. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  19. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  20. Oscillation of Angiogenesis with Vascular Dropout in Diabetic Retinopathy by VESsel GENeration Analysis (VESGEN)

    Science.gov (United States)

    Parsons-Wingerter, Patricia; Radbakrishnan, Krisbnan; Vickerman, Mary B.; Kaiser, Peter K.

    2010-01-01

    PURPOSE. Vascular dropout and angiogenesis are hallmarks of the progression of diabetic retinopathy (DR). However, current evaluation of DR relies on grading of secondary vascular effects, such as microaneurysms and hemorrhages, by clinical examination instead of by evaluation of actual vascular changes. The purpose of this study was to map and quantify vascular changes during progression of DR by VESsel GENeration Analysis (VESGEN). METHODS. In this prospective cross-sectional study, 15 eyes with DR were evaluated with fluorescein angiography (FA) and color fundus photography, and were graded using modified Early Treatment Diabetic Retinopathy Study criteria. FA images were separated by semiautomatic image processing into arterial and venous trees. Vessel length density (L(sub v)), number density (N(sub v)), and diameter (D(sub v)) were analyzed in a masked fashion with VESGEN software. Each vascular tree was automatically segmented into branching generations (G(sub 1)...G(sub 8) or G(sub 9)) by vessel diameter and branching. Vascular remodeling status (VRS) for N(sub v) and L(sub v) was graded 1 to 4 for increasing severity of vascular change. RESULTS. By N(sub v) and L(sub v), VRS correlated significantly with the independent clinical diagnosis of mild to proliferative DR (13/15 eyes). N(sub v) and L(sub v) of smaller vessels (G(sub >=6) increased from VRS1 to VRS2 by 2.4 X and 1.6 X, decreased from VRS2 to VRS3 by 0.4 X and 0.6X, and increased from VRS3 to VRS4 by 1.7 X and 1.5 X (P < 0.01). Throughout DR progression, the density of larger vessels (G(sub 1-5)) remained essentially unchanged, and D(sub v1-5) increased slightly. CONCLUSIONS. Vessel density oscillated with the progression of DR. Alternating phases of angiogenesis/neovascularization and vascular dropout were dominated first by remodeling of arteries and subsequently by veins.

  1. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  2. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    Science.gov (United States)

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  3. Wave intensity analysis in air-filled flexible vessels.

    Science.gov (United States)

    Clavica, Francesco; Parker, Kim H; Khir, Ashraf W

    2015-02-26

    Wave intensity analysis (WIA) is an analytical technique generally used to investigate the propagation of waves in the cardiovascular system. Despite its increasing usage in the cardiovascular system, to our knowledge WIA has never been applied to the respiratory system. Given the analogies between arteries and airways (i.e. fluid flow in flexible vessels), the aim of this work is to test the applicability of WIA with gas flow instead of liquid flow. The models employed in this study are similar to earlier studies used for arterial investigations. Simultaneous pressure (P) and velocity (U) measurements were initially made in a single tube and then in several flexible tubes connected in series. Wave speed was calculated using the foot-to-foot method (cf), which was used to separate analytically the measured P and U waveforms into their forward and backward components. Further, the data were used to calculate wave intensity, which was also separated into its forward and backward components. Although the measured wave speed was relatively high, the results showed that the onsets and the nature of reflections (compression/expansion) derived with WIA, corresponded well to those anticipated using the theory of waves in liquid-filled elastic tubes. On average the difference between the experimental and theoretical arrival time of reflection was 6.1% and 3.6% for the single vessel and multivessel experiment, respectively. The results suggest that WIA can provide relatively accurate information on reflections in air-filled flexible tubes, warranting further studies to explore the full potential of this technique in the respiratory system. Copyright © 2015. Published by Elsevier Ltd.

  4. New angiographic measurement tool for analysis of small cerebral vessels: application to a subarachnoid haemorrhage model in the rat

    Energy Technology Data Exchange (ETDEWEB)

    Turowski, B.; Moedder, U. [Heinrich-Heine University, Institute of Diagnostic Radiology, Neuroradiology, Duesselorf (Germany); Haenggi, D.; Steiger, H.J. [Heinrich-Heine University, Department of Neurosurgery, Duesseldorf (Germany); Beck, A.; Aurich, V. [Heinrich-Heine University, Institute of Informatics, Duesseldorf (Germany)

    2007-02-15

    Exact quantification of vasospasm by angiography is known to be difficult especially in small vessels. The purpose of the study was to develop a new method for computerized analysis of small arteries and to demonstrate feasibility on cerebral angiographies of rats acquired on a clinical angiography unit. A new software tool analysing grey values and subtracting background noise was validated on a vessel model. It was tested in practice in animals with subarachnoid haemorrhage (SAH). A total of 28 rats were divided into four groups: SAH untreated, SAH treated with local calcium antagonist, SAH treated with placebo, and sham-operated. The diameters of segments of the internal carotid, caudal cerebral, middle cerebral, rostral cerebral and the stapedial arteries were measured and compared to direct measurements of the diameters on magnified images. There was a direct correlation between the cross-sectional area of vessels measured in a phantom and the measurements acquired using the new image analysis method. The spread of repeated measurements with the new software was small compared to the spread of direct measurements of vessel diameters on magnified images. Application of the measurement tool to experimental SAH in rats showed a statistically significant reduction of vasospasm in the SAH groups treated with nimodipine-releasing pellets in comparison to all the other groups combined. The presented computerized method for analysis of small intracranial vessels is a new method allowing precise relative measurements. Nimodipine-releasing subarachnoidal pellets reduce vasospasm, but further testing with larger numbers is necessary. The tool can be applied to human angiography without modification and offers the promise of substantial progress in the diagnosis of vasospasm after SAH. (orig.)

  5. An Analysis of Software Design Methodologies

    Science.gov (United States)

    1979-08-01

    the second can be charac- terized as "mechanical" or "algorithmic". Duncker (1945) extended this aspect of problem solving by demonstrating that a...1972. Davis, C. G., & Vick, C. R. The software development system. IEEE Transactions on Software Engineering, 1977, SE-3, 69-84. Duncker , K. On

  6. VMStools: Open-source software for the processing, analysis and visualization of fisheries logbook and VMS data

    DEFF Research Database (Denmark)

    Hintzen, Niels T.; Bastardie, Francois; Beare, Doug

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook (...

  7. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  8. Vessel Monitoring Systems Study. Volume I - Technical Analysis.

    Science.gov (United States)

    1980-09-01

    In the Port and Tanker Safety Act of 1978 the U.S. Conress directed the Department of Transportation to performa a study on the desirability and feasibility of a shore-station system for monitoring vessels (including fishing vessels)offshore within t...

  9. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  10. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  11. Next Generation Static Software Analysis Tools (Dagstuhl Seminar 14352)

    OpenAIRE

    Cousot, Patrick; Kroening, Daniel; Sinz, Daniel

    2014-01-01

    There has been tremendous progress in static software analysis over the last years with, for example, refined abstract interpretation methods, the advent of fast decision procedures like SAT and SMT solvers, new approaches like software (bounded) model checking or CEGAR, or new problem encodings. We are now close to integrating these techniques into every programmer's toolbox. The aim of the seminar was to bring together developers of software analysis tools and algorithms, including ...

  12. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  13. Computer Analysis of Eye Blood-Vessel Images

    Science.gov (United States)

    Wall, R. J.; White, B. S.

    1984-01-01

    Technique rapidly diagnoses diabetes mellitus. Photographs of "whites" of patients' eyes scanned by computerized image analyzer programmed to quantify density of small blood vessels in conjuctiva. Comparison with data base of known normal and diabetic patients facilitates rapid diagnosis.

  14. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  15. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  16. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  17. Comprehensive Analysis of Chicken Vessels as Microvascular Anastomosis Training Model

    Directory of Open Access Journals (Sweden)

    Bo Young Kang

    2017-01-01

    Full Text Available BackgroundNonliving chickens are commonly used as a microvascular anastomosis training model. However, previous studies have investigated only a few types of vessel, and no study has compared the characteristics of the various vessels. The present study evaluated the anatomic characteristics of various chicken vessels as a training model.MethodsEight vessels—the brachial artery, basilic vein, radial artery, ulnar artery, ischiatic artery and vein, cranial tibial artery, and common dorsal metatarsal artery—were evaluated in 26 fresh chickens and 30 chicken feet for external diameter (ED and thicknesses of the tunica adventitia and media. The dissection time from skin incision to application of vessel clamps was also measured.ResultsThe EDs of the vessels varied. The ischiatic vein had the largest ED of 2.69±0.33 mm, followed by the basilic vein (1.88±0.36 mm, ischiatic artery (1.68±0.24 mm, common dorsal metatarsal artery (1.23±0.23 mm, cranial tibial artery (1.18±0.19 mm, brachial artery (1.08±0.15 mm, ulnar artery (0.82±0.13 mm, and radial artery (0.56±0.12 mm, and the order of size was consistent across all subjects. Thicknesses of the tunica adventitia and media were also diverse, ranging from 74.09±19.91 µm to 158.66±40.25 µm (adventitia and from 31.2±7.13 µm to 154.15±46.48 µm (media, respectively. Mean dissection time was <3 minutes for all vessels.ConclusionsOur results suggest that nonliving chickens can provide various vessels with different anatomic characteristics, which can allow trainees the choice of an appropriate microvascular anastomosis training model depending on their purpose and skillfulness.

  18. Combining Static Analysis and Model Checking for Software Analysis

    Science.gov (United States)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  19. Insulated Pressure Vessels for Vehicular Hydrogen Storage: Analysis and Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Aceves, S M; Martinez-Frias, J; Garcia-Villazana, O; Espinosa-Loza, F

    2001-06-26

    Insulated pressure vessels are cryogenic-capable pressure vessels that can be fueled with liquid hydrogen (LH{sub 2}) or ambient-temperature compressed hydrogen (CH{sub 2}). Insulated pressure vessels offer the advantages of liquid hydrogen tanks (low weight and volume), with reduced disadvantages (fuel flexibility, lower energy requirement for hydrogen liquefaction and reduced evaporative losses). The work described here is directed at verifying that commercially available pressure vessels can be safely used to store liquid hydrogen. The use of commercially available pressure vessels significantly reduces the cost and complexity of the insulated pressure vessel development effort. This paper describes a series of tests that have been done with aluminum-lined, fiber-wrapped vessels to evaluate the damage caused by low temperature operation. All analysis and experiments to date indicate that no significant damage has resulted. Required future tests are described that will prove that no technical barriers exist to the safe use of aluminum-fiber vessels at cryogenic temperatures. Future activities also include a demonstration project in which the insulated pressure vessels will be installed and tested on two vehicles. A draft standard will also be generated for obtaining certification for insulated pressure vessels.

  20. Analysis and Measurement of NOx Emissions in Port Auxiliary Vessels

    Directory of Open Access Journals (Sweden)

    German de Melo Rodriguez

    2013-09-01

    Full Text Available This paper is made NOx pollution emitted by port auxiliary vessels, specifically by harbour tugs, due to its unique operating characteristics of operation, require a large propulsion power changes discontinuously, also possess some peculiar technical characteristics, large tonnage and high propulsive power, that differentiate them from other auxiliary vessels of the port. Taking into account all the above features, there are no studies of the NOx emission engines caused by different working regimes of power because engine manufacturers have not measured these emissions across the range of operating power, but usually we only report the pollution produced by its engines to a maximum continuous power.

  1. Cross-instrument Analysis Correlation Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  2. Shakedown Analysis for the Hydraulic Nut Device of Reactor Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyu Wan; Kim, Jong Min; Park, Sung Ho; Choi, Taek Sang [KEPCO Engineering and Construction, Daejeon (Korea, Republic of)

    2012-05-15

    The hydraulic nut system is a tensioning system that is a direct retrofit for any existing tensioning system currently used on the reactor vessel. The system itself is of a modular design tensioner and integrated mechanical lock ring to retain load generated through the use of hydraulics. The hydraulic nuts allow for 100% tensioning of all studs simultaneously, so reduction in critical path time and radiation exposure for both installation and removal laborers can be achieved by adopting this device. Structural analyses for the hydraulic nut device which will be applied to the Nuclear Power Plant reactor vessel have been performed to evaluate the effect of the replacement on the structural integrity of both the reactor vessel closure head area and the hydraulic nuts. Shakedown analyses have been performed because the primary plus secondary (P+Q) stress intensity limit of the hydraulic nut is exceeded in several locations. It is concluded that shakedown will occur and structural integrity of the reactor vessel closure head area will be maintained with the application of the hydraulic nut system

  3. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  4. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T.; Nagao, T.; Takahashi, K. [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  5. Software Tool for Real-Time Power Quality Analysis

    OpenAIRE

    CZIKER, A. C.; CHINDRIS, M. D.; Miron, A

    2013-01-01

    A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recov...

  6. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  7. Probabilistic Structural Integrity Analysis of Boiling Water Reactor Pressure Vessel under Low Temperature Overpressure Event

    Directory of Open Access Journals (Sweden)

    Hsoung-Wei Chou

    2015-01-01

    Full Text Available The probabilistic structural integrity of a Taiwan domestic boiling water reactor pressure vessel has been evaluated by the probabilistic fracture mechanics analysis. First, the analysis model was built for the beltline region of the reactor pressure vessel considering the plant specific data. Meanwhile, the flaw models which comprehensively simulate all kinds of preexisting flaws along the vessel wall were employed here. The low temperature overpressure transient which has been concluded to be the severest accident for a boiling water reactor pressure vessel was considered as the loading condition. It is indicated that the fracture mostly happens near the fusion-line area of axial welds but with negligible failure risk. The calculated results indicate that the domestic reactor pressure vessel has sufficient structural integrity until doubling of the present end-of-license operation.

  8. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  9. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  10. Exact Thermal Analysis of Functionally Graded Cylindrical and Spherical Vessels

    Directory of Open Access Journals (Sweden)

    Vebil Yıldırım

    2017-07-01

    Full Text Available Thermal analyses of radially functionally graded (FG thick-walled a spherical vessel and an infinite cylindrical vessel or a circular annulus are conducted analytically by the steady-state 1-D Fourier heat conduction theory under Dirichlet’s boundary conditions. By employing simple-power material grading pattern the differential equations are obtained in the form of Euler-Cauchy types. Analytical solution of the differential equations gives the temperature field and the heat flux distribution in the radial direction in a closed form. Three different physical metal-ceramic pairs first considered to study the effect of the aspect ratio, which is defined as the inner radius to the outer radius of the structure, on the temperature and heat flux variation along the radial coordinate. Then a parametric study is performed with hypothetic inhomogeneity indexes for varying aspect ratios.

  11. Open architecture software platform for biomedical signal analysis.

    Science.gov (United States)

    Duque, Juliano J; Silva, Luiz E V; Murta, Luiz O

    2013-01-01

    Biomedical signals are very important reporters of the physiological status in human body. Therefore, great attention is devoted to the study of analysis methods that help extracting the greatest amount of relevant information from these signals. There are several free of charge softwares which can process biomedical data, but they are usually closed architecture, not allowing addition of new functionalities by users. This paper presents a proposal for free open architecture software platform for biomedical signal analysis, named JBioS. Implemented in Java, the platform offers some basic functionalities to load and display signals, and allows the integration of new software components through plugins. JBioS facilitates validation of new analysis methods and provides an environment for multi-methods analysis. Plugins can be developed for preprocessing, analyzing and simulating signals. Some applications have been done using this platform, suggesting that, with these features, JBioS presents itself as a software with potential applications in both research and clinical area.

  12. Computer-assisted qualitative data analysis software: a review.

    Science.gov (United States)

    Banner, Davina J; Albarrran, John W

    2009-01-01

    Over recent decades, qualitative research has become accepted as a uniquely valuable methodological approach for generating knowledge, particularly in relation to promoting understanding of patients' experiences and responses to illness. Within cardiovascular nursing such qualitative approaches have been widely adopted to systematically investigate a number of phenomena. Contemporary qualitative research practice comprises a diverse range of disciplines and approaches. Computer-aided qualitative data analysis software represents an important facet of this increasingly sophisticated movement. Such software offers an efficient means through which to manage and organize data while supporting rigorous data analysis. The increasing use of qualitative data analysis software has stimulated wide discussion. This research column includes a review of some of the advantages and debates related to the use and integration of qualitative data analysis software.

  13. ITER vacuum vessel structural analysis completion during manufacturing phase

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.-M., E-mail: jean-marc.martinez@live.fr [ITER Organization, Route Vinon sur Verdon, CS 90046, 13067, St. Paul lez Durance, Cedex (France); Alekseev, A.; Sborchia, C.; Choi, C.H.; Utin, Y.; Jun, C.H.; Terasawa, A.; Popova, E.; Xiang, B.; Sannazaro, G.; Lee, A.; Martin, A.; Teissier, P.; Sabourin, F. [ITER Organization, Route Vinon sur Verdon, CS 90046, 13067, St. Paul lez Durance, Cedex (France); Caixas, J.; Fernandez, E.; Zarzalejos, J.M. [F4E, c/Josep Pla, n.2, Torres Diagonal Litoral, Edificio B3, E-08019, Barcelona (Spain); Kim, H.-S.; Kim, Y.G. [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Privalova, E. [NTC “Sintez”, Efremov Inst., 189631 Metallostroy, St. Petersburg (Russian Federation); and others

    2016-11-01

    Highlights: • ITER Vacuum Vessel (VV) is a part of the first barrier to confine the plasma. • A Nuclear Pressure Equipment necessitates Agreed Notified Body to assure design, fabrication, and conformance testing and quality assurance. • Some supplementary RCC-MR margin targets have been considered to guarantee considerable structural margins in areas not inspected in operation. • Many manufacturing deviation requests (MDR) and project change requests (PCR) impose to re-evaluate the structural margin. • Several structural analyses were performed with global and local models to guarantee the structural integrity of the whole ITER Vacuum Vessel. - Abstract: Some years ago, analyses were performed by ITER Organization Central Team (IO-CT) to verify the structural integrity of the ITER vacuum vessel baseline design fixed in 2010 and classified as a Protection Important Component (PIC). The manufacturing phase leads the ITER Organization domestic agencies (IO-DA) and their contracted manufacturers to propose detailed design improvements to optimize the manufacturing or inspection process. These design and quality inspection changes can affect the structural margins with regards to the Codes&Standards and thus oblige to evaluate one more time the modified areas. This paper proposes an overview of the additional analyses already performed to guarantee the structural integrity of the manufacturing designs. In this way, CT and DAs have been strongly involved to keep the considerable margins obtained previously which were used to fix reasonable compensatory measures for the lack of In Service Inspections of a Nuclear Pressure Equipment (NPE).

  14. Fracture Analysis of Rubber Sealing Material for High Pressure Hydrogen Vessel

    National Research Council Canada - National Science Library

    YAMABE, Junichiro; FUJIWARA, Hirotada; NISHIMURA, Shin

    2011-01-01

    In order to clarify the influence of high pressure hydrogen gas on mechanical damage in a rubber O-ring, the fracture analysis of the O-ring used for a sealing material of a pressure hydrogen vessel was conducted...

  15. Analysis of Blood Flow in a Partially Blocked Bifurcated Blood Vessel

    Science.gov (United States)

    Abdul-Razzak, Hayder; Elkassabgi, Yousri; Punati, Pavan K.; Nasser, Naseer

    2009-09-01

    Coronary artery disease is a major cause of death in the United States. It is the narrowing of the lumens of the coronary blood vessel by a gradual build-up of fatty material, atheroma, which leads to the heart muscle not receiving enough blood. This my ocardial ischemia can cause angina, a heart attack, heart failure as well as sudden cardiac death [9]. In this project a solid model of bifurcated blood vessel with an asymmetric stenosis is developed using GAMBIT and imported into FLUENT for analysis. In FLUENT, pressure and velocity distributions in the blood vessel are studied under different conditions, where the size and position of the blockage in the blood vessel are varied. The location and size of the blockage in the blood vessel are correlated with the pressures and velocities distributions. Results show that such correlation may be used to predict the size and location of the blockage.

  16. In vitro validation and comparison of different software packages or algorithms for coronary bifurcation analysis using calibrated phantoms: implications for clinical practice and research of bifurcation stenting.

    Science.gov (United States)

    Ishibashi, Yuki; Grundeken, Maik J; Nakatani, Shimpei; Iqbal, Javaid; Morel, Marie-Angele; Généreux, Philippe; Girasis, Chrysafios; Wentzel, Jolanda J; Garcia-Garcia, Hector M; Onuma, Yoshinobu; Serruys, Patrick W

    2015-03-01

    The accuracy and precision of quantitative coronary angiography (QCA) software dedicated for bifurcation lesions compared with conventional single-vessel analysis remains unknown. Furthermore, comparison of different bifurcation analysis algorithms has not been performed. Six plexiglas phantoms with 18 bifurcations were manufactured with a tolerance Analysis System (CAAS; Version 5.10, Pie Medical Imaging, Maastricht, The Netherlands) and QAngio XA (Version 7.3, Medis Medical Imaging System BV, Leiden, The Netherlands) software packages. Conventional single-vessel analysis underestimated the reference vessel diameter and percent diameter stenosis in the proximal main vessel while it overestimated these parameters in the distal main vessel and side branch. CAAS software showed better overall accuracy and precision than QAngio XA (with automatic Y- or T-shape bifurcation algorithm selection) for various phantom diameters including minimum lumen diameter (0.012 ± 0.103 mm vs. 0.041 ± 0.322 mm, P = 0.003), reference vessel diameter (-0.050 ± 0.043 mm vs. 0.116 ± 0.610 mm, P = 0.026), and % diameter stenosis (-0.94 ± 4.07 % vs. 1.74 ± 7.49 %, P = 0.041). QAngio XA demonstrated higher minimal lumen diameter, reference vessel diameter, and % diameter stenosis when compared to the actual phantom diameters; however, the accuracy of these parameters improved to a similar level as CAAS when the sole T-shape algorithm in the QAnxio XA was used. The use of the single-vessel QCA method is inaccurate in bifurcation lesions. Both CAAS and QAngio XA (when the T shape is systematically used) bifurcation software packages are suitable for quantitative assessment of bifurcations. © 2014 Wiley Periodicals, Inc.

  17. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  18. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis is c...

  19. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  20. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  1. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  2. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  3. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  4. Inter-method agreement in retinal blood vessels diameter analysis between Dynamic Vessel Analyzer and optical coherence tomography.

    Science.gov (United States)

    Benatti, Lucia; Corvi, Federico; Tomasso, Livia; Mercuri, Stefano; Querques, Lea; Ricceri, Fulvio; Bandello, Francesco; Querques, Giuseppe

    2017-06-01

    To analyze the inter-methods agreement in arteriovenous ratio (AVR) evaluation between spectral-domain optical coherence tomography (SD-OCT) and Dynamic Vessel Analyzer (DVA). Healthy volunteers underwent DVA and SD-OCT examination. AVR was measured by SD-OCT using the four external lines of the optic nerve head-centered 7-line cube and by DVA using an automated AVR estimation. The mean AVR was calculated, twice, separately by two independent readers for each tool. Twenty-two eyes of 11 healthy subjects (five women and six men, mean age 35) were included. AVR analysis by DVA showed high inter-observer agreement between reader 1 and 2, and high intra-observer agreement for both reader 1 and reader 2. With regard to AVR analysis on SD-OCT, we found high inter-observer agreement between reader 1 and 2, and low intra-observer agreement for reader 2 but high intra-observer agreement for reader 1. Overall, the mean AVR measured on SD-OCT turned out to be significantly higher than mean AVR measured through DVA (reader 1, 0.9023 ± 0.06 vs 0.8036 ± 0.08; p DVA and SD-OCT). We found significant difference in the two noninvasive methods for AVR measurement, with a tendency for SD-OCT to overestimate retinal vascular caliber in comparison to DVA. This may be useful for achieving greater accuracy in the evaluation of retinal vessel in ocular as well as systemic diseases.

  5. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  6. Propensity Score Analysis in R: A Software Review

    Science.gov (United States)

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  7. Analysis and Design of Cryogenic Pressure Vessels for Automotive Hydrogen Storage

    Science.gov (United States)

    Espinosa-Loza, Francisco Javier

    Cryogenic pressure vessels maximize hydrogen storage density by combining the high pressure (350-700 bar) typical of today's composite pressure vessels with the cryogenic temperature (as low as 25 K) typical of low pressure liquid hydrogen vessels. Cryogenic pressure vessels comprise a high-pressure inner vessel made of carbon fiber-coated metal (similar to those used for storage of compressed gas), a vacuum space filled with numerous sheets of highly reflective metalized plastic (for high performance thermal insulation), and a metallic outer jacket. High density of hydrogen storage is key to practical hydrogen-fueled transportation by enabling (1) long-range (500+ km) transportation with high capacity vessels that fit within available spaces in the vehicle, and (2) reduced cost per kilogram of hydrogen stored through reduced need for expensive structural material (carbon fiber composite) necessary to make the vessel. Low temperature of storage also leads to reduced expansion energy (by an order of magnitude or more vs. ambient temperature compressed gas storage), potentially providing important safety advantages. All this is accomplished while simultaneously avoiding fuel venting typical of cryogenic vessels for all practical use scenarios. This dissertation describes the work necessary for developing and demonstrating successive generations of cryogenic pressure vessels demonstrated at Lawrence Livermore National Laboratory. The work included (1) conceptual design, (2) detailed system design (3) structural analysis of cryogenic pressure vessels, (4) thermal analysis of heat transfer through cryogenic supports and vacuum multilayer insulation, and (5) experimental demonstration. Aside from succeeding in demonstrating a hydrogen storage approach that has established all the world records for hydrogen storage on vehicles (longest driving range, maximum hydrogen storage density, and maximum containment of cryogenic hydrogen without venting), the work also

  8. Software for Data Analysis Programming with R

    CERN Document Server

    Chambers, John

    2008-01-01

    Although statistical design is one of the oldest branches of statistics, its importance is ever increasing, especially in the face of the data flood that often faces statisticians. It is important to recognize the appropriate design, and to understand how to effectively implement it, being aware that the default settings from a computer package can easily provide an incorrect analysis. The goal of this book is to describe the principles that drive good design, paying attention to both the theoretical background and the problems arising from real experimental situations. Designs are motivated t

  9. Confirmatory Factor Analysis Alternative : Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2016-12-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  10. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  11. Residual stress analysis in BWR pressure vessel attachments

    Energy Technology Data Exchange (ETDEWEB)

    Dexter, R.J.; Leung, C.P. (Southwest Research Inst., San Antonio, TX (United States)); Pont, D. (FRAMASOFT+CSI, 69 - Lyon (France). Div. of Framatome)

    1992-06-01

    Residual stresses from welding processes can be the primary driving force for stress corrosion cracking (SCC) in BWR components. Thus, a better understanding of the causes and nature of these residual stresses can help assess and remedy SCC. Numerical welding simulation software, such as SYSWELD, and material property data have been used to quantify residual stresses for application to SCC assessments in BWR components. Furthermore, parametric studies using SYSWELD have revealed which variables significantly affect predicted residual stress. Overall, numerical modeling techniques can be used to evaluate residual stress for SCC assessments of BWR components and to identify and plan future SCC research.

  12. Equipment Obsolescence Analysis and Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  13. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  14. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  15. HANSIS software tool for the automated analysis of HOLZ lines

    Energy Technology Data Exchange (ETDEWEB)

    Holec, D., E-mail: david.holec@unileoben.ac.at [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom); Sridhara Rao, D.V.; Humphreys, C.J. [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom)

    2009-06-15

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  16. Analysis of The Propulsion System Towards The Speed Reduction of Vessels Type PC-43

    Directory of Open Access Journals (Sweden)

    Arica Dwi Susanto

    2017-05-01

    Full Text Available (PC-43 is an Indonesian navy vessel type limited patrol craft made in Indonesian. The vessel was designed using a steel material with a maximum speed of 27 knots and using engine power by 3 x 1800 HP, T = 1.40 at the empty draft and T = 1.70 at full draft. The speed is decreased in the current conditions by 22 knots at 1.50 meters draft within 1 year after its launching. This fact is very interesting to be used as a paper project by analyzing the effect of changes in vessel’s draft to the resistance and analyze the current installed engine power, This paper carried two methods of calculation, namely: resistance and power calculation numerically along with resistance and power calculation using software maxsurf. The results from the manual calculations of power at T = 1.65 meters in 27 knots, the power needed is BHPscr = 4245.04 HP. From the data of power installed in the vessel, it was stated that the power is 3 x 1800 = 5400 HP, means a mathematical/theoretical speed of 27 knots can be achieved. Thus, the resistance and power is not one of the causes of speed reduction in Vessel Type PC- 43.

  17. First statistical analysis of Geant4 quality software metrics

    Science.gov (United States)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  18. Texture analysis software: integration with a radiological workstation.

    Science.gov (United States)

    Duvauferrier, Régis; Bezy, Joan; Bertaud, Valérie; Toussaint, Grégoire; Morelli, John; Lasbleiz, Jeremy

    2012-01-01

    Image analysis is the daily task of radiologists. The texture of a structure or imaging finding can be more difficult to describe than other parameters. Image processing can help the radiologist in completing this difficult task. The aim of this article is to explain how we have developed texture analysis software and integrated it into a standard radiological workstation. The texture analysis method has been divided into three steps: definition of primitive elements, counting, and statistical analysis. The software was developed in C++ and integrated into a Siemens workstation with a graphical user interface. The results of analyses may be exported in Excel format. The software allows users to perform texture analyses on any type of radiological image without the need for image transfer by simply placing a region of interest. This tool has already been used to assess the trabecular network of vertebra. The integration of such software into PACS extends the applicability of texture analysis beyond that of a mere research tool and facilitates its use in routine clinical practice.

  19. Software Tool for Real-Time Power Quality Analysis

    Directory of Open Access Journals (Sweden)

    CZIKER, A. C.

    2013-11-01

    Full Text Available A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recovery algorithm is applied, the harmonic analysis can be made even if voltage dips or swells appear. The virtual instrument input data can be recorded or online signals; the last ones being get through a data acquisition board. The virtual instrument was tested using both virtually created and real signals from measurements performed in distribution networks. The paper contains a numeric example made on a synthetic digital signal and an analysis made in real-time.

  20. New Results in Software Model Checking and Analysis

    Science.gov (United States)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  1. The Implication of Using NVivo Software in Qualitative Data Analysis ...

    African Journals Online (AJOL)

    2015-03-15

    Mar 15, 2015 ... However, Computer Assisted Qualitative Data Analysis Software. (CAQDAS) are increasingly being developed. .... approval was granted by Malawi's National Health Science. Research Committee. Data .... assumption of the tutorial is that the researcher has the basic understanding of the computer, which ...

  2. A Pedagogical Software for the Analysis of Loudspeaker Systems

    Science.gov (United States)

    Pueo, B.; Roma, M.; Escolano, J.; Lopez, J. J.

    2009-01-01

    In this paper, a pedagogical software for the design and analysis of loudspeaker systems is presented, with emphasis on training students in the interaction between system parameters. Loudspeakers are complex electromechanical system, whose behavior is neither intuitive nor easy to understand by inexperienced students. Although commercial…

  3. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  4. Software Product "Equilibrium" for Preparation and Analysis of Aquatic Solutions

    CERN Document Server

    Bontchev, G D; Ivanov, P I; Maslov, O D; Milanov, M V; Dmitriev, S N

    2003-01-01

    Software product "Equilibrium" for preparation and analysis of aquatic solutions is developed. The program allows determining analytical parameters of a solution, such as ionic force and pH. "Equilibrium" is able to calculate the ratio of existing ion forms in the solution, with respect to the hydrolysis and complexation in the presence of one or more ligands.

  5. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  6. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  7. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  8. SIMA: Python software for analysis of dynamic fluorescence imaging data

    OpenAIRE

    Patrick eKaifosh; Jeffrey eZaremba; Nathan eDanielson; Attila eLosonczy

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scannin...

  9. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  10. Morphological analysis of vessel elements for systematic study of three Zingiberaceae tribes.

    Science.gov (United States)

    Gevú, Kathlyn Vasconcelos; Lima, Helena Regina Pinto; Kress, John; Da Cunha, Maura

    2017-05-01

    Zingiberaceae containing over 1,000 species that are divided into four subfamilies and six tribes. In recent decades, there has been an increase in the number of studies about vessel elements in families of monocotyledon. However, there are still few studies of Zingiberaceae tribes. This study aims to establish systematic significance of studying vessel elements in two subfamilies and three tribes of Zingiberaceae. The vegetative organs of 33 species processed were analysed by light and scanning electron microscopy and Principal Component Analysis was used to elucidate genera boundaries. Characteristics of vessel elements, such as the type of perforation plate, the number of bars and type of parietal thickening, are proved to be important for establishing the relationship among taxa. Scalariform perforation plate and the scalariform parietal thickening are frequent in Zingiberaceae and may be a plesiomorphic condition for this taxon. In the Principal Component Analysis, the most significant characters of the vessel elements were: simple perforation plates and partially pitted parietal thickening, found only in Alpinieae tribe, and 40 or more bars composing the plate in Elettariopsis curtisii, Renealmia chrysotricha, Zingiber spectabile, Z. officinale, Curcuma and Globba species. Vessel elements characters of 18 species of Alpinieae, Zingibereae and Globbeae were first described in this work.

  11. Progressive Failure Analysis of Adhesive Joints of Filament-Wound Composite Pressure Vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junhwan; Shin, Kwangbok [Hanbat National University, Daejeon (Korea, Republic of); Hwang, Taekyung [Agency for Defence Development, Daejeon (Korea, Republic of)

    2014-11-15

    This study performed the progressive failure analysis of adhesive joints of a composite pressure vessel with a separated dome by using a cohesive zone model. In order to determine the input parameters of a cohesive element for numerical analysis, the interlaminar fracture toughness values in modes I and II and in the mixed mode for the adhesive joints of the composite pressure vessel were obtained by a material test. All specimens were manufactured by the filament winding method. A mechanical test was performed on adhesively bonded double-lap joints to determine the shear strength of the adhesive joints and verify the reliability of the cohesive zone model for progressive failure analysis. The test results showed that the shear strength of the adhesive joints was 32MPa; the experiment and analysis results had an error of about 4.4%, indicating their relatively good agreement. The progressive failure analysis of a composite pressure vessel with an adhesively bonded dome performed using the cohesive zone model showed that only 5.8% of the total adhesive length was debonded and this debonded length did not affect the structural integrity of the vessel.

  12. Novel software package for cross-platform transcriptome analysis (CPTRA).

    Science.gov (United States)

    Zhou, Xin; Su, Zhen; Sammons, R Douglas; Peng, Yanhui; Tranel, Patrick J; Stewart, C Neal; Yuan, Joshua S

    2009-10-08

    Next-generation sequencing techniques enable several novel transcriptome profiling approaches. Recent studies indicated that digital gene expression profiling based on short sequence tags has superior performance as compared to other transcriptome analysis platforms including microarrays. However, the transcriptomic analysis with tag-based methods often depends on available genome sequence. The use of tag-based methods in species without genome sequence should be complemented by other methods such as cDNA library sequencing. The combination of different next generation sequencing techniques like 454 pyrosequencing and Illumina Genome Analyzer (Solexa) will enable high-throughput and accurate global gene expression profiling in species with limited genome information. The combination of transcriptome data acquisition methods requires cross-platform transcriptome data analysis platforms, including a new software package for data processing. Here we presented a software package, CPTRA: Cross-Platform TRanscriptome Analysis, to analyze transcriptome profiling data from separate methods. The software package is available at http://people.tamu.edu/approximately syuan/cptra/cptra.html. It was applied to the case study of non-target site glyphosate resistance in horseweed; and the data was mined to discover resistance target gene(s). For the software, the input data included a long-read sequence dataset with proper annotation, and a short-read sequence tag dataset for the quantification of transcripts. By combining the two datasets, the software carries out the unique sequence tag identification, tag counting for transcript quantification, and cross-platform sequence matching functions, whereby the short sequence tags can be annotated with a function, level of expression, and Gene Ontology (GO) classification. Multiple sequence search algorithms were implemented and compared. The analysis highlighted the importance of transport genes in glyphosate resistance and identified

  13. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  14. Quantitative software analysis of ultrasonographic textures in experimental testicular torsion.

    Science.gov (United States)

    Aslan, Mustafa; Kucukaslan, Ibrahim; Mulazimoglu, Serkan; Soyer, Tutku; Şenyücel, Mine; Çakmak, Murat; Scholbach, Jakob; Aslan, Selim

    2013-04-01

    Ultrasonography (US) has high diagnostic value in testicular torsion but is vulnerable to several potential errors, especially in the early period. Echotexture (ETX) analysis software provides a numerical expression of B-mode images and allows quantitative evaluation of blood flow due to ischemic damage using power Doppler US (PDUS) analysis. Our aim in this study was to determine the diagnostic value and effective parameters of EXT analysis software in the early period of torsion using B-mode and PDUS images. In this study, eight rats were used. Following anesthesia, the right testis was rotated to a 1080-degree counterclockwise position whereas the left testis was left in place to have a control group. B-mode and PDUS images of both sides were recorded with a portable US device immediately (0 hour) and 1 and 2 hours after torsion. The B-mode images were analyzed in terms of gradient, homogeneity, and contrast using the BS200pro software (BAB Digital Imaging System 2007, Ankara, Turkey). Intensity (I)-red and area (A)-red values were measured on PDUS images with the Pixelflux (Version 1.0, Chameleon-Software, Leipzig, Germany). The data were evaluated by the Mann-Whitney U and Wilcoxon tests. Data from B-mode US image EXT analysis showed no significant difference between the right and left testicles in 0 to 2 hours (p > 0.05). The values obtained from PDUS analysis (I-red and A-red) significantly decreased at the testicular torsion side at the end of the second hour (p 0.05) whereas the flow was significantly lower at 2 hours (p analysis. Georg Thieme Verlag KG Stuttgart · New York.

  15. An analysis of alignment and integral based kernels for machine learning from vessel trajectories

    NARCIS (Netherlands)

    de Vries, G.K.D.; van Someren, M.

    2014-01-01

    In this paper we present an analysis of the application of the two most important types of similarity measures for moving object trajectories in machine learning from vessel movement data. These similarities are applied in the tasks of clustering, classification and outlier detection. The first

  16. Rapid analysis of vessel elements (RAVE): a tool for studying physiologic, pathologic and tumor angiogenesis.

    Science.gov (United States)

    Seaman, Marc E; Peirce, Shayn M; Kelly, Kimberly

    2011-01-01

    Quantification of microvascular network structure is important in a myriad of emerging research fields including microvessel remodeling in response to ischemia and drug therapy, tumor angiogenesis, and retinopathy. To mitigate analyst-specific variation in measurements and to ensure that measurements represent actual changes in vessel network structure and morphology, a reliable and automatic tool for quantifying microvascular network architecture is needed. Moreover, an analysis tool capable of acquiring and processing large data sets will facilitate advanced computational analysis and simulation of microvascular growth and remodeling processes and enable more high throughput discovery. To this end, we have produced an automatic and rapid vessel detection and quantification system using a MATLAB graphical user interface (GUI) that vastly reduces time spent on analysis and greatly increases repeatability. Analysis yields numerical measures of vessel volume fraction, vessel length density, fractal dimension (a measure of tortuosity), and radii of murine vascular networks. Because our GUI is open sourced to all, it can be easily modified to measure parameters such as percent coverage of non-endothelial cells, number of loops in a vascular bed, amount of perfusion and two-dimensional branch angle. Importantly, the GUI is compatible with standard fluorescent staining and imaging protocols, but also has utility analyzing brightfield vascular images, obtained, for example, in dorsal skinfold chambers. A manually measured image can be typically completed in 20 minutes to 1 hour. In stark comparison, using our GUI, image analysis time is reduced to around 1 minute. This drastic reduction in analysis time coupled with increased repeatability makes this tool valuable for all vessel research especially those requiring rapid and reproducible results, such as anti-angiogenic drug screening.

  17. Rapid analysis of vessel elements (RAVE: a tool for studying physiologic, pathologic and tumor angiogenesis.

    Directory of Open Access Journals (Sweden)

    Marc E Seaman

    Full Text Available Quantification of microvascular network structure is important in a myriad of emerging research fields including microvessel remodeling in response to ischemia and drug therapy, tumor angiogenesis, and retinopathy. To mitigate analyst-specific variation in measurements and to ensure that measurements represent actual changes in vessel network structure and morphology, a reliable and automatic tool for quantifying microvascular network architecture is needed. Moreover, an analysis tool capable of acquiring and processing large data sets will facilitate advanced computational analysis and simulation of microvascular growth and remodeling processes and enable more high throughput discovery. To this end, we have produced an automatic and rapid vessel detection and quantification system using a MATLAB graphical user interface (GUI that vastly reduces time spent on analysis and greatly increases repeatability. Analysis yields numerical measures of vessel volume fraction, vessel length density, fractal dimension (a measure of tortuosity, and radii of murine vascular networks. Because our GUI is open sourced to all, it can be easily modified to measure parameters such as percent coverage of non-endothelial cells, number of loops in a vascular bed, amount of perfusion and two-dimensional branch angle. Importantly, the GUI is compatible with standard fluorescent staining and imaging protocols, but also has utility analyzing brightfield vascular images, obtained, for example, in dorsal skinfold chambers. A manually measured image can be typically completed in 20 minutes to 1 hour. In stark comparison, using our GUI, image analysis time is reduced to around 1 minute. This drastic reduction in analysis time coupled with increased repeatability makes this tool valuable for all vessel research especially those requiring rapid and reproducible results, such as anti-angiogenic drug screening.

  18. Safety analysis of nuclear containment vessels subjected to strong earthquakes and subsequent tsunamis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Feng; Li, Hong Zhi [Dept. Structural Engineering, Tongji University, Shanghai (China)

    2017-08-15

    Nuclear power plants under expansion and under construction in China are mostly located in coastal areas, which means they are at risk of suffering strong earthquakes and subsequent tsunamis. This paper presents a safety analysis for a new reinforced concrete containment vessel in such events. A finite element method-based model was built, verified, and first used to understand the seismic performance of the containment vessel under earthquakes with increased intensities. Then, the model was used to assess the safety performance of the containment vessel subject to an earthquake with peak ground acceleration (PGA) of 0.56g and subsequent tsunamis with increased inundation depths, similar to the 2011 Great East earthquake and tsunami in Japan. Results indicated that the containment vessel reached Limit State I (concrete cracking) and Limit State II (concrete crushing) when the PGAs were in a range of 0.8–1.1g and 1.2–1.7g, respectively. The containment vessel reached Limit State I with a tsunami inundation depth of 10 m after suffering an earthquake with a PGA of 0.56g. A site-specific hazard assessment was conducted to consider the likelihood of tsunami sources.

  19. Structure analysis of a reactor pressure vessel by two and three-dimensional models

    Energy Technology Data Exchange (ETDEWEB)

    Sacher, H.; Mayr, M. (Technischer Ueberwachungs-Verein Bayern e.V., Muenchen (Germany, F.R.))

    1982-03-01

    This paper investigates the reactor pressure vessel of a 1300 MW pressurised water reactor. In order to determine the stresses and deformations of the vessel, two- and three-dimensional finite element models are used which represent the real structure with different degrees of accuracy. The results achieved by these different models are compared for the case of the transient called 'Start up of the nuclear power plant'. It was found that axisymmetric models, which consider non-axisymmetric components by correction factors, together with special attention to holes and other stress concentrations, allow a sufficient computation of stresses and deformations in the vessel, with the exception of the coolant nozzle region. In this latter case a fully three-dimensional analysis may be necessary.

  20. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  1. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the Pixel detector fulfills two main purposes: to tune front-end registers for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied toghether to chips with dierent characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  2. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  3. Stromatoporoid biometrics using image analysis software: A first order approach

    Science.gov (United States)

    Wolniewicz, Pawel

    2010-04-01

    Strommetric is a new image analysis computer program that performs morphometric measurements of stromatoporoid sponges. The program measures 15 features of skeletal elements (pillars and laminae) visible in both longitudinal and transverse thin sections. The software is implemented in C++, using the Open Computer Vision (OpenCV) library. The image analysis system distinguishes skeletal elements from sparry calcite using Otsu's method for image thresholding. More than 150 photos of thin sections were used as a test set, from which 36,159 measurements were obtained. The software provided about one hundred times more data than the current method applied until now. The data obtained are reproducible, even if the work is repeated by different workers. Thus the method makes the biometric studies of stromatoporoids objective.

  4. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  5. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  6. Nonlinear data reconciliation in material flow analysis with software STAN

    Directory of Open Access Journals (Sweden)

    Oliver Cencic

    2016-11-01

    Full Text Available STAN is a freely available software that supports Material/Substance Flow Analysis (MFA/SFA under the consideration of data uncertainties. It is capable of performing nonlinear data reconciliation based on the conventional weighted least-squares minimization approach, and error propagation. This paper summarizes the mathematical foundation of the calculation algorithm implemented in STAN and demonstrates its use on a hypothetical example from MFA.

  7. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  8. Correlation between US-PSV and 64-Row MDCTA with Advanced Vessel Analysis in the Quantification of 50–70% Carotid Artery Stenosis

    Directory of Open Access Journals (Sweden)

    Matteo Stefanini

    2012-01-01

    Full Text Available Purpose. To correlate ultrasonographic peak systolic velocity (US-PSV and 64-row multidetector computed tomography angiography (MDCTA with advanced vessel analysis (AVA software in the quantification of 50–70% carotid artery stenosis. Materials and methods. 199 consecutive patients (247 arteries with internal carotid artery (ICA or third proximal bifurcation stenosis. Each patient was studied by duplex US (DUS and 64-row MDCTA with AVA software. Results. DUS showed PSV measurements less than 125 cm/s in 51 carotid stenosis and a value greater than this in 196 arteries. 64-row MDCTA AVA software showed a grade of stenosis less than 50% in 42 carotid arteries while a greater 70% was found in 4 carotid arteries; then, carotid arteries with stenosis percentage between 50% and 70% were 201. Linear regression analysis showed a good linear correlation (=0.88 between MDCTA-AVA software percentage stenosis and PSV: between 50% grade of stenosis and PSV value corresponding to 133,6 cm/sec and between 70% stenosis and PSV value corresponding to 268 cm/sec. The sensitivity, specificity, positive predictive value(PPV, negative predictive value(NPV of this analysis were 93%, 82%, 97%, 75%, respectively. Conclusion. Linear correlation between PSV data and grade of stenosis from 50% to 70% obtained with 64-row MDCTA AVA software. Main PSV value corresponding to 50% and 70% grade of stenosis at AVA analysis.

  9. Correlation between US-PSV and 64-Row MDCTA with Advanced Vessel Analysis in the Quantification of 50-70% Carotid Artery Stenosis.

    Science.gov (United States)

    Stefanini, Matteo; Gaspari, Eleonora; Boi, Luca; Del Giudice, Costantino; Mastrangeli, Roberta; Nucera, Francesca; Simonetti, Giovanni

    2012-01-01

    Purpose. To correlate ultrasonographic peak systolic velocity (US-PSV) and 64-row multidetector computed tomography angiography (MDCTA) with advanced vessel analysis (AVA) software in the quantification of 50-70% carotid artery stenosis. Materials and methods. 199 consecutive patients (247 arteries) with internal carotid artery (ICA) or third proximal bifurcation stenosis. Each patient was studied by duplex US (DUS) and 64-row MDCTA with AVA software. Results. DUS showed PSV measurements less than 125 cm/s in 51 carotid stenosis and a value greater than this in 196 arteries. 64-row MDCTA AVA software showed a grade of stenosis less than 50% in 42 carotid arteries while a greater 70% was found in 4 carotid arteries; then, carotid arteries with stenosis percentage between 50% and 70% were 201. Linear regression analysis showed a good linear correlation (r = 0.88) between MDCTA-AVA software percentage stenosis and PSV: between 50% grade of stenosis and PSV value corresponding to 133,6 cm/sec and between 70% stenosis and PSV value corresponding to 268 cm/sec. The sensitivity, specificity, positive predictive value(PPV), negative predictive value(NPV) of this analysis were 93%, 82%, 97%, 75%, respectively. Conclusion. Linear correlation between PSV data and grade of stenosis from 50% to 70% obtained with 64-row MDCTA AVA software. Main PSV value corresponding to 50% and 70% grade of stenosis at AVA analysis.

  10. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  11. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Science.gov (United States)

    Kaifosh, Patrick; Zaremba, Jeffrey D.; Danielson, Nathan B.; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/. PMID:25295002

  12. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    Science.gov (United States)

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  13. Performance Analysis of Multipurpose Refrigeration System (MRS on Fishing Vessel

    Directory of Open Access Journals (Sweden)

    Ust Y.

    2016-04-01

    Full Text Available The use of efficient refrigerator/freezers helps considerably to reduce the amount of the emitted greenhouse gas. A two-circuit refrigerator-freezer cycle (RF reveals a higher energy saving potential than a conventional cycle with a single loop of serial evaporators, owing to pressure drop in each evaporator during refrigeration operation and low compression ratio. Therefore, several industrial applications and fish storage systems have been utilized by using multipurpose refrigeration cycle. That is why a theoretical performance analysis based on the exergetic performance coefficient, coefficient of performance (COP, exergy efficiency and exergy destruction ratio criteria, has been carried out for a multipurpose refrigeration system by using different refrigerants in serial and parallel operation conditions. The exergetic performance coefficient criterion is defined as the ratio of exergy output to the total exergy destruction rate (or loss rate of availability. According to the results of the study, the refrigerant R32 shows the best performance in terms of exergetic performance coefficient, COP, exergy efficiency, and exergy destruction ratio from among the other refrigerants (R1234yf, R1234ze, R404A, R407C, R410A, R143A and R502. The effects of the condenser, freezer-evaporator and refrigerator-evaporator temperatures on the exergetic performance coefficient, COP, exergy efficiency and exergy destruction ratios have been fully analyzed for the refrigerant R32.

  14. Free software, business capital, and institutional change: a veblenian analysis of the software industry

    OpenAIRE

    Koloğlugil, Serhat

    2012-01-01

    Free software, unlike proprietary software under exclusive copyright control, exemplifies a form of productive and innovative activity that is based upon mutual sharing of technological knowledge. Free software engineers, who get connected through various software-development projects, voluntarily contribute their time and skills to produce computer programs which, they insist, should be free for anyone to use, modify, and distribute. This paper argues that Thorstein Veblen's socio-economic t...

  15. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  16. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  17. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  18. Aseismic safety analysis of a prestressed concrete containment vessel for CPR1000 nuclear power plant

    Science.gov (United States)

    Yi, Ping; Wang, Qingkang; Kong, Xianjing

    2017-01-01

    The containment vessel of a nuclear power plant is the last barrier to prevent nuclear reactor radiation. Aseismic safety analysis is the key to appropriate containment vessel design. A prestressed concrete containment vessel (PCCV) model with a semi-infinite elastic foundation and practical arrangement of tendons has been established to analyze the aseismic ability of the CPR1000 PCCV structure under seismic loads and internal pressure. A method to model the prestressing tendon and its interaction with concrete was proposed and the axial force of the prestressing tendons showed that the simulation was reasonable and accurate. The numerical results show that for the concrete structure, the location of the cylinder wall bottom around the equipment hatch and near the ring beam are critical locations with large principal stress. The concrete cracks occurred at the bottom of the PCCV cylinder wall under the peak earthquake motion of 0.50 g, however the PCCV was still basically in an elastic state. Furthermore, the concrete cracks occurred around the equipment hatch under the design internal pressure of 0.4MPa, but the steel liner was still in the elastic stage and its leak-proof function soundness was verified. The results provide the basis for analysis and design of containment vessels.

  19. Analysis of Long-term Ex-vessel Debris Coolability with a Simple Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Byoung Cheol; Moriyama, Kiyofumi; Park, Hyun Sun [POSTECH, Pohang (Korea, Republic of)

    2016-05-15

    In the late phase of the severe accident in light water reactors (LWRs), assuring the coolability of ex-vessel core debris is important because it is the last barrier to prevent the accident progression before the radioactive release to the environment. If the debris cooling is unsuccessful, the heated and possibly re-melted core debris may induces molten core-concrete interaction (MCCI) that produces steam and non-condensable gases and causes the over-pressurization of containment vessel. Analysis for the long-term cooling of an ex-vessel debris bed was performed using a simple model developed by Hwang et al., and the model was originally developed to explain the feature of the debris bed in FARO experiments. It is assumed that the debris bed consists of a fluidized particles on top and a porous lump of sintered particles below. The bed formation process is also considered to determine the geometry and initial condition of the debris bed. Therefore, the simple model includes a wide range of cooling processes with a simplified zero or one-dimensional analysis. Sensitivity tests for the ex-vessel debris bed cooling behavior on selected input variables were performed using a modified version of a simple model developed by Hwang et al.. The model covers the series of phenomena from the breakup of the melt jet till the long-term cooling phase, and the results showed that the accumulation area of debris bed has significant impacts on the overall cooling performance.

  20. JPL multipolarization workstation - Hardware, software and examples of data analysis

    Science.gov (United States)

    Burnette, Fred; Norikane, Lynne

    1987-01-01

    A low-cost stand-alone interactive image processing workstation has been developed for operations on multipolarization JPL aircraft SAR data, as well as data from future spaceborne imaging radars. A recently developed data compression technique is used to reduce the data volume to 10 Mbytes, for a typical data set, so that interactive analysis may be accomplished in a timely and efficient manner on a supermicrocomputer. In addition to presenting a hardware description of the work station, attention is given to the software that has been developed. Three illustrative examples of data analysis are presented.

  1. A software platform for the analysis of dermatology images

    Science.gov (United States)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  2. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  3. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  4. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  5. Measuring Cerebral Blood Flow in Moyamoya Angiopathy by Quantitative Magnetic Resonance Angiography Noninvasive Optimal Vessel Analysis.

    Science.gov (United States)

    Khan, Nadia; Lober, Robert M; Ostergren, Lauren; Petralia, Jacob; Bell-Stephens, Teresa; Navarro, Ramon; Feroze, Abdullah; Steinberg, Gary K

    2017-12-01

    Moyamoya disease causes progressive occlusion of the supraclinoidal internal carotid artery, and middle, anterior, and less frequently the posterior cerebral arteries, carrying the risk of stroke. Blood flow is often partially reconstituted by compensatory moyamoya collaterals and sometimes the posterior circulation. Cerebral revascularization can further augment blood flow. These changes to blood flow within the cerebral vessels, however, are not well characterized. To evaluate blood flow changes resulting from the disease process and revascularization surgery using quantitative magnetic resonance angiography with noninvasive optimal vessel analysis (NOVA). We retrospectively analyzed 190 preoperative and postoperative imaging scans in 66 moyamoya patients after revascularization surgery. Images were analyzed for blood flow using NOVA and compared with preoperative angiographic staging and postoperative blood flow. Blood flow rates within superficial temporal artery grafts were compared based on angiographic evidence of patency. Diseased vessels had lower blood flow, correlating with angiographic staging. Flow in posterior cererbal and basilar arteries increased with disease severity, particularly when both the anterior and middle cerebral arteries were occluded. Basilar artery flow and ipsilateral internal carotid artery flow decreased after surgery. Flow rates were different between angiographically robust and poor direct bypass grafts, as well as between robust and patent grafts. Preoperative changes in cerebral vessel flow as measured by NOVA correlated with angiographic disease progression. NOVA demonstrated that preoperative augmentation of the posterior circulation decreased after surgery. This report is the first to quantify the shift in collateral supply from the posterior circulation to the bypass graft.

  6. Cost-Benefit Analysis Methodology: Install Commercially Compliant Engines on National Security Exempted Vessels?

    Science.gov (United States)

    2015-11-05

    Carderock Division (NAVSSES), Machinery Research and Engineering Dept., Philadelphia, PA, USA. 2. Life Cycle Engineering, Federal Solutions Group , Energy...Federal Solutions Group , Energy Programs, Pittsburgh, PA, USA. 3. Alion Science and Technology, Industrial Engineering Division, Analysis Department...designation of Emission Control Area ( ECA ) now applies to all U.S. coasts. Within ECAs , all vessels, regardless of flag, are required to meet the

  7. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  8. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  11. Development of FB-MultiPier dynamic vessel-collision analysis models, phase 2 : [summary].

    Science.gov (United States)

    2014-07-01

    When collisions between large vessels and bridge : supports occur, they can result in significant : damage to bridge and vessel. These collisions : are extremely hazardous, often taking lives on : the vessel and the bridge. Direct costs of repair : a...

  12. Development of FB-MultiPier dynamic vessel-collision analysis models, phase 2.

    Science.gov (United States)

    2014-07-01

    Massive waterway vessels such as barges regularly transit navigable waterways in the U.S. During passages that fall within : the vicinity of bridge structures, vessels may (under extreme circumstances) deviate from the intended vessel transit path. A...

  13. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  14. Analysis of Test Efficiency during Software Development Process

    OpenAIRE

    nair, T. R. GopalaKrishnan; Suma, V.; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a...

  15. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  16. FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE

    Directory of Open Access Journals (Sweden)

    Entin Hartini

    2016-06-01

    Full Text Available ABSTRACT FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE. The reactor pressure vessel (RPV is a pressure boundary in the PWR type reactor which serves to confine radioactive material during chain reaction process. The integrity of the RPV must be guaranteed either  in a normal operation or accident conditions. In analyzing the integrity of RPV, especially related to the crack behavior which can introduce break to the reactor pressure vessel, a fracture mechanic approach should be taken for this assessment. The uncertainty of input used in the assessment, such as mechanical properties and physical environment, becomes a reason that the assessment is not sufficient if it is perfomed only by deterministic approach. Therefore, the uncertainty approach should be applied. The aim of this study is to analize the uncertainty of fracture mechanics calculations in evaluating the reliability of PWR`s reactor pressure vessel. Random character of input quantity was generated using probabilistic principles and theories. Fracture mechanics analysis is solved by Finite Element Method (FEM with  MSC MARC software, while uncertainty input analysis is done based on probability density function with Latin Hypercube Sampling (LHS using python script. The output of MSC MARC is a J-integral value, which is converted into stress intensity factor for evaluating the reliability of RPV’s 2D. From the result of the calculation, it can be concluded that the SIF from  probabilistic method, reached the limit value of  fracture toughness earlier than SIF from  deterministic method.  The SIF generated by the probabilistic method is 105.240 MPa m0.5. Meanwhile, the SIF generated by deterministic method is 100.876 MPa m0.5. Keywords: Uncertainty analysis, fracture mechanics, LHS, FEM, reactor pressure vessels   ABSTRAK ANALISIS KETIDAKPASTIAN FRACTURE MECHANIC PADA EVALUASI KEANDALAN

  17. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  18. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  19. Visualization techniques for the analysis of software behavior and related structures

    OpenAIRE

    Trümper, Jonas

    2014-01-01

    Software maintenance encompasses any changes made to a software system after its initial deployment and is thereby one of the key phases in the typical software-engineering lifecycle. In software maintenance, we primarily need to understand structural and behavioral aspects, which are difficult to obtain, e.g., by code reading. Software analysis is therefore a vital tool for maintaining these systems: It provides - the preferably automated - means to extract and evaluate information from thei...

  20. Software for the analysis and simulations of measurements; Software para analise e simulacao de medicoes

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Augusto Cesar Assis; Sarmento, Christiana Lauar; Mota, Geraldo Cesar; Domingos, Marileide Mourao; Belo, Noema Sant`Anna; Alves, Tulio Marcus Machado [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)

    1992-12-31

    This paper shows the development of a graphic software which act as a system to analyze the behaviour of electric power measurements and permits the calculation of `percent errors`, derived from measure inexactness. The software will show, in each situation, the correct link diagram, the measurement diagram, the `percent error` and the graphic behaviour of this error, in function of the power charge factor. 14 figs., 4 refs.

  1. Vessel Operating Units (Vessels)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for vessels that are greater than five net tons and have a current US Coast Guard documentation number. Beginning in1979, the NMFS...

  2. Performance analysis of software for identification of intestinal parasites

    Directory of Open Access Journals (Sweden)

    Andressa P. Gomes

    2015-08-01

    Full Text Available ABSTRACTIntroduction:Intestinal parasites are among the most frequent diagnoses worldwide. An accurate clinical diagnosis of human parasitic infections depends on laboratory confirmation for specific differentiation of the infectious agent.Objectives:To create technological solutions to help parasitological diagnosis, through construction and use of specific software.Material and method:From the images obtained from the sediment, the software compares the morphometry, area, perimeter and circularity, and uses the information on specific morphological and staining characteristics of parasites and allows the potential identification of parasites.RESULTS:Our results demonstrate satisfactory performance, from a total of 204 images analyzed, 81.86% had the parasite correctly identified by the computer system, and 18.13% could not be identified, due to the large amount of fecal debris in the sample evaluated.Discussion:Currently the techniques used in Parasitology area are predominantly manual, probably being affected by variables, such as attention and experience of the professional. Therefore, the use of computerization in this sector can improve the performance of parasitological analysis.Conclusions:This work contributes to the computerization of healthcare area, and benefits both health professionals and their patients, in addition to provide a more efficient, accurate and secure diagnosis.

  3. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Directory of Open Access Journals (Sweden)

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  4. Topographic analysis of eyelid position using digital image processing software.

    Science.gov (United States)

    Chun, Yeoun Sook; Park, Hong Hyun; Park, In Ki; Moon, Nam Ju; Park, Sang Joon; Lee, Jeong Kyu

    2017-11-01

    To propose a novel analysis technique for objective quantification of topographic eyelid position with an algorithmatically calculated scheme and to determine its feasibility. One hundred normal eyelids from 100 patients were segmented using a graph cut algorithm, and 11 shape features of eyelids were semi-automatically quantified using in-house software. To evaluate the intra- and inter-examiner reliability of this software, intra-class correlation coefficients (ICCs) were used. To evaluate the diagnostic value of this scheme, the correlations between semi-automatic and manual measurements of margin reflex distance 1 (MRD1) and margin reflex distance 2 (MRD2) were analysed using a Bland-Altman analysis. To determine the degree of agreement according to manual MRD length, the relationship between the variance of semi-automatic measurements and the manual measurements was evaluated using linear regression. Intra- and inter-examiner reliability were excellent, with ICCs ranging from 0.913 to 0.980 in 11 shape features including MRD1, MRD2, palpebral fissure, lid perimeter, upper and lower lid lengths, roundness, total area, and medial, central, and lateral areas. The correlations between semi-automatic and manual MRDs were also excellent, with better correlation in MRD1 than in MRD2 (R = 0.893 and 0.823, respectively). In addition, significant positive relationships were observed between the variance and the length of MRD1 and 2; the longer the MRD length, the more the variance. The proposed novel optimized integrative scheme, which is shown to have high repeatability and reproducibility, is useful for topographic analysis of eyelid position. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  5. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  6. The declining impact of piracy on maritime transport in the Indian Ocean: Statistical analysis of 5-year vessel tracking data

    OpenAIRE

    Vespe, Michele; Greidanus, Harm; Alvarez, Marlene Alvarez

    2015-01-01

    The analysis of the declining impact of piracy on maritime routes and vessel behaviours in the Indian Ocean is here performed using Long Range Identification and Tracking (LRIT) reports. A 5-year archive of vessel position data covering the period characterized by the highest number of attacks and the subsequent decline provides a unique source for data-driven statistical analysis that highlights changes in routing and sailing speeds. The work, besides demonstrating the value of LRIT data for...

  7. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  8. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  9. Modular Open-Source Software for Item Factor Analysis.

    Science.gov (United States)

    Pritikin, Joshua N; Hunter, Micheal D; Boker, Steven

    2015-06-01

    This paper introduces an Item Factor Analysis (IFA) module for OpenMx, a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation and manipulation of models. Modular organization of the source code facilitates the easy addition of item models, item parameter estimation algorithms, optimizers, test scoring algorithms, and fit diagnostics all within an integrated framework. Three short example scripts are presented for fitting item parameters, latent distribution parameters, and a multiple group model. The availability of both IFA and structural equation modeling in the same software is a step toward the unification of these two methodologies.

  10. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  11. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  12. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  13. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  14. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  15. Slicing techniques applied to architectural analysis of legacy software

    OpenAIRE

    Rodrigues, Nuno F.

    2009-01-01

    Tese de doutoramento em Informática (ramo de conhecimento em Fundamentos da Computação) Program understanding is emerging as a key concern in software engineering. In a situation in which the only quality certificate of the running software artifact still is life-cycle endurance, customers and software producers are little prepared to modify or improve running code. However, faced with so risky a dependence on legacy software, managers are more and more prepared to spend resources to in...

  16. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  17. [Statistical analysis using freely-available "EZR (Easy R)" software].

    Science.gov (United States)

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  18. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  19. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    Science.gov (United States)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  20. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  1. How qualitative data analysis software may support the qualitative analysis process

    NARCIS (Netherlands)

    Peters, V.A.M.; Wester, F.P.J.

    2007-01-01

    The last decades have shown large progress in the elaboration of procedures for qualitative data analysis and in the development of computer programs to support this kind of analysis. We believe, however, that the link between methodology and computer software tools is too loose, especially for a

  2. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  3. Ex vivo blood vessel bioreactor for analysis of the biodegradation of magnesium stent models with and without vessel wall integration.

    Science.gov (United States)

    Wang, Juan; Liu, Lumei; Wu, Yifan; Maitz, Manfred F; Wang, Zhihong; Koo, Youngmi; Zhao, Ansha; Sankar, Jagannathan; Kong, Deling; Huang, Nan; Yun, Yeoheung

    2017-03-01

    Current in vitro models fail in predicting the degradation rate and mode of magnesium (Mg) stents in vivo. To overcome this, the microenvironment of the stent is simulated here in an ex vivo bioreactor with porcine aorta and circulating medium, and compared with standard static in vitro immersion and with in vivo rat aorta models. In ex vivo and in vivo conditions, pure Mg wires were exposed to the aortic lumen and inserted into the aortic wall to mimic early- and long-term implantation, respectively. Results showed that: 1) Degradation rates of Mg were similar for all the fluid diffusion conditions (in vitro static, aortic wall ex vivo and in vivo); however, Mg degradation under flow condition (i.e. in the lumen) in vivo was slower than ex vivo; 2) The corrosion mode in the samples can be mainly described as localized (in vitro), mixed localized and uniform (ex vivo), and uniform (in vivo); 3) Abundant degradation products (MgO/Mg(OH)2 and Ca/P) with gas bubbles accumulated around the localized degradation regions ex vivo, but a uniform and thin degradation product layer was found in vivo. It is concluded that the ex vivo vascular bioreactor provides an improved test setting for magnesium degradation between static immersion and animal experiments and highlights its promising role in bridging degradation behavior and biological response for vascular stent research. Magnesium and its alloys are candidates for a new generation of biodegradable stent materials. However, the in vitro degradation of magnesium stents does not match the clinical degradation rates, corrupting the validity of conventional degradation tests. Here we report an ex vivo vascular bioreactor, which allows simulation of the microenvironment with and without blood vessel integration to study the biodegradation of magnesium implants in comparison with standard in vitro test conditions and with in vivo implantations. The bioreactor did simulate the corrosion of an intramural implant very well, but

  4. Efficiency Analysis of Additions of Ice Flake in Cargo Hold Cooling System of Fishing Vessel

    Directory of Open Access Journals (Sweden)

    Amiadji Amiadji

    2017-06-01

    Full Text Available As a maritime nation, the majority of people's livelihood in Indonesia coast is as a fishermen. The process of preserving fish after being caught will determine how good the product quality. One of process on preserving fish that can be done is to perform the cooling process using a cooling machine on board. Refrigeration system certainly requires high electrical power consumption. That high power usage can be reduced as much as possible, one of which is to add chopped ice (ice flake on a fishing boat cargo space. So that the load for cooling can be reduced.The purpose of this thesis is to find out how the influence of the addition of ice flake on cooling load in the cargo hold of fishing vessels, and to know how much power is used when the cooling machine is combined with the addition of ice flake. In this analysis cooling load calculation refers to the standard ISO 7547.from the results of analysis found that the addition of ice flake on cargo space can reduce cooling load and can reduce electricity consumption day in the main vessel for the addition comparison flake ice and fish weight of 1: 1.

  5. TweezPal - Optical tweezers analysis and calibration software

    Science.gov (United States)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional

  6. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  7. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  8. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  9. An Initial Quality Analysis of the Ohloh Software Evolution Data

    NARCIS (Netherlands)

    Bruntink, M.

    2014-01-01

    Large public data sets on software evolution promise great value to both researchers and practitioners, in particular for software (development) analytics. To realise this value, the data quality of such data sets needs to be studied and improved. Despite these data sets being of a secondary nature,

  10. Preliminary Analysis of Ex-Vessel Steam Explosion using TEXAS-V code for APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Song, Sung Chu; Lee, Jung Jae; Cho, Yong Jin; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-10-15

    The purpose of this study is to explore input development and the audit calculation using TEXAS-V code for ex-vessel steam explosion for a flooded reactor cavity of APR1400. TEXAS computational models are one of the simplified tools for simulations of fuel-coolant interaction during mixing, triggering and explosion phase. The models of TEXAS code were validated by performing the fundamental experimental investigation in the KROTOS facility at JRC, Ispra. The experiments such as KROTOS and FARO experiment are aimed at providing benchmark data to examine the effect of fuel-coolant initial conditions and mixing on explosion energetics with alumina and prototypical core material. TEXAS-V code used in this study was to analyze and predict the ex-vessel steam explosion for a reactor scale. The input deck to simulate the flooded reactor cavity of APR1400 is developed and base case calculation is performed. This study will provide a base for further study. The code will be of use for the evaluation and sensitivity study of ex-vessel steam explosion for ERVC strategy in the future studies. Analysis result of this study is similar to the result of other study. Through this study, it is found that TEXAS-V could be the used as a tool for predicting the steam explosion load on a reactor scale, as fast running computer code. In addition, TEXAS-V code could be to evaluate the impact of various uncertainties, which are not clearly understood yet, to provide a conservative envelope for the steam explosion.

  11. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  12. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (ASME Code) requirements in... compliance with Section III, ``Rules for Construction of Nuclear Facility Components,'' of the ASME Code... regulatory issue summary (RIS) to remind addressees of the American Society of Mechanical Engineers (ASME...

  13. Analysis of Software Binaries for Reengineering-Driven Product Line Architecture—An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Ian D. Peake

    2015-04-01

    Full Text Available This paper describes a method for the recovering of software architectures from a set of similar (but unrelated software products in binary form. One intention is to drive refactoring into software product lines and combine architecture recovery with run time binary analysis and existing clustering methods. Using our runtime binary analysis, we create graphs that capture the dependencies between different software parts. These are clustered into smaller component graphs, that group software parts with high interactions into larger entities. The component graphs serve as a basis for further software product line work. In this paper, we concentrate on the analysis part of the method and the graph clustering. We apply the graph clustering method to a real application in the context of automation / robot configuration software tools.

  14. Quantitative analysis of histopathological findings using image processing software.

    Science.gov (United States)

    Horai, Yasushi; Kakimoto, Tetsuhiro; Takemoto, Kana; Tanaka, Masaharu

    2017-10-01

    In evaluating pathological changes in drug efficacy and toxicity studies, morphometric analysis can be quite robust. In this experiment, we examined whether morphometric changes of major pathological findings in various tissue specimens stained with hematoxylin and eosin could be recognized and quantified using image processing software. Using Tissue Studio, hypertrophy of hepatocytes and adrenocortical cells could be quantified based on the method of a previous report, but the regions of red pulp, white pulp, and marginal zones in the spleen could not be recognized when using one setting condition. Using Image-Pro Plus, lipid-derived vacuoles in the liver and mucin-derived vacuoles in the intestinal mucosa could be quantified using two criteria (area and/or roundness). Vacuoles derived from phospholipid could not be quantified when small lipid deposition coexisted in the liver and adrenal cortex. Mononuclear inflammatory cell infiltration in the liver could be quantified to some extent, except for specimens with many clustered infiltrating cells. Adipocyte size and the mean linear intercept could be quantified easily and efficiently using morphological processing and the macro tool equipped in Image-Pro Plus. These methodologies are expected to form a base system that can recognize morphometric features and analyze quantitatively pathological findings through the use of information technology.

  15. RNAstructure: software for RNA secondary structure prediction and analysis

    Directory of Open Access Journals (Sweden)

    Mathews David H

    2010-03-01

    Full Text Available Abstract Background To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. Results RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms, prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. Conclusion The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.

  16. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    National Research Council Canada - National Science Library

    Titus Felix FURTUNĂ; Claudiu VINȚE

    2016-01-01

    Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results...

  17. Spatiotemporal image correlation analysis of blood flow in branched vessel networks of zebrafish embryos

    Science.gov (United States)

    Ceffa, Nicolo G.; Cesana, Ilaria; Collini, Maddalena; D'Alfonso, Laura; Carra, Silvia; Cotelli, Franco; Sironi, Laura; Chirico, Giuseppe

    2017-10-01

    Ramification of blood circulation is relevant in a number of physiological and pathological conditions. The oxygen exchange occurs largely in the capillary bed, and the cancer progression is closely linked to the angiogenesis around the tumor mass. Optical microscopy has made impressive improvements in in vivo imaging and dynamic studies based on correlation analysis of time stacks of images. Here, we develop and test advanced methods that allow mapping the flow fields in branched vessel networks at the resolution of 10 to 20 μm. The methods, based on the application of spatiotemporal image correlation spectroscopy and its extension to cross-correlation analysis, are applied here to the case of early stage embryos of zebrafish.

  18. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  19. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  20. Availability Analysis and Improvement of Software Rejuvenation Using Virtualization

    National Research Council Canada - National Science Library

    PARK, Jong Sou; Thandar, THEIN; Sung-Do, CHI

    2007-01-01

    .... To improve the availability of application servers, we have conducted a study of virtualization technology and software rejuvenation that follows a proactive fault-tolerant approach to counter act...

  1. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  2. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  3. Economic Analysis, Public Policy and the Software Industry

    OpenAIRE

    B. Richardson, George

    2010-01-01

    This paper focuses on three related matters. It analyses the process of competition in the software industry, this being important both in itself and for the light it throws on competition within all industries characterised by low or zero marginal costs and a high rate of technical development. The software industry, operating under private enterprise, is dependent on copyright, and the issues raised by intellectual property protection are therefore also considered. Given the need for inter-...

  4. Investigation, Analysis and Implementation of Open Source Mobile Communication Software

    OpenAIRE

    Paudel, Suresh

    2016-01-01

    Over the past few years, open source software has transformed the mobile communication networks. The development of VoIP technologies has enabled the migration of telco protocols and services to the IP network with help of open source software. This allows for deployment of mobile networks in rural areas with lower cost. The usage of open source GSM is very useful for developing countries which do not yet have full mobile coverage. Open source GSM allows very rapid and economical deployment o...

  5. SURROGATE MODEL DEVELOPMENT AND VALIDATION FOR RELIABILITY ANALYSIS OF REACTOR PRESSURE VESSELS

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, William M.; Riley, Matthew E.; Spencer, Benjamin W.

    2016-07-01

    In nuclear light water reactors (LWRs), the reactor coolant, core and shroud are contained within a massive, thick walled steel vessel known as a reactor pressure vessel (RPV). Given the tremendous size of these structures, RPVs typically contain a large population of pre-existing flaws introduced in the manufacturing process. After many years of operation, irradiation-induced embrittlement makes these vessels increasingly susceptible to fracture initiation at the locations of the pre-existing flaws. Because of the uncertainty in the loading conditions, flaw characteristics and material properties, probabilistic methods are widely accepted and used in assessing RPV integrity. The Fracture Analysis of Vessels – Oak Ridge (FAVOR) computer program developed by researchers at Oak Ridge National Laboratory is widely used for this purpose. This program can be used in order to perform deterministic and probabilistic risk-informed analyses of the structural integrity of an RPV subjected to a range of thermal-hydraulic events. FAVOR uses a one-dimensional representation of the global response of the RPV, which is appropriate for the beltline region, which experiences the most embrittlement, and employs an influence coefficient technique to rapidly compute stress intensity factors for axis-aligned surface-breaking flaws. The Grizzly code is currently under development at Idaho National Laboratory (INL) to be used as a general multiphysics simulation tool to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled RPVs. Grizzly can be used to model the thermo-mechanical response of an RPV under transient conditions observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local 3D models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtain stress intensity factors, which can in

  6. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  7. Accident Damage Analysis Module (ADAM) – Technical Guidance, Software tool for Consequence Analysis calculations

    OpenAIRE

    FABBRI LUCIANO; BINDA MASSIMO; BRUINEN DE BRUIN YURI

    2017-01-01

    This report provides a technical description of the modelling and assumptions of the Accident Damage Analysis Module (ADAM) software application, which has been recently developed by the Joint Research Centre (JRC) of the European Commission (EC) to assess physical effects of an industrial accident resulting from an unintended release of a dangerous substance

  8. Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.

  9. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  10. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Science.gov (United States)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  11. Characterization of human retinal vessel arborisation in normal and amblyopic eyes using multifractal analysis

    Directory of Open Access Journals (Sweden)

    Stefan Tălu

    2015-10-01

    Full Text Available AIM:To characterize the human retinal vessel arborisation in normal and amblyopic eyes using multifractal geometry and lacunarity parameters.METHODS:Multifractal analysis using a box counting algorithm was carried out for a set of 12 segmented and skeletonized human retinal images, corresponding to both normal (6 images and amblyopia states of the retina (6 images.RESULTS:It was found that the microvascular geometry of the human retina network represents geometrical multifractals, characterized through subsets of regions having different scaling properties that are not evident in the fractal analysis. Multifractal analysis of the amblyopia images (segmented and skeletonized versions show a higher average of the generalized dimensions (Dq for q=0, 1, 2 indicating a higher degree of the tree-dimensional complexity associated with the human retinal microvasculature network whereas images of healthy subjects show a lower value of generalized dimensions indicating normal complexity of biostructure. On the other hand, the lacunarity analysis of the amblyopia images (segmented and skeletonized versions show a lower average of the lacunarity parameter Λ than the corresponding values for normal images (segmented and skeletonized versions.CONCLUSION:The multifractal and lacunarity analysis may be used as a non-invasive predictive complementary tool to distinguish amblyopic subjects from healthy subjects and hence this technique could be used for an early diagnosis of patients with amblyopia.

  12. Manufacturing Cost Analysis of Novel Steel/Concrete Composite Vessel for Stationary Storage of High-Pressure Hydrogen

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Zhili [ORNL; Zhang, Wei [ORNL; Wang, Jy-An John [ORNL; Ren, Fei [ORNL

    2012-09-01

    A novel, low-cost, high-pressure, steel/concrete composite vessel (SCCV) technology for stationary storage of compressed gaseous hydrogen (CGH2) is currently under development at Oak Ridge National Laboratory (ORNL) sponsored by DOE s Fuel Cell Technologies (FCT) Program. The SCCV technology uses commodity materials including structural steels and concretes for achieving cost, durability and safety requirements. In particular, the hydrogen embrittlement of high-strength low-alloy steels, a major safety and durability issue for current industry-standard pressure vessel technology, is mitigated through the use of a unique layered steel shell structure. This report presents the cost analysis results of the novel SCCV technology. A high-fidelity cost analysis tool is developed, based on a detailed, bottom-up approach which takes into account the material and labor costs involved in each of the vessel manufacturing steps. A thorough cost study is performed to understand the SCCV cost as a function of the key vessel design parameters, including hydrogen pressure, vessel dimensions, and load-carrying ratio. The major conclusions include: The SCCV technology can meet the technical/cost targets set forth by DOE s FCT Program for FY2015 and FY2020 for all three pressure levels (i.e., 160, 430 and 860 bar) relevant to the hydrogen production and delivery infrastructure. Further vessel cost reduction can benefit from the development of advanced vessel fabrication technologies such as the highly automated friction stir welding (FSW). The ORNL-patented multi-layer, multi-pass FSW can not only reduce the amount of labor needed for assembling and welding the layered steel vessel, but also make it possible to use even higher strength steels for further cost reductions and improvement of vessel structural integrity. It is noted the cost analysis results demonstrate the significant cost advantage attainable by the SCCV technology for different pressure levels when compared to the

  13. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  14. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  15. RFcap: a software analysis tool for multichannel cochlear implant signals.

    Science.gov (United States)

    Lai, Wai Kong; Dillier, Norbert

    2013-03-01

    Being able to display and analyse the output of a speech processor that encodes the parameters of complex stimuli to be presented by a cochlear implant (CI) is useful for software and hardware development as well as for diagnostic purposes. This firstly requires appropriate hardware that is able to receive and decode the radio frequency (RF)-coded signals, and then processing the decoded data using suitable software. The PCI-IF6 clinical hardware for the Nucleus CI system, together with the Nucleus Implant Communicator and Nucleus Matlab Toolbox research software libraries, provide the necessary functionality. RFcap is a standalone Matlab application that encapsulates the relevant functions to capture, display, and analyse the RF-coded signals intended for the Nucleus CI24M/R, CI24RE, and CI500 multichannel CIs.

  16. Analysis by NASA's VESGEN Software of Vascular Branching in the Human Retina with a Ground-Based Microgravity Analog

    Science.gov (United States)

    Parsons-Wingerter, Patricia; Vyas, Ruchi J.; Raghunandan, Sneha; Vu, Amanda C.; Zanello, Susana B.; Ploutz-Snyder, Robert; Taibbi, Giovanni; Vizzeri, Gianmarco

    2016-01-01

    Significant risks for visual impairment were discovered recently in astronauts following spaceflight, especially after long-duration missions.1 We hypothesize that microgravity-induced fluid shifts result in pathological changes within the retinal vasculature that precede visual and other ocular impairments. We therefore are analyzing retinal vessels in healthy subjects with NASA's VESsel GENeration Analysis (VESGEN) software2 before and after head-down tilt (HDT), a ground-based microgravity analog For our preliminary study of masked images, two groups of venous trees with and without small veins (G=7) were clearly identified by VESGEN analysis. Upon completing all images and unmasking the subject status of pre- and post- HDT, we will determine whether differences in the presence or absence of small veins are important correlates, and perhaps reliable predictors, of other ocular and physiological adaptations to prolonged HDT and microgravity. Greater peripapillary retinal thickening was measured following 70-day HDT bed rest than 14-day HDT bed rest, suggesting that time of HDT may increase the amount of optic disc swelling.3 Spectralis OCT detected retinal nerve fiber layer thickening post HDT, without clinical signs of optic disc edema. Such changes may have resulted from HDT-induced cephalad fluid shifts. Clinical methods for examining adaptive microvascular remodeling in the retina to microgravity space flight are currently not established.

  17. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  18. Availability Analysis and Improvement of Software Rejuvenation Using Virtualization

    Directory of Open Access Journals (Sweden)

    Jong Sou PARK

    2007-01-01

    Full Text Available Availability of business-critical application servers is an issue ofparamount importance that has received special attention from the industry and academia. To improve the availability of application servers, we have conducted a study of virtualization technology and software rejuvenation that follows a proactive fault-tolerant approach to counter act the software aging problem. We present Markov models for analyzing availability in such continuously running applications and express availability, downtime and downtime costs during rejuvenation in terms of the parameters in the models. Our results show that our approach is a practical way to ensure uninterrupted availability and optimize performance for even strongly aging applications.

  19. STATISTICAL ANALYSIS FOR OBJECT ORIENTED DESIGN SOFTWARE SECURITY METRICS

    OpenAIRE

    Amjan.Shaik; Dr.C.R.K.Reddy; Dr.A.Damodaran

    2010-01-01

    In the last decade, empirical studies on object-oriented design metrics have shown some of them to be useful for predicting the fault-proneness of classes in object-oriented software systems. In the era of Computerization Object Oriented Paradigm is becoming more and more pronounced. This has provoked the need of high quality object oriented software, as the traditional metrics cannot be applied on the object-oriented systems. This paper gives the evaluation of CK suit of metrics. There are q...

  20. BEANS - a software package for distributed Big Data analysis

    OpenAIRE

    Hypki, Arkadiusz

    2016-01-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versat...

  1. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  2. Description of the GMAO OSSE for Weather Analysis Software Package: Version 3

    Science.gov (United States)

    Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.; hide

    2017-01-01

    The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.

  3. Control and analysis software for a laser scanning microdensitometer

    Indian Academy of Sciences (India)

    It provides a user-friendly Graphical User Interface (GUI) to analyse the scanned data and also store the analysed data/image in popular formats like data in Excel and images in jpeg. It has also on-line calibration facility with standard optical density tablets. The control software and data acquisition system is simple, ...

  4. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...

  5. Analysis of crosscutting across software development phases based on traceability

    NARCIS (Netherlands)

    van den Berg, Klaas; Conejero, J.M.; Hernández, J.

    2006-01-01

    Traceability of requirements and concerns enhances the quality of software development. We use trace relations to define crosscutting. As starting point, we set up a dependency matrix to capture the relationship between elements at two levels, e.g. concerns and representations of concerns. The

  6. Software package for analysis of completely randomized block design

    African Journals Online (AJOL)

    This study is to design and develop statistical software (package), OYSP1.0 which conveniently accommodates and analyzes large mass of data emanating from experimental designs, in particular, completely Randomized Block design. Visual Basic programming is used in the design. The statistical package OYSP 1.0 ...

  7. Software engineering article types: An analysis of the literature

    NARCIS (Netherlands)

    Montesi, M.; Lago, P.

    2008-01-01

    The software engineering (SE) community has recently recognized that the field lacks well-established research paradigms and clear guidance on how to write good research reports. With no comprehensive guide to the different article types in the field, article writing and reviewing heavily depends on

  8. Accuracy of 3D Imaging Software in Cephalometric Analysis

    Science.gov (United States)

    2013-06-21

    measurements and is superior in observing complex periodontal defects (Misch, Yi & Sarment, 2006). With endodontics , CBCT provides more adequate diagnosis...applications of CBCT in dentistry include all of the following fields: oral and maxillofacial surgery, endodontics , implant dentistry, orthodontics...without magnification or distortion errors (Alamri, Sadrameli, Alshalhoob, Sadrameli & Alshehri, 2012). Orthodontic assessment software programs allow

  9. Analysis of Code Refactoring Impact on Software Quality

    Directory of Open Access Journals (Sweden)

    Kaur Amandeep

    2016-01-01

    Full Text Available Code refactoring is a “Technique used for restructuring an existing source code, improving its internal structure without changing its external behaviour”. It is the process of changing a source code in such a way that it does not alter the external behaviour of the code yet improves its internal structure. It is a way to clean up code that minimizes the chances of introducing bugs. Refactoring is a change made to the internal structure of a software component to make it easier to understand and cheaper to modify, without changing the observable behaviour of that software component. Bad smells indicate that there is something wrong in the code that have to refactor. There are different tools that are available to identify and remove these bad smells. It is a technique that change our source code in a more readable and maintainable form by removing the bad smells from the code. Refactoring is used to improve the quality of software by reducing the complexity. In this paper bad smells are found and perform the refactoring based on these bad smell and then find the complexity of program and compare with initial complexity. This paper shows that when refactoring is performed the complexity of software decrease and easily understandable.

  10. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  11. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  12. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Thematic Review and Analysis of Grounded Theory Application in Software Engineering

    Directory of Open Access Journals (Sweden)

    Omar Badreddin

    2013-01-01

    Full Text Available We present metacodes, a new concept to guide grounded theory (GT research in software engineering. Metacodes are high level codes that can help software engineering researchers guide the data coding process. Metacodes are constructed in the course of analyzing software engineering papers that use grounded theory as a research methodology. We performed a high level analysis to discover common themes in such papers and discovered that GT had been applied primarily in three software engineering disciplines: agile development processes, geographically distributed software development, and requirements engineering. For each category, we collected and analyzed all grounded theory codes and created, following a GT analysis process, what we call metacodes that can be used to drive further theory building. This paper surveys the use of grounded theory in software engineering and presents an overview of successes and challenges of applying this research methodology.

  14. Meta-analysis: retinal vessel caliber and risk for coronary heart disease

    NARCIS (Netherlands)

    McGeechan, Kevin; Liew, Gerald; Macaskill, Petra; Irwig, Les; Klein, Ronald; Klein, Barbara E. K.; Wang, Jie Jin; Mitchell, Paul; Vingerling, Johannes R.; deJong, Paulus T. V. M.; Witteman, Jacqueline C. M.; Breteler, Monique M. B.; Shaw, Jonathan; Zimmet, Paul; Wong, Tien Y.

    2009-01-01

    Retinal vessel caliber may be a novel marker of coronary heart disease (CHD) risk. However, the sex-specific effect, magnitude of association, and effect independent of traditional CHD disease risk factors remain unclear. To determine the association between retinal vessel caliber and risk for CHD.

  15. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    OpenAIRE

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multi...

  16. Vulnerability analysis of a pressurized aluminum composite vessel against hypervelocity impacts

    Directory of Open Access Journals (Sweden)

    Hereil Pierre-Louis

    2015-01-01

    Full Text Available Vulnerability of high pressure vessels subjected to high velocity impact of space debris is analyzed with the response of pressurized vessels to hypervelocity impact of aluminum sphere. Investigated tanks are CFRP (carbon fiber reinforced plastics overwrapped Al vessels. Explored internal pressure of nitrogen ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from Xrays radiographies and particle velocity measurements show the evolution of debris cloud and shock wave propagation in pressurized nitrogen. Observation of recovered vessels leads to the damage pattern and to its evolution as a function of the internal pressure. It is shown that the rupture mode is not a bursting mode but rather a catastrophic damage of the external carbon composite part of the vessel.

  17. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  18. AMIDE: A Free Software Tool for Multimodality Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Markus Loening

    2003-07-01

    Full Text Available Amide's a Medical Image Data Examiner (AMIDE has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  19. AMIDE: a free software tool for multimodality medical image analysis.

    Science.gov (United States)

    Loening, Andreas Markus; Gambhir, Sanjiv Sam

    2003-07-01

    Amide's a Medical Image Data Examiner (AMIDE) has been developed as a user-friendly, open-source software tool for displaying and analyzing multimodality volumetric medical images. Central to the package's abilities to simultaneously display multiple data sets (e.g., PET, CT, MRI) and regions of interest is the on-demand data reslicing implemented within the program. Data sets can be freely shifted, rotated, viewed, and analyzed with the program automatically handling interpolation as needed from the original data. Validation has been performed by comparing the output of AMIDE with that of several existing software packages. AMIDE runs on UNIX, Macintosh OS X, and Microsoft Windows platforms, and it is freely available with source code under the terms of the GNU General Public License.

  20. Domain analysis for the reuse of software development experiences

    Science.gov (United States)

    Basili, V. R.; Briand, L. C.; Thomas, W. M.

    1994-01-01

    We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.

  1. Space Telecommunications Radio System Software Architecture Concepts and Analysis

    Science.gov (United States)

    Handler, Louis M.; Hall, Charles S.; Briones, Janette C.; Blaser, Tammy M.

    2008-01-01

    The Space Telecommunications Radio System (STRS) project investigated various Software Defined Radio (SDR) architectures for Space. An STRS architecture has been selected that separates the STRS operating environment from its various waveforms and also abstracts any specialized hardware to limit its effect on the operating environment. The design supports software evolution where new functionality is incorporated into the radio. Radio hardware functionality has been moving from hardware based ASICs into firmware and software based processors such as FPGAs, DSPs and General Purpose Processors (GPPs). Use cases capture the requirements of a system by describing how the system should interact with the users or other systems (the actors) to achieve a specific goal. The Unified Modeling Language (UML) is used to illustrate the Use Cases in a variety of ways. The Top Level Use Case diagram shows groupings of the use cases and how the actors are involved. The state diagrams depict the various states that a system or object may be in and the transitions between those states. The sequence diagrams show the main flow of activity as described in the use cases.

  2. Potku - New analysis software for heavy ion elastic recoil detection analysis

    Science.gov (United States)

    Arstila, K.; Julin, J.; Laitinen, M. I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-07-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight-energy (ToF-E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF-E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  3. ANALYSIS OF SOFTWARE THREATS TO THE AUTOMATIC IDENTIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Marijan Gržan

    2017-01-01

    Full Text Available Automatic Identification System (AIS represents an important improvement in the fields of maritime security and vessel tracking. It is used by the signatory countries to the SOLAS Convention and by private and public providers. Its main advantage is that it can be used as an additional navigation aids, especially in avoiding collision at sea and in search and rescue operations. The present work analyses the functioning of the AIS System and the ways of exchanging data among the users. We also study one of the vulnerabilities of the System that can be abused by malicious users. The threat itself is analysed in detail in order to provide insight into the very process from the creation of a program to its implementation.

  4. On The Human, Organizational, and Technical Aspects of Software Development and Analysis

    Science.gov (United States)

    Damaševičius, Robertas

    Information systems are designed, constructed, and used by people. Therefore, a software design process is not purely a technical task, but a complex psycho-socio-technical process embedded within organizational, cultural, and social structures. These structures influence the behavior and products of the programmer's work such as source code and documentation. This chapter (1) discusses the non-technical (organizational, social, cultural, and psychological) aspects of software development reflected in program source code; (2) presents a taxonomy of the social disciplines of computer science; and (3) discusses the socio-technical software analysis methods for discovering the human, organizational, and technical aspects embedded within software development artifacts.

  5. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  6. Using software security analysis to verify the secure socket layer (SSL) protocol

    Science.gov (United States)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  7. Analysis of HRCT-derived xylem network reveals reverse flow in some vessels.

    Science.gov (United States)

    Lee, Eric F; Matthews, Mark A; McElrone, Andrew J; Phillips, Ronald J; Shackel, Kenneth A; Brodersen, Craig R

    2013-09-21

    Long distance water and nutrient transport in plants is dependent on the proper functioning of xylem networks, a series of interconnected pipe-like cells that are vulnerable to hydraulic dysfunction as a result of drought-induced embolism and/or xylem-dwelling pathogens. Here, flow in xylem vessels was modeled to determine the role of vessel connectivity by using three dimensional xylem networks derived from High Resolution Computed Tomography (HRCT) images of grapevine (Vitis vinifera cv. 'Chardonnay') stems. Flow in 4-27% of the vessel segments (i.e. any section of vessel elements between connection points associated with intervessel pits) was found to be oriented in the direction opposite to the bulk flow under normal transpiration conditions. In order for the flow in a segment to be in the reverse direction, specific requirements were determined for the location of connections, distribution of vessel endings, diameters of the connected vessels, and the conductivity of the connections. Increasing connectivity and decreasing vessel length yielded increasing numbers of reverse flow segments until a maximum value was reached, after which more interconnected networks and smaller average vessel lengths yielded a decrease in the number of reverse flow segments. Xylem vessel relays also encouraged the formation of reverse flow segments. Based on the calculated flow rates in the xylem network, the downward spread of Xylella fastidiosa bacteria in grape stems was modeled, and reverse flow was shown to be an additional mechanism for the movement of bacteria to the trunk of grapevine. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions

    Science.gov (United States)

    2015-02-01

    Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions by Joshua Smith ARL-TR-7217...1138 ARL-TR-7217 February 2015 Spectrum Monitoring Using SpectrumAnalysis LabVIEW Software, Nanoceptors, and Various Digitizing Solutions...REPORT TYPE Final 3. DATES COVERED (From - To) 06/2014–07/2014 4. TITLE AND SUBTITLE Spectrum Monitoring Using Spectrum Analysis LabVIEW

  9. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    , and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements......, relations among them, and properties of both. Our objective is to show how this concept can be used i) in the analysis of existing software ecosystems and ii) in the design of new software ecosystems. Method We performed a mixed-method study that consisted of a case study and an experiment. For i), we...... performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization...

  10. Software analysis for modeling the parameters of shunting locomotives chassis

    Directory of Open Access Journals (Sweden)

    Falendysh Anatoliy

    2017-01-01

    Full Text Available The article provides an overview of software designed to perform the simulation of structures, calculate their states, and respond to the effects of loads applied to any of the points in the model. In this case, we are interested in the possibility of modeling the locomotive chassis frames, with the possibility of determining the weakest points of their construction, determination of the remaining life of the structure. For this purpose, the article presents a developed model for calculating the frame of the diesel locomotive chassis, taking into account technical, economic and other parameters.

  11. Eval: A software package for analysis of genome annotations

    Directory of Open Access Journals (Sweden)

    Brent Michael R

    2003-10-01

    Full Text Available Abstract Summary Eval is a flexible tool for analyzing the performance of gene annotation systems. It provides summaries and graphical distributions for many descriptive statistics about any set of annotations, regardless of their source. It also compares sets of predictions to standard annotations and to one another. Input is in the standard Gene Transfer Format (GTF. Eval can be run interactively or via the command line, in which case output options include easily parsable tab-delimited files. Availability To obtain the module package with documentation, go to http://genes.cse.wustl.edu/ and follow links for Resources, then Software. Please contact brent@cse.wustl.edu

  12. Analysis of Code Refactoring Impact on Software Quality

    OpenAIRE

    Kaur Amandeep; Kaur Manpreet

    2016-01-01

    Code refactoring is a “Technique used for restructuring an existing source code, improving its internal structure without changing its external behaviour”. It is the process of changing a source code in such a way that it does not alter the external behaviour of the code yet improves its internal structure. It is a way to clean up code that minimizes the chances of introducing bugs. Refactoring is a change made to the internal structure of a software component to make it easier to understand ...

  13. Unimak Pass vessel analysis. Social and economic studies program technical report number 108. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, K.; Cook, P.; Pederson, J.; Hennigh, G.

    1984-09-01

    The report identifies present and future marine traffic and related characteristics of vessels using Unimak Pass. Present and future vessel traffic estimates through the year 2000 are developed for four categories: fishing; natural resources; commercial shipping; and Outer Continental Shelf (OCS) activities. Total annual vessel traffic for the year 2000 is estimated to increase approximately 100% over the base year from approximately 2290 trips to 4600 trips in 2000. The report assesses the impact, i.e., increase in collisions, of additional vessel traffic using Unimak Pass as a result of future OCS activities in northern and western Alaska. The OCS traffic almost doubles the likelihood of a collision in the pass by the year 2000. The collision rate in the Pass without OCS activity is estimated at one collision every 57 years, while the collision rate with OCS activity is estimated at one collision every 33 years in the year 2000.

  14. Analysis of Air Temperatures around Reactor Vessel in EU-APR

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Won Seok; Lee, Keun Sung; Hwang, Do Hyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    EU-APR, modified and improved from its original design of APR1400, has been developed to comply with European Utility Requirements (EUR) and nuclear design requirements of the European countries. In EU-APR, the removable concrete shielding blocks is newly adopted to reduce the radioactive dose rate in order to allow personnel access to the containment building during power operation. During plants startup, hot shutdown and power operation, the reactor cavity HVAC system maintains temperature and humidity for the in-core instrument (ICI) chase, ex-core detector spaces, reactor cavity and the hot and cold leg penetration opening. CFD analysis has been performed in order to check if the air temperature in the reactor cavity and concrete in EU-APR are exceeded the temperature limits. Six calculations for EU-APR have been carried out with different opening areas of the relief damper plate and air flow rates sucked out through six ventilation pipes. For the temperature distribution in the reactor cavity, it is found that there are the regions above a temperature of 120℉(48.9℃). However, these regions are relatively small and are observed very near the reactor vessel insulation. Therefore, these high temperature regions do not directly influence on concrete. Furthermore, the heat emission used in the current calculations already includes 20% margin including the conservative margins used for the metal reflective heat losses and the margins used to estimate Gamma and neutron heat into the primary wall. Totally, 10% margin is included.

  15. Pretest Round Robin Analysis of 1:4-Scale Prestressed Concrete Containment Vessel Model

    Energy Technology Data Exchange (ETDEWEB)

    HESSHEIMER,MICHAEL F.; LUK,VINCENT K.; KLAMERUS,ERIC W.; SHIBATA,S.; MITSUGI,S.; COSTELLO,J.F.

    2000-12-18

    The purpose of the program is to investigate the response of representative scale models of nuclear containment to pressure loading beyond the design basis accident and to compare analytical predictions to measured behavior. This objective is accomplished by conducting static, pneumatic overpressurization tests of scale models at ambient temperature. This research program consists of testing two scale models: a steel containment vessel (SCV) model (tested in 1996) and a prestressed concrete containment vessel (PCCV) model, which is the subject of this paper.

  16. Estimation Of Blood Vessels Functional State By Means Of Analysis Of Temperature Reaction On Occlusive Test

    Directory of Open Access Journals (Sweden)

    A.P. Rytik

    2009-12-01

    Full Text Available Temperature reaction of distant phalanges in the case of the occlusive test has been registered. It has been revealed that the temperature reaction on the occlusive test for the group of patients with disturbances of vessel tone regulation differs from the reaction of norm group. Possible influence of vessel regulation state and volumetric blood supply on the skin temperature dynamics has been estimated. Diagnostic ability of the temperature occlusive test has been investigated

  17. Structure analysis of a reactor pressure vessel by two- and three-dimensional models. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Sacher, H.; Mayr, M.

    1982-03-01

    This paper investigates the reactor pressure vessel of a 1300 MW pressurised water reactor. In order to determine the stresses and deformations of the vessel, two- and three-dimensional finite element models are used which represent the real structure with different degrees of accuracy. The results achieved by these different models are compared for the case of the transient called ''Start up of the nuclear power plant''. 5 refs.

  18. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  19. Comparative analysis of methods for extracting vessel network on breast MRI images

    Science.gov (United States)

    Gaizer, Bence T.; Vassiou, Katerina G.; Lavdas, Eleftherios; Arvanitis, Dimitrios L.; Fezoulidis, Ioannis V.; Glotsos, Dimitris T.

    2017-11-01

    Digital processing of MRI images aims to provide an automatized diagnostic evaluation of regular health screenings. Cancerous lesions are proven to cause an alteration in the vessel structure of the diseased organ. Currently there are several methods used for extraction of the vessel network in order to quantify its properties. In this work MRI images (Signa HDx 3.0T, GE Healthcare, courtesy of University Hospital of Larissa) of 30 female breasts were subjected to three different vessel extraction algorithms to determine the location of their vascular network. The first method is an experiment to build a graph over known points of the vessel network; the second algorithm aims to determine the direction and diameter of vessels at these points; the third approach is a seed growing algorithm, spreading selection to neighbors of the known vessel pixels. The possibilities shown by the different methods were analyzed, and quantitative measurements were performed. The data provided by these measurements showed no clear correlation with the presence or malignancy of tumors, based on the radiological diagnosis of skilled physicians.

  20. SEM/EDS analysis of soil and roasting vessels fragments from ancient mercury ore roasting sites at Idrija area

    Directory of Open Access Journals (Sweden)

    Tamara Teršič

    2011-06-01

    Full Text Available Numerous roasting vessels fragments can be found at ancient roasting site areas in the surroundings of Idrija town, which were used for ore roasting in the first 150 years of Hg production in Idrija. The earthen vessels fragments lay just below the surface humus layer and in some parts they stretch more than 1 meter deep; they arecovered with red (cinnabar or black (metacinnabar coatings.SEM/EDS analysis of roasting vessels fragments and soil samples from roasting site areas P{enk and Frbejžene trate was performed in order to characterize the solid forms of Hg in applied sampling material. Mercuric sulphide HgS was found to be the main mercury compound present in the samples. Analysis of earthen vessels fragmentsshowed abundant HgS coatings on the surface of ceramics, forming either crust-like aggregates on matrix or isolated grains. Some well-shaped grains with indicated structure and the size of up to 200 μm could also be observed. In soil HgS was present as powder-like concentrations scattered in soil samples, frequently coating silicate and quartz crystals and clay-minerals. Polycristalline, mercury- and sulphur- rich particles comprising silica, clay mineralsand Al-, Fe- and Mg-oxides that were also observed in the samples were interpreted as soil aggregates infiltrated by mercuric and sulphur vapours and by liquid mercury spilled during roasting. These particles suggest a possible presence of mercury-sulphur associations other than HgS.

  1. PerfAndPubTools – Tools for Software Performance Analysis and Publishing of Results

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2016-05-01

    Full Text Available PerfAndPubTools consists of a set of MATLAB/Octave functions for the post-processing and analysis of software performance benchmark data and producing associated publication quality materials.

  2. Integrating Multi-Vendor Software Analysis into the Lifecycle for Reliability, Productivity, and Performance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the proposed work is to create new ways to manage, visualize, and share data produced by multiple software analysis tools, and to create a framework for...

  3. Development of the Free-space Optical Communications Analysis Software (FOCAS)

    Science.gov (United States)

    Jeganathan, M.; Mecherle, G.; Lesh, J.

    1998-01-01

    The Free-space Optical Communications Analysis Software (FOCAS) was developed at the Jet Propulsion Laboratory (JPL) to provide mission planners, systems engineers and communications engineers with an easy to use tool to analyze optical communications link.

  4. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  5. A COMPARISON OF STEPWISE AND FUZZY MULTIPLE REGRESSION ANALYSIS TECHNIQUES FOR MANAGING SOFTWARE PROJECT RISKS: ANALYSIS PHASE

    OpenAIRE

    Abdelrafe Elzamly; Burairah Hussin

    2014-01-01

    Risk is not always avoidable, but it is controllable. The aim of this study is to identify whether those techniques are effective in reducing software failure. This motivates the authors to continue the effort to enrich the managing software project risks with consider mining and quantitative approach with large data set. In this study, two new techniques are introduced namely stepwise multiple regression analysis and fuzzy multiple regression to manage the software risks. Two evaluation proc...

  6. Application of Texture Analysis to Study Small Vessel Disease and Blood–Brain Barrier Integrity

    Directory of Open Access Journals (Sweden)

    Maria del C. Valdés Hernández

    2017-07-01

    Full Text Available ObjectivesWe evaluate the alternative use of texture analysis for evaluating the role of blood–brain barrier (BBB in small vessel disease (SVD.MethodsWe used brain magnetic resonance imaging from 204 stroke patients, acquired before and 20 min after intravenous gadolinium administration. We segmented tissues, white matter hyperintensities (WMH and applied validated visual scores. We measured textural features in all tissues pre- and post-contrast and used ANCOVA to evaluate the effect of SVD indicators on the pre-/post-contrast change, Kruskal–Wallis for significance between patient groups and linear mixed models for pre-/post-contrast variations in cerebrospinal fluid (CSF with Fazekas scores.ResultsTextural “homogeneity” increase in normal tissues with higher presence of SVD indicators was consistently more overt than in abnormal tissues. Textural “homogeneity” increased with age, basal ganglia perivascular spaces scores (p < 0.01 and SVD scores (p < 0.05 and was significantly higher in hypertensive patients (p < 0.002 and lacunar stroke (p = 0.04. Hypertension (74% patients, WMH load (median = 1.5 ± 1.6% of intracranial volume, and age (mean = 65.6 years, SD = 11.3 predicted the pre/post-contrast change in normal white matter, WMH, and index stroke lesion. CSF signal increased with increasing SVD post-contrast.ConclusionA consistent general pattern of increasing textural “homogeneity” with increasing SVD and post-contrast change in CSF with increasing WMH suggest that texture analysis may be useful for the study of BBB integrity.

  7. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    Directory of Open Access Journals (Sweden)

    Delphine Ribes

    Full Text Available In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

  8. Analysis of a hardware and software fault tolerant processor for critical applications

    Science.gov (United States)

    Dugan, Joanne B.

    1993-01-01

    Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.

  9. Software for micromorphometric characterization of soil pores obtained from 2-D image analysis

    Directory of Open Access Journals (Sweden)

    Miguel Cooper

    2016-08-01

    Full Text Available ABSTRACT Studies of soil porosity through image analysis are important to an understanding of how the soil functions. However, the lack of a simplified methodology for the quantification of the shape, number, and size of soil pores has limited the use of information extracted from images. The present work proposes a software program for the quantification and characterization of soil porosity from data derived from 2-D images. The user-friendly software was developed in C++ and allows for the classification of pores in terms of size, shape, and combinations of size and shape. Using raw data generated by image analysis systems, the software calculates the following parameters for the characterization of soil porosity: total area of pore (Tap, number of pores, pore shape, pore shape and pore area, and pore shape and equivalent pore diameter (EqDiam. In this paper, the input file with the raw soil porosity data was generated using the Noesis Visilog 5.4 image analysis system; however other image analysis programs can be used, in which case, the input file requires a standard format to permit processing by this software. The software also shows the descriptive statistics (mean, standard deviation, variance, and the coefficient of variation of the parameters considering the total number of images evaluated. The results show that the software is a complementary tool to any analysis of soil porosity, allowing for a precise and quick analysis.

  10. A Review of CEFA Software: Comprehensive Exploratory Factor Analysis Program

    Science.gov (United States)

    Lee, Soon-Mook

    2010-01-01

    CEFA 3.02(Browne, Cudeck, Tateneni, & Mels, 2008) is a factor analysis computer program designed to perform exploratory factor analysis. It provides the main properties that are needed for exploratory factor analysis, namely a variety of factoring methods employing eight different discrepancy functions to be minimized to yield initial…

  11. Structural analysis of the ITER Vacuum Vessel regarding 2012 ITER Project-Level Loads

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, J.-M., E-mail: jean-marc.martinez@live.fr [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul lez Durance (France); Jun, C.H.; Portafaix, C.; Choi, C.-H.; Ioki, K.; Sannazzaro, G.; Sborchia, C. [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul lez Durance (France); Cambazar, M.; Corti, Ph.; Pinori, K.; Sfarni, S.; Tailhardat, O. [Assystem EOS, 117 rue Jacquard, L' Atrium, 84120 Pertuis (France); Borrelly, S. [Sogeti High Tech, RE2, 180 rue René Descartes, Le Millenium – Bat C, 13857 Aix en Provence (France); Albin, V.; Pelletier, N. [SOM Calcul – Groupe ORTEC, 121 ancien Chemin de Cassis – Immeuble Grand Pré, 13009 Marseille (France)

    2014-10-15

    Highlights: • ITER Vacuum Vessel is a part of the first barrier to confine the plasma. • ITER Vacuum Vessel as Nuclear Pressure Equipment (NPE) necessitates a third party organization authorized by the French nuclear regulator to assure design, fabrication, conformance testing and quality assurance, i.e. Agreed Notified Body (ANB). • A revision of the ITER Project-Level Load Specification was implemented in April 2012. • ITER Vacuum Vessel Loads (seismic, pressure, thermal and electromagnetic loads) were summarized. • ITER Vacuum Vessel Structural Margins with regards to RCC-MR code were summarized. - Abstract: A revision of the ITER Project-Level Load Specification (to be used for all systems of the ITER machine) was implemented in April 2012. This revision supports ITER's licensing by accommodating requests from the French regulator to maintain consistency with the plasma physics database and our present understanding of plasma transients and electro-magnetic (EM) loads, to investigate the possibility of removing unnecessary conservatism in the load requirements and to review the list and definition of incidental cases. The purpose of this paper is to present the impact of this 2012 revision of the ITER Project-Level Load Specification (LS) on the ITER Vacuum Vessel (VV) loads and the main structural margins required by the applicable French code, RCC-MR.

  12. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  13. [Develop a statistics analysis software in population genetics using VBA language].

    Science.gov (United States)

    Cai, Ying; Zhou, Ni; Xu, Ye-li; Xiang, Da-peng; Su, Jiang-hui; Zhang, Lin-tian

    2006-12-01

    To develop a statistics analysis software that can be used in STR population genetics for the purpose of promoting and fastening the basic research of STR population genetics. Selecting the Microsoft VBA for Excel, which is simple and easy to use, as the program language and using its macro function to develop a statistics analysis software used in STR population genetics. The software "Easy STR Genetics" based on VBA language, by which the population genetic analysis of STR data can be made, were developed. The developed software "Easy STR Genetics" based on VBA language, can be spread in the domain of STR population genetics research domestically and internationally, due to its feature of full function, good compatibility for different formats of input data, distinct and easy to understand outputs for statistics and calculation results.

  14. Integrating Multiple Autonomous Underwater Vessels, Surface Vessels and Aircraft into Oceanographic Research Vessel Operations

    Science.gov (United States)

    McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.

    2012-12-01

    Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the

  15. Use of computed tomography and automated software for quantitative analysis of the vasculature of patients with pulmonary hypertension

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Danilo Tadao; Pádua, Adriana Ignácio de; Lima Filho, Moyses Oliveira; Marin Neto, José Antonio; Elias Júnior, Jorge; Baddini-Martinez, José; Santos, Marcel Koenigkam, E-mail: danilowada@yahoo.com.br [Universidade de São Paulo (HCFMRP/USP), Ribeirão Preto, SP (Brazil). Faculdade de Medicina. Hospital das Clínicas

    2017-11-15

    Objective: To perform a quantitative analysis of the lung parenchyma and pulmonary vasculature of patients with pulmonary hypertension (PH) on computed tomography angiography (CTA) images, using automated software. Materials And Methods: We retrospectively analyzed the CTA findings and clinical records of 45 patients with PH (17 males and 28 females), in comparison with a control group of 20 healthy individuals (7 males and 13 females); the mean age differed significantly between the two groups (53 ± 14.7 vs. 35 ± 9.6 years; p = 0.0001). Results: The automated analysis showed that, in comparison with the controls, the patients with PH showed lower 10{sup th} percentile values for lung density, higher vascular volumes in the right upper lung lobe, and higher vascular volume ratios between the upper and lower lobes. In our quantitative analysis, we found no differences among the various PH subgroups. We inferred that a difference in the 10{sup th} percentile values indicates areas of hypovolaemia in patients with PH and that a difference in pulmonary vascular volumes indicates redistribution of the pulmonary vasculature and an increase in pulmonary vasculature resistance. Conclusion: Automated analysis of pulmonary vessels on CTA images revealed alterations and could represent an objective diagnostic tool for the evaluation of patients with PH. (author)

  16. Growth and Remodeling in Blood Vessels Studied In Vivo With Fractal Analysis

    Science.gov (United States)

    Parsons-Wingerter, Patricia A.

    2003-01-01

    Every cell in the human body must reside in close proximity to a blood vessel (within approximately 200 mm) because blood vessels provide the oxygen, metabolite, and fluid exchanges required for cellular existence. The growth and remodeling of blood vessels are required to support the normal physiology of embryonic development, reproductive biology, wound healing and adaptive remodeling to exercise, as well as abnormal tissue change in diseases such as cancer, diabetes, and coronary heart disease. Cardiovascular and hemodynamic (blood flow dynamics) alterations experienced by astronauts during long-term spaceflight, including orthostatic intolerance, fluid shifts in the body, and reduced numbers of red (erythrocyte) and white (immune) blood cells, are identified as risk factors of very high priority in the NASA task force report on risk reduction for human spaceflight, the "Critical Path Roadmap."

  17. Analysis on flexible manufacturing system layout using arena simulation software

    Science.gov (United States)

    Fadzly, M. K.; Saad, Mohd Sazli; Shayfull, Z.

    2017-09-01

    Flexible manufacturing system (FMS) was defined as highly automated group technology machine cell, consisting of a group of processing stations interconnected by an automated material handling and storage system, and controlled by an integrated computer system. FMS can produce parts or products are in the mid-volume, mid-variety production range. The layout system in FMS is an important criterion to design the FMS system to produce a part or product. This facility layout of an FMS involves the positioning of cells within given boundaries, so as to minimize the total projected travel time between cells. Defining the layout includes specifying the spatial coordinates of each cell, its orientation in either a horizontal or vertical position, and the location of its load or unloads point. There are many types of FMS layout such as In-line, loop ladder and robot centered cell layout. The research is concentrating on the design and optimization FMS layout. The final conclusion can be summarized that the objective to design and optimisation of FMS layout for this study is successful because the FMS In-line layout is the best layout based on effective time and cost using ARENA simulation software.

  18. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  19. [Vascular access for haemodyalisis. Comparative analysis of the mechanical behaviour of native vessels and prosthesis].

    Science.gov (United States)

    Bia, D; Zócalo, Y; Armentano, R; Pérez, H; Cabrera, E; Saldías, M; Galli, C; Alvarez, I

    2006-01-01

    The prosthesis nowadays used in the vascular access for haemodialysis have low patency rates, mainly due to the luminal obstruction, determined by the intimal hyperplasia. Several factors have been related to de development of intimal hyperplasia and graft failure. Among them are the differences in the biomechanical properties between the prosthesis and the native vessels. In the searching for vascular prosthesis that overcomes the limitations of the currently used, the cryopreserved vessels (cryografts) appear as an alternative of growing interest. However, it is unknown if the mechanical differences or mismatch between prosthesis and native vessels are lesser when using cryografts. To characterize and compare the biomechanical behaviour of native vessels used in vascular access and cryografts. Additionally, segments of expanded polytetrafluoroethylene (ePTFE) were also evaluated, so as to evaluate the potential biomechanical advantages of the cryografts respect to synthetic prosthesis used in vascular access. Segments from human humeral (n = 12), carotid (n = 12) and femoral (n = 12) arteries, and saphenous vein (n = 12), were obtained from 6 multiorgan donors. The humeral arteries were studied in fresh state. The other segments were divided into two groups, and 6 segments from each vessel were studied in fresh state, while the remaining 6 segments were evaluated after 30 days of criopreservation. For the mechanical evaluation the vascular segments and 6 segments of ePTFE were mounted in a circulation mock and submitted to haemodynamic conditions similar to those of the in vivo. Instantaneous pressure (Konigsberg) and diameter (Sonomicrometry) were measured and used to calculate the viscous and elastic indexes, the compliance, distensibility and characteristic impedance. For each mechanical parameter studied, the mismatch between the prosthesis and the native vessel was evaluated. The ePTFE was the prosthesis with the higher mechanical mismatch (p vascular

  20. Computer Software for Design, Analysis and Control of Fluid Power Systems

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben; Grahl-Madsen, Mads

    1999-01-01

    This Deliverable presents contributions from SWING's Task 2.3 Analysis of available software solutions. The Deliverable has focus on the results from this analysis having in mind the task objectives·to carry out a thorough analysis of the state-of the-art solutions for fluid power systems modelling...

  1. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  2. A continuum damage analysis of hydrogen attack in 2.25 Cr-1Mo vessel

    DEFF Research Database (Denmark)

    van der Burg, M.W.D.; van der Giessen, E.; Tvergaard, Viggo

    1998-01-01

    reaction of carbides with hydrogen, thus forming cavities with high pressure methane gas. Driven by the methane gas pressure, the cavities grow, while remote tensile stresses can significantly enhance the cavitation rate. The damage model gives the strain-rate and damage rate as a function...... and later decelerate the cavitation rate significantly. Numerical studies for different material parameters and different stress conditions demonstrate the HA process inside a vessel in time. Also, the lifetime of the pressure vessel is determined. The analyses underline that the general applicability...

  3. Probabilistic Analysis of Collision Damages with Application to ro-Ro Passenger Vessels

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis; Nielsen, Lars Peter

    1997-01-01

    To quantify the risks involved in Ro-ro passenger vessel traffic, rational criteria for prediction and evaluation of collision accidents have to be developed. This implies that probabilities as well as the inherent consequences have to be analyzed and assessed.The present report outlines a method...... for evaluation of the probability of a Ro-Ro passenger vessel on a given route being struck by another ship. Given a collision has taken place the spatial distribution of the collision damages is calculated. Results are presented in terms of probability distributions, for indentation depth, length and height...

  4. Detection of Atrial Fibrillation Among Patients With Stroke Due to Large or Small Vessel Disease: A Meta-Analysis

    OpenAIRE

    Demeestere, Jelle; Fieuws, Steffen; Lansberg, Maarten G.; Lemmens, Robin

    2016-01-01

    Background-Recent trials have demonstrated that extended cardiac monitoring increases the yield of paroxysmal atrial fibrillation (AF) detection in patients with cryptogenic stroke. The utility of extended cardiac monitoring is uncertain among patients with stroke caused by small and large vessel disease. We conducted a meta-analysis to estimate the yield of AF detection in this population. Methods and Results-We searched PubMed, Cochrane, and SCOPUS databases for studies on AF detection in s...

  5. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  6. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  7. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Glasscock, J.A.

    1995-03-08

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies.

  8. Business Analysis Skills and Techniques Among Software Developers from Various BPO Industries In Iloilo City, Philippines

    Directory of Open Access Journals (Sweden)

    Alex Ledonio

    2016-11-01

    Full Text Available In Iloilo City, Philippines, BPO Industry is booming and an upcoming Megaworld Business District situates a multitude of BPO companies. In this study the software developers of various BPO companies in Iloilo City were evaluated according to their competency on Business Analysis Skill and Techniques. A common misconception is that IT programmers should be detached to business analysis process and will just have to wait for the requirement solution to implement through software development. This study will gauge how much skills and knowledge they possess on the Business Analysis side. The result of the study reveals that software developers evaluated has an average rating on Business Analysis Tasks and Techniques. Respondents are lacking skills generally on business planning, business requirements analysis, and elicitation processes. These results can be used as a baseline data to recommend a necessary adjustment in school curricula.

  9. Software Architecture of Code Analysis Frameworks Matters: The Frama-C Example

    Directory of Open Access Journals (Sweden)

    Julien Signoles

    2015-08-01

    Full Text Available Implementing large software, as software analyzers which aim to be used in industrial settings, requires a well-engineered software architecture in order to ease its daily development and its maintenance process during its lifecycle. If the analyzer is not only a single tool, but an open extensible collaborative framework in which external developers may develop plug-ins collaborating with each other, such a well designed architecture even becomes more important. In this experience report, we explain difficulties of developing and maintaining open extensible collaborative analysis frameworks, through the example of Frama-C, a platform dedicated to the analysis of code written in C. We also present the new upcoming software architecture of Frama-C and how it aims to solve some of these issues.

  10. Parallel line analysis: multifunctional software for the biomedical sciences

    Science.gov (United States)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  11. Cerebral blood flow in small vessel disease : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Shi, Yulu; Thrippleton, Michael J; Makin, Stephen D; Marshall, Ian; Geerlings, Mirjam I; de Craen, Anton Jm; van Buchem, Mark A; Wardlaw, Joanna M

    2016-01-01

    White matter hyperintensities are frequent on neuroimaging of older people and are a key feature of cerebral small vessel disease. They are commonly attributed to chronic hypoperfusion, although whether low cerebral blood flow is cause or effect is unclear. We systematically reviewed studies that

  12. Computational Fluid Dynamics Analysis of Pulsatile Blood Flow Behavior in Modelled Stenosed Vessels with Different Severities

    Directory of Open Access Journals (Sweden)

    Mohsen Mehrabi

    2012-01-01

    Full Text Available This study focuses on the behavior of blood flow in the stenosed vessels. Blood is modelled as an incompressible non-Newtonian fluid which is based on the power law viscosity model. A numerical technique based on the finite difference method is developed to simulate the blood flow taking into account the transient periodic behaviour of the blood flow in cardiac cycles. Also, pulsatile blood flow in the stenosed vessel is based on the Womersley model, and fluid flow in the lumen region is governed by the continuity equation and the Navier-Stokes equations. In this study, the stenosis shape is cosine by using Tu and Devil model. Comparing the results obtained from three stenosed vessels with 30%, 50%, and 75% area severity, we find that higher percent-area severity of stenosis leads to higher extrapressure jumps and higher blood speeds around the stenosis site. Also, we observe that the size of the stenosis in stenosed vessels does influence the blood flow. A little change on the cross-sectional value makes vast change on the blood flow rate. This simulation helps the people working in the field of physiological fluid dynamics as well as the medical practitioners.

  13. Software package for the design and analysis of DNA origami structures

    DEFF Research Database (Denmark)

    Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong

    A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....

  14. Review Essay: Guidance in the World of Computer-Assisted Qualitative Data Analysis Software (CAQDAS) Programs

    OpenAIRE

    Áine Humble

    2015-01-01

    This review discusses Christina SILVER and Ann Lewins' book, "Using Software in Qualitative Research: A Step-by-Step Guide" (2nd ed.). This book is an impressive undertaking, with online supplemental material in the form of three data sets consisting of many different types of data, detailed instructions for seven CAQDAS (Computer-Assisted Qualitative Data Analysis Software) programs, and full-color reproductions of illustrations from the book. The 14 chapters in the book cover a wide range o...

  15. Development of high performance casting analysis software by coupled parallel computation

    Directory of Open Access Journals (Sweden)

    Sang Hyun CHO

    2007-08-01

    Full Text Available Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation, mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field, the high performance analysis is essential to shorten the gap between product designing time and prototyping time. The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing and MPI (Message Passing Interface (1 methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes, the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  16. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    Science.gov (United States)

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  17. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  18. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    Directory of Open Access Journals (Sweden)

    Robert Oostenveld

    2011-01-01

    Full Text Available This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  19. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  20. A novel community driven software for functional enrichment analysis of extracellular vesicles data

    Science.gov (United States)

    Pathan, Mohashin; Keerthikumar, Shivakumar; Chisanga, David; Alessandro, Riccardo; Ang, Ching-Seng; Askenase, Philip; Batagov, Arsen O.; Benito-Martin, Alberto; Camussi, Giovanni; Clayton, Aled; Collino, Federica; Di Vizio, Dolores; Falcon-Perez, Juan Manuel; Fonseca, Pedro; Fonseka, Pamali; Fontana, Simona; Gho, Yong Song; Hendrix, An; Hoen, Esther Nolte-’t; Iraci, Nunzio; Kastaniegaard, Kenneth; Kislinger, Thomas; Kowal, Joanna; Kurochkin, Igor V.; Leonardi, Tommaso; Liang, Yaxuan; Llorente, Alicia; Lunavat, Taral R.; Maji, Sayantan; Monteleone, Francesca; Øverbye, Anders; Panaretakis, Theocharis; Patel, Tushar; Peinado, Héctor; Pluchino, Stefano; Principe, Simona; Ronquist, Goran; Royo, Felix; Sahoo, Susmita; Spinelli, Cristiana; Stensballe, Allan; Théry, Clotilde; van Herwijnen, Martijn J.C.; Wauben, Marca; Welton, Joanne L.; Zhao, Kening; Mathivanan, Suresh

    2017-01-01

    ABSTRACT Bioinformatics tools are imperative for the in depth analysis of heterogeneous high-throughput data. Most of the software tools are developed by specific laboratories or groups or companies wherein they are designed to perform the required analysis for the group. However, such software tools may fail to capture “what the community needs in a tool”. Here, we describe a novel community-driven approach to build a comprehensive functional enrichment analysis tool. Using the existing FunRich tool as a template, we invited researchers to request additional features and/or changes. Remarkably, with the enthusiastic participation of the community, we were able to implement 90% of the requested features. FunRich enables plugin for extracellular vesicles wherein users can download and analyse data from Vesiclepedia database. By involving researchers early through community needs software development, we believe that comprehensive analysis tools can be developed in various scientific disciplines. PMID:28717418

  1. A unified approach to feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of software is a prerequisite to incorporating modifications requested by users during software evolution and maintenance. However, feature-centric understanding of large object-oriented programs is difficult to achieve due to size, complexity and implicit cha......-racter of mappings between features and source code. In this paper, we address these issues through our unified approach to feature-centric analysis of object-oriented software. Our approach supports discovery of feature-code traceability links and their analysis from three perspectives and at three levels...... of abstraction. We further improve scalability of analysis by partitioning features into canonical groups. To demonstrate feasibility our approach, we use our NetBeans-integrated tool Featureous for conducting a case study of feature-centric analysis of the JHotDraw project. Lastly, we discuss how Featureous...

  2. Multi-dimensional project evaluation: Combining cost-benefit analysis and multi-criteria analysis with the COSIMA software system

    DEFF Research Database (Denmark)

    This paper proposes a methodology that integrates quantitative and qualitative assessment. The methodology proposed combines conventional cost-benefit analysis (CBA) with multi-criteria analysis (MCA). The CBA methodology, based on welfare theory, assures that the project with the highest welfare...... different methods for combining cost-benefit analysis and multi-criteria analysis are examined and compared and a software system is presented. The software system gives the decision makers some possibilities regarding preference analysis, sensitivity and risk analysis. The aim of the software...... for society is ranked uppermost. To compare the different impacts, it is necessary to have a common monetary unit. Theoretically, all benefits and all costs should be accounted for in socio-economic cost-benefit analysis. However, this is far from in practical the general case due to difficulties...

  3. Analysis of ionospheric parameters by the software system "Aurora"

    Science.gov (United States)

    Polozov, Yury; Fetisova, Nadezhda

    2017-10-01

    The paper presents methods of modeling and analysis of ionospheric parameters, which realized in the program system of complex analysis of geophysical parameters "Aurora". The methods allow to analyze of characteristic changes in the ionospheric parameters and allocate the anomalous features during periods of ionospheric disturbances. The algorithm parameters are adapted for analyzing the ionospheric data of the Paratunka station (Kamchatka) and based on results of the estimates (station data of Yakutsk, Gakona, etc. were analyzed). Methods can be applied for the mid-latitude region. The system is implemented in the public domain (http://aurorasa.ikir.ru%243a8580/" ext-link-type="uri">http://aurorasa.ikir.ru:8580). The research was supported by RSF Grant, project No 14-11-00194.

  4. Reliability and accuracy of three different computerized cephalometric analysis software.

    Science.gov (United States)

    Rusu, Oana; Petcu, Ana Elena; Drăgan, Eliza; Haba, Danisia; Moscalu, Mihaela; Zetu, Irina Nicoleta

    2015-01-01

    The aim of this investigation was to determine, compare and evaluate three different computerized tracing programs, where the lateral cephalograms were digitized on the screen. 39 randomly selected cephalometric radiographs were used in the present study. Three programs Planmeca Romexis® (Romexis 3.2.0., Helsinki, Finland), Orthalis (France) and AxCeph (A.C 2.3.0.74, Ljubljana, Slovenia) were evaluated. 12 skeletal, 9 dental and 3 soft tissue parameters were measured that consisted of 11 linear and 13 angular measurements. Statistical analysis was carried out using multivariate analysis of variance (MANOVA), Levene test, Tukey Honestly Significant Difference (HSD) test and Kruskal-Wallis test. The measurements obtained with the cephalometric analyses programs used in the study were reliable.

  5. An instructional guide for leaf color analysis using digital imaging software

    Science.gov (United States)

    Paula F. Murakami; Michelle R. Turner; Abby K. van den Berg; Paul G. Schaberg

    2005-01-01

    Digital color analysis has become an increasingly popular and cost-effective method utilized by resource managers and scientists for evaluating foliar nutrition and health in response to environmental stresses. We developed and tested a new method of digital image analysis that uses Scion Image or NIH image public domain software to quantify leaf color. This...

  6. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  7. Millstone: software for multiplex microbial genome analysis and engineering.

    Science.gov (United States)

    Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  8. Statistical Analysis Software for the TRS-80 Microcomputer.

    Science.gov (United States)

    1981-09-01

    007011260 11240 LC-LCMC 112500070 11210 11240 F0LsCe*PLC 11270 0X01l-FOX i1280 RETURN 67 11290 14Xa(4CX/DF)C (1/3) - (1-(21(9.OF)DI)/SQR(2/(9*OF)) 11300...Linear Regression"FRIT I007 PRINT#4 Analysis of Variance’ 100m KPB898 : zs-X : oOSUu ISO 1009 IF 10.4 0070 20 101001 10.I~3 THEN ZT=*20 10070 10120

  9. Analysis and Design of Software-Based Optimal PID Controllers

    OpenAIRE

    Garpinger, Olof

    2015-01-01

    A large process industry can have somewhere between five hundred and five thousand control loops, and PID controllers are used in 90–97% of the cases. It is well-known that only 20–30% of the controllers in the process industry are tuned satisfactorily, but with the methods available today it is considered too time-consuming to optimize each single controller. This thesis presents tools for analysis and design of optimal PID controllers, and suggests when and how to use them efficiently. High...

  10. Towards a software approach to mitigate correlation power analysis

    CSIR Research Space (South Africa)

    Frieslaar, I

    2016-07-01

    Full Text Available achieves the same results as sampling at 2 GS/s asynchronously (O’Flynn and Chen, 2012). Once the system clock and the device clock is syn- chronized it is possible to multiple the digital signal. The attack is carried out by using the provided board known... to Mitigate Correlation Power Analysis Ibraheem Frieslaar1,2, Barry Irwin2 1Modelling and Digital Science, Council for Scientific and Industrial Research, Pretoria, South Africa. 2Department of Computer Science, Rhodes University, Grahamstown, South Africa...

  11. Engineering Evaluation/Cost Analysis for Power Burst Facility (PER-620) Final End State and PBF Vessel Disposal

    Energy Technology Data Exchange (ETDEWEB)

    B. C. Culp

    2007-05-01

    Preparation of this engineering evaluation/cost analysis is consistent with the joint U.S. Department of Energy and U.S. Environmental Protection Agency Policy on Decommissioning of Department of Energy Facilities Under the Comprehensive Environmental Response, Compensation, and Liability Act, (DOE and EPA 1995) which establishes the Comprehensive Environmental, Response, Compensation, and Liability Act non-time critical removal action process as an approach for decommissioning. The scope of this engineering evaluation/cost analysis is to evaluate alternatives and recommend a preferred alternative for the final end state of the PBF and the final disposal location for the PBF vessel.

  12. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.

    Science.gov (United States)

    Zamawe, F C

    2015-03-01

    For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.

  13. Research vessels

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, P.S.

    by the research vessels RV Gaveshani and ORV Sagar Kanya are reported. The work carried out by the three charted ships is also recorded. A short note on cruise plans for the study of ferromanganese nodules is added...

  14. Weight optimization of offshore supply vessel based on structural analysis using finite element meth

    Directory of Open Access Journals (Sweden)

    Ahmed M.H. Elhewy

    2016-06-01

    Full Text Available Ship design process usually relies on statistics and comparisons with existing ships, rather than analytical approaches and optimization techniques. Designers found this way as the best to fulfil the owner’s requirements, but better solutions, for both the shipyard and the owner may exist. Assessing ship life cycle cost is one of the most attractive tasks for shipyard during early design stage. Structural optimization can be used to achieve that task. In this paper, a comprehensive study on the structural optimization of an offshore supply vessel (OSV, as a case study, is presented. Detailed structural modeling of the vessel is created. Various environmental loads acting on the ship hull such as still water loads and wave induced loads are briefly explained. Different loading conditions and corresponding structural responses have been investigated to assign the most severe one on the vessel. The basic concept of structural optimization and optimization characteristics is highlighted. Blind search optimization technique is applied and approximately forty-two percent weight and cost savings are found by comparing the weight of various design scenarios together without showing any structural inadequacy.

  15. Nuclear analysis and shielding optimisation in support of the ITER In-Vessel Viewing System design

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Andrew, E-mail: andrew.turner@ccfe.ac.uk [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Pampin, Raul [F4E Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, 08019 Barcelona (Spain); Loughlin, M.J. [ITER Organisation, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Ghani, Zamir; Hurst, Gemma [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Lo Bue, Alessandro [F4E Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, 08019 Barcelona (Spain); Mangham, Samuel [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Puiu, Adrian [F4E Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, 08019 Barcelona (Spain); Zheng, Shanliang [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom)

    2014-10-15

    The In-Vessel Viewing System (IVVS) units proposed for ITER are deployed to perform in-vessel examination. During plasma operations, the IVVS is located beyond the vacuum vessel, with shielding blocks envisaged to protect components from neutron damage and reduce shutdown dose rate (SDR) levels. Analyses were conducted to determine the effectiveness of several shielding configurations. The neutron response of the system was assessed using global variance reduction techniques and a surface source, and shutdown dose rate calculations were undertaken using MCR2S. Unshielded, the absorbed dose to piezoelectric motors (PZT) was found to be below stable limits, however activation of the primary closure plate (PCP) was prohibitively high. A scenario with shielding blocks at probe level showed significantly reduced PCP contact dose rate, however still marginally exceeded port cell requirements. The addition of shielding blocks at the bioshield plug demonstrated PCP contact dose rates below project requirements. SDR levels in contact with the isolated IVVS cartridge were found to marginally exceed the hands-on maintenance limit. For engineering feasibility, shielding blocks at bioshield level are to be avoided, however the port cell SDR field requires further consideration. In addition, alternative low-activation steels are being considered for the IVVS cartridge.

  16. Analysis and optimization on in-vessel inspection robotic system for EAST

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Weijun, E-mail: zhangweijun@sjtu.edu.cn; Zhou, Zeyu; Yuan, Jianjun; Du, Liang; Mao, Ziming

    2015-12-15

    Since China has successfully built her first Experimental Advanced Superconducting TOKAMAK (EAST) several years ago, great interest and demand have been increasing in robotic in-vessel inspection/operation systems, by which an observation of in-vessel physical phenomenon, collection of visual information, 3D mapping and localization, even maintenance are to be possible. However, it has been raising many challenges to implement a practical and robust robotic system, due to a lot of complex constraints and expectations, e.g., high remanent working temperature (100 °C) and vacuum (10{sup −3} pa) environment even in the rest interval between plasma discharge experiments, close-up and precise inspection, operation efficiency, besides a general kinematic requirement of D shape irregular vessel. In this paper we propose an upgraded robotic system with redundant degrees of freedom (DOF) manipulator combined with a binocular vision system at the tip and a virtual reality system. A comprehensive comparison and discussion are given on the necessity and main function of the binocular vision system, path planning for inspection, fast localization, inspection efficiency and success rate in time, optimization of kinematic configuration, and the possibility of underactuated mechanism. A detailed design, implementation, and experiments of the binocular vision system together with the recent development progress of the whole robotic system are reported in the later part of the paper, while, future work and expectation are described in the end.

  17. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  18. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  19. Analysis of dpa Rates in the HFIR Reactor Vessel using a Hybrid Monte Carlo/Deterministic Method*

    Directory of Open Access Journals (Sweden)

    Risner J.M.

    2016-01-01

    Full Text Available The Oak Ridge High Flux Isotope Reactor (HFIR, which began full-power operation in 1966, provides one of the highest steady-state neutron flux levels of any research reactor in the world. An ongoing vessel integrity analysis program to assess radiation-induced embrittlement of the HFIR reactor vessel requires the calculation of neutron and gamma displacements per atom (dpa, particularly at locations near the beam tube nozzles, where radiation streaming effects are most pronounced. In this study we apply the Forward-Weighted Consistent Adjoint Driven Importance Sampling (FW-CADIS technique in the ADVANTG code to develop variance reduction parameters for use in the MCNP radiation transport code. We initially evaluated dpa rates for dosimetry capsule locations, regions in the vicinity of the HB-2 beamline, and the vessel beltline region. We then extended the study to provide dpa rate maps using three-dimensional cylindrical mesh tallies that extend from approximately 12 in. below to approximately 12 in. above the height of the core. The mesh tally structures contain over 15,000 mesh cells, providing a detailed spatial map of neutron and photon dpa rates at all locations of interest. Relative errors in the mesh tally cells are typically less than 1%.

  20. Analysis of dpa rates in the HFIR reactor vessel using a hybrid Monte Carlo/deterministic method

    Energy Technology Data Exchange (ETDEWEB)

    Blakeman, Edward [Retired

    2016-01-01

    The Oak Ridge High Flux Isotope Reactor (HFIR), which began full-power operation in 1966, provides one of the highest steady-state neutron flux levels of any research reactor in the world. An ongoing vessel integrity analysis program to assess radiation-induced embrittlement of the HFIR reactor vessel requires the calculation of neutron and gamma displacements per atom (dpa), particularly at locations near the beam tube nozzles, where radiation streaming effects are most pronounced. In this study we apply the Forward-Weighted Consistent Adjoint Driven Importance Sampling (FW-CADIS) technique in the ADVANTG code to develop variance reduction parameters for use in the MCNP radiation transport code. We initially evaluated dpa rates for dosimetry capsule locations, regions in the vicinity of the HB-2 beamline, and the vessel beltline region. We then extended the study to provide dpa rate maps using three-dimensional cylindrical mesh tallies that extend from approximately 12 below to approximately 12 above the axial extent of the core. The mesh tally structures contain over 15,000 mesh cells, providing a detailed spatial map of neutron and photon dpa rates at all locations of interest. Relative errors in the mesh tally cells are typically less than 1%.

  1. ANALYSIS OF EROSION AND SEDIMENTATION PATTERNS USING SOFTWARE OF MIKE 21 HDFM-MT IN THE KAPUAS MURUNG RIVER MOUTH CENTRAL KALIMANTAN PROVINCE

    Directory of Open Access Journals (Sweden)

    Franto Novico

    2017-07-01

    Full Text Available The public transportation system along the Kapuas River, Central Kalimantan are highly depend on water transportation. Natural condition gives high distribution to the smoothness of the vessel traffic along the Kapuas Murung River. The local government has planned to build specific port for stock pile at the Batanjung which would face with natural phenomena of sedimentation and erosion at a river mouth. Erosion and sedimentation could be predicted not only by field observing but it is also needed hypotheses using software analysis. Hydrodynamics and transport sediment models by Mike 21 HDFM-MT software will be applied to describe the position of sedimentations and erosions at a river mouth. Model is assumed by two different river conditions, wet and dry seasons. Based on two types of conditions the model would also describe the river flow and sediment transport at spring and neap periods. Tidal fluctuations and a river current as field observation data would be verified with the result of model simulations. Based on field observation and simulation results could be known the verification of tidal has an 89.74% correlation while the river current correlation has 43.6%. Moreover, based on the simulation the sediment patterns in flood period have a larger area than ebb period. Furthermore, the erosion patterns dominantly occur during wet and dry season within ebb period. Water depths and sediment patterns should be considered by the vessels that will use the navigation channel at a river mouth.

  2. Public-domain software for root image analysis

    Directory of Open Access Journals (Sweden)

    Mirian Cristina Gomes Costa

    2014-10-01

    Full Text Available In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk, and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve, at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 % revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm. Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm² as well as in CXve (-4231 to 612.1 mm². However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are

  3. Software Ion Scan Functions in Analysis of Glycomic and Lipidomic MS/MS Datasets.

    Science.gov (United States)

    Haramija, Marko

    2017-12-29

    Hardware ion scan functions unique to MS/MS mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS) are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. SIS functions can be easily coded for additional functionalities, such as software multiple precursor ion scan (sMPIS), software no ion scan (sNIS) and software variable ion scan (sVIS) functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. SIS functions can be easily coded using modern script languages, and are independent of instrument manufacturer. Here we show one example of utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis was needed. Based on tables constructed with output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate and efficient manner. Glycomic research is progressing slowly, and with respect to MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel software ion scan functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analyses of lipidomic MS/MS datasets as well, as will be discussed briefly. This article is protected by copyright. All rights reserved.

  4. Software-assisted small bowel motility analysis using free-breathing MRI: feasibility study.

    Science.gov (United States)

    Bickelhaupt, Sebastian; Froehlich, Johannes M; Cattin, Roger; Raible, Stephan; Bouquet, Hanspeter; Bill, Urs; Patak, Michael A

    2014-01-01

    To validate a software prototype allowing for small bowel motility analysis in free breathing by comparing it to manual measurements. In all, 25 patients (15 male, 10 female; mean age 39 years) were included in this Institutional Review Board-approved, retrospective study. Magnetic resonance imaging (MRI) was performed on a 1.5T system after standardized preparation acquiring motility sequences in free breathing over 69-84 seconds. Small bowel motility was analyzed manually and with the software. Functional parameters, measurement time, and reproducibility were compared using the coefficient of variance and paired Student's t-test. Correlation was analyzed using Pearson's correlation coefficient and linear regression. The 25 segments were analyzed twice both by hand and using the software with automatic breathing correction. All assessed parameters significantly correlated between the methods (P software (3.90%, standard deviation [SD] ± 5.69) than manual examinations (9.77%, SD ± 11.08). The time needed was significantly less (P software (4.52 minutes, SD ± 1.58) compared to manual measurement, lasting 17.48 minutes for manual (SD ± 1.75 minutes). The use of the software proves reliable and faster small bowel motility measurements in free-breathing MRI compared to manual analyses. The new technique allows for analyses of prolonged sequences acquired in free breathing, improving the informative value of the examinations by amplifying the evaluable data. Copyright © 2013 Wiley Periodicals, Inc.

  5. A Systematic Analysis of Functional Safety Certification Practices in Industrial Robot Software Development

    Directory of Open Access Journals (Sweden)

    Tong Xie

    2017-01-01

    Full Text Available For decades, industry robotics have delivered on the promise of speed, efficiency and productivity. The last several years have seen a sharp resurgence in the orders of industrial robots in China, and the areas addressed within industrial robotics has extended into safety-critical domains. However, safety standards have not yet been implemented widely in academia and engineering applications, particularly in robot software development. This paper presents a systematic analysis of functional safety certification practices in software development for the safety-critical software of industrial robots, to identify the safety certification practices used for the development of industrial robots in China and how these practices comply with the safety standard requirements. Reviewing from Chinese academic papers, our research shows that safety standards are barely used in software development of industrial robot. The majority of the papers propose various solutions to achieve safety, but only about two thirds of the papers refer to non-standardized approaches that mainly address the systematic level rather than the software development level. In addition, our research shows that with the development of artificial intelligent, an emerging field is still on the quest for standardized and suitable approaches to develop safety-critical software.

  6. TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Knio, Omar [Johns Hopkins Univ., Baltimore, MD (United States)

    2011-05-01

    The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

  7. Arlequin (version 3.0: An integrated software package for population genetics data analysis

    Directory of Open Access Journals (Sweden)

    Stefan Schneider

    2005-01-01

    Full Text Available Arlequin ver 3.0 is a software package integrating several basic and advanced methods for population genetics data analysis, like the computation of standard genetic diversity indices, the estimation of allele and haplotype frequencies, tests of departure from linkage equilibrium, departure from selective neutrality and demographic equilibrium, estimation or parameters from past population expansions, and thorough analyses of population subdivision under the AMOVA framework. Arlequin 3 introduces a completely new graphical interface written in C++, a more robust semantic analysis of input files, and two new methods: a Bayesian estimation of gametic phase from multi-locus genotypes, and an estimation of the parameters of an instantaneous spatial expansion from DNA sequence polymorphism. Arlequin can handle several data types like DNA sequences, microsatellite data, or standard multilocus genotypes. A Windows version of the software is freely available on http://cmpg.unibe.ch/software/arlequin3.

  8. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  9. Review Essay: Guidance in the World of Computer-Assisted Qualitative Data Analysis Software (CAQDAS Programs

    Directory of Open Access Journals (Sweden)

    Áine Humble

    2015-03-01

    Full Text Available This review discusses Christina SILVER and Ann LEWINS' book, "Using Software in Qualitative Research: A Step-by-Step Guide" (2nd ed.. This book is an impressive undertaking, with online supplemental material in the form of three data sets consisting of many different types of data, detailed instructions for seven CAQDAS (Computer-Assisted Qualitative Data Analysis Software programs, and full-color reproductions of illustrations from the book. The 14 chapters in the book cover a wide range of analysis issues when working with software programs, and the authors encourage critical use of such tools. Readers will benefit from engaging with the online supplemental tools. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1502223

  10. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  11. Morphometric analysis of rat femoral vessels under a video magnification system

    Directory of Open Access Journals (Sweden)

    Rui Sergio Monteiro de Barros

    Full Text Available Abstract The right femoral vessels of 80 rats were identified and dissected. External lengths and diameters of femoral arteries and femoral veins were measured using either a microscope or a video magnification system. Findings were correlated to animals’ weights. Mean length was 14.33 mm for both femoral arteries and femoral veins, mean diameter of arteries was 0.65 mm and diameter of veins was 0.81 mm. In our sample, rats’ body weights were only correlated with the diameter of their femoral veins.

  12. Thromboangiitis obliterans with multiple large vessel involvement: case report and analysis of immunophenotypes.

    Science.gov (United States)

    Edo, Naoki; Miyai, Kosuke; Ogata, Sho; Nakanishi, Kuniaki; Hiroi, Sadayuki; Tominaga, Susumu; Aiko, Satoshi; Kawai, Toshiaki

    2010-01-01

    Thromboangiitis obliterans (TAO, Buerger's disease) is an idiopathic, recurrent, segmental, nonatherosclerotic, inflammatory, occlusive vascular disease with a poorly understood pathogenesis. Intestinal or multi-organ involvement is rare. Recent immunohistochemical analyses of ordinary TAO have indicated an inflammatory and immunologic pathogenesis. We report a case of TAO involving multiple large vessels. By immunohistochemistry, CD3+ T cells were revealed around the recanalization sites within the abdominal aorta. CD4+ T cells were almost equal in number to CD8+ T cells. These findings indicate the participation of inflammatory and immunologic processes in TAO with multi-organ involvement (as in ordinary TAO).

  13. Fully coupled, hygro-thermo-mechanical sensitivity analysis of a pre-stressed concrete pressure vessel

    OpenAIRE

    Davie, C.T.; Pearce, C.J.; Bićanić, N.

    2014-01-01

    Following a recent world wide resurgence in the desire to build and operate nuclear power stations as a response to rising energy demands and global plans to reduce carbon emissions, and in the light of recent events such as those at the Fukushima Dai-ichi nuclear power plant in Japan, which have raised questions of safety, this work has investigated the long term behaviour of concrete nuclear power plant structures.\\ud A case example of a typical pre-stressed concrete pressure vessel (PCPV),...

  14. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  15. Towards holistic goal centered performance management in software development: lessons from a best practice analysis

    Directory of Open Access Journals (Sweden)

    Thomas Murphy

    2015-01-01

    Full Text Available There are strong motivating factors for more effective performance measurement practices in software development. Astute practices in this domain are lauded to improve efficiency and effectiveness. However previous studies have shown that measurement in software is intricate, complex and fraught with challenges. Consequently it is poorly managed in practice. Our research seeks to better understand performance management in a real world software development setting in order to identify the challenges and generate a roadmap for improvement. This paper presents findings from an inductive analysis of a radical measurement program in a global software organization. Our study investigates the level at which non-compliance with best practice can explain the company’s disappointing results. We found that a narrow focus on projects, rather than on organizational goals, has seriously hindered its success. We also found that the rate of change in the organization as a whole was impinging on the effective implementation of its measurement program. An analysis of the results demonstrates just how challenging software measurement is. The findings provide an evaluation of best practice relative to the literature that is informed by real industry experience.

  16. Design and Analysis of Boiler Pressure Vessels based on IBR codes

    Science.gov (United States)

    Balakrishnan, B.; Kanimozhi, B.

    2017-05-01

    Pressure vessels components are widely used in the thermal and nuclear power plants for generating steam using the philosophy of heat transfer. In Thermal power plant, Coal is burnt inside the boiler furnace for generating the heat. The amount of heat produced through the combustion of pulverized coal is used in changing the phase transfer (i.e. Water into Super-Heated Steam) in the Pressure Parts Component. Pressure vessels are designed as per the Standards and Codes of the country, where the boiler is to be installed. One of the Standards followed in designing Pressure Parts is ASME (American Society of Mechanical Engineers). The mandatory requirements of ASME code must be satisfied by the manufacturer. In our project case, A Shell/pipe which has been manufactured using ASME code has an issue during the drilling of hole. The Actual Size of the drilled holes must be, as per the drawing, but due to error, the size has been differentiate from approved design calculation (i.e. the diameter size has been exceeded). In order to rectify this error, we have included an additional reinforcement pad to the drilled and modified the design of header in accordance with the code requirements.

  17. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...

  18. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  19. MyoVision: software for automated high-content analysis of skeletal muscle immunohistochemistry.

    Science.gov (United States)

    Wen, Yuan; Murach, Kevin A; Vechetti, Ivan J; Fry, Christopher S; Vickery, Chase; Peterson, Charlotte A; McCarthy, John J; Campbell, Kenneth S

    2018-01-01

    Analysis of skeletal muscle cross sections is an important experimental technique in muscle biology. Many aspects of immunohistochemistry and fluorescence microscopy can now be automated, but most image quantification techniques still require extensive human input, slowing progress and introducing the possibility of user bias. MyoVision is a new software package that was developed to overcome these limitations. The software improves upon previously reported automatic techniques and analyzes images without requiring significant human input and correction. When compared with data derived by manual quantification, MyoVision achieves an accuracy of ≥94% for basic measurements such as fiber number, fiber type distribution, fiber cross-sectional area, and myonuclear number. Scientists can download the software free from www.MyoVision.org and use it to automate the analysis of their own experimental data. This will improve the efficiency and consistency of the analysis of muscle cross sections and help to reduce the burden of routine image quantification in muscle biology. NEW & NOTEWORTHY Scientists currently analyze images of immunofluorescently labeled skeletal muscle using time-consuming techniques that require sustained human supervision. As well as being inefficient, these techniques can increase variability in studies that quantify morphological adaptations of skeletal muscle at the cellular level. MyoVision is new software that overcomes these limitations by performing high-content analysis of muscle cross sections with minimal manual input. It is open source and freely available.

  20. A Comparative Analysis of Software Engineering with Mature Engineering Disciplines Using a Problem-Solving Perspective

    NARCIS (Netherlands)

    Tekinerdogan, B.; Aksit, Mehmet; Dogru, Ali H.; Bicer, Veli

    2011-01-01

    Software engineering is compared with traditional engineering disciplines using a domain specific problem-solving model called Problem-Solving for Engineering Model (PSEM). The comparative analysis is performed both from a historical and contemporary view. The historical view provides lessons on the

  1. The Design of Lessons Using Mathematics Analysis Software to Support Multiple Representations in Secondary School Mathematics

    Science.gov (United States)

    Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda

    2011-01-01

    Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…

  2. An Assessmant of a Beofulf System for a Wide Class of Analysis and Design Software

    Science.gov (United States)

    Katz, D. S.; Cwik, T.; Kwan, B. H.; Lou, J. Z.; Springer, P. L.; Sterling, T. L.; Wang, P.

    1997-01-01

    This paper discusses Beowulf systems, focusing on Hyglac, the Beowulf system installed at the Jet Propulsion Laboratory. The purpose of the paper is to assess how a system of this type will perform while running a variety of scientific and engineering analysis and design software.

  3. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...

  4. Long Term Preservation of Data Analysis Software at the NASA/IPAC Infrared Science Archive

    NARCIS (Netherlands)

    Teplitz, H.I.; Groom, S.; Brooke, T.; Desai, V.; Engler, D.; Fowler, J.; Good, J.; Khan, I.; Levine, D.; Alexov, A.

    2012-01-01

    The NASA/IPAC Infrared Science Archive (IRSA) curates both data and analysis tools from NASA's infrared missions. As part of our primary goal, we provide long term access to mission-specific software from projects such as IRAS and Spitzer. We will review the efforts by IRSA (and within the greater

  5. AN AUTOMATIZED IN-PLACE ANALYSIS OF A HEAVY LIFT JACK-UP VESSEL UNDER SURVIVAL CONDITIONS

    Directory of Open Access Journals (Sweden)

    Gil Rama

    2014-08-01

    Full Text Available Heavy lift jack-up vessels (HLJV are used for the installation of components of large offshore wind farms. A systematic FE-analysis is presented for the HLJV THOR (owned by Hochtief Infrastructure GmbH under extreme weather conditions. A parametric finite element (FE model and analysis are developed by using ANSYS-APDL programming environment. The analysis contains static and dynamic nonlinear FE-calculations, which are carried out according to the relevant standards (ISO 19905 for in-place analyses of jack-up vessels. Besides strategies of model abstraction, a guide for the determination of the relevant loads is given. In order to calculate the dynamic loads, single degree of freedom (SDOF analogy and dynamic nonlinear FE-calculations are used. As a result of detailed determination of dynamic loads and consideration of soil properties by spring elements, the used capacities are able to be reduced by 28 %. This provides for significant improvement of the environmental restrictions of the HLJV THOR for the considered load scenario.

  6. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Science.gov (United States)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  7. POSTMan (POST-translational modification analysis), a software application for PTM discovery.

    Science.gov (United States)

    Arntzen, Magnus Ø; Osland, Christoffer Leif; Raa, Christopher Rasch-Olsen; Kopperud, Reidun; Døskeland, Stein-Ove; Lewis, Aurélia E; D'Santos, Clive S

    2009-03-01

    Post-translationally modified peptides present in low concentrations are often not selected for CID, resulting in no sequence information for these peptides. We have developed a software POSTMan (POST-translational Modification analysis) allowing post-translationally modified peptides to be targeted for fragmentation. The software aligns LC-MS runs (MS(1) data) between individual runs or within a single run and isolates pairs of peptides which differ by a user defined mass difference (post-translationally modified peptides). The method was validated for acetylated peptides and allowed an assessment of even the basal protein phosphorylation of phenylalanine hydroxylase (PHA) in intact cells.

  8. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    Science.gov (United States)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  9. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    Science.gov (United States)

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  10. Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software

    Science.gov (United States)

    Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.

    2017-12-01

    Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.

  11. PROMETHEE Method and Sensitivity Analysis in the Software Application for the Support of Decision-Making

    Directory of Open Access Journals (Sweden)

    Petr Moldrik

    2008-01-01

    Full Text Available PROMETHEE is one of methods, which fall into multi-criteria analysis (MCA. The MCA, as the name itself indicates, deals with the evaluation of particular variants according to several criteria. Developed software application (MCA8 for the support of multi-criteria decision-making was upgraded about PROMETHEE method and a graphic tool, which enables the execution of the sensitivity analysis. This analysis is used to ascertain how a given model output depends upon the input parameters. The MCA8 software application with mentioned graphic upgrade was developed for purposes of solving multi-criteria decision tasks. In the MCA8 is possible to perform sensitivity analysis by a simple form – through column graphs. We can change criteria significances (weights in these column graphs directly and watch the changes of the order of variants immediately.

  12. Bolted Ribs Analysis for the ITER Vacuum Vessel using Finite Element Submodelling Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zarzalejos, José María, E-mail: jose.zarzalejos@ext.f4e.europa.eu [External at F4E, c/Josep Pla, n.2, Torres Diagonal Litoral, Edificio B3, E-08019, Barcelona (Spain); Fernández, Elena; Caixas, Joan; Bayón, Angel [F4E, c/Josep Pla, n.2, Torres Diagonal Litoral, Edificio B3, E-08019, Barcelona (Spain); Polo, Joaquín [Iberdrola Ingeniería y Construcción, Avenida de Manoteras 20, 28050 Madrid (Spain); Guirao, Julio [Numerical Analysis Technologies, S L., Marqués de San Esteban 52, Entlo, 33209 Gijon (Spain); García Cid, Javier [Iberdrola Ingeniería y Construcción, Avenida de Manoteras 20, 28050 Madrid (Spain); Rodríguez, Eduardo [Mechanical Engineering Department EPSIG, University of Oviedo, Gijon (Spain)

    2014-10-15

    Highlights: • The ITER Vacuum Vessel Bolted Ribs assemblies are modelled using Finite Elements. • Finite Element submodelling techniques are used. • Stress results are obtained for all the assemblies and a post-processing is performed. • All the elements of the assemblies are compliant with the regulatory provisions. • Submodelling is a time-efficient solution to verify the structural integrity of this type of structures. - Abstract: The ITER Vacuum Vessel (VV) primary function is to enclose the plasmas produced by the ITER Tokamak. Since it acts as the first radiological barrier of the plasma, it is classified as a class 2 welded box structure, according to RCC-MR 2007. The VV is made of an inner and an outer D-shape, 60 mm-thick double shell connected through thick massive bars (housings) and toroidal and poloidal structural stiffening ribs. In order to provide neutronic shielding to the ex-vessel components, the space between shells is filled with borated steel plates, called In-Wall Shielding (IWS) blocks, and water. In general, these blocks are connected to the IWS ribs which are connected to adjacent housings. The development of a Finite Element model of the ITER VV including all its components in detail is unaffordable from the computational point of view due to the large number of degrees of freedom it would require. This limitation can be overcome by using submodelling techniques to simulate the behaviour of the bolted ribs assemblies. Submodelling is a Finite Element technique which allows getting more accurate results in a given region of a coarse model by generating an independent, finer model of the region under study. In this paper, the methodology and several simulations of the VV bolted ribs assemblies using submodelling techniques are presented. A stress assessment has been performed for the elements involved in the assembly considering possible types of failure and including stress classification and categorization techniques to analyse

  13. Fluorescence Image Analyzer - FLIMA: software for quantitative analysis of fluorescence in situ hybridization.

    Science.gov (United States)

    Silva, H C M; Martins-Júnior, M M C; Ribeiro, L B; Matoso, D A

    2017-03-30

    The Fluorescence Image Analyzer (FLIMA) software was developed for the quantitative analysis of images generated by fluorescence in situ hybridization (FISH). Currently, the images of FISH are examined without a coefficient that enables a comparison between them. Through GD Graphics Library, the FLIMA software calculates the amount of pixels on image and recognizes each present color. The coefficient generated by the algorithm shows the percentage of marks (probes) hybridized on the chromosomes. This software can be used for any type of image generated by a fluorescence microscope and is able to quantify digoxigenin probes exhibiting a red color, biotin probes exhibiting a green color, and double-FISH probes (digoxigenin and biotin used together), where the white color is displayed.

  14. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  15. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    Science.gov (United States)

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  16. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses......The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  17. An analysis of type F2 software measurement standards for profile surface texture parameters

    Science.gov (United States)

    Todhunter, L. D.; Leach, R. K.; Lawes, S. D. A.; Blateyron, F.

    2017-06-01

    This paper reports on an in-depth analysis of ISO 5436 part 2 type F2 reference software for the calculation of profile surface texture parameters that has been performed on the input, implementation and output results of the reference software developed by the National Physical Laboratory (NPL), the National Institute of Standards and Technology (NIST) and Physikalisch-Technische Bundesanstalt (PTB). Surface texture parameters have been calculated for a selection of 17 test data files obtained from the type F1 reference data sets on offer from NPL and NIST. The surface texture parameter calculation results show some disagreements between the software methods of the National Metrology Institutes. These disagreements have been investigated further, and some potential explanations are given.

  18. Artifacts in Quantitative analysis of myocardial perfusion SPECT, using Cedars-Sinai QPS Software.

    Science.gov (United States)

    Malek, Hadi; Yaghoobi, Nahid; Hedayati, Raheleh

    2017-04-01

    Quantitative analysis of myocardial perfusion Single photon emission computerized tomography (SPECT) images is increasingly applied in modern nuclear cardiology practice, assisting in the interpretation of myocardial perfusion images (MPI). There are different extensively validated state-of-the-art software packages, including QPS (cedars-Sinai), Corridor 4DM (University of Michigan) and Emory cardiac toolbox (Emory university), providing highly accurate and reproducible data. However, these software packages may suffer from potential artifacts related to patient or technical factors. By recognizing the source of such artifacts, the interpreting physician can avoid misinterpretation of MPI study. In this review, we discuss some of technical pitfalls that may occur in Quantitative Perfusion SPECT software (QPS, cedars-Sinai Medical center).

  19. MultiPSQ: a software solution for the analysis of diagnostic n-plexed pyrosequencing reactions.

    Directory of Open Access Journals (Sweden)

    Piotr Wojtek Dabrowski

    Full Text Available BACKGROUND: Pyrosequencing can be applied for Single-Nucleotide-Polymorphism (SNP-based pathogen typing or for providing sequence information of short DNA stretches. However, for some pathogens molecular typing cannot be performed relying on a single SNP or short sequence stretch, necessitating the consideration of several genomic regions. A promising rapid approach is the simultaneous application of multiple sequencing primers, called multiplex pyrosequencing. These primers generate a fingerprint-pyrogram which is constituted by the sum of all individual pyrograms originating from each primer used. METHODS: To improve pyrosequencing-based pathogen typing, we have developed the software tool MultiPSQ that expedites the analysis and evaluation of multiplex-pyrograms. As a proof of concept, a multiplex pyrosequencing assay for the typing of orthopoxviruses was developed to analyse clinical samples diagnosed in the German Consultant Laboratory for Poxviruses. RESULTS: The software tool MultiPSQ enabled the analysis of multiplex-pyrograms originating from various pyrosequencing primers. Thus several target regions can be used for pathogen typing based on pyrosequencing. As shown with a proof of concept assay, SNPs present in different orthopoxvirus strains could be identified correctly with two primers by MultiPSQ. CONCLUSIONS: Software currently available is restricted to a fixed number of SNPs and sequencing primers, severely limiting the usefulness of this technique. In contrast, our novel software MultiPSQ allows analysis of data from multiplex pyrosequencing assays that contain any number of sequencing primers covering any number of polymorphisms.

  20. Comparative analysis of statistical software products for the qualifying examination of plant varieties suitable for dissemination

    Directory of Open Access Journals (Sweden)

    Н. В. Лещук

    2017-12-01

    Full Text Available Purpose. To define statistical methods and tools (application packages for creating the decision support system (DSS for qualifying examination of plant varieties suitable for dissemination (VSD in the context of data processing tasks. To substantiate the selection of software for proces­sing statistical data relative to field and laboratory investigations that are included into the qualifying examination for VSD. Methods. Analytical one based on the comparison of methods of descriptive and multivariate statistics and tools of intellectual analysis of data obtained during qualifying examination for VSD. Comparative analysis of software tools for processing statistical data in order to prepare proposals for the final decision on plant variety application. Decomposition of tasks was carried out which were included into the decision support system for qualifying examination of varieties-candidates for VSD. Results. Statistical package SPSS, analysis package included in MS Excel and programe language R was compared for the following criteria: interface usability, functionality, quality of calculation result presentation, visibility of graphical information, software cost. The both packages were widely used in the world for statistical data processing, they have similar functions for statistics calculation. Conclusion. Tasks of VSD were separated and recommended to tackle using investigated tools. Programe language R was a product recommended to use as a tool. The main advantage of R as compared to the package IBM SPSS Statistics is the fact that R is an open source software.

  1. MIDAS: software for analysis and visualisation of interallelic disequilibrium between multiallelic markers

    Directory of Open Access Journals (Sweden)

    Day Ian NM

    2006-04-01

    Full Text Available Abstract Background Various software tools are available for the display of pairwise linkage disequilibrium across multiple single nucleotide polymorphisms. The HapMap project also presents these graphics within their website. However, these approaches are limited in their use of data from multiallelic markers and provide limited information in a graphical form. Results We have developed a software package (MIDAS – Multiallelic Interallelic Disequilibrium Analysis Software for the estimation and graphical display of interallelic linkage disequilibrium. Linkage disequilibrium is analysed for each allelic combination (of one allele from each of two loci, between all pairwise combinations of any type of multiallelic loci in a contig (or any set of many loci (including single nucleotide polymorphisms, microsatellites, minisatellites and haplotypes. Data are presented graphically in a novel and informative way, and can also be exported in tabular form for other analyses. This approach facilitates visualisation of patterns of linkage disequilibrium across genomic regions, analysis of the relationships between different alleles of multiallelic markers and inferences about patterns of evolution and selection. Conclusion MIDAS is a linkage disequilibrium analysis program with a comprehensive graphical user interface providing novel views of patterns of linkage disequilibrium between all types of multiallelic and biallelic markers. Availability Available from http://www.genes.org.uk/software/midas and http://www.sgel.humgen.soton.ac.uk/midas

  2. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  3. Potential field continuation: a comparative analysis of three different types of software

    OpenAIRE

    M. Loddo; Ciminale, M.

    1998-01-01

    DARING.F is a new Fortran77 computer program which has been developed to perform the continuation of potential field data between arbitrary surfaces. The implemented equivalent source algorithm inverts a system of linear equations by using sparse matrix. A comparative analysis between the performance of this software and that of two computer programs (named UPWARD.F and UPNEW.F) previously written by the same authors is carried out. As a result of this analysis, some useful and important sugg...

  4. Progress on the development of automated data analysis algorithms and software for ultrasonic inspection of composites

    Science.gov (United States)

    Aldrin, John C.; Coughlin, Chris; Forsyth, David S.; Welter, John T.

    2014-02-01

    Progress is presented on the development and implementation of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. ADA processing results are presented for test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions.

  5. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    Science.gov (United States)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  6. Efficiency of Software Testing Techniques: A Controlled Experiment Replication and Network Meta-analysis

    Directory of Open Access Journals (Sweden)

    Omar S. Gómez

    2017-07-01

    Full Text Available Background: Common approaches to software verification include static testing techniques, such as code reading, and dynamic testing techniques, such as black-box and white-box testing. Objective: With the aim of gaining a~better understanding of software testing techniques, a~controlled experiment replication and the synthesis of previous experiments which examine the efficiency of code reading, black-box and white-box testing techniques were conducted. Method: The replication reported here is composed of four experiments in which instrumented programs were used. Participants randomly applied one of the techniques to one of the instrumented programs. The outcomes were synthesized with seven experiments using the method of network meta-analysis (NMA. Results: No significant differences in the efficiency of the techniques were observed. However, it was discovered the instrumented programs had a~significant effect on the efficiency. The NMA results suggest that the black-box and white-box techniques behave alike; and the efficiency of code reading seems to be sensitive to other factors. Conclusion: Taking into account these findings, the Authors suggest that prior to carrying out software verification activities, software engineers should have a~clear understanding of the software product to be verified; they can apply either black-box or white-box testing techniques as they yield similar defect detection rates.

  7. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  8. A Detailed Analysis over Iranian EFL Teachers’ Beliefs towards Using Pronunciation Software in Teaching English Pronunciation

    Directory of Open Access Journals (Sweden)

    Abbas Pourhosein Gilakjani

    2017-10-01

    Full Text Available One of the useful ways of teaching English pronunciation is the application of pronunciation software. Pronunciation software supplies a personal and stress-free setting for both teachers and learners through which they can have infinite input, exercise at their own pace, and get feedback through the automatic speech recognition. This study investigated the Iranian teachers’ beliefs towards utilizing pronunciation software in English pronunciation instruction. The researchers applied a qualitative method to investigate the impact of pronunciation software on teachers’ pronunciation instruction. The researchers used a belief questionnaire to choose teachers for the semi-structured interview and distributed it to 28 teachers at the two Islamic Azad Universities of Iran. The researchers chose 14 of them based on their answers to the belief questionnaire. Therefore, these 14 teachers participated in the qualitative aspect of this study. The researchers collected data and analyzed them. Qualitative data analysis was done through reducing and displaying the collected data and drawing conclusions from the collected data. The findings obtained from the qualitative research demonstrated that Iranian university teachers held positive beliefs towards the application of pronunciation software in pronunciation instruction. These positive beliefs provided teaching and learning opportunities and appropriate resources for teachers, met their teaching needs, and solved some of their pronunciation difficulties.

  9. Leveraging intellectual capital through Lewin's Force Field Analysis: The case of software development companies

    Directory of Open Access Journals (Sweden)

    Alexandru Capatina

    2017-09-01

    Full Text Available This article presents an original conceptual framework for the strategic management of intellectual capital assets in software development companies. The framework is based on Lewin's Force Field Analysis. The framework makes it possible to assess software company managers’ opinions regarding the way driving and restraining forces affect the pillars of intellectual capital. The capacity to adapt to change is vital for companies in knowledge-intensive industries. Accordingly, this study examined a sample of 74 Romanian software development companies. The aim was to help companies benefit from managing the driving and restraining forces acting upon the pillars of intellectual capital (human, structural, and relational. The effects of the driving forces, quantified by PathMaker software's Force Field Tool, were observed to be greater than the restraining forces for each pillar of intellectual capital. This paper contributes by showing the explanatory power of this framework. The framework thus offers a tool that helps managers drive change in their organizations through effective intellectual capital management. Furthermore, this article describes how to encourage the implementation of changes that create value for software development companies.

  10. Ultrastructural analysis of small blood vessels in skin biopsies in CADASIL

    Directory of Open Access Journals (Sweden)

    Lačković Vesna

    2008-01-01

    Full Text Available Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL is an inherited small- and medium-artery disease of the brain caused by mutation of the Notch3 gene. Very often, this disease is misdiagnosed. We examined skin biopsies in two members of the first discovered Serbian family affected by CADASIL. Electron microscopy showed that skin blood vessels of both patients contain numerous deposits of granular osmiophilic material (GOM around vascular smooth muscle cells (VSMCs. We observed degeneration of VSMCs, reorganization of their cytoskeleton and dense bodies, disruption of myoendothelial contacts, and apoptosis. Our results suggest that the presence of GOM in small skin arteries represents a specific marker in diagnosis of CADASIL.

  11. Ex-vessel break in ITER divertor cooling loop analysis with the ECART code

    CERN Document Server

    Cambi, G; Parozzi, F; Porfiri, MT

    2003-01-01

    A hypothetical double-ended pipe rupture in the ex-vessel section of the International Thermonuclear Experimental Reactor (ITER) divertor primary heat transfer system during pulse operation has been assessed using the nuclear source term ECART code. That code was originally designed and validated for traditional nuclear power plant safety analyses, and has been internationally recognized as a relevant nuclear source term codes for nuclear fission plants. It permits the simulation of chemical reactions and transport of radioactive gases and aerosols under two-phase flow transients in generic flow systems, using a built-in thermal-hydraulic model. A comparison with the results given in ITER Generic Site Safety Report, obtained using a thermal-hydraulic system code (ATHENA), a containment code (INTRA) and an aerosol transportation code (NAUA), in a sequential way, is also presented and discussed.

  12. Development of a surrogate model for analysis of ex-vessel steam explosion in Nordic type BWRs

    Energy Technology Data Exchange (ETDEWEB)

    Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se; Basso, Simone, E-mail: simoneb@kth.se; Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se

    2016-12-15

    Highlights: • Severe accident. • Steam explosion. • Surrogate model. • Sensitivity study. • Artificial neural networks. - Abstract: Severe accident mitigation strategy adopted in Nordic type Boiling Water Reactors (BWRs) employs ex-vessel core melt cooling in a deep pool of water below reactor vessel. Energetic fuel–coolant interaction (steam explosion) can occur during molten core release into water. Dynamic loads can threaten containment integrity increasing the risk of fission products release to the environment. Comprehensive uncertainty analysis is necessary in order to assess the risks. Computational costs of the existing fuel–coolant interaction (FCI) codes is often prohibitive for addressing the uncertainties, including the effect of stochastic triggering time. This paper discusses development of a computationally efficient surrogate model (SM) for prediction of statistical characteristics of steam explosion impulses in Nordic BWRs. The TEXAS-V code was used as the Full Model (FM) for the calculation of explosion impulses. The surrogate model was developed using artificial neural networks (ANNs) and the database of FM solutions. Statistical analysis was employed in order to treat chaotic response of steam explosion impulse to variations in the triggering time. Details of the FM and SM implementation and their verification are discussed in the paper.

  13. Hepatic resection using a bipolar vessel sealing device: technical and histological analysis.

    Science.gov (United States)

    Romano, Fabrizio; Garancini, Mattia; Caprotti, Roberto; Bovo, Giorgio; Conti, Matteo; Perego, Elisa; Uggeri, Franco

    2007-01-01

    Blood loss and bile leakage are well-known risk factors for morbidity and mortality during liver resection. Bleeding usually occurs during parenchymal transection, and surgical technique should be considered an important factor in preventing intraoperative and postoperative complications. Many approaches and devices have been developed to limit bleeding and bile leakage. The aim of the present study was to determine whether a bipolar vessel sealing device allows a safe and careful liver transection without routine inflow occlusion, achieving a satisfactory hemostasis and bile stasis, thus reducing blood loss and bile leak and related complications. A total of 50 consecutive patients (24 males, 26 females, with a mean age of 57 years) underwent major and minor hepatic resections using a bipolar vessel sealing device. A clamp crushing technique followed by energy application was used to perform the parenchymal transection. Inflow occlusion was used when necessary to control blood loss but not as a routine. No other devices were applied to achieve hemostasis. The instrument was effective in 45 patients and failed to achieve hemostasis in 5 cases, all of whom had a cirrhotic liver. Median blood loss was 490 ml (range 100-2500 ml) and intraoperative blood transfusions were required in eight cases (16%). Mean operative time was 178 min (range 50-315 min). Inflow occlusion was necessary in 16 (32%) patients. The postoperative complication rate was 24%, with a postoperative hemorrhage in a cirrhotic patient. There was no clinical evidence of bile leak or procedure-related abdominal abscess. We conclude that the device is a useful tool in standard liver resection, achieving good hemostasis and bile stasis in patients with normal liver parenchyma, but its use should be avoided in cirrhotic patients.

  14. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  15. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  16. A new method and software for quantitative analysis of continuous intracranial pressure recordings.

    Science.gov (United States)

    Eide, P K; Fremming, A D

    2001-12-01

    A computer software utilising a new method for quantitative analysis of intracranial pressure (ICP), was developed to provide a more accurate analysis of continuously recorded ICP. Intracranial pressure curves were analysed by the software to explore the relationship between mean ICP and the presence of ICP elevations. The Sensometrics Pressure Analyser (version 1.2) software provides a quantitative analysis of the ICP curve, presenting the ICP recordings as a matrix of numbers of ICP elevations of different levels (e.g. 20 or 30 or 40 mmHg) and durations (e.g. 0.5, 5 or 10 minutes). The number of ICP elevations may be standardised by calculating the number of elevations during for instance a 10 hour period. The computer software was used to retrospectively analyse the ICP curves in our first consecutive 127 patients undergoing continuous 24 hours ICP monitoring during the two-year period from February 1997 to December 1998. The indications for ICP monitoring were suspected hydrocephalus, craniosynostosis or shunt failure. Analysis of the ICP curves revealed a rather weak relationship between mean ICP and the number of apparently abnormal ICP elevations (that is elevations of 20 mmHg or above). Abnormal ICP elevations were present in a relatively high proportion of cases with a normal mean ICP below 10 mmHg, or a borderline mean ICP between 10 and 15 mmHg. In addition, the ICP data of two cases are presented suggesting that mean ICP may be an inaccurate measure of ICP. The results of analysing ICP curves by means of this method and software reveal that calculation of ICP elevations of different levels and durations may represent a more accurate description of the ICP curve than calculation of mean ICP. The method may enhance the clinical application of ICP monitoring.

  17. Automated ultrasound edge-tracking software comparable to established semi-automated reference software for carotid intima-media thickness analysis.

    Science.gov (United States)

    Shenouda, Ninette; Proudfoot, Nicole A; Currie, Katharine D; Timmons, Brian W; MacDonald, Maureen J

    2017-04-26

    Many commercial ultrasound systems are now including automated analysis packages for the determination of carotid intima-media thickness (cIMT); however, details regarding their algorithms and methodology are not published. Few studies have compared their accuracy and reliability with previously established automated software, and those that have were in asymptomatic adults. Therefore, this study compared cIMT measures from a fully automated ultrasound edge-tracking software (EchoPAC PC, Version 110.0.2; GE Medical Systems, Horten, Norway) to an established semi-automated reference software (Artery Measurement System (AMS) II, Version 1.141; Gothenburg, Sweden) in 30 healthy preschool children (ages 3-5 years) and 27 adults with coronary artery disease (CAD; ages 48-81 years). For both groups, Bland-Altman plots revealed good agreement with a negligible mean cIMT difference of -0·03 mm. Software differences were statistically, but not clinically, significant for preschool images (P = 0·001) and were not significant for CAD images (P = 0·09). Intra- and interoperator repeatability was high and comparable between software for preschool images (ICC, 0·90-0·96; CV, 1·3-2·5%), but slightly higher with the automated ultrasound than the semi-automated reference software for CAD images (ICC, 0·98-0·99; CV, 1·4-2·0% versus ICC, 0·84-0·89; CV, 5·6-6·8%). These findings suggest that the automated ultrasound software produces valid cIMT values in healthy preschool children and adults with CAD. Automated ultrasound software may be useful for ensuring consistency among multisite research initiatives or large cohort studies involving repeated cIMT measures, particularly in adults with documented CAD. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  18. [Simple analysis on professor ZHANG Dao-Zong's academic thought of dredging the Governor Vessel and regulating mentality].

    Science.gov (United States)

    Cao, Yi; Li, Pei-Fang; Chen, Xing-Sheng

    2006-10-01

    Introduce Professor Zhang Dao-zong's studies on the Governor Vessel and its academic thought of dredging the Governor Vessel and regulating mentality, and clinical experiences of dredging the Governor Vessel and regulating mentality therapy for apoplexy, epilepsy, vertigo, ankynosing spondylitis, traumatic paraplegia, child Tourette's disease, etc....

  19. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  20. Usability analysis of the tele-nursing call management software at HealthLink BC.

    Science.gov (United States)

    Hall, Simon A S; Kushniruk, Andre W; Borycki, Elizabeth M

    2011-01-01

    Usability engineering methods have been shown to be effective in identifying software problems that may lead to user operating inefficiencies, user errors, data encoding errors or far more serious health threatening consequences. This research project applied two complementary usability engineering analysis methods to a mature tele-nursing clinical call management software platform (a knowledgebase and an EMR product). Findings from the study revealed 100 discrete usability errors or problems. This research also introduced an adaptation of cognitive task analysis, with the development of a 'cognitive task screen-turn' analysis, which provided useful information about operating differences among users performing identical tasks that was particularly useful in revealing four unnecessary steps within the system.

  1. A Method for Digital Color Analysis of Spalted Wood Using Scion Image Software

    Directory of Open Access Journals (Sweden)

    Sara C. Robinson

    2009-02-01

    Full Text Available Color analysis of spalted wood surfaces requires a non-subjective, repeatable method for determining percent of pigmentation on the wood surface. Previously published methods used human visual perception with a square grid overlay to determine the percent of surface pigmentation. Our new method uses Scion Image©, a graphical software program used for grayscale and color analysis, to separate fungal pigments from the wood background. These human interface processes render the wood block into HSV (hue, saturation, value, within the RGB color space, allowing subtle and drastic color changes to be visualized, selected and analyzed by the software. Analysis with Scion Image© allows for a faster, less subjective, and easily repeatable procedure that is superior to simple human visual perception.

  2. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    Science.gov (United States)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  3. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  4. Gardony Map Drawing Analyzer: Software for quantitative analysis of sketch maps.

    Science.gov (United States)

    Gardony, Aaron L; Taylor, Holly A; Brunyé, Tad T

    2016-03-01

    Sketch maps are effective tools for assessing spatial memory. However, despite their widespread use in cognitive science research, sketch map analysis techniques remain unstandardized and carry limitations. In the present article, we present the Gardony Map Drawing Analyzer (GMDA), an open-source software package for sketch map analysis. GMDA combines novel and established analysis techniques into a graphical user interface that permits rapid computational sketch map analysis. GMDA calculates GMDA-unique measures based on pairwise comparisons between landmarks, as well as bidimensional regression parameters (Friedman & Kohler, 2003), which together reflect sketch map quality at two levels: configural and individual landmark. The configural measures assess the overall landmark configuration and provide a whole-map analysis. Individual landmark measures, introduced in GMDA, assess individual landmark placement and indicate how individual landmarks contribute to the configural scores. Together, these measures provide a more complete psychometric picture of sketch map analysis, allowing for comparisons between sketch maps and between landmarks. The calculated measures reflect specific and cognitively relevant aspects of interlandmark spatial relationships, including distance and angular representation. GMDA supports complex environments (up to 48 landmarks) and two software modes that capture aspects of maps not addressed by existing techniques, such as landmark size and shape variation and interlandmark containment relationships. We describe the software and its operation and present a formal specification of calculation procedures for its unique measures. We then validate the software by demonstrating the capabilities and reliability of its measures using simulation and experimental data. The most recent version of GMDA is available at www.aarongardony.com/tools/map-drawing-analyzer.

  5. Probabilistic Fracture Mechanics Analysis of Boling Water Reactor Vessel for Cool-Down and Low Temperature Over-Pressurization Transients

    Directory of Open Access Journals (Sweden)

    Jeong Soon Park

    2016-04-01

    Full Text Available The failure probabilities of the reactor pressure vessel (RPV for low temperature over-pressurization (LTOP and cool-down transients are calculated in this study. For the cool-down transient, a pressure–temperature limit curve is generated in accordance with Section XI, Appendix G of the American Society of Mechanical Engineers (ASME code, from which safety margin factors are deliberately removed for the probabilistic fracture mechanics analysis. Then, sensitivity analyses are conducted to understand the effects of some input parameters. For the LTOP transient, the failure of the RPV mostly occurs during the period of the abrupt pressure rise. For the cool-down transient, the decrease of the fracture toughness with temperature and time plays a main role in RPV failure at the end of the cool-down process. As expected, the failure probability increases with increasing fluence, Cu and Ni contents, and initial reference temperature-nil ductility transition (RTNDT. The effect of warm prestressing on the vessel failure probability for LTOP is not significant because most of the failures happen before the stress intensity factor reaches the peak value while its effect reduces the failure probability by more than one order of magnitude for the cool-down transient.

  6. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  7. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Directory of Open Access Journals (Sweden)

    Alain Ourghanlian

    2015-03-01

    Full Text Available We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricité de France (EDF investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools. In the last part, we present an overview of the results and the limitations of the tools.

  8. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    Science.gov (United States)

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i

  9. Processing of MRI images weighted in TOF for blood vessels analysis: 3-D reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez D, J.; Cordova F, T. [Universidad de Guanajuato, Campus Leon, Departamento de Ingenieria Fisica, Loma del Bosque No. 103, Lomas del Campestre, 37150 Leon, Guanajuato (Mexico); Cruz A, I., E-mail: hernandezdj.gto@gmail.com [CONACYT, Centro de Investigacion en Matematicas, A. C., Jalisco s/n, Col. Valenciana, 36000 Guanajuato, Gto. (Mexico)

    2015-10-15

    This paper presents a novel presents an approach based on differences of intensities for the identification of vascular structures in medical images from MRI studies of type time of flight method (TOF). The plating method hypothesis gave high intensities belonging to the vascular system image type TOF can be segmented by thresholding of the histogram. The enhanced vascular structures is performed using the filter Vesselness, upon completion of a decision based on fuzzy thresholding minimizes error in the selection of vascular structures. It will give a brief introduction to the vascular system problems and how the images have helped diagnosis, is summarized the physical history of the different imaging modalities and the evolution of digital images with computers. Segmentation and 3-D reconstruction became image type time of flight; these images are typically used in medical diagnosis of cerebrovascular diseases. The proposed method has less error in segmentation and reconstruction of volumes related to the vascular system, clear images and less noise compared with edge detection methods. (Author)

  10. Free software for performing physical analysis of systems for digital radiography and mammography.

    Science.gov (United States)

    Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco

    2014-05-01

    In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.

  11. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  12. ICAS-PAT: A Software for Design, Analysis and Validation of PAT Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    end product qualities. In an earlier article, Singh et al. [Singh, R., Gernaey, K. V., Gani, R. (2009). Model-based computer-aided framework for design of process monitoring and analysis systems. Computers & Chemical Engineering, 33, 22–42] proposed the use of a systematic model and data based...... methodology to design appropriate PAT systems. This methodology has now been implemented into a systematic computer-aided framework to develop a software (ICAS-PAT) for design, validation and analysis of PAT systems. Two supporting tools needed by ICAS-PAT have also been developed: a knowledge base...... (consisting of process knowledge as well as knowledge on measurement methods and tools) and a generic model library (consisting of process operational models). Through a tablet manufacturing process example, the application of ICAS-PAT is illustrated, highlighting as well, the main features of the software....

  13. Development of soil compaction analysis software (SCAN) integrating a low cost GPS receiver and compactometer.

    Science.gov (United States)

    Hwang, Jinsang; Yun, Hongsik; Kim, Juhyong; Suh, Yongcheol; Hong, Sungnam; Lee, Dongha

    2012-01-01

    A software for soil compaction analysis (SCAN) has been developed for evaluating the compaction states using the data from the GPS as well as a compactometer attached on the roller. The SCAN is distinguished from other previous software for intelligent compaction (IC) in that it can use the results from various types of GPS positioning methods, and it also has an optimal structure for remotely managing the large amounts of data gathered from numerous rollers. For this, several methods were developed: (1) improving the accuracy of low cost GPS receiver's positioning results; (2) modeling the trajectory of a moving roller using a GPS receiver's results and linking it with the data from the compactometer; and (3) extracting the information regarding the compaction states of the ground from the modeled trajectory, using spatial analysis methods. The SCAN was verified throughout various field compaction tests, and it has been confirmed that it can be a very effective tool in evaluating field compaction states.

  14. A comparison of conventional and computer-assisted semen analysis (CRISMAS software) using samples from 166 young Danish men

    DEFF Research Database (Denmark)

    Vested, Anne; Ramlau-Hansen, Cecilia H; Bonde, Jens P

    2011-01-01

    The aim of the present study was to compare assessments of sperm concentration and sperm motility analysed by conventional semen analysis with those obtained by computer-assisted semen analysis (CASA) (Copenhagen Rigshospitalet Image House Sperm Motility Analysis System (CRISMAS) 4.6 software) us...... and motility analysis. This needs to be accounted for in clinics using this software and in studies of determinants of these semen characteristics.......The aim of the present study was to compare assessments of sperm concentration and sperm motility analysed by conventional semen analysis with those obtained by computer-assisted semen analysis (CASA) (Copenhagen Rigshospitalet Image House Sperm Motility Analysis System (CRISMAS) 4.6 software......) using semen samples from 166 young Danish men. The CRISMAS software identifies sperm concentration and classifies spermatozoa into three motility categories. To enable comparison of the two methods, the four motility stages obtained by conventional semen analysis were, based on their velocity...

  15. Cardiomyocyte MEA data analysis (CardioMDA)--a novel field potential data analysis software for pluripotent stem cell derived cardiomyocytes.

    Science.gov (United States)

    Pradhapan, Paruthi; Kuusela, Jukka; Viik, Jari; Aalto-Setälä, Katriina; Hyttinen, Jari

    2013-01-01

    Cardiac safety pharmacology requires in-vitro testing of all drug candidates before clinical trials in order to ensure they are screened for cardio-toxic effects which may result in severe arrhythmias. Micro-electrode arrays (MEA) serve as a complement to current in-vitro methods for drug safety testing. However, MEA recordings produce huge volumes of data and manual analysis forms a bottleneck for high-throughput screening. To overcome this issue, we have developed an offline, semi-automatic data analysis software, 'Cardiomyocyte MEA Data Analysis (CardioMDA)', equipped with correlation analysis and ensemble averaging techniques to improve the accuracy, reliability and throughput rate of analysing human pluripotent stem cell derived cardiomyocyte (CM) field potentials. With the program, true field potential and arrhythmogenic complexes can be distinguished from one another. The averaged field potential complexes, analysed using our software to determine the field potential duration, were compared with the analogous values obtained from manual analysis. The reliability of the correlation analysis algorithm, evaluated using various arrhythmogenic and morphology changing signals, revealed a mean sensitivity and specificity of 99.27% and 94.49% respectively, in determining true field potential complexes. The field potential duration of the averaged waveforms corresponded well to the manually analysed data, thus demonstrating the reliability of the software. The software has also the capability to create overlay plots for signals recorded under different drug concentrations in order to visualize and compare the magnitude of response on different ion channels as a result of drug treatment. Our novel field potential analysis platform will facilitate the analysis of CM MEA signals in semi-automated way and provide a reliable means of efficient and swift analysis for cardiomyocyte drug or disease model studies.

  16. CSA06 Computing, Software and Analysis challenge at the Spanish Tier-1 and Tier-2 sites

    CERN Document Server

    Alcaraz, J; Cabrillo, Iban Jose; Colino, Nicanor; Cuevas-Maestro, J; Delgado Peris, Antonio; Fernandez Menendez, Javier; Flix, Jose; García-Abia, Pablo; González-Caballero, I; Hernández, Jose M; Marco, Rafael; Martinez Ruiz del Arbol, Pablo; Matorras, Francisco; Merino, Gonzalo; Rodríguez-Calonge, F J; Vizan Garcia, Jesus Manuel

    2007-01-01

    This note describes the participation of the Spanish centres PIC, CIEMAT and IFCA as Tier-1 and Tier-2 sites in the CMS CSA06 Computing, Software and Analysis challenge. A number of the facilities, services and workflows have been demonstrated at the 2008 25% scale. Very valuable experience has been gained running the complex computing system under realistic conditions at a significant scale. The focus of this note is on presenting achieved results, operational experience and lessons learnt during the challenge.

  17. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  18. Periodic precipitation a microcomputer analysis of transport and reaction processes in diffusion media, with software development

    CERN Document Server

    Henisch, H K

    1991-01-01

    Containing illustrations, worked examples, graphs and tables, this book deals with periodic precipitation (also known as Liesegang Ring formation) in terms of mathematical models and their logical consequences, and is entirely concerned with microcomputer analysis and software development. Three distinctive periodic precipitation mechanisms are included: binary diffusion-reaction; solubility modulation, and competitive particle growth. The book provides didactic illustrations of a valuable investigational procedure, in the form of hypothetical experimentation by microcomputer. The development

  19. [Hardware and software for EMG recording and analysis of respiratory muscles of human].

    Science.gov (United States)

    Solnushkin, S D; Chakhman, V N; Segizbaeva, M O; Pogodin, M A; Aleksandrov, V G

    2014-01-01

    This paper presents a new hardware and software system that allows to not only record the EMG of different groups of the respiratory muscles, but also hold its amplitude-frequency analysis, which allows to determine the change in the contribution to the work of breathing of a respiratory muscles and detect early signs of fatigue of the respiratory muscles. Presented complex can be used for functional diagnostics of breath in patients and healthy people and sportsmen.

  20. CellProfiler Analyst: data exploration and analysis software for complex image-based screens.

    Science.gov (United States)

    Jones, Thouis R; Kang, In Han; Wheeler, Douglas B; Lindquist, Robert A; Papallo, Adam; Sabatini, David M; Golland, Polina; Carpenter, Anne E

    2008-11-15

    Image-based screens can produce hundreds of measured features for each of hundreds of millions of individual cells in a single experiment. Here, we describe CellProfiler Analyst, open-source software for the interactive exploration and analysis of multidimensional data, particularly data from high-throughput, image-based experiments. The system enables interactive data exploration for image-based screens and automated scoring of complex phenotypes that require combinations of multiple measured features per cell.

  1. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  2. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    Science.gov (United States)

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Comparative Analysis of Software Development Methods between Parallel, V-Shaped and Iterative

    OpenAIRE

    Nugroho, Suryanto; Waluyo, Sigit Hadi; Hakim, Luqman

    2017-01-01

    Any organization that will develop software is faced with a difficult choice of choosing the right software development method. Whereas the software development methods used, play a significant role in the overall software development process. Software development methods are needed so that the software development process can be systematic so that it is not only completed within the right time frame but also must have good quality. There are various methods of software development in System ...

  4. FLIM-FRET analyzer: open source software for automation of lifetime-based FRET analysis.

    Science.gov (United States)

    Kim, Jiho; Tsoy, Yury; Persson, Jan; Grailhe, Regis

    2017-01-01

    Despite the broad use of FRET techniques, available methods for analyzing protein-protein interaction are subject to high labor and lack of systematic analysis. We propose an open source software allowing the quantitative analysis of fluorescence lifetime imaging (FLIM) while integrating the steady-state fluorescence intensity information for protein-protein interaction studies. Our developed open source software is dedicated to fluorescence lifetime imaging microscopy (FLIM) data obtained from Becker & Hickl SPC-830. FLIM-FRET analyzer includes: a user-friendly interface enabling automated intensity-based segmentation into single cells, time-resolved fluorescence data fitting to lifetime value for each segmented objects, batch capability, and data representation with donor lifetime versus acceptor/donor intensity quantification as a measure of protein-protein interactions. The FLIM-FRET analyzer software is a flexible application for lifetime-based FRET analysis. The application, the C#. NET source code, and detailed documentation are freely available at the following URL: http://FLIM-analyzer.ip-korea.org.

  5. SPSens: a software package for stochastic parameter sensitivity analysis of biochemical reaction networks.

    Science.gov (United States)

    Sheppard, Patrick W; Rathinam, Muruhan; Khammash, Mustafa

    2013-01-01

    SPSens is a software package for the efficient computation of stochastic parameter sensitivities of biochemical reaction networks. Parameter sensitivity analysis is a valuable tool that can be used to study robustness properties, for drug targeting, and many other purposes. However its application to stochastic models has been limited when Monte Carlo methods are required due to extremely high computational costs. SPSens provides efficient, state of the art sensitivity analysis algorithms in a single software package so that sensitivity analysis can be easily performed on stochastic models of biochemical reaction networks. SPSens implements the algorithms in C and estimates sensitivities with respect to both infinitesimal and finite perturbations to system parameters, in many cases reducing variance by orders of magnitude compared to basic methods. Included among the features of SPSens are serial and parallel command line versions, an interface with Matlab, and several example problems. SPSens is distributed freely under GPL version 3 and can be downloaded from http://sourceforge.net/projects/spsens/. The software can be run on Linux, Mac OS X and Windows platforms.

  6. How Can Single-Case Data Be Analyzed? Software Resources, Tutorial, and Reflections on Analysis.

    Science.gov (United States)

    Manolov, Rumen; Moeyaert, Mariola

    2017-03-01

    The present article aims to present a series of software developments in the quantitative analysis of data obtained via single-case experimental designs (SCEDs), as well as the tutorial describing these developments. The tutorial focuses on software implementations based on freely available platforms such as R and aims to bring statistical advances closer to applied researchers and help them become autonomous agents in the data analysis stage of a study. The range of analyses dealt with in the tutorial is illustrated on a typical single-case dataset, relying heavily on graphical data representations. We illustrate how visual and quantitative analyses can be used jointly, giving complementary information and helping the researcher decide whether there is an intervention effect, how large it is, and whether it is practically significant. To help applied researchers in the use of the analyses, we have organized the data in the different ways required by the different analytical procedures and made these data available online. We also provide Internet links to all free software available, as well as all the main references to the analytical techniques. Finally, we suggest that appropriate and informative data analysis is likely to be a step forward in documenting and communicating results and also for increasing the scientific credibility of SCEDs.

  7. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  8. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  9. Application of software solutions for modeling and analysis of parameters of belt drive in engineering

    Science.gov (United States)

    Timerbaev, N. F.; Sadrtdinov, A. R.; Prosvirnikov, D. B.; Fomin, A. A.; Stepanov, V. V.

    2017-10-01

    The application of software systems in engineering when developing the belt drive designs and evaluating their characteristics is considered. A technique for calculating and analyzing belt drives is described using the example of calculating V-belt and flat-belt drives using a software solution. As a result of the belt drive analysis, belt profiles, belt cross-sectional dimensions, drive and driven sheave diameters and power parameters are determined, and graphics images of the dependences of belt’s prestressing force and the force acting on the shaft from the diameter of the driving sheave are obtained. By approximating the results of calculations, theoretical equations for calculating the power parameters of the belt drives were derived. Carrying out the analysis of belt drives with the use of software solutions allows one to avoid computational errors and to optimize the design and performance. At the same time, a convenient and intuitive interface, as well as an integrated graphical editor, provide visibility of the output data and allow the accelerated engineering analysis of the development object.

  10. [Static Retinal Vessel Analysis in Population-based Study SHIP-Trend].

    Science.gov (United States)

    Theophil, Constanze; Jürgens, Clemens; Großjohann, Rico; Kempin, Robert; Ittermann, Till; Nauck, Matthias; Völzke, Henry; Tost, Frank H W

    2017-08-24

    Background Interdisciplinary investigations of possible connections between general diseases and ophthalmological changes are difficult to perform in the clinical environment. But they are gaining in importance as a result of the age-related increase in chronic diseases. The collection of health-related parameters in the Study of Health in Pomerania (SHIP) project allows to derive conclusions for the general population. Methods The population-based SHIP trend study was conducted between 2008 and 2012 in Greifswald. The baseline cohort included 4420 subjects (response 50.1%) at the age of 20 to 84 years. The pre-existence of arterial hypertension, diabetes mellitus and smoking status were questioned in a standardized questionnaire, the blood pressure and the HbA1c were determined by the laboratory. The vascular diameter of retinal arterioles and venules were determined by means of non-mydriatic fundus images and the retinal arterial (CRAE) and venous equivalent (CRVE) were calculated therefrom. The association of diabetes mellitus, HbA1c, smoking status and blood pressure with the retinal vascular parameters was tested for age and sex with linear regression models. Results In 3218 subjects with evaluable standardized fundus photographs, significant associations of elevated HbA1c (> 6.5%), smoking status and systolic and diastolic blood pressure were found with the retinal vessel widths CRAE and CRVE. Anamnestic diabetes mellitus, on the other hand, was not associated with any of the vascular parameters. Conclusion This research study reveals a relevant correlation between general diseases and the retinal blood flow in the eye. Therefore, general diseases can induce ophthalmological changes and eye examination can provide information for the assessment of general diseases. Georg Thieme Verlag KG Stuttgart · New York.

  11. A Human Factors Analysis and Classification System (HFACS) Examination of Commercial Vessel Accidents

    Science.gov (United States)

    2012-09-01

    analysis identified relationships among the HFACS levels and collision , allision, and grounding accidents . Logistic regression analysis identified six...determined for HFACS Level I. HFACS analysis identified relationships among the HFACS levels and collision , allision, and grounding accidents . Logistic...HFACS Level II Codes for Each Accident Type ............ 59 Table 14. Collisions Compared to All HFACS Level I Categories ....................... 60

  12. Netlang: A software for the linguistic analysis of corpora by means of complex networks.

    Science.gov (United States)

    Barceló-Coblijn, Lluís; Serna Salazar, Diego; Isaza, Gustavo; Castillo Ossa, Luis F; Bedia, Manuel G

    2017-01-01

    To date there is no software that directly connects the linguistic analysis of a conversation to a network program. Networks programs are able to extract statistical information from data basis with information about systems of interacting elements. Language has also been conceived and studied as a complex system. However, most proposals do not analyze language according to linguistic theory, but use instead computational systems that should save time at the price of leaving aside many crucial aspects for linguistic theory. Some approaches to network studies on language do apply precise linguistic analyses, made by a linguist. The problem until now has been the lack of interface between the analysis of a sentence and its integration into the network that could be managed by a linguist and that could save the analysis of any language. Previous works have used old software that was not created for these purposes and that often produced problems with some idiosyncrasies of the target language. The desired interface should be able to deal with the syntactic peculiarities of a particular language, the options of linguistic theory preferred by the user and the preservation of morpho-syntactic information (lexical categories and syntactic relations between items). Netlang is the first program able to do that. Recently, a new kind of linguistic analysis has been developed, which is able to extract a complexity pattern from the speaker's linguistic production which is depicted as a network where words are inside nodes, and these nodes connect each other by means of edges or links (the information inside the edge can be syntactic, semantic, etc.). The Netlang software has become the bridge between rough linguistic data and the network program. Netlang has integrated and improved the functions of programs used in the past, namely the DGA annotator and two scripts (ToXML.pl and Xml2Pairs.py) used for transforming and pruning data. Netlang allows the researcher to make accurate

  13. Stress analysis in a non axisymmetric loaded reactor pressure vessel; Verificacao de tensoes em um vaso de pressao nuclear com carregamentos nao-axissimetricos

    Energy Technology Data Exchange (ETDEWEB)

    Albuquerque, Levi Barcelos; Assis, Gracia Menezes V. de [Coordenadoria para Projetos Especiais (COPESP), Sao Paulo, SP (Brazil); Miranda, Carlos Alexandre J.; Cruz, Julio Ricardo B.; Mattar Neto, Miguel [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)

    1995-12-31

    In this work we intend to present the stress analysis of a PWR vessel under postulated concentrated loads. The vessel was modeled with Axisymmetric solid 4 nodes harmonic finite elements with the use of the ANSYS program, version 5.0. The bolts connecting the vessel flanges were modeled with beam elements. Some considerations were made to model the contact between the flanges. The perforated part of the vessel tori spherical head was modeled (with reduced properties due to its holes) to introduce its stiffness and loads but was not within the scope of this work. The loading consists of some usual ones, as pressure, dead weight, bolts preload, seismic load and some postulated ones as concentrated loads, over the vessel, modeled by Fourier Series. The results in the axisymmetric model are taken in terms of linearized stresses, obtained in some circumferential positions and for each position, in some sections along the vessel. Using the ASME Code (Section III, Division 1, Sub-section NB) the stresses are within the allowable limits. In order to draw some conclusions about stress linearization, the membrane plus bending stresses (Pl + Pb) are obtained and compared in some sections, using three different methods. (author) 4 refs., 15 figs., 7 tabs.

  14. SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.

    Science.gov (United States)

    Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro

    2012-12-01

    Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.

  15. Facilitating the analysis of the multifocal electroretinogram using the free software environment R.

    Science.gov (United States)

    Bergholz, Richard; Rossel, Mirjam; Dutescu, Ralf M; Vöge, Klaas P; Salchow, Daniel J

    2017-08-07

    The large amount of data rendered by the multifocal electroretinogram (mfERG) can be analyzed and visualized in various ways. The evaluation and comparison of more than one examination is time-consuming and prone to create errors. Using the free software environment R we developed a solution to average the data of multiple examinations and to allow a comparison of different patient groups. Data of single mfERG recordings as exported in .csv format from a RETIport 21 system (version 7/03, Roland Consult) or manually compiled .csv files are the basis for the calculations. The R software extracts response densities and implicit times of N1 and P1 for the sum response, each ring eccentricity, and each single hexagon. Averages can be calculated for as many subjects as needed. The mentioned parameters can then be compared to another group of patients or healthy subjects. Application of the software is illustrated by comparing 11 patients with chloroquine maculopathy to a control group of 7 healthy subjects. The software scripts display response density and implicit time 3D plots of each examination as well as of the group averages. Differences of the group averages are presented as 3D and grayscale 2D plots. Both groups are compared using the t-test with Bonferroni correction. The group comparison is furthermore illustrated by the average waveforms and by boxplots of each eccentricity. This software solution on the basis of the programming language R facilitates the clinical and scientific use of the mfERG and aids in interpretation and analysis.

  16. T-REX: software for the processing and analysis of T-RFLP data

    Directory of Open Access Journals (Sweden)

    Culman Steven W

    2009-06-01

    Full Text Available Abstract Background Despite increasing popularity and improvements in terminal restriction fragment length polymorphism (T-RFLP and other microbial community fingerprinting techniques, there are still numerous obstacles that hamper the analysis of these datasets. Many steps are required to process raw data into a format ready for analysis and interpretation. These steps can be time-intensive, error-prone, and can introduce unwanted variability into the analysis. Accordingly, we developed T-REX, free, online software for the processing and analysis of T-RFLP data. Results Analysis of T-RFLP data generated from a multiple-factorial study was performed with T-REX. With this software, we were able to i label raw data with attributes related to the experimental design of the samples, ii determine a baseline threshold for identification of true peaks over noise, iii align terminal restriction fragments (T-RFs in all samples (i.e., bin T-RFs, iv construct a two-way data matrix from labeled data and process the matrix in a variety of ways, v produce several measures of data matrix complexity, including the distribution of variance between main and interaction effects and sample heterogeneity, and vi analyze a data matrix with the additive main effects and multiplicative interaction (AMMI model. Conclusion T-REX provides a free, platform-independent tool to the research community that allows for an integrated, rapid, and more robust analysis of T-RFLP data.

  17. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  18. ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.

    Science.gov (United States)

    Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z

    2015-10-08

    Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the

  19. Distance software: design and analysis of distance sampling surveys for estimating population size.

    Science.gov (United States)

    Thomas, Len; Buckland, Stephen T; Rexstad, Eric A; Laake, Jeff L; Strindberg, Samantha; Hedley, Sharon L; Bishop, Jon Rb; Marques, Tiago A; Burnham, Kenneth P

    2010-02-01

    1.Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance.2.We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use.3.Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated.4.A first step in analysis of distance sampling data is modelling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark-recapture distance sampling, which relaxes the assumption of certain detection at zero distance.5.All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap.6.Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modelling analysis engine for spatial and habitat modelling, and information about accessing the analysis engines directly from other software.7.Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of-the-art software that implements these methods is described that makes the methods

  20. Analysis of Simplified Hydrogen and Dust Explosion in the Vacuum Vessel Accident using MELCOR Code

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Soo Min; Moon, Sung Bo; Bang, In Cheol [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The failure of the confinement barriers inside a penetration line between the VV and a port cell initiates the accident. Air ingress in VV results in formation of hydrogen/air explosive mixture and further explosion. Hydrogen explosion is assumed to trigger a dust explosion which leads to a large pressure peak creating a connection between VV and NBI Cell. Any other components related to the accident other than VV, suppression tank (ST), port cell, NBI Cell and Gallery are ignored in the analysis. Additional Free volume is assumed connected to the VV to simulate hydrogen/dust explosions. Figure 1 presents schematics of modeling of thermal-hydraulic analysis used in the accident analysis. For ITER design, three volumes of accident analysis report (AAR) presents analysis of selected postulated events important in ITER safety studies including hypothetical events. MELCOR 1.8.2 code was chosen as one of the several codes to perform ITER safety analysis because it models wide range of physical phenomena such as thermal-hydraulics, heat transfer, and other phenomena including aerosol physics. MELCOR can also predict structural temperatures using energy produced by radioactive heat or chemical reactions. Analysis has shown that hydrogen/dust explosion damaged the VV confinement barriers transporting dust from VV to the port cell, NBI cell and other penetrating lines. Unlike the accident analysis performed in the accident analysis report (AAR), the radioactive material was released into the environment shortly after the event. Simplified accident analysis was performed in attempt to perform fast safety analysis, however, multiple components and initial conditions not under consideration caused significant difference from the AAR analysis results.

  1. ANATI QUANTI: software de análises quantitativas para estudos em anatomia vegetal ANATI QUANTI: quantitative analysis software for plant anatomy studies

    Directory of Open Access Journals (Sweden)

    T.V. Aguiar

    2007-12-01

    Full Text Available Em diversos estudos interdisciplinares em que a Anatomia Vegetal é utilizada, análises quantitativas complementares são necessárias. Geralmente, a avaliação micromorfométrica é feita manualmente e/ou utilizando programas computacionais de análise de imagens não específicos. Este trabalho teve como objetivo desenvolver um programa específico para Anatomia Vegetal quantitativa e testar sua eficiência e aceitação por usuários. A solução foi elaborada na linguagem Java, visando maior mobilidade em relação ao sistema operacional a ser usado. O software desenvolvido foi denominado ANATI QUANTI e testado pelos alunos, pesquisadores e professores do Laboratório de Anatomia Vegetal da Universidade Federal de Viçosa (UFV. Todos os entrevistados receberam fotos para efetuarem medições no ANATI QUANTI e comparar com os resultados obtidos utilizando o software disponível. Os voluntários, através de questionários previamente formulados, destacaram as principais vantagens e desvantagens do programa desenvolvido em relação ao software disponível. Além de ser mais específico, simples e ágil do que o software disponível, o ANATI QUANTI é confiável, atendendo à expectativa dos entrevistados. Entretanto, há necessidade de acrescentar recursos adicionais, como a inserção de novas escalas, o que aumentaria a gama de usuários. O ANATI QUANTI já está em uso nas pesquisas desenvolvidas por usuários na UFV. Por ser um software livre e de código aberto, será disponibilizado na internet gratuitamente.Complementary quantitative analyses are necessary for several interdisciplinary studies using Plant Anatomy. Generally, micromorphometric evaluation is performed manually and/or using non-specific software for image analyses. This work aimed to develop specific quantitative analysis software for Plant Anatomy and test its efficiency and acceptance by users. The solution was elaborated in the JAVA language, which has a greater

  2. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...... of the discovered feature-code relations through a number of analytical views....

  3. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    Science.gov (United States)

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  4. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    Directory of Open Access Journals (Sweden)

    Luke eCampagnola

    2014-01-01

    Full Text Available The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  5. ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite

    Directory of Open Access Journals (Sweden)

    Borchers Christoph H

    2010-01-01

    Full Text Available Abstract Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light" versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm

  6. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  7. Representation of the Physiological Factors Contributing to Postflight Changes in Functional Performance Using Motion Analysis Software

    Science.gov (United States)

    Parks, Kelsey

    2010-01-01

    Astronauts experience changes in multiple physiological systems due to exposure to the microgravity conditions of space flight. To understand how changes in physiological function influence functional performance, a testing procedure has been developed that evaluates both astronaut postflight functional performance and related physiological changes. Astronauts complete seven functional and physiological tests. The objective of this project is to use motion tracking and digitizing software to visually display the postflight decrement in the functional performance of the astronauts. The motion analysis software will be used to digitize astronaut data videos into stick figure videos to represent the astronauts as they perform the Functional Tasks Tests. This project will benefit NASA by allowing NASA scientists to present data of their neurological studies without revealing the identities of the astronauts.

  8. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  9. Value Benefit Analysis Software and Its Application in Bolu-Lake Abant Natural Park

    Directory of Open Access Journals (Sweden)

    Omer Lutfu Corbaci

    2008-09-01

    Full Text Available Value benefit analysis (VBA is a psychometric instrument for finding the best compromise in forestry multiple-use planning, when the multiple objectives cannot be expressed in the same physical or monetary unit. It insures a systematic assessment of the consequences of proposed alternatives and thoroughly documents the decision process. The method leads to a ranking of alternatives based upon weighting of the objectives and evaluation of the contribution of each alternative to these objectives. The use of the method is illustrated with hypothetical data about Bolu-Lake Abant Natural Park (BLANP. In this study, in addition, computer software controlling the confidence was created. This software puts into practice the method proposed by Churchman and Ackoff, and determines the significance of the alternatives quickly and accurately.

  10. P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.; Kobold, Markus A.; Stratton, Kelly G.; White, Amanda M.; Rodland, Karin D.

    2017-10-31

    P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).

  11. Introduction to the KWALON Experiment: Discussions on Qualitative Data Analysis Software by Developers and Users

    Directory of Open Access Journals (Sweden)

    Jeanine C. Evers

    2010-11-01

    Full Text Available In this introduction to the KWALON Experiment and related conference, we describe the motivations of the collaborating European networks in organising this joint endeavour. The KWALON Experiment consisted of five developers of Qualitative Data Analysis (QDA software analysing a dataset regarding the financial crisis in the time period 2008-2009, provided by the conference organisers. Besides this experiment, researchers were invited to present their reflective papers on the use of QDA software. This introduction gives a description of the experiment, the "rules", research questions and reflective points, as well as a full description of the dataset and search rules used, and our reflection on the lessons learned. The related conference is described, as are the papers which are included in this FQS issue. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1101405

  12. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    Science.gov (United States)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of

  13. Enhancing supply vessel safety

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    A supply-vessel bridge installation consists of a navigating bridge and a control position aft, from which operators control the ship when close to rigs or platforms, and operate winches and other loading equipment. The international Convention for Safety of I Ale at Sea (SOLAS) does not regulate the layout, so design varies to a large degree, often causing an imperfect working environment. As for other types of ships, more than half the offshore service vessel accidents at sea are caused by bridge system failures. A majority can be traced back to technical design, and operational errors. The research and development project NAUT-OSV is a response to the offshore industry's safety concerns. Analysis of 24 incidents involving contact or collision between supply vessels and offshore installations owned or operated by Norwegian companies indicated that failures in the bridge system were often the cause.

  14. Hospital-based financial analysis of endovascular therapy and intravenous thrombolysis for large vessel acute ischemic strokes: the 'bottom line'.

    Science.gov (United States)

    Rai, Ansaar T; Evans, Kim

    2015-02-01

    Economic viability is important to any hospital striving to be a comprehensive stroke center. An inability to recover cost can strain sustained delivery of advanced stroke care. To carry out a comparative financial analysis of intravenous (IV) recombinant tissue plasminogen activator and endovascular (EV) therapy in treating large vessel strokes from a hospital's perspective. Actual hospital's charges, costs, and payments were analyzed for 265 patients who received treatment for large vessel strokes. The patients were divided into an EV (n=141) and an IV group (n=124). The net gain/loss was calculated as the difference between payments received and the total cost. The charges, costs, and payments were significantly higher for the EV than the IV group (p<0.0001 for all). Medicare A was the main payer. Length of stay was inversely related to net gain/loss (p<0.0001). Favorable outcome was associated with a net gain of $3853 (±$21,155) and poor outcome with a net deficit of $2906 (±$15,088) (p=0.003). The hospital showed a net gain for the EV group versus a net deficit for the IV group in patients who survived the admission (p=0.04), had a favorable outcome (p=0.1), or were discharged to home (p=0.03). There was no difference in the time in hospital based on in-hospital mortality for the EV group but patients who died in the IV group had a significantly shorter length of stay than those who survived (p=0.04). The favorable outcome of 42.3% in the EV group was significantly higher than the 29.4% in the IV group (p=0.03). Endovascular therapy was associated with better outcomes and higher cost-recovery than IV thrombolysis in patients with large vessel strokes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis.

    Science.gov (United States)

    Quek, Lake-Ee; Wittmann, Christoph; Nielsen, Lars K; Krömer, Jens O

    2009-05-01

    The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i) tracer cultivation on 13C substrates, (ii) 13C labelling analysis by mass spectrometry and (iii) mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU) framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly (studies. By providing the software open source, we hope it will evolve with the rapidly growing field of fluxomics.

  16. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  17. Analysis of the course and the possibility of accounting data in selected accounting software

    OpenAIRE

    KARASOVÁ, Iveta

    2013-01-01

    The thesis deals with a comparison of two accounting software in a chosen company. The company used to use an accounting software Abra from its beginning, but in 2015 they bought an accounting software Pohoda. The company had mostly negative experience with software Abra. As for software Pohoda, no disadvantage has been found, though not all functions have been used yet. The aim of the thesis is to assess whether the enterprise should keep using software Pohoda or whether it should come back ...

  18. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    Science.gov (United States)

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  19. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  20. XplorSeq: A software environment for integrated management and phylogenetic analysis of metagenomic sequence data

    Directory of Open Access Journals (Sweden)

    Frank Daniel N

    2008-10-01

    Full Text Available Abstract Background Advances in automated DNA sequencing technology have accelerated the generation of metagenomic DNA sequences, especially environmental ribosomal RNA gene (rDNA sequences. As the scale of rDNA-based studies of microbial ecology has expanded, need has arisen for software that is capable of managing, annotating, and analyzing the plethora of diverse data accumulated in these projects. Results XplorSeq is a software package that facilitates the compilation, management and phylogenetic analysis of DNA sequences. XplorSeq was developed for, but is not limited to, high-throughput analysis of environmental rRNA gene sequences. XplorSeq integrates and extends several commonly used UNIX-based analysis tools by use of a Macintosh OS-X-based graphical user interface (GUI. Through this GUI, users may perform basic sequence import and assembly steps (base-calling, vector/primer trimming, contig assembly, perform BLAST (Basic Local Alignment and Search Tool; 123 searches of NCBI and local databases, create multiple sequence alignments, build phylogenetic trees, assemble Operational Taxonomic Units, estimate biodiversity indices, and summarize data in a variety of formats. Furthermore, sequences may be annotated with user-specified meta-data, which then can be used to sort data and organize analyses and reports. A document-based architecture permits parallel analysis of sequence data from multiple clones or amplicons, with sequences and other data stored in a single file. Conclusion XplorSeq should benefit researchers who are engaged in analyses of environmental sequence data, especially those with little experience using bioinformatics software. Although XplorSeq was developed for management of rDNA sequence data, it can be applied to most any sequencing project. The application is available free of charge for non-commercial use at http://vent.colorado.edu/phyloware.

  1. Analysis of the common genetic component of large-vessel vasculitides through a meta-Immunochip strategy.

    Science.gov (United States)

    Carmona, F David; Coit, Patrick; Saruhan-Direskeneli, Güher; Hernández-Rodríguez, José; Cid, María C; Solans, Roser; Castañeda, Santos; Vaglio, Augusto; Direskeneli, Haner; Merkel, Peter A; Boiardi, Luigi; Salvarani, Carlo; González-Gay, Miguel A; Martín, Javier; Sawalha, Amr H

    2017-03-09

    Giant cell arteritis (GCA) and Takayasu's arteritis (TAK) are major forms of large-vessel vasculitis (LVV) that share clinical features. To evaluate their genetic similarities, we analysed Immunochip genotyping data from 1,434 LVV patients and 3,814 unaffected controls. Genetic pleiotropy was also estimated. The HLA region harboured the main disease-specific associations. GCA was mostly associated with class II genes (HLA-DRB1/HLA-DQA1) whereas TAK was mostly associated with class I genes (HLA-B/MICA). Both the statistical significance and effect size of the HLA signals were considerably reduced in the cross-disease meta-analysis in comparison with the analysis of GCA and TAK separately. Consequently, no significant genetic correlation between these two diseases was observed when HLA variants were tested. Outside the HLA region, only one polymorphism located nearby the IL12B gene surpassed the study-wide significance threshold in the meta-analysis of the discovery datasets (rs755374, P = 7.54E-07; ORGCA = 1.19, ORTAK = 1.50). This marker was confirmed as novel GCA risk factor using four additional cohorts (PGCA = 5.52E-04, ORGCA = 1.16). Taken together, our results provide evidence of strong genetic differences between GCA and TAK in the HLA. Outside this region, common susceptibility factors were suggested, especially within the IL12B locus.

  2. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    Directory of Open Access Journals (Sweden)

    Ikeda Noriaki

    2006-10-01

    Full Text Available Abstract Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel interface, and the

  3. Prediction of incident stroke events based on retinal vessel caliber: a systematic review and individual-participant meta-analysis

    NARCIS (Netherlands)

    McGeechan, Kevin; Liew, Gerald; Macaskill, Petra; Irwig, Les; Klein, Ronald; Klein, Barbara E. K.; Wang, Jie Jin; Mitchell, Paul; Vingerling, Johannes R.; de Jong, Paulus T. V. M.; Witteman, Jacqueline C. M.; Breteler, Monique M. B.; Shaw, Jonathan; Zimmet, Paul; Wong, Tien Y.

    2009-01-01

    The caliber of the retinal vessels has been shown to be associated with stroke events. However, the consistency and magnitude of association, and the changes in predicted risk independent of traditional risk factors, are unclear. To determine the association between retinal vessel caliber and the

  4. Multi-platform compatible software for analysis of polymer bending mechanics.

    Directory of Open Access Journals (Sweden)

    John S Graham

    Full Text Available Cytoskeletal polymers play a fundamental role in the responses of cells to both external and internal stresses. Quantitative knowledge of the mechanical properties of those polymers is essential for developing predictive models of cell mechanics and mechano-sensing. Linear cytoskeletal polymers, such as actin filaments and microtubules, can grow to cellular length scales at which they behave as semiflexible polymers that undergo thermally-driven shape deformations. Bending deformations are often modeled using the wormlike chain model. A quantitative metric of a polymer's resistance to bending is the persistence length, the fundamental parameter of that model. A polymer's bending persistence length is extracted from its shape as visualized using various imaging techniques. However, the analysis methodologies required for determining the persistence length are often not readily within reach of most biological researchers or educators. Motivated by that limitation, we developed user-friendly, multi-platform compatible software to determine the bending persistence length from images of surface-adsorbed or freely fluctuating polymers. Three different types of analysis are available (cosine correlation, end-to-end and bending-mode analyses, allowing for rigorous cross-checking of analysis results. The software is freely available and we provide sample data of adsorbed and fluctuating filaments and expected analysis results for educational and tutorial purposes.

  5. A software framework for the analysis of complex microscopy image data.

    Science.gov (United States)

    Chao, Jerry; Ward, E Sally; Ober, Raimund J

    2010-07-01

    Technological advances in both hardware and software have made possible the realization of sophisticated biological imaging experiments using the optical microscope. As a result, modern microscopy experiments are capable of producing complex image datasets. For a given data analysis task, the images in a set are arranged, based on the requirements of the task, by attributes such as the time and focus levels at which they were acquired. Importantly, different tasks performed over the course of an analysis are often facilitated by the use of different arrangements of the images. We present a software framework that supports the use of different logical image arrangements to analyze a physical set of images. This framework, called the Microscopy Image Analysis Tool (MIATool), realizes the logical arrangements using arrays of pointers to the images, thereby removing the need to replicate and manipulate the actual images in their storage medium. In order that they may be tailored to the specific requirements of disparate analysis tasks, these logical arrangements may differ in size and dimensionality, with no restrictions placed on the number of dimensions and the meaning of each dimension. MIATool additionally supports processing flexibility, extensible image processing capabilities, and data storage management.

  6. Predictive Model and Software for Inbreeding-Purging Analysis of Pedigreed Populations

    Directory of Open Access Journals (Sweden)

    Aurora García-Dorado

    2016-11-01

    Full Text Available The inbreeding depression of fitness traits can be a major threat to the survival of populations experiencing inbreeding. However, its accurate prediction requires taking into account the genetic purging induced by inbreeding, which can be achieved using a “purged inbreeding coefficient”. We have developed a method to compute purged inbreeding at the individual level in pedigreed populations with overlapping generations. Furthermore, we derive the inbreeding depression slope for individual logarithmic fitness, which is larger than that for the logarithm of the population fitness average. In addition, we provide a new software, PURGd, based on these theoretical results that allows analyzing pedigree data to detect purging, and to estimate the purging coefficient, which is the parameter necessary to predict the joint consequences of inbreeding and purging. The software also calculates the purged inbreeding coefficient for each individual, as well as standard and ancestral inbreeding. Analysis of simulation data show that this software produces reasonably accurate estimates for the inbreeding depression rate and for the purging coefficient that are useful for predictive purposes.

  7. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  8. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    Directory of Open Access Journals (Sweden)

    Schneider Maria PC

    2011-04-01

    Full Text Available Abstract Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  9. FIMTrack: An open source tracking and locomotion analysis software for small animals.

    Directory of Open Access Journals (Sweden)

    Benjamin Risse

    2017-05-01

    Full Text Available Imaging and analyzing the locomotion behavior of small animals such as Drosophila larvae or C. elegans worms has become an integral subject of biological research. In the past we have introduced FIM, a novel imaging system feasible to extract high contrast images. This system in combination with the associated tracking software FIMTrack is already used by many groups all over the world. However, so far there has not been an in-depth discussion of the technical aspects. Here we elaborate on the implementation details of FIMTrack and give an in-depth explanation of the used algorithms. Among others, the software offers several tracking strategies to cover a wide range of different model organisms, locomotion types, and camera properties. Furthermore, the software facilitates stimuli-based analysis in combination with built-in manual tracking and correction functionalities. All features are integrated in an easy-to-use graphical user interface. To demonstrate the potential of FIMTrack we provide an evaluation of its accuracy using manually labeled data. The source code is available under the GNU GPLv3 at https://github.com/i-git/FIMTrack and pre-compiled binaries for Windows and Mac are available at http://fim.uni-muenster.de.

  10. SAMPA: A free software tool for skin and membrane permeation data analysis.

    Science.gov (United States)

    Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina

    2017-10-01

    Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Security Vulnerability Profiles of Mission Critical Software: Empirical Analysis of Security Related Bug Reports

    Science.gov (United States)

    Goseva-Popstojanova, Katerina; Tyo, Jacob

    2017-01-01

    While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.

  12. FIMTrack: An open source tracking and locomotion analysis software for small animals.

    Science.gov (United States)

    Risse, Benjamin; Berh, Dimitri; Otto, Nils; Klämbt, Christian; Jiang, Xiaoyi

    2017-05-01

    Imaging and analyzing the locomotion behavior of small animals such as Drosophila larvae or C. elegans worms has become an integral subject of biological research. In the past we have introduced FIM, a novel imaging system feasible to extract high contrast images. This system in combination with the associated tracking software FIMTrack is already used by many groups all over the world. However, so far there has not been an in-depth discussion of the technical aspects. Here we elaborate on the implementation details of FIMTrack and give an in-depth explanation of the used algorithms. Among others, the software offers several tracking strategies to cover a wide range of different model organisms, locomotion types, and camera properties. Furthermore, the software facilitates stimuli-based analysis in combination with built-in manual tracking and correction functionalities. All features are integrated in an easy-to-use graphical user interface. To demonstrate the potential of FIMTrack we provide an evaluation of its accuracy using manually labeled data. The source code is available under the GNU GPLv3 at https://github.com/i-git/FIMTrack and pre-compiled binaries for Windows and Mac are available at http://fim.uni-muenster.de.

  13. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    Science.gov (United States)

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  14. Analysis of On-Board Oxygen and Nitrogen Generation Systems for Surface Vessels.

    Science.gov (United States)

    1983-06-01

    electrical supply: all utilities (filter, heater , cooler, compressor, valves, solid state elec- tronics, monitor/command, converters, firefighting) are...explosion), liquid nitrogen ( suffocation , cold temperatures, etc.). A comprehensive analysis of expected hazards caused by the presence of LOX, LN, GOX...the designer and the user should review the application environment (e.g. open or in enclosure) to avoid a potential suffocation j hazard. (Appendix A

  15. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    Science.gov (United States)

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server

  16. Products eco-sustainability analysis using CAD SolidWorks software

    Directory of Open Access Journals (Sweden)

    Popa Luminița I.

    2017-01-01

    Full Text Available This article is focused on the analysis of environmental impact and Eco-sustainability of models designed using CAD SolidWorks software. We have evaluated the material it was made the whole ansamble, in terms of strength, durability and environmental pollution considering the carbon footprint, energy consumption, air acidification and eutrophication. We considered the whole product life-cycle management, from raw material extraction, processing it, piece production, assembly it, and use it until the end of his life, considering the mode of transport and the distance between these stages. The case study presents the virtual model of the product and Sustainability Report.

  17. Towards a multi-site international public dataset for the validation of retinal image analysis software.

    Science.gov (United States)

    Trucco, Emanuele; Ruggeri, Alfredo

    2013-01-01

    This paper discusses concisely the main issues and challenges posed by the validation of retinal image analysis algorithms. It is designed to set the discussion for the IEEE EBMC 2013 invited session "From laboratory to clinic: the validation of retinal image processing tools ". The session carries forward an international initiative started at EMBC 2011, Boston, which resulted in the first large-consensus paper (14 international sites) on the validation of retinal image processing software, appearing in IOVS. This paper is meant as a focus for the session discussion, but the ubiquity and importance of validation makes its contents, arguably, of interest for the wider medical image processing community.

  18. Advanced functionality for radio analysis in the Offline software framework of the Pierre Auger Observatory

    Czech Academy of Sciences Publication Activity Database

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Kárová, Tatiana; Mandát, Dušan; Nečesal, Petr; Nožka, Libor; Nyklíček, Michal; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovancová, Jaroslava; Schovánek, Petr; Šmída, Radomír; Trávníček, Petr

    2011-01-01

    Roč. 635, č. 1 (2011), s. 92-102 ISSN 0168-9002 R&D Projects: GA MŠk LC527; GA MŠk(CZ) 1M06002; GA MŠk(CZ) LA08016; GA AV ČR KJB100100904; GA AV ČR KJB300100801 Institutional research plan: CEZ:AV0Z10100502; CEZ:AV0Z10100522 Keywords : cosmic rays * radio detection * analysis software * detector simulation Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.207, year: 2011

  19. Interuniversity Upper Atmosphere Global Observation Network (IUGONET Meta-Database and Analysis Software

    Directory of Open Access Journals (Sweden)

    A Yatagai

    2014-09-01

    Full Text Available An overview of the Interuniversity Upper atmosphere Global Observation NETwork (IUGONET project is presented. This Japanese program is building a meta-database for ground-based observations of the Earth’s upper atmosphere, in which metadata connected with various atmospheric radars and photometers, including those located in both polar regions, are archived. By querying the metadata database, researchers are able to access data file/information held by data facilities. Moreover, by utilizing our analysis software, users can download, visualize, and analyze upper-atmospheric data archived in or linked with the system. As a future development, we are looking to make our database interoperable with others.

  20. PREDICTIVE ANALYSIS SOFTWARE FOR MODELING THE ALTMAN Z-SCORE FINANCIAL DISTRESS STATUS OF COMPANIES

    Directory of Open Access Journals (Sweden)

    ILIE RĂSCOLEAN

    2012-10-01

    Full Text Available Literature shows some bankruptcy methods for determining the financial distress status of companies and based on this information we chosen Altman statistical model because it has been used a lot in the past and like that it has become a benchmark for other methods. Based on this financial analysis flowchart, programming software was developed that allows the calculation and determination of the bankruptcy probability for a certain rate of failure Z-score, corresponding to a given interval that is equal to the ratio of the number of bankrupt companies and the total number of companies (bankrupt and healthy interval.