WorldWideScience

Sample records for semi-automatic computer system

  1. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  2. Requirements Report Computer Software System for a Semi-Automatic Pipe Handling System and Fabrication Facility

    National Research Council Canada - National Science Library

    1980-01-01

    .... This report is to present the requirements of the computer software that must be developed to create Pipe Detail Drawings and to support the processing of the Pipe Detail Drawings through the Pipe Shop...

  3. Semi-automatic aircraft control system

    Science.gov (United States)

    Gilson, Richard D. (Inventor)

    1978-01-01

    A flight control type system which provides a tactile readout to the hand of a pilot for directing elevator control during both approach to flare-out and departure maneuvers. For altitudes above flare-out, the system sums the instantaneous coefficient of lift signals of a lift transducer with a generated signal representing ideal coefficient of lift for approach to flare-out, i.e., a value of about 30% below stall. Error signals resulting from the summation are read out by the noted tactile device. Below flare altitude, an altitude responsive variation is summed with the signal representing ideal coefficient of lift to provide error signal readout.

  4. Semi-automatic dimension and density measuring system for UO{sub 2} pellets

    Energy Technology Data Exchange (ETDEWEB)

    Subramanian, K S; Shyam, P G; Muralidhara Rao, J V; Laxminarayana, B; Suryaprakash, M [Nuclear Fuel Complex, Hyderabad (India)

    1994-12-31

    The parameters like diameter, length, L/D ratio and sintered density of cylindrical UO{sub 2} pellets are critical in both the PHWR and BWR fuels. A semi-automatic system is developed by interfacing a laser micrometer, a digital electronic balance with a PC-XT and incorporating menu-driven, user-friendly software developed in-house. The advantages are data storage, acquisition, statistical analysis with histograms and print out of acquired and computed values with respective set-up limits along with the production details like lot number, press number, furnace number etc. This paper describes the details of the above system and the software. 3 figs., 2 ills.

  5. A Semi-Automatic, Remote-Controlled Video Observation System for Transient Luminous Events

    DEFF Research Database (Denmark)

    Allin, Thomas Højgaard; Neubert, Torsten; Laursen, Steen

    2003-01-01

    In support for global ELF/VLF observations, HF measurements in France, and conjugate photometry/VLF observations in South Africa, we developed and operated a semi-automatic, remotely controlled video system for the observation of middle-atmospheric transient luminous events (TLEs). Installed...

  6. Cuypers : a semi-automatic hypermedia generation system

    NARCIS (Netherlands)

    J.R. van Ossenbruggen (Jacco); F.J. Cornelissen; J.P.T.M. Geurts (Joost); L. Rutledge (Lloyd); L. Hardman (Lynda)

    2000-01-01

    textabstractThe report describes the architecture of emph{Cuypers, a system supporting second and third generation Web-based multimedia. First generation Web-content encodes information in handwritten (HTML) Web pages. Second generation Web content generates HTML pages on demand, e.g. by filling in

  7. Cuypers : a semi-automatic hypermedia generation system

    OpenAIRE

    Ossenbruggen, Jacco; Cornelissen, F.J.; Geurts, Joost; Rutledge, Lloyd; Hardman, Lynda

    2000-01-01

    textabstractThe report describes the architecture of emph{Cuypers, a system supporting second and third generation Web-based multimedia. First generation Web-content encodes information in handwritten (HTML) Web pages. Second generation Web content generates HTML pages on demand, e.g. by filling in templates with content retrieved dynamically from a database or transformation of structured documents using style sheets (e.g. XSLT). Third generation Web pages will make use of rich markup (e.g. ...

  8. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    Science.gov (United States)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  9. Semi-automatic parking slot marking recognition for intelligent parking assist systems

    Directory of Open Access Journals (Sweden)

    Ho Gi Jung

    2014-01-01

    Full Text Available This paper proposes a semi-automatic parking slot marking-based target position designation method for parking assist systems in cases where the parking slot markings are of a rectangular type, and its efficient implementation for real-time operation. After the driver observes a rearview image captured by a rearward camera installed at the rear of the vehicle through a touchscreen-based human machine interface, a target parking position is designated by touching the inside of a parking slot. To ensure the proposed method operates in real-time in an embedded environment, access of the bird's-eye view image is made efficient: image-wise batch transformation is replaced with pixel-wise instantaneous transformation. The proposed method showed a 95.5% recognition rate in 378 test cases with 63 test images. Additionally, experiments confirmed that the pixel-wise instantaneous transformation reduced execution time by 92%.

  10. Integrating different tracking systems in football: multiple camera semi-automatic system, local position measurement and GPS technologies.

    Science.gov (United States)

    Buchheit, Martin; Allen, Adam; Poon, Tsz Kit; Modonutti, Mattia; Gregson, Warren; Di Salvo, Valter

    2014-12-01

    Abstract During the past decade substantial development of computer-aided tracking technology has occurred. Therefore, we aimed to provide calibration equations to allow the interchangeability of different tracking technologies used in soccer. Eighty-two highly trained soccer players (U14-U17) were monitored during training and one match. Player activity was collected simultaneously with a semi-automatic multiple-camera (Prozone), local position measurement (LPM) technology (Inmotio) and two global positioning systems (GPSports and VX). Data were analysed with respect to three different field dimensions (small, systems were compared, and calibration equations (linear regression models) between each system were calculated for each field dimension. Most metrics differed between the 4 systems with the magnitude of the differences dependant on both pitch size and the variable of interest. Trivial-to-small between-system differences in total distance were noted. However, high-intensity running distance (>14.4 km · h -1 ) was slightly-to-moderately greater when tracked with Prozone, and accelerations, small-to-very largely greater with LPM. For most of the equations, the typical error of the estimate was of a moderate magnitude. Interchangeability of the different tracking systems is possible with the provided equations, but care is required given their moderate typical error of the estimate.

  11. Semi-automatic system for UV images analysis of historical musical instruments

    Science.gov (United States)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  12. Semi-automatic classification of skeletal morphology in genetically altered mice using flat-panel volume computed tomography.

    Directory of Open Access Journals (Sweden)

    Christian Dullin

    2007-07-01

    Full Text Available Rapid progress in exploring the human and mouse genome has resulted in the generation of a multitude of mouse models to study gene functions in their biological context. However, effective screening methods that allow rapid noninvasive phenotyping of transgenic and knockout mice are still lacking. To identify murine models with bone alterations in vivo, we used flat-panel volume computed tomography (fpVCT for high-resolution 3-D imaging and developed an algorithm with a computational intelligence system. First, we tested the accuracy and reliability of this approach by imaging discoidin domain receptor 2- (DDR2- deficient mice, which display distinct skull abnormalities as shown by comparative landmark-based analysis. High-contrast fpVCT data of the skull with 200 microm isotropic resolution and 8-s scan time allowed segmentation and computation of significant shape features as well as visualization of morphological differences. The application of a trained artificial neuronal network to these datasets permitted a semi-automatic and highly accurate phenotype classification of DDR2-deficient compared to C57BL/6 wild-type mice. Even heterozygous DDR2 mice with only subtle phenotypic alterations were correctly determined by fpVCT imaging and identified as a new class. In addition, we successfully applied the algorithm to classify knockout mice lacking the DDR1 gene with no apparent skull deformities. Thus, this new method seems to be a potential tool to identify novel mouse phenotypes with skull changes from transgenic and knockout mice on the basis of random mutagenesis as well as from genetic models. However for this purpose, new neuronal networks have to be created and trained. In summary, the combination of fpVCT images with artificial neuronal networks provides a reliable, novel method for rapid, cost-effective, and noninvasive primary screening tool to detect skeletal phenotypes in mice.

  13. A semi-automatic traffic sign detection, classification and positioning system

    NARCIS (Netherlands)

    Creusen, I.M.; Hazelhoff, L.; With, de P.H.N.; Said, A.; Guleryuz, O.G.; Stevenson, R.L.

    2012-01-01

    The availability of large-scale databases containing street-level panoramic images offers the possibility to perform semi-automatic surveying of real-world objects such as traffic signs. These inventories can be performed significantly more efficiently than using conventional methods. Governmental

  14. Robust semi-automatic segmentation of pulmonary subsolid nodules in chest computed tomography scans

    International Nuclear Information System (INIS)

    Lassen, B C; Kuhnigk, J-M; Van Ginneken, B; Van Rikxoort, E M; Jacobs, C

    2015-01-01

    The malignancy of lung nodules is most often detected by analyzing changes of the nodule diameter in follow-up scans. A recent study showed that comparing the volume or the mass of a nodule over time is much more significant than comparing the diameter. Since the survival rate is higher when the disease is still in an early stage it is important to detect the growth rate as soon as possible. However manual segmentation of a volume is time-consuming. Whereas there are several well evaluated methods for the segmentation of solid nodules, less work is done on subsolid nodules which actually show a higher malignancy rate than solid nodules. In this work we present a fast, semi-automatic method for segmentation of subsolid nodules. As minimal user interaction the method expects a user-drawn stroke on the largest diameter of the nodule. First, a threshold-based region growing is performed based on intensity analysis of the nodule region and surrounding parenchyma. In the next step the chest wall is removed by a combination of a connected component analyses and convex hull calculation. Finally, attached vessels are detached by morphological operations. The method was evaluated on all nodules of the publicly available LIDC/IDRI database that were manually segmented and rated as non-solid or part-solid by four radiologists (Dataset 1) and three radiologists (Dataset 2). For these 59 nodules the Jaccard index for the agreement of the proposed method with the manual reference segmentations was 0.52/0.50 (Dataset 1/Dataset 2) compared to an inter-observer agreement of the manual segmentations of 0.54/0.58 (Dataset 1/Dataset 2). Furthermore, the inter-observer agreement using the proposed method (i.e. different input strokes) was analyzed and gave a Jaccard index of 0.74/0.74 (Dataset 1/Dataset 2). The presented method provides satisfactory segmentation results with minimal observer effort in minimal time and can reduce the inter-observer variability for segmentation of

  15. Robust semi-automatic segmentation of pulmonary subsolid nodules in chest computed tomography scans

    Science.gov (United States)

    Lassen, B. C.; Jacobs, C.; Kuhnigk, J.-M.; van Ginneken, B.; van Rikxoort, E. M.

    2015-02-01

    The malignancy of lung nodules is most often detected by analyzing changes of the nodule diameter in follow-up scans. A recent study showed that comparing the volume or the mass of a nodule over time is much more significant than comparing the diameter. Since the survival rate is higher when the disease is still in an early stage it is important to detect the growth rate as soon as possible. However manual segmentation of a volume is time-consuming. Whereas there are several well evaluated methods for the segmentation of solid nodules, less work is done on subsolid nodules which actually show a higher malignancy rate than solid nodules. In this work we present a fast, semi-automatic method for segmentation of subsolid nodules. As minimal user interaction the method expects a user-drawn stroke on the largest diameter of the nodule. First, a threshold-based region growing is performed based on intensity analysis of the nodule region and surrounding parenchyma. In the next step the chest wall is removed by a combination of a connected component analyses and convex hull calculation. Finally, attached vessels are detached by morphological operations. The method was evaluated on all nodules of the publicly available LIDC/IDRI database that were manually segmented and rated as non-solid or part-solid by four radiologists (Dataset 1) and three radiologists (Dataset 2). For these 59 nodules the Jaccard index for the agreement of the proposed method with the manual reference segmentations was 0.52/0.50 (Dataset 1/Dataset 2) compared to an inter-observer agreement of the manual segmentations of 0.54/0.58 (Dataset 1/Dataset 2). Furthermore, the inter-observer agreement using the proposed method (i.e. different input strokes) was analyzed and gave a Jaccard index of 0.74/0.74 (Dataset 1/Dataset 2). The presented method provides satisfactory segmentation results with minimal observer effort in minimal time and can reduce the inter-observer variability for segmentation of

  16. Building a semi-automatic ontology learning and construction system for geosciences

    Science.gov (United States)

    Babaie, H. A.; Sunderraman, R.; Zhu, Y.

    2013-12-01

    We are developing an ontology learning and construction framework that allows continuous, semi-automatic knowledge extraction, verification, validation, and maintenance by potentially a very large group of collaborating domain experts in any geosciences field. The system brings geoscientists from the side-lines to the center stage of ontology building, allowing them to collaboratively construct and enrich new ontologies, and merge, align, and integrate existing ontologies and tools. These constantly evolving ontologies can more effectively address community's interests, purposes, tools, and change. The goal is to minimize the cost and time of building ontologies, and maximize the quality, usability, and adoption of ontologies by the community. Our system will be a domain-independent ontology learning framework that applies natural language processing, allowing users to enter their ontology in a semi-structured form, and a combined Semantic Web and Social Web approach that lets direct participation of geoscientists who have no skill in the design and development of their domain ontologies. A controlled natural language (CNL) interface and an integrated authoring and editing tool automatically convert syntactically correct CNL text into formal OWL constructs. The WebProtege-based system will allow a potentially large group of geoscientists, from multiple domains, to crowd source and participate in the structuring of their knowledge model by sharing their knowledge through critiquing, testing, verifying, adopting, and updating of the concept models (ontologies). We will use cloud storage for all data and knowledge base components of the system, such as users, domain ontologies, discussion forums, and semantic wikis that can be accessed and queried by geoscientists in each domain. We will use NoSQL databases such as MongoDB as a service in the cloud environment. MongoDB uses the lightweight JSON format, which makes it convenient and easy to build Web applications using

  17. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  18. Application of a semi-automatic ROI setting system for brain PET images to animal PET studies

    International Nuclear Information System (INIS)

    Kuge, Yuji; Akai, Nobuo; Tamura, Koji

    1998-01-01

    ProASSIST, a semi-automatic ROI (region of interest) setting system for human brain PET images, has been modified for use with the canine brain, and the performance of the obtained system was evaluated by comparing the operational simplicity for ROI setting and the consistency of ROI values obtained with those by a conventional manual procedure. Namely, we created segment maps for the canine brain by making reference to the coronal section atlas of the canine brain by Lim et al., and incorporated them into the ProASSIST system. For the performance test, CBF (cerebral blood flow) and CMRglc (cerebral metabolic rate in glucose) images in dogs with or without focal cerebral ischemia were used. In ProASSIST, brain contours were defined semiautomatically. In the ROI analysis of the test image, manual modification of the contour was necessary in half cases examined (8/16). However, the operation was rather simple so that the operation time per one brain section was significantly shorter than that in the manual operation. The ROI values determined by the system were comparable with those by the manual procedure, confirming the applicability of the system to these animal studies. The use of the system like the present one would also merit the more objective data acquisition for the quantitative ROI analysis, because no manual procedure except for some specifications of the anatomical features is required for ROI setting. (author)

  19. Chromosome painting in biological dosimetry: Semi-automatic system to score stable chromosome aberrations

    International Nuclear Information System (INIS)

    Garcia-Sagredo, J.M.; Vallcorba, I.; Sanchez-Hombre, M.C.; Ferro, M.T.; San Roman Cos-Gayon, C.; Santos, A.; Malpica, N.; Ortiz, C.

    1997-01-01

    From the beginning of the description of the procedure of chromosome painting by fluorescence in situ hybridization (FISH), it was thought its possible application to score induced chromosomal aberrations in radiation exposition. With chromosome painting it is possible to detect changes between chromosomes that has been validated in radiation exposition. Translocation scoring by FISH, contrarily to the unstable dicentrics, mainly detect stable chromosome aberrations that do not disappear, it allows the capability of quantify delayed acute expositions or chronic cumulative expositions. The large number of cells that have to be analyzed for high accuracy, specially when dealing with low radiation doses, makes it almost imperative to use an automatic analysis system. After validate translocation scoring by FISH in our, we have evaluated the ability and sensitivity to detect chromosomal aberrations by chromosome using different paint probes used, showing that any combination of paint probes can be used to score induced chromosomal aberrations. Our group has developed a FISH analysis that is currently being adapted for translocation scoring analysis. It includes systematic error correction and internal control probes. The performance tests carried out show that 9,000 cells can be analyzed in 10 hr. using a Sparc 4/370. Although with a faster computer, a higher throughput is expected, for large population screening or very low radiation doses, this performance still has to be improved. (author)

  20. Graphical user interface (GUIDE) and semi-automatic system for the acquisition of anaglyphs

    Science.gov (United States)

    Canchola, Marco A.; Arízaga, Juan A.; Cortés, Obed; Tecpanecatl, Eduardo; Cantero, Jose M.

    2013-09-01

    Diverse educational experiences have shown greater acceptance of children to ideas related to science, compared with adults. That fact and showing great curiosity are factors to consider to undertake scientific outreach efforts for children, with prospects of success. Moreover now 3D digital images have become a topic that has gained importance in various areas, entertainment, film and video games mainly, but also in areas such as medical practice transcendental in disease detection This article presents a system model for 3D images for educational purposes that allows students of various grade levels, school and college, have an approach to image processing, explaining the use of filters for stereoscopic images that give brain impression of depth. The system is based on one of two hardware elements, centered on an Arduino board, and a software based on Matlab. The paper presents the design and construction of each of the elements, also presents information on the images obtained and finally how users can interact with the device.

  1. Semi-automatic surface sediment sampling system - A prototype to be implemented in bivalve fishing surveys

    Science.gov (United States)

    Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.

    2018-01-01

    In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.

  2. A novel region-growing based semi-automatic segmentation protocol for three-dimensional condylar reconstruction using cone beam computed tomography (CBCT.

    Directory of Open Access Journals (Sweden)

    Tong Xi

    Full Text Available OBJECTIVE: To present and validate a semi-automatic segmentation protocol to enable an accurate 3D reconstruction of the mandibular condyles using cone beam computed tomography (CBCT. MATERIALS AND METHODS: Approval from the regional medical ethics review board was obtained for this study. Bilateral mandibular condyles in ten CBCT datasets of patients were segmented using the currently proposed semi-automatic segmentation protocol. This segmentation protocol combined 3D region-growing and local thresholding algorithms. The segmentation of a total of twenty condyles was performed by two observers. The Dice-coefficient and distance map calculations were used to evaluate the accuracy and reproducibility of the segmented and 3D rendered condyles. RESULTS: The mean inter-observer Dice-coefficient was 0.98 (range [0.95-0.99]. An average 90th percentile distance of 0.32 mm was found, indicating an excellent inter-observer similarity of the segmented and 3D rendered condyles. No systematic errors were observed in the currently proposed segmentation protocol. CONCLUSION: The novel semi-automated segmentation protocol is an accurate and reproducible tool to segment and render condyles in 3D. The implementation of this protocol in the clinical practice allows the CBCT to be used as an imaging modality for the quantitative analysis of condylar morphology.

  3. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  4. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    Science.gov (United States)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  5. N2 gas station and gas distribution system for TLD personnel monitoring gas based semi-automatic badge readers

    International Nuclear Information System (INIS)

    Chourasiya, G.; Pradhan, S.M.; Kher, R.K.; Bhatt, B.C

    2003-01-01

    Full text: New improvised hot gas based Auto TLD badge reader has several advantages over the earlier contact heating based manual badge reader. It requires constant supply of N 2 gas for its operation; The gas supplied using replaceable individual gas cylinders may have some safety hazards in their handling. It was therefore considered worthwhile to setup a N 2 gas assembly/ station outside the lab area and to bring regulated gas supply through network of tubes with proper regulation to the individual readers. The paper presents detailed description of the gas station and distribution system. The system is quite useful and offers several practical advantages for readout of TLD badges on the semiautomatic badge readers based on gas heating. Important advantage from dosimetric point of view is avoidance of gas flow rate fluctuations and corresponding variations in TL readouts

  6. Adaptive neuro-fuzzy inference systems for semi-automatic discrimination between seismic events: a study in Tehran region

    Science.gov (United States)

    Vasheghani Farahani, Jamileh; Zare, Mehdi; Lucas, Caro

    2012-04-01

    Thisarticle presents an adaptive neuro-fuzzy inference system (ANFIS) for classification of low magnitude seismic events reported in Iran by the network of Tehran Disaster Mitigation and Management Organization (TDMMO). ANFIS classifiers were used to detect seismic events using six inputs that defined the seismic events. Neuro-fuzzy coding was applied using the six extracted features as ANFIS inputs. Two types of events were defined: weak earthquakes and mining blasts. The data comprised 748 events (6289 signals) ranging from magnitude 1.1 to 4.6 recorded at 13 seismic stations between 2004 and 2009. We surveyed that there are almost 223 earthquakes with M ≤ 2.2 included in this database. Data sets from the south, east, and southeast of the city of Tehran were used to evaluate the best short period seismic discriminants, and features as inputs such as origin time of event, distance (source to station), latitude of epicenter, longitude of epicenter, magnitude, and spectral analysis (fc of the Pg wave) were used, increasing the rate of correct classification and decreasing the confusion rate between weak earthquakes and quarry blasts. The performance of the ANFIS model was evaluated for training and classification accuracy. The results confirmed that the proposed ANFIS model has good potential for determining seismic events.

  7. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  8. Body composition estimation from selected slices: equations computed from a new semi-automatic thresholding method developed on whole-body CT scans

    Directory of Open Access Journals (Sweden)

    Alizé Lacoste Jeanson

    2017-05-01

    Full Text Available Background Estimating volumes and masses of total body components is important for the study and treatment monitoring of nutrition and nutrition-related disorders, cancer, joint replacement, energy-expenditure and exercise physiology. While several equations have been offered for estimating total body components from MRI slices, no reliable and tested method exists for CT scans. For the first time, body composition data was derived from 41 high-resolution whole-body CT scans. From these data, we defined equations for estimating volumes and masses of total body AT and LT from corresponding tissue areas measured in selected CT scan slices. Methods We present a new semi-automatic approach to defining the density cutoff between adipose tissue (AT and lean tissue (LT in such material. An intra-class correlation coefficient (ICC was used to validate the method. The equations for estimating the whole-body composition volume and mass from areas measured in selected slices were modeled with ordinary least squares (OLS linear regressions and support vector machine regression (SVMR. Results and Discussion The best predictive equation for total body AT volume was based on the AT area of a single slice located between the 4th and 5th lumbar vertebrae (L4-L5 and produced lower prediction errors (|PE| = 1.86 liters, %PE = 8.77 than previous equations also based on CT scans. The LT area of the mid-thigh provided the lowest prediction errors (|PE| = 2.52 liters, %PE = 7.08 for estimating whole-body LT volume. We also present equations to predict total body AT and LT masses from a slice located at L4-L5 that resulted in reduced error compared with the previously published equations based on CT scans. The multislice SVMR predictor gave the theoretical upper limit for prediction precision of volumes and cross-validated the results.

  9. Body composition estimation from selected slices: equations computed from a new semi-automatic thresholding method developed on whole-body CT scans.

    Science.gov (United States)

    Lacoste Jeanson, Alizé; Dupej, Ján; Villa, Chiara; Brůžek, Jaroslav

    2017-01-01

    Estimating volumes and masses of total body components is important for the study and treatment monitoring of nutrition and nutrition-related disorders, cancer, joint replacement, energy-expenditure and exercise physiology. While several equations have been offered for estimating total body components from MRI slices, no reliable and tested method exists for CT scans. For the first time, body composition data was derived from 41 high-resolution whole-body CT scans. From these data, we defined equations for estimating volumes and masses of total body AT and LT from corresponding tissue areas measured in selected CT scan slices. We present a new semi-automatic approach to defining the density cutoff between adipose tissue (AT) and lean tissue (LT) in such material. An intra-class correlation coefficient (ICC) was used to validate the method. The equations for estimating the whole-body composition volume and mass from areas measured in selected slices were modeled with ordinary least squares (OLS) linear regressions and support vector machine regression (SVMR). The best predictive equation for total body AT volume was based on the AT area of a single slice located between the 4th and 5th lumbar vertebrae (L4-L5) and produced lower prediction errors (|PE| = 1.86 liters, %PE = 8.77) than previous equations also based on CT scans. The LT area of the mid-thigh provided the lowest prediction errors (|PE| = 2.52 liters, %PE = 7.08) for estimating whole-body LT volume. We also present equations to predict total body AT and LT masses from a slice located at L4-L5 that resulted in reduced error compared with the previously published equations based on CT scans. The multislice SVMR predictor gave the theoretical upper limit for prediction precision of volumes and cross-validated the results.

  10. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  11. Semi-automatic logarithmic converter of logs

    International Nuclear Information System (INIS)

    Gol'dman, Z.A.; Bondar's, V.V.

    1974-01-01

    Semi-automatic logarithmic converter of logging charts. An original semi-automatic converter was developed for use in converting BK resistance logging charts and the time interval, ΔT, of acoustic logs from a linear to a logarithmic scale with a specific ratio for subsequent combining of them with neutron-gamma logging charts in operative interpretation of logging materials by a normalization method. The converter can be used to increase productivity by giving curves different from those obtained in manual, pointwise processing. The equipment operates reliably and is simple in use. (author)

  12. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  13. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  14. Neuromantic - from semi manual to semi automatic reconstruction of neuron morphology

    Directory of Open Access Journals (Sweden)

    Darren eMyatt

    2012-03-01

    Full Text Available The ability to create accurate geometric models of neuronal morphologyis important for understanding the role of shape in informationprocessing. Despite a significant amount of research on automating neuronreconstructions from image stacks obtained via microscopy, in practice mostdata are still collected manually.This paper describes Neuromantic, an open source system for threedimensional digital tracing of neurites. Neuromantic reconstructions arecomparable in quality to those of existing commercial and freeware systemswhile balancing speed and accuracy of manual reconstruction. Thecombination of semi-automatic tracing, intuitive editing, and ability ofvisualising large image stacks on standard computing platforms providesa versatile tool that can help address the reconstructions availabilitybottleneck. Practical considerations for reducing the computational time andspace requirements of the extended algorithm are also discussed.

  15. Semi-automatic ROI placement system for analysis of brain PET images based on elastic model. Application to diagnosis of Alzheimer's disease

    International Nuclear Information System (INIS)

    Ohyama, Masashi; Mishina, Masahiro; Kitamura, Shin; Katayama, Yasuo; Senda, Michio; Tanizaki, Naoki; Ishii, Kenji

    2000-01-01

    PET with 18F-fluorodeoxyglucose (FDG) is a useful technique to image cerebral glucose metabolism and to detect patients with Alzheimer's disease in the early stage, in which characteristic temporoparietal hypometabolism is visualized. We have developed a new system, in which the standard brain ROI atlas made of networks of segments is elastically transformed to match the subject brain images, so that standard ROIs defined on the segments are placed on the individual brain images and are used to measure radioactivity over each brain region. We applied this methods to Alzheimer's disease. This method was applied to the images of 10 normal subjects (ages 55 +/- 12) and 21 patients clinically diagnosed as Alzheimer's disease (age 61 +/- 10). The FDG uptake reflecting glucose metabolism was evaluated with SUV, i.e. decay corrected radioactivity divided by injected dose per body weight in (Bq/ml)/(Bq/g). The system worked all right in every subject including those with extensive hypometabolism. Alzheimer patients showed markedly lower in the parietal cortex (4.0-4.1). When the threshold value of FDG uptake in the parietal lobe was set as 5 (Bq/ml)/(Bq/g), we could discriminate the patients with Alzheimer's disease from the normal subjects. The sensitivity was 86% and the specificity was 90%. This system can assist diagnosis of FDG images and may be useful for treating data of a large number of subjects; e.g. when PET is applied to health screening. (author)

  16. Performance testing of a semi-automatic card punch system, using direct STR profiling of DNA from blood samples on FTA™ cards.

    Science.gov (United States)

    Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J

    2015-01-01

    The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. © 2014 American Academy of Forensic Sciences.

  17. Research on Semi-automatic Bomb Fetching for an EOD Robot

    Directory of Open Access Journals (Sweden)

    Qian Jun

    2008-11-01

    Full Text Available An EOD robot system, SUPER-PLUS, which has a novel semi-automatic bomb fetching function is presented in this paper. With limited support of human, SUPER-PLUS scans the cluttered environment with a wrist-mounted laser distance sensor and plans the manipulator a collision free path to fetch the bomb. The model construction of manipulator, bomb and environment, C-space map, path planning and the operation procedure are introduced in detail. The semi-automatic bomb fetching function has greatly improved the operation performance of EOD robot.

  18. Research on Semi-Automatic Bomb Fetching for an EOD Robot

    Directory of Open Access Journals (Sweden)

    Zeng Jian-Jun

    2007-06-01

    Full Text Available An EOD robot system, SUPER-PLUS, which has a novel semi-automatic bomb fetching function is presented in this paper. With limited support of human, SUPER-PLUS scans the cluttered environment with a wrist-mounted laser distance sensor and plans the manipulator a collision free path to fetch the bomb. The model construction of manipulator, bomb and environment, C-space map, path planning and the operation procedure are introduced in detail. The semi-automatic bomb fetching function has greatly improved the operation performance of EOD robot.

  19. Semi-automatic ultrasonic inspection of PWR upper internal immersed components

    International Nuclear Information System (INIS)

    Dombret, P.; Coquette, A.; Cermak, J.; Verspeelt, D.

    1985-01-01

    The present paper describes the characteristics of a semi-automatic ultrasonic inspection system. Components inspected are the so-called flexures, small pins located at the upper part of control rod tube-guide, some of which happened to broke in a few Westinghouse type PWR's. Inspection results and other system capabilities are also mentioned

  20. Application of a semi-automatic cartilage segmentation method for biomechanical modeling of the knee joint.

    Science.gov (United States)

    Liukkonen, Mimmi K; Mononen, Mika E; Tanska, Petri; Saarakkala, Simo; Nieminen, Miika T; Korhonen, Rami K

    2017-10-01

    Manual segmentation of articular cartilage from knee joint 3D magnetic resonance images (MRI) is a time consuming and laborious task. Thus, automatic methods are needed for faster and reproducible segmentations. In the present study, we developed a semi-automatic segmentation method based on radial intensity profiles to generate 3D geometries of knee joint cartilage which were then used in computational biomechanical models of the knee joint. Six healthy volunteers were imaged with a 3T MRI device and their knee cartilages were segmented both manually and semi-automatically. The values of cartilage thicknesses and volumes produced by these two methods were compared. Furthermore, the influences of possible geometrical differences on cartilage stresses and strains in the knee were evaluated with finite element modeling. The semi-automatic segmentation and 3D geometry construction of one knee joint (menisci, femoral and tibial cartilages) was approximately two times faster than with manual segmentation. Differences in cartilage thicknesses, volumes, contact pressures, stresses, and strains between segmentation methods in femoral and tibial cartilage were mostly insignificant (p > 0.05) and random, i.e. there were no systematic differences between the methods. In conclusion, the devised semi-automatic segmentation method is a quick and accurate way to determine cartilage geometries; it may become a valuable tool for biomechanical modeling applications with large patient groups.

  1. Semi-Automatic Construction of Skeleton Concept Maps from Case Judgments

    NARCIS (Netherlands)

    Boer, A.; Sijtsma, B.; Winkels, R.; Lettieri, N.

    2014-01-01

    This paper proposes an approach to generating Skeleton Conceptual Maps (SCM) semi automatically from legal case documents provided by the United Kingdom’s Supreme Court. SCM are incomplete knowledge representations for the purpose of scaffolding learning. The proposed system intends to provide

  2. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  3. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  4. Design and development of semi-automatic radiation test and calibration facility

    International Nuclear Information System (INIS)

    Yadav, Ashok Kumar; Chouhan, V.K.; Narayan, Pradeep

    2008-01-01

    Semi-automatic gamma radiation test and calibration facility have been designed, developed and commissioned at Defence Laboratory Jodhpur (DLJ). The facility comprises of medium and high dose rate range setup using 30 Ci Cobalt-60 source, in a portable remotely operated Techops camera and a 15000 Ci 60 Co source in a Tele-therapy machine. The radiation instruments can be positioned at any desired position using a computer controlled positioner having three translational and one rotational motion. User friendly software helps in positioning the Device Under Test (DUT) at any desired dose rate or distance and acquire the data automatically. The servo and stepper motor controlled positioner helps in achieving the required precision and accuracy for the radiation calibration of the instruments. This paper describes the semi-automatic radiation test and calibration facility commissioned at DLJ. (author)

  5. Semi-Automatic Construction of Skeleton Concept Maps from Case Judgments

    OpenAIRE

    Boer, A.; Sijtsma, B.; Winkels, R.; Lettieri, N.

    2014-01-01

    This paper proposes an approach to generating Skeleton Conceptual Maps (SCM) semi automatically from legal case documents provided by the United Kingdom’s Supreme Court. SCM are incomplete knowledge representations for the purpose of scaffolding learning. The proposed system intends to provide students with a tool to pre-process text and to extract knowledge from documents in a time saving manner. A combination of natural language processing methods and proposition extraction algorithms are u...

  6. A dorsolateral prefrontal cortex semi-automatic segmenter

    Science.gov (United States)

    Al-Hakim, Ramsey; Fallon, James; Nain, Delphine; Melonakos, John; Tannenbaum, Allen

    2006-03-01

    Structural, functional, and clinical studies in schizophrenia have, for several decades, consistently implicated dysfunction of the prefrontal cortex in the etiology of the disease. Functional and structural imaging studies, combined with clinical, psychometric, and genetic analyses in schizophrenia have confirmed the key roles played by the prefrontal cortex and closely linked "prefrontal system" structures such as the striatum, amygdala, mediodorsal thalamus, substantia nigra-ventral tegmental area, and anterior cingulate cortices. The nodal structure of the prefrontal system circuit is the dorsal lateral prefrontal cortex (DLPFC), or Brodmann area 46, which also appears to be the most commonly studied and cited brain area with respect to schizophrenia. 1, 2, 3, 4 In 1986, Weinberger et. al. tied cerebral blood flow in the DLPFC to schizophrenia.1 In 2001, Perlstein et. al. demonstrated that DLPFC activation is essential for working memory tasks commonly deficient in schizophrenia. 2 More recently, groups have linked morphological changes due to gene deletion and increased DLPFC glutamate concentration to schizophrenia. 3, 4 Despite the experimental and clinical focus on the DLPFC in structural and functional imaging, the variability of the location of this area, differences in opinion on exactly what constitutes DLPFC, and inherent difficulties in segmenting this highly convoluted cortical region have contributed to a lack of widely used standards for manual or semi-automated segmentation programs. Given these implications, we developed a semi-automatic tool to segment the DLPFC from brain MRI scans in a reproducible way to conduct further morphological and statistical studies. The segmenter is based on expert neuroanatomist rules (Fallon-Kindermann rules), inspired by cytoarchitectonic data and reconstructions presented by Rajkowska and Goldman-Rakic. 5 It is semi-automated to provide essential user interactivity. We present our results and provide details on

  7. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    and interaction with the programmer. With this pragmatic approach, we can provide scalable and effective refactoring support for real-world code, including libraries and incomplete applications. Through a series of experiments that estimate how much manual effort our technique demands from the programmer, we show......Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  8. Evaluation of semi-automatic arterial stenosis quantification

    International Nuclear Information System (INIS)

    Hernandez Hoyos, M.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Univ. de los Andes, Bogota; Serfaty, J.M.; Douek, P.C.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron; Maghiar, A.; Mansard, C.; Orkisz, M.; Magnin, I.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne

    2006-01-01

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa Κ W = 0.91) than the radiologists (Κ W = 0.69). In patients, agreement between software and radiologists varied from Κ W =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  9. User Interaction in Semi-Automatic Segmentation of Organs at Risk: a Case Study in Radiotherapy.

    Science.gov (United States)

    Ramkumar, Anjana; Dolz, Jose; Kirisli, Hortense A; Adebahr, Sonja; Schimek-Jasch, Tanja; Nestle, Ursula; Massoptier, Laurent; Varga, Edit; Stappers, Pieter Jan; Niessen, Wiro J; Song, Yu

    2016-04-01

    Accurate segmentation of organs at risk is an important step in radiotherapy planning. Manual segmentation being a tedious procedure and prone to inter- and intra-observer variability, there is a growing interest in automated segmentation methods. However, automatic methods frequently fail to provide satisfactory result, and post-processing corrections are often needed. Semi-automatic segmentation methods are designed to overcome these problems by combining physicians' expertise and computers' potential. This study evaluates two semi-automatic segmentation methods with different types of user interactions, named the "strokes" and the "contour", to provide insights into the role and impact of human-computer interaction. Two physicians participated in the experiment. In total, 42 case studies were carried out on five different types of organs at risk. For each case study, both the human-computer interaction process and quality of the segmentation results were measured subjectively and objectively. Furthermore, different measures of the process and the results were correlated. A total of 36 quantifiable and ten non-quantifiable correlations were identified for each type of interaction. Among those pairs of measures, 20 of the contour method and 22 of the strokes method were strongly or moderately correlated, either directly or inversely. Based on those correlated measures, it is concluded that: (1) in the design of semi-automatic segmentation methods, user interactions need to be less cognitively challenging; (2) based on the observed workflows and preferences of physicians, there is a need for flexibility in the interface design; (3) the correlated measures provide insights that can be used in improving user interaction design.

  10. Bouncy knee in a semi-automatic knee lock prosthesis.

    Science.gov (United States)

    Fisher, L D; Lord, M

    1986-04-01

    The Bouncy Knee concept has previously proved of value when fitted to stabilised knee units of active amputees. The stance phase flex-extend action afforded by a Bouncy Knee increased the symmetry of gait and also gave better tolerance to slopes and uneven ground. A bouncy function has now been incorporated into a knee of the semi-automatic knee lock design in a pilot laboratory trial involving six patients. These less active patients did not show consistent changes in symmetry of gait, but demonstrated an improved ability to walk on slopes and increased their walking range. Subjective response was positive, as noted in the previous trials.

  11. Implementation of a microcontroller-based semi-automatic coagulator.

    Science.gov (United States)

    Chan, K; Kirumira, A; Elkateeb, A

    2001-01-01

    The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.

  12. Gray-Matter Volume Estimate Score: A Novel Semi-Automatic Method Measuring Early Ischemic Change on CT

    OpenAIRE

    Song, Dongbeom; Lee, Kijeong; Kim, Eun Hye; Kim, Young Dae; Lee, Hye Sun; Kim, Jinkwon; Song, Tae-Jin; Ahn, Sung Soo; Nam, Hyo Suk; Heo, Ji Hoe

    2015-01-01

    Background and Purpose We developed a novel method named Gray-matter Volume Estimate Score (GRAVES), measuring early ischemic changes on Computed Tomography (CT) semi-automatically by computer software. This study aimed to compare GRAVES and Alberta Stroke Program Early CT Score (ASPECTS) with regards to outcome prediction and inter-rater agreement. Methods This was a retrospective cohort study. Among consecutive patients with ischemic stroke in the anterior circulation who received intra-art...

  13. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  14. PEP computer control system

    International Nuclear Information System (INIS)

    1979-03-01

    This paper describes the design and performance of the computer system that will be used to control and monitor the PEP storage ring. Since the design is essentially complete and much of the system is operational, the system is described as it is expected to 1979. Section 1 of the paper describes the system hardware which includes the computer network, the CAMAC data I/O system, and the operator control consoles. Section 2 describes a collection of routines that provide general services to applications programs. These services include a graphics package, data base and data I/O programs, and a director programm for use in operator communication. Section 3 describes a collection of automatic and semi-automatic control programs, known as SCORE, that contain mathematical models of the ring lattice and are used to determine in real-time stable paths for changing beam configuration and energy and for orbit correction. Section 4 describes a collection of programs, known as CALI, that are used for calibration of ring elements

  15. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    Science.gov (United States)

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  16. A semi-automatic method for peak and valley detection in free-breathing respiratory waveforms

    International Nuclear Information System (INIS)

    Lu Wei; Nystrom, Michelle M.; Parikh, Parag J.; Fooshee, David R.; Hubenschmidt, James P.; Bradley, Jeffrey D.; Low, Daniel A.

    2006-01-01

    The existing commercial software often inadequately determines respiratory peaks for patients in respiration correlated computed tomography. A semi-automatic method was developed for peak and valley detection in free-breathing respiratory waveforms. First the waveform is separated into breath cycles by identifying intercepts of a moving average curve with the inspiration and expiration branches of the waveform. Peaks and valleys were then defined, respectively, as the maximum and minimum between pairs of alternating inspiration and expiration intercepts. Finally, automatic corrections and manual user interventions were employed. On average for each of the 20 patients, 99% of 307 peaks and valleys were automatically detected in 2.8 s. This method was robust for bellows waveforms with large variations

  17. A semi-automatic method for positioning a femoral bone reconstruction for strict view generation.

    Science.gov (United States)

    Milano, Federico; Ritacco, Lucas; Gomez, Adrian; Gonzalez Bernaldo de Quiros, Fernan; Risk, Marcelo

    2010-01-01

    In this paper we present a semi-automatic method for femoral bone positioning after 3D image reconstruction from Computed Tomography images. This serves as grounding for the definition of strict axial, longitudinal and anterior-posterior views, overcoming the problem of patient positioning biases in 2D femoral bone measuring methods. After the bone reconstruction is aligned to a standard reference frame, new tomographic slices can be generated, on which unbiased measures may be taken. This could allow not only accurate inter-patient comparisons but also intra-patient comparisons, i.e., comparisons of images of the same patient taken at different times. This method could enable medical doctors to diagnose and follow up several bone deformities more easily.

  18. Semi-Automatic Rating Method for Neutrophil Alkaline Phosphatase Activity.

    Science.gov (United States)

    Sugano, Kanae; Hashi, Kotomi; Goto, Misaki; Nishi, Kiyotaka; Maeda, Rie; Kono, Keigo; Yamamoto, Mai; Okada, Kazunori; Kaga, Sanae; Miwa, Keiko; Mikami, Taisei; Masauzi, Nobuo

    2017-01-01

    The neutrophil alkaline phosphatase (NAP) score is a valuable test for the diagnosis of myeloproliferative neoplasms, but it has still manually rated. Therefore, we developed a semi-automatic rating method using Photoshop ® and Image-J, called NAP-PS-IJ. Neutrophil alkaline phosphatase staining was conducted with Tomonaga's method to films of peripheral blood taken from three healthy volunteers. At least 30 neutrophils with NAP scores from 0 to 5+ were observed and taken their images. From which the outer part of neutrophil was removed away with Image-J. These were binarized with two different procedures (P1 and P2) using Photoshop ® . NAP-positive area (NAP-PA) and granule (NAP-PGC) were measured and counted with Image-J. The NAP-PA in images binarized with P1 significantly (P < 0.05) differed between images with NAP scores from 0 to 3+ (group 1) and those from 4+ to 5+ (group 2). The original images in group 1 were binarized with P2. NAP-PGC of them significantly (P < 0.05) differed among all four NAP score groups. The mean NAP-PGC with NAP-PS-IJ indicated a good correlation (r = 0.92, P < 0.001) to results by human examiners. The sensitivity and specificity of NAP-PS-IJ were 60% and 92%, which might be considered as a prototypic method for the full-automatic rating NAP score. © 2016 Wiley Periodicals, Inc.

  19. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  20. Sherlock: A Semi-automatic Framework for Quiz Generation Using a Hybrid Semantic Similarity Measure.

    Science.gov (United States)

    Lin, Chenghua; Liu, Dong; Pang, Wei; Wang, Zhe

    In this paper, we present a semi-automatic system (Sherlock) for quiz generation using linked data and textual descriptions of RDF resources. Sherlock is distinguished from existing quiz generation systems in its generic framework for domain-independent quiz generation as well as in the ability of controlling the difficulty level of the generated quizzes. Difficulty scaling is non-trivial, and it is fundamentally related to cognitive science. We approach the problem with a new angle by perceiving the level of knowledge difficulty as a similarity measure problem and propose a novel hybrid semantic similarity measure using linked data. Extensive experiments show that the proposed semantic similarity measure outperforms four strong baselines with more than 47 % gain in clustering accuracy. In addition, we discovered in the human quiz test that the model accuracy indeed shows a strong correlation with the pairwise quiz similarity.

  1. a New Approach for the Semi-Automatic Texture Generation of the Buildings Facades, from Terrestrial Laser Scanner Data

    Science.gov (United States)

    Oniga, E.

    2012-07-01

    The result of the terrestrial laser scanning is an impressive number of spatial points, each of them being characterized as position by the X, Y and Z co-ordinates, by the value of the laser reflectance and their real color, expressed as RGB (Red, Green, Blue) values. The color code for each LIDAR point is taken from the georeferenced digital images, taken with a high resolution panoramic camera incorporated in the scanner system. In this article I propose a new algorithm for the semiautomatic texture generation, using the color information, the RGB values of every point that has been taken by terrestrial laser scanning technology and the 3D surfaces defining the buildings facades, generated with the Leica Cyclone software. The first step is when the operator defines the limiting value, i.e. the minimum distance between a point and the closest surface. The second step consists in calculating the distances, or the perpendiculars drawn from each point to the closest surface. In the third step we associate the points whose 3D coordinates are known, to every surface, depending on the limiting value. The fourth step consists in computing the Voronoi diagram for the points that belong to a surface. The final step brings automatic association between the RGB value of the color code and the corresponding polygon of the Voronoi diagram. The advantage of using this algorithm is that we can obtain, in a semi-automatic manner, a photorealistic 3D model of the building.

  2. Semi-automatic creation and exploitation of competence ontologies for trend aware profiling, matching and planning

    Directory of Open Access Journals (Sweden)

    H. Ulrich Hoppe

    2013-03-01

    Full Text Available Human resource managers are confronted with the problem that they have to fulfil the enterprise’s competence needs either by developing their current staff or by recruiting new employees. In both cases decisions about who to select for the new position and more often which competences are crucial for the future success. This is especially true for highly dynamic industries like the IT industry. This article presents our work from the KoPIWA project in the Digital Economy. Our approach is based on a conceptual model that encompasses the market level, the social context and relations between competences. This model is the foundation for the ontology based decision support system for human resource managers presented in this article. To semi-automatically create and update the competence ontology methods from the areas data mining, social network analysis and information retrieval are employed. The results of these methods with regard to recruiting and learning processes are presented.

  3. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    Science.gov (United States)

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    2018-04-03

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  4. System for connection of a scanning measuring equipment with a computer

    International Nuclear Information System (INIS)

    Avdeev, N.F.; Bogomolov, M.N.; Volkov, G.A.

    1976-01-01

    A system of communication of viewing and measuring devices is considered ensuring an on-line operation with BESM-4 computer. A program of film data processing is presented. A detailed description is given with respect to an interface. Modifications are mentioned in some units of the semi-automatic devices which ensure a communication with a computer. The communication system allows for a man-machine dialogue

  5. 10 CFR Appendix J1 to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Automatic and Semi-Automatic Clothes...

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uniform Test Method for Measuring the Energy Consumption... Energy Consumption of Automatic and Semi-Automatic Clothes Washers The provisions of this appendix J1... means for determining the energy consumption of a clothes washer with an adaptive control system...

  6. A semi-automatic system for labelling seafood products and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-05-10

    May 10, 2010 ... resource regulation, thereby contributing to an increasing depletion of ... ponents and software development have been carefully designed with solutions geared ... according to allocation in macro-areas and geographical sub-area ... In the LS testing phase, this server was installed on-site at the. Consiglio ...

  7. A semi-automatic system for labelling seafood products and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-05-10

    May 10, 2010 ... oleanolic acid) may be produced in B. platyphylla Suk. cell culture, but their ..... Yeast extract and methyl jasmonate-induced silymarin production in ... alternata 102: Antagonistic effect of salicylic acid and methyl jasmonate on ...

  8. Semi-automatic bubble counting system for superheated droplet detectors

    International Nuclear Information System (INIS)

    Reina, Luiz C.; Bellido, Luis F.; Ramos, Paulo R.; Silva, Ademir X. da; Facure, Alessandro; Dantas, Jose E.R.

    2009-01-01

    Neutron dose rate measurements are normally performed by means of PADC, CR-39 and TLD detectors. Although, none of these devices can give instant reading of the neutron dose, recently new kind of detectors are being developed, based on the formation of tiny drops in a superheated liquid suspended in a polymer or gel solution, called superheated droplet detector (SDD) or also as bubble detectors (BD), with no response for gamma radiation. This work describes the experimental setup and the developed procedures for acquiring and processing digital images obtained with bubble detector spectrometer (BDS), developed by Bubble Technology Industries, for personal neutron dosimeter and/or neutron energy fluence measurements in nuclear facilities. The results of the neutron measurements obtained during the F-18 production, at the RDS-111 cyclotron, are presented. These neutron measurements were the first ones with this type of BDS detectors in a particle accelerator facility in Brazil and it was very important to estimate neutron dose rate received by occupationally exposed individuals. (author)

  9. Complications in CT-guided, semi-automatic coaxial core biopsy of potentially malignant pulmonary lesions; Komplikationen bei CT-gesteuerter, koaxialer Stanzbiopsie malignomverdaechtiger Lungenherde in halbautomatischer Technik

    Energy Technology Data Exchange (ETDEWEB)

    Schulze, R. [Klinik Loewenstein (Germany). Dept. of Radiology; Seebacher, G.; Enderes, B.; Kugler, G.; Graeter, T.P. [Klinik Loewenstein (Germany). Dept. of Thoracic and Vascular Surgery; Fischer, J.R. [Klinik Loewenstein (Germany). Dept. of Oncology

    2015-08-15

    Histological verification of pulmonary lesions is important to ensure correct treatment. Computed tomographic (CT) transthoracic core biopsy is a well-established procedure for this. Comparison of available studies is difficult though, as technical and patient characteristics vary. Using a standardized biopsy technique, we evaluated our results for CT-guided coaxial core biopsy in a semi-automatic technique. Within 2 years, 664 consecutive transpulmonary biopsies were analyzed retrospectively. All interventions were performed using a 17/18G semi-automatic core biopsy system (4 to 8 specimens). The incidence of complications and technical and patient-dependent risk factors were evaluated. Comparing the histology with the final diagnosis, the sensitivity was 96.3 %, and the specificity was 100 %. 24 procedures were not diagnostic. In all others immunohistological staining was possible. The main complication was pneumothorax (PT, 21.7 %), with chest tube insertion in 6 % of the procedures (n = 40). Bleeding without therapeutic consequences was seen in 43 patients. There was no patient mortality. The rate of PT with chest tube insertion was 9.6 % in emphysema patients and 2.8 % without emphysema (p = 0.001). Smokers with emphysema had a 5 times higher risk of developing PT (p = 0.001). Correlation of tumor size or biopsy angle and the risk of PT was not significant. The risk of developing a PT was associated with an increasing intrapulmonary depth of the lesion (p = 0.001). CT-guided, semiautomatic coaxial core biopsy of the lung is a safe diagnostic procedure. The rate of major complications is low, and the sensitivity and specificity of the procedure are high. Smokers with emphysema are at a significantly higher risk of developing pneumothorax and should be monitored accordingly.

  10. Semi-automatic detection and correction of body organ motion, particularly cardiac motion in SPECT studies

    International Nuclear Information System (INIS)

    Quintana, J.C.; Caceres, F.; Vargas, P.

    2002-01-01

    patient and artificially imposed). The method is fast (<20s) and robust as compared with manual or other semi-automatic detection of body organ motions in nuclear medicine studies. Conclusion: A fast and robust semi-automatic patient motion detection and correction for SPECT studies has been developed

  11. Accuracy and reproducibility of a novel semi-automatic segmentation technique for MR volumetry of the pituitary gland

    International Nuclear Information System (INIS)

    Renz, Diane M.; Hahn, Horst K.; Rexilius, Jan; Schmidt, Peter; Lentschig, Markus; Pfeil, Alexander; Sauner, Dieter; Fitzek, Clemens; Mentzel, Hans-Joachim; Kaiser, Werner A.; Reichenbach, Juergen R.; Boettcher, Joachim

    2011-01-01

    Although several reports about volumetric determination of the pituitary gland exist, volumetries have been solely performed by indirect measurements or manual tracing on the gland's boundaries. The purpose of this study was to evaluate the accuracy and reproducibility of a novel semi-automatic MR-based segmentation technique. In an initial technical investigation, T1-weighted 3D native magnetised prepared rapid gradient echo sequences (1.5 T) with 1 mm isotropic voxel size achieved high reliability and were utilised in different in vitro and in vivo studies. The computer-assisted segmentation technique was based on an interactive watershed transform after resampling and gradient computation. Volumetry was performed by three observers with different software and neuroradiologic experiences, evaluating phantoms of known volume (0.3, 0.9 and 1.62 ml) and healthy subjects (26 to 38 years; overall 135 volumetries). High accuracy of the volumetry was shown by phantom analysis; measurement errors were 0.05). The analysed semi-automatic MR volumetry of the pituitary gland is a valid, reliable and fast technique. Possible clinical applications are hyperplasia or atrophy of the gland in pathological circumstances either by a single assessment or by monitoring in follow-up studies. (orig.)

  12. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  13. Semi-automatic delineation using weighted CT-MRI registered images for radiotherapy of nasopharyngeal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fitton, I. [European Georges Pompidou Hospital, Department of Radiology, 20 rue Leblanc, 75015, Paris (France); Cornelissen, S. A. P. [Image Sciences Institute, UMC, Department of Radiology, P.O. Box 85500, 3508 GA Utrecht (Netherlands); Duppen, J. C.; Rasch, C. R. N.; Herk, M. van [The Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Department of Radiotherapy, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands); Steenbakkers, R. J. H. M. [University Medical Center Groningen, Department of Radiation Oncology, Hanzeplein 1, 9713 GZ Groningen (Netherlands); Peeters, S. T. H. [UZ Gasthuisberg, Herestraat 49, 3000 Leuven, Belgique (Belgium); Hoebers, F. J. P. [Maastricht University Medical Center, Department of Radiation Oncology (MAASTRO clinic), GROW School for Oncology and Development Biology Maastricht, 6229 ET Maastricht (Netherlands); Kaanders, J. H. A. M. [UMC St-Radboud, Department of Radiotherapy, Geert Grooteplein 32, 6525 GA Nijmegen (Netherlands); Nowak, P. J. C. M. [ERASMUS University Medical Center, Department of Radiation Oncology,Groene Hilledijk 301, 3075 EA Rotterdam (Netherlands)

    2011-08-15

    Purpose: To develop a delineation tool that refines physician-drawn contours of the gross tumor volume (GTV) in nasopharynx cancer, using combined pixel value information from x-ray computed tomography (CT) and magnetic resonance imaging (MRI) during delineation. Methods: Operator-guided delineation assisted by a so-called ''snake'' algorithm was applied on weighted CT-MRI registered images. The physician delineates a rough tumor contour that is continuously adjusted by the snake algorithm using the underlying image characteristics. The algorithm was evaluated on five nasopharyngeal cancer patients. Different linear weightings CT and MRI were tested as input for the snake algorithm and compared according to contrast and tumor to noise ratio (TNR). The semi-automatic delineation was compared with manual contouring by seven experienced radiation oncologists. Results: A good compromise for TNR and contrast was obtained by weighing CT twice as strong as MRI. The new algorithm did not notably reduce interobserver variability, it did however, reduce the average delineation time by 6 min per case. Conclusions: The authors developed a user-driven tool for delineation and correction based a snake algorithm and registered weighted CT image and MRI. The algorithm adds morphological information from CT during the delineation on MRI and accelerates the delineation task.

  14. A semi-automatic calibration method for seismic arrays applied to an Alaskan array

    Science.gov (United States)

    Lindquist, K. G.; Tibuleac, I. M.; Hansen, R. A.

    2001-12-01

    Well-calibrated, small (less than 22 km) aperture seismic arrays are of great importance for event location and characterization. We have implemented the crosscorrelation method of Tibuleac and Herrin (Seis. Res. Lett. 1997) as a semi-automatic procedure, applicable to any seismic array. With this we are able to process thousands of phases with several days of computer time on a Sun Blade 1000 workstation. Complicated geology beneath elements and elevation differences amonst the array stations made station corrections necessary. 328 core phases (including PcP, PKiKP, PKP, PKKP) were used in order to determine the static corrections. To demonstrate this application and method, we have analyzed P and PcP arrivals at the ILAR array (Eielson, Alaska) between years 1995-2000. The arrivals were picked by PIDC, for events (mb>4.0) well located by the USGS. We calculated backazimuth and horizontal velocity residuals for all events. We observed large backazimuth residuals for regional and near-regional phases. We are discussing the possibility of a dipping Moho (strike E-W, dip N) beneath the array versus other local structure that would produce the residuals.

  15. Evaluation of a semi-automatic radioimmunoassay for hepatitis B surface antigen (HBsAg)

    International Nuclear Information System (INIS)

    Vries, J. de; Kruining, J.; Heijtink, R.A.

    1983-01-01

    The recently developed semi-automatic Hepatube system was evaluated in comparison to another radioimmunoassay for the detection of hepatitis B surface antigen (HBsAg), the manual Ausria II-125 test. After incubation of serum in anti-HBs coated tubes, the Hepatube system uses a machine to wash the tubes and to add tracer. After a second incubation, tubes are washed again in the machine and are manually transferred to the #betta# counter. Two machines were used. Machine 1 had an undefined defect. Of 1490 samples tested, 69 (4.6%) gave false-positive results versus 11 (0.7%) in the Ausria II-125 test. Machine 2 had one false-positive result among 920 samples versus 5 in the Ausria II-125 test. The sensitivity was measured with reference panels from Wellcome and Abbott as well as in titration series. The Hepatube system was found to be a factor three less sensitive than the Ausria II-125 test. The Hepatube processor is easy to handle; radioactive material can be held at a distance during the whole procedure; waste material is limited and less voluminous than in the Ausria II-125 test. (Auth.)

  16. Construction and use of an optical semi-automatic titrator employing the technique of reflectance photometry

    International Nuclear Information System (INIS)

    Hwang, Hoon

    2001-01-01

    An optical semi-automatic titrator was constructed employing the technique of the reflectance spectrometry and was tested for the determination of the end points of the acid-base, precipitation, and EDTA titrations. And since the current optical semi-automatic titrator built on the principle of the reflectance spectrometry could be successfully used even for the determination of the end of the end point in the precipitation titration where the solid particles are formed during the titration process, it was found to be feasible that a completely automated optical titrator would be designed and built based on the current findings

  17. Semi-Automatic Operational Service for Drought Monitoring and Forecasting in the Tuscany Region

    Directory of Open Access Journals (Sweden)

    Ramona Magno

    2018-02-01

    Full Text Available A drought-monitoring and forecasting system developed for the Tuscany region was improved in order to provide a semi-automatic, more detailed, timely and comprehensive operational service for decision making, water authorities, researchers and general stakeholders. Ground-based and satellite data from different sources (regional meteorological stations network, MODIS Terra satellite and CHIRPS/CRU precipitation datasets are integrated through an open-source, interoperable SDI (spatial data infrastructure based on PostgreSQL/PostGIS to produce vegetation and precipitation indices that allow following of the occurrence and evolution of a drought event. The SDI allows the dissemination of comprehensive, up-to-date and customizable information suitable for different end-users through different channels, from a web page and monthly bulletins, to interoperable web services, and a comprehensive climate service. The web services allow geospatial elaborations on the fly, and the geo-database can be increased with new input/output data to respond to specific requests or to increase the spatial resolution.

  18. Semi-automatic retrieval of definitional information: a northern Sotho ...

    African Journals Online (AJOL)

    If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link ...

  19. Method and apparatus for mounting or dismounting a semi-automatic twist-lock

    NARCIS (Netherlands)

    Klein Breteler, A.J.; Tekeli, G.

    2001-01-01

    The invention relates to a method for mounting or dismounting a semi-automatic twistlock at a corner of a deck container, wherein the twistlock is mounted or dismounted on a quayside where a ship may be docked for loading or unloading, in a loading or unloading terminal installed on the quayside,

  20. Semi-automatic construction of reference standards for evaluation of image registration

    NARCIS (Netherlands)

    Murphy, K.; Ginneken, van B.; Klein, S.; Staring, M.; Hoop, de B.J.; Viergever, M.A.; Pluim, J.P.W.

    2011-01-01

    Quantitative evaluation of image registration algorithms is a difficult and under-addressed issue due to the lack of a reference standard in most registration problems. In this work a method is presented whereby detailed reference standard data may be constructed in an efficient semi-automatic

  1. Semi-automatic Citation Correction with Lemon8-XML

    Directory of Open Access Journals (Sweden)

    MJ Suhonos

    2009-03-01

    Full Text Available The Lemon8-XML software application, developed by the Public Knowledge Project (PKP, provides an open-source, computer-assisted interface for reliable citation structuring and validation. Lemon8-XML combines citation parsing algorithms with freely-available online indexes such as PubMed, WorldCat, and OAIster. Fully-automated markup of entire bibliographies may be a genuine possibility using this approach. Automated markup of citations would increase bibliographic accuracy while reducing copyediting demands.

  2. WiseScaffolder: an algorithm for the semi-automatic scaffolding of Next Generation Sequencing data.

    Science.gov (United States)

    Farrant, Gregory K; Hoebeke, Mark; Partensky, Frédéric; Andres, Gwendoline; Corre, Erwan; Garczarek, Laurence

    2015-09-03

    The sequencing depth provided by high-throughput sequencing technologies has allowed a rise in the number of de novo sequenced genomes that could potentially be closed without further sequencing. However, genome scaffolding and closure require costly human supervision that often results in genomes being published as drafts. A number of automatic scaffolders were recently released, which improved the global quality of genomes published in the last few years. Yet, none of them reach the efficiency of manual scaffolding. Here, we present an innovative semi-automatic scaffolder that additionally helps with chimerae resolution and generates valuable contig maps and outputs for manual improvement of the automatic scaffolding. This software was tested on the newly sequenced marine cyanobacterium Synechococcus sp. WH8103 as well as two reference datasets used in previous studies, Rhodobacter sphaeroides and Homo sapiens chromosome 14 (http://gage.cbcb.umd.edu/). The quality of resulting scaffolds was compared to that of three other stand-alone scaffolders: SSPACE, SOPRA and SCARPA. For all three model organisms, WiseScaffolder produced better results than other scaffolders in terms of contiguity statistics (number of genome fragments, N50, LG50, etc.) and, in the case of WH8103, the reliability of the scaffolds was confirmed by whole genome alignment against a closely related reference genome. We also propose an efficient computer-assisted strategy for manual improvement of the scaffolding, using outputs generated by WiseScaffolder, as well as for genome finishing that in our hands led to the circularization of the WH8103 genome. Altogether, WiseScaffolder proved more efficient than three other scaffolders for both prokaryotic and eukaryotic genomes and is thus likely applicable to most genome projects. The scaffolding pipeline described here should be of particular interest to biologists wishing to take advantage of the high added value of complete genomes.

  3. Semi-automatic breast ultrasound image segmentation based on mean shift and graph cuts.

    Science.gov (United States)

    Zhou, Zhuhuang; Wu, Weiwei; Wu, Shuicai; Tsui, Po-Hsiang; Lin, Chung-Chih; Zhang, Ling; Wang, Tianfu

    2014-10-01

    Computerized tumor segmentation on breast ultrasound (BUS) images remains a challenging task. In this paper, we proposed a new method for semi-automatic tumor segmentation on BUS images using Gaussian filtering, histogram equalization, mean shift, and graph cuts. The only interaction required was to select two diagonal points to determine a region of interest (ROI) on an input image. The ROI image was shrunken by a factor of 2 using bicubic interpolation to reduce computation time. The shrunken image was smoothed by a Gaussian filter and then contrast-enhanced by histogram equalization. Next, the enhanced image was filtered by pyramid mean shift to improve homogeneity. The object and background seeds for graph cuts were automatically generated on the filtered image. Using these seeds, the filtered image was then segmented by graph cuts into a binary image containing the object and background. Finally, the binary image was expanded by a factor of 2 using bicubic interpolation, and the expanded image was processed by morphological opening and closing to refine the tumor contour. The method was implemented with OpenCV 2.4.3 and Visual Studio 2010 and tested for 38 BUS images with benign tumors and 31 BUS images with malignant tumors from different ultrasound scanners. Experimental results showed that our method had a true positive rate (TP) of 91.7%, a false positive (FP) rate of 11.9%, and a similarity (SI) rate of 85.6%. The mean run time on Intel Core 2.66 GHz CPU and 4 GB RAM was 0.49 ± 0.36 s. The experimental results indicate that the proposed method may be useful in BUS image segmentation. © The Author(s) 2014.

  4. Accuracy and reproducibility of a novel semi-automatic segmentation technique for MR volumetry of the pituitary gland

    Energy Technology Data Exchange (ETDEWEB)

    Renz, Diane M. [Charite University Medicine Berlin, Campus Virchow Clinic, Department of Radiology, Berlin (Germany); Hahn, Horst K.; Rexilius, Jan [Institute for Medical Image Computing, Fraunhofer MEVIS, Bremen (Germany); Schmidt, Peter [Friedrich-Schiller-University, Jena University Hospital, Institute of Diagnostic and Interventional Radiology, Department of Neuroradiology, Jena (Germany); Lentschig, Markus [MR- and PET/CT Centre Bremen, Bremen (Germany); Pfeil, Alexander [Friedrich-Schiller-University, Jena University Hospital, Department of Internal Medicine III, Jena (Germany); Sauner, Dieter [St. Georg Clinic Leipzig, Hospital Hubertusburg, Department of Radiology, Wermsdorf (Germany); Fitzek, Clemens [Asklepios Clinic Brandenburg, Department of Radiology and Neuroradiology, Brandenburg an der Havel (Germany); Mentzel, Hans-Joachim [Friedrich-Schiller-University, Jena University Hospital, Institute of Diagnostic and Interventional Radiology, Department of Pediatric Radiology, Jena (Germany); Kaiser, Werner A. [Friedrich-Schiller-University, Jena University Hospital, Institute of Diagnostic and Interventional Radiology, Jena (Germany); Reichenbach, Juergen R. [Friedrich-Schiller-University, Jena University Hospital, Medical Physics Group, Institute of Diagnostic and Interventional Radiology, Jena (Germany); Boettcher, Joachim [SRH Clinic Gera, Institute of Diagnostic and Interventional Radiology, Gera (Germany)

    2011-04-15

    Although several reports about volumetric determination of the pituitary gland exist, volumetries have been solely performed by indirect measurements or manual tracing on the gland's boundaries. The purpose of this study was to evaluate the accuracy and reproducibility of a novel semi-automatic MR-based segmentation technique. In an initial technical investigation, T1-weighted 3D native magnetised prepared rapid gradient echo sequences (1.5 T) with 1 mm isotropic voxel size achieved high reliability and were utilised in different in vitro and in vivo studies. The computer-assisted segmentation technique was based on an interactive watershed transform after resampling and gradient computation. Volumetry was performed by three observers with different software and neuroradiologic experiences, evaluating phantoms of known volume (0.3, 0.9 and 1.62 ml) and healthy subjects (26 to 38 years; overall 135 volumetries). High accuracy of the volumetry was shown by phantom analysis; measurement errors were <4% with a mean error of 2.2%. In vitro, reproducibility was also promising with intra-observer variability of 0.7% for observer 1 and 0.3% for observers 2 and 3; mean inter-observer variability was in vitro 1.2%. In vivo, scan-rescan, intra-observer and inter-observer variability showed mean values of 3.2%, 1.8% and 3.3%, respectively. Unifactorial analysis of variance demonstrated no significant differences between pituitary volumes for various MR scans or software calculations in the healthy study groups (p > 0.05). The analysed semi-automatic MR volumetry of the pituitary gland is a valid, reliable and fast technique. Possible clinical applications are hyperplasia or atrophy of the gland in pathological circumstances either by a single assessment or by monitoring in follow-up studies. (orig.)

  5. A semi-automatic technique for measurement of arterial wall from black blood MRI

    International Nuclear Information System (INIS)

    Ladak, Hanif M.; Thomas, Jonathan B.; Mitchell, J. Ross; Rutt, Brian K.; Steinman, David A.

    2001-01-01

    Black blood magnetic resonance imaging (MRI) has become a popular technique for imaging the artery wall in vivo. Its noninvasiveness and high resolution make it ideal for studying the progression of early atherosclerosis in normal volunteers or asymptomatic patients with mild disease. However, the operator variability inherent in the manual measurement of vessel wall area from MR images hinders the reliable detection of relatively small changes in the artery wall over time. In this paper we present a semi-automatic method for segmenting the inner and outer boundary of the artery wall, and evaluate its operator variability using analysis of variance (ANOVA). In our approach, a discrete dynamic contour is approximately initialized by an operator, deformed to the inner boundary, dilated, and then deformed to the outer boundary. A group of four operators performed repeated measurements on 12 images from normal human subjects using both our semi-automatic technique and a manual approach. Results from the ANOVA indicate that the inter-operator standard error of measurement (SEM) of total wall area decreased from 3.254 mm2 (manual) to 1.293 mm2 (semi-automatic), and the intra-operator SEM decreased from 3.005 mm2 to 0.958 mm2. Operator reliability coefficients increased from less than 69% to more than 91% (inter-operator) and 95% (intra-operator). The minimum detectable change in wall area improved from more than 8.32 mm2 (intra-operator, manual) to less than 3.59 mm2 (inter-operator, semi-automatic), suggesting that it is better to have multiple operators measure wall area with our semi-automatic technique than to have a single operator make repeated measurements manually. Similar improvements in wall thickness and lumen radius measurements were also recorded. Since the semi-automatic technique has effectively ruled out the effect of the operator on these measurements, it may be possible to use such techniques to expand prospective studies of atherogenesis to multiple

  6. Semi-automatic version of the potentiometric titration method for characterization of uranium compounds

    International Nuclear Information System (INIS)

    Cristiano, Bárbara F.G.; Delgado, José Ubiratan; Wanderley S da Silva, José; Barros, Pedro D. de; Araújo, Radier M.S. de; Dias, Fábio C.; Lopes, Ricardo T.

    2012-01-01

    The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. - Highlights: ► A semi-automatic potentiometric titration method was developed for U charaterization. ► K 2 Cr 2 O 7 was the only certified reference material used. ► Values obtained for U 3 O 8 samples were consistent with certified. ► Uncertainty of 0.01% was useful for characterization and intercomparison program.

  7. Semi-automatic watershed medical image segmentation methods for customized cancer radiation treatment planning simulation

    International Nuclear Information System (INIS)

    Kum Oyeon; Kim Hye Kyung; Max, N.

    2007-01-01

    A cancer radiation treatment planning simulation requires image segmentation to define the gross tumor volume, clinical target volume, and planning target volume. Manual segmentation, which is usual in clinical settings, depends on the operator's experience and may, in addition, change for every trial by the same operator. To overcome this difficulty, we developed semi-automatic watershed medical image segmentation tools using both the top-down watershed algorithm in the insight segmentation and registration toolkit (ITK) and Vincent-Soille's bottom-up watershed algorithm with region merging. We applied our algorithms to segment two- and three-dimensional head phantom CT data and to find pixel (or voxel) numbers for each segmented area, which are needed for radiation treatment optimization. A semi-automatic method is useful to avoid errors incurred by both human and machine sources, and provide clear and visible information for pedagogical purpose. (orig.)

  8. Semi-automatic handling of meteorological ground measurements using WeatherProg: prospects and practical implications

    Science.gov (United States)

    Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; De Mascellis, Roberto; Manna, Piero; Terribile, Fabio

    2016-04-01

    WeatherProg is a computer program for the semi-automatic handling of data measured at ground stations within a climatic network. The program performs a set of tasks ranging from gathering raw point-based sensors measurements to the production of digital climatic maps. Originally the program was developed as the baseline asynchronous engine for the weather records management within the SOILCONSWEB Project (LIFE08 ENV/IT/000408), in which daily and hourly data where used to run water balance in the soil-plant-atmosphere continuum or pest simulation models. WeatherProg can be configured to automatically perform the following main operations: 1) data retrieval; 2) data decoding and ingestion into a database (e.g. SQL based); 3) data checking to recognize missing and anomalous values (using a set of differently combined checks including logical, climatological, spatial, temporal and persistence checks); 4) infilling of data flagged as missing or anomalous (deterministic or statistical methods); 5) spatial interpolation based on alternative/comparative methods such as inverse distance weighting, iterative regression kriging, and a weighted least squares regression (based on physiography), using an approach similar to PRISM. 6) data ingestion into a geodatabase (e.g. PostgreSQL+PostGIS or rasdaman). There is an increasing demand for digital climatic maps both for research and development (there is a gap between the major of scientific modelling approaches that requires digital climate maps and the gauged measurements) and for practical applications (e.g. the need to improve the management of weather records which in turn raises the support provided to farmers). The demand is particularly burdensome considering the requirement to handle climatic data at the daily (e.g. in the soil hydrological modelling) or even at the hourly time step (e.g. risk modelling in phytopathology). The key advantage of WeatherProg is the ability to perform all the required operations and

  9. Hera-FFX: a Firefox add-on for Semi-automatic Web Accessibility Evaluation

    OpenAIRE

    Fuertes Castro, José Luis; González, Ricardo; Gutiérrez, Emmanuelle; Martínez Normand, Loïc

    2009-01-01

    Website accessibility evaluation is a complex task requiring a combination of human expertise and software support. There are several online and offline tools to support the manual web accessibility evaluation process. However, they all have some weaknesses because none of them includes all the desired features. In this paper we present Hera-FFX, an add-on for the Firefox web browser that supports semi-automatic web accessibility evaluation.

  10. Semi-automatic version of the potentiometric titration method for characterization of uranium compounds.

    Science.gov (United States)

    Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T

    2012-09-01

    The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Comparison of manual and semi-automatic measuring techniques in MSCT scans of patients with lymphoma: a multicentre study

    Energy Technology Data Exchange (ETDEWEB)

    Hoeink, A.J.; Wessling, J.; Schuelke, C.; Kohlhase, N.; Wassenaar, L.; Heindel, W.; Buerke, B. [University Hospital Muenster, Department of Clinical Radiology, Muenster (Germany); Koch, R. [University of Muenster, Institute of Biostatistics and Clinical Research (IBKF), Muenster (Germany); Mesters, R.M. [University Hospital Muenster, Department of Haematology and Oncology, Muenster (Germany); D' Anastasi, M.; Graser, A.; Karpitschka, M. [University Hospital Muenchen (LMU), Institute of Clinical Radiology, Muenchen (Germany); Fabel, M.; Wulff, A. [University Hospital Kiel, Department of Clinical Radiology, Kiel (Germany); Pinto dos Santos, D. [University Hospital Mainz, Department of Diagnostic and Interventional Radiology, Mainz (Germany); Kiessling, A. [University Hospital Marburg, Department of Diagnostic and Interventional Radiology, Marburg (Germany); Dicken, V.; Bornemann, L. [Institute of Medical Imaging Computing, Fraunhofer MeVis, Bremen (Germany)

    2014-11-15

    Multicentre evaluation of the precision of semi-automatic 2D/3D measurements in comparison to manual, linear measurements of lymph nodes regarding their inter-observer variability in multi-slice CT (MSCT) of patients with lymphoma. MSCT data of 63 patients were interpreted before and after chemotherapy by one/tworadiologists in five university hospitals. In 307 lymph nodes, short (SAD)/long (LAD) axis diameter and WHO area were determined manually and semi-automatically. Volume was solely calculated semi-automatically. To determine the precision of the individual parameters, a mean was calculated for every lymph node/parameter. Deviation of the measured parameters from this mean was evaluated separately. Statistical analysis entailed intraclass correlation coefficients (ICC) and Kruskal-Wallis tests. Median relative deviations of semi-automatic parameters were smaller than deviations of manually assessed parameters, e.g. semi-automatic SAD 5.3 vs. manual 6.5 %. Median variations among different study sites were smaller if the measurement was conducted semi-automatically, e. g. manual LAD 5.7/4.2 % vs. semi-automatic 3.4/3.4 %. Semi-automatic volumetry was superior to the other parameters (2.8 %). Semi-automatic determination of different lymph node parameters is (compared to manually assessed parameters) associated with a slightly greater precision and a marginally lower inter-observer variability. These results are with regard to the increasing mobility of patients among different medical centres and in relation to the quality management of multicentre trials of importance. (orig.)

  12. Semi-Automatic Electronic Stent Register: a novel approach to preventing ureteric stents lost to follow up.

    Science.gov (United States)

    Macneil, James W H; Michail, Peter; Patel, Manish I; Ashbourne, Julie; Bariol, Simon V; Ende, David A; Hossack, Tania A; Lau, Howard; Wang, Audrey C; Brooks, Andrew J

    2017-10-01

    Ureteric stents are indispensable tools in modern urology; however, the risk of them not being followed-up once inserted poses medical and medico-legal risks. Stent registers are a common solution to mitigate this risk; however, manual registers are logistically challenging, especially for busy units. Western Sydney Local Health District developed a novel Semi-Automatic Electronic Stent Register (SAESR) utilizing billing information to track stent insertions. To determine the utility of this system, an audit was conducted comparing the 6 months before the introduction of the register to the first 6 months of the register. In the first 6 months of the register, 457 stents were inserted. At the time of writing, two of these are severely delayed for removal, representing a rate of 0.4%. In the 6 months immediately preceding the introduction of the register, 497 stents were inserted, and six were either missed completely or severely delayed in their removal, representing a rate of 1.2%. A non-inferiority analysis found this to be no worse than the results achieved before the introduction of the register. The SAESR allowed us to improve upon our better than expected rate of stents lost to follow up or severely delayed. We demonstrated non-inferiority in the rate of lost or severely delayed stents, and a number of other advantages including savings in personnel costs. The semi-automatic register represents an effective way of reducing the risk associated with a common urological procedure. We believe that this methodology could be implemented elsewhere. © 2017 Royal Australasian College of Surgeons.

  13. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  14. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  15. Method of semi-automatic high precision potentiometric titration for characterization of uranium compounds

    International Nuclear Information System (INIS)

    Cristiano, Barbara Fernandes G.; Dias, Fabio C.; Barros, Pedro D. de; Araujo, Radier Mario S. de; Delgado, Jose Ubiratan; Silva, Jose Wanderley S. da; Lopes, Ricardo T.

    2011-01-01

    The method of high precision potentiometric titration is widely used in the certification and characterization of uranium compounds. In order to reduce the analysis and diminish the influence if the annalist, a semi-automatic version of the method was developed at the safeguards laboratory of the CNEN-RJ, Brazil. The method was applied with traceability guaranteed by use of primary standard of potassium dichromate. The standard uncertainty combined in the determination of concentration of total uranium was of the order of 0.01%, which is better related to traditionally methods used by the nuclear installations which is of the order of 0.1%

  16. A semi-automatic method for developing an anthropomorphic numerical model of dielectric anatomy by MRI

    International Nuclear Information System (INIS)

    Mazzurana, M; Sandrini, L; Vaccari, A; Malacarne, C; Cristoforetti, L; Pontalti, R

    2003-01-01

    Complex permittivity values have a dominant role in the overall consideration of interaction between radiofrequency electromagnetic fields and living matter, and in related applications such as electromagnetic dosimetry. There are still some concerns about the accuracy of published data and about their variability due to the heterogeneous nature of biological tissues. The aim of this study is to provide an alternative semi-automatic method by which numerical dielectric human models for dosimetric studies can be obtained. Magnetic resonance imaging (MRI) tomography was used to acquire images. A new technique was employed to correct nonuniformities in the images and frequency-dependent transfer functions to correlate image intensity with complex permittivity were used. The proposed method provides frequency-dependent models in which permittivity and conductivity vary with continuity-even in the same tissue-reflecting the intrinsic realistic spatial dispersion of such parameters. The human model is tested with an FDTD (finite difference time domain) algorithm at different frequencies; the results of layer-averaged and whole-body-averaged SAR (specific absorption rate) are compared with published work, and reasonable agreement has been found. Due to the short time needed to obtain a whole body model, this semi-automatic method may be suitable for efficient study of various conditions that can determine large differences in the SAR distribution, such as body shape, posture, fat-to-muscle ratio, height and weight

  17. Semi-automatic quantitative measurements of intracranial internal carotid artery stenosis and calcification using CT angiography

    International Nuclear Information System (INIS)

    Bleeker, Leslie; Berg, Rene van den; Majoie, Charles B.; Marquering, Henk A.; Nederkoorn, Paul J.

    2012-01-01

    Intracranial carotid artery atherosclerotic disease is an independent predictor for recurrent stroke. However, its quantitative assessment is not routinely performed in clinical practice. In this diagnostic study, we present and evaluate a novel semi-automatic application to quantitatively measure intracranial internal carotid artery (ICA) degree of stenosis and calcium volume in CT angiography (CTA) images. In this retrospective study involving CTA images of 88 consecutive patients, intracranial ICA stenosis was quantitatively measured by two independent observers. Stenoses were categorized with cutoff values of 30% and 50%. The calcification in the intracranial ICA was qualitatively categorized as absent, mild, moderate, or severe and quantitatively measured using the semi-automatic application. Linear weighted kappa values were calculated to assess the interobserver agreement of the stenosis and calcium categorization. The average and the standard deviation of the quantitative calcium volume were calculated for the calcium categories. For the stenosis measurements, the CTA images of 162 arteries yielded an interobserver correlation of 0.78 (P < 0.001). Kappa values of the categorized stenosis measurements were moderate: 0.45 and 0.58 for cutoff values of 30% and 50%, respectively. The kappa value for the calcium categorization was 0.62, with a good agreement between the qualitative and quantitative calcium assessment. Quantitative degree of stenosis measurement of the intracranial ICA on CTA is feasible with a good interobserver agreement ICA. Qualitative calcium categorization agrees well with quantitative measurements. (orig.)

  18. Evaluation of Semi-Automatic Metadata Generation Tools: A Survey of the Current State of the Art

    Directory of Open Access Journals (Sweden)

    Jung-ran Park

    2015-09-01

    Full Text Available Assessment of the current landscape of semi-automatic metadata generation tools is particularly important considering the rapid development of digital repositories and the recent explosion of big data. Utilization of (semiautomatic metadata generation is critical in addressing these environmental changes and may be unavoidable in the future considering the costly and complex operation of manual metadata creation. To address such needs, this study examines the range of semi-automatic metadata generation tools (n=39 while providing an analysis of their techniques, features, and functions. The study focuses on open-source tools that can be readily utilized in libraries and other memory institutions.  The challenges and current barriers to implementation of these tools were identified. The greatest area of difficulty lies in the fact that  the piecemeal development of most semi-automatic generation tools only addresses part of the issue of semi-automatic metadata generation, providing solutions to one or a few metadata elements but not the full range elements.  This indicates that significant local efforts will be required to integrate the various tools into a coherent set of a working whole.  Suggestions toward such efforts are presented for future developments that may assist information professionals with incorporation of semi-automatic tools within their daily workflows.

  19. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    Science.gov (United States)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  20. The Semi-automatic Synthesis of 18F-fluoroethyl-choline by Domestic FDG Synthesizer

    Directory of Open Access Journals (Sweden)

    ZHOU Ming

    2016-02-01

    Full Text Available As an important complementary imaging agent for 18F-FDG, 18F-fluoroethyl-choline (18F-FECH has been demonstrated to be promising in brain and prostate cancer imaging. By using domestic PET-FDG-TI-I CPCU synthesizer, 18F-FECH was synthesized by different reagents and consumable supplies. The C18 column was added before the product collection bottle to remove K2.2.2. The 18F-FECH was synthesized by PET-FDG-IT-I synthesizer efficiently about 30 minutes by radiochemical yield of 42.0% (no decay corrected, n=5, and the radiochemical purity was still more than 99.0% after 6 hours. The results showed the domestic PET-FDG-IT-I synthesizer could semi-automatically synthesize injectable 18F-FECH in high efficiency and radiochemical purity

  1. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    Science.gov (United States)

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  2. Response evaluation of malignant liver lesions after TACE/SIRT. Comparison of manual and semi-automatic measurement of different response criteria in multislice CT

    International Nuclear Information System (INIS)

    Hoeink, Anna Janina

    2017-01-01

    To compare measurement precision and interobserver variability in the evaluation of hepatocellular carcinoma (HCC) and liver metastases in MSCT before and after transarterial local ablative therapies. Retrospective study of 72 patients with malignant liver lesions (42 metastases; 30 HCCs) before and after therapy (43 SIRT procedures; 29 TACE procedures). Established (LAD; SAD; WHO) and vitality-based parameters (mRECIST; mLAD; mSAD; EASL) were assessed manually and semi-automatically by two readers. The relative interobserver difference (RID) and intraclass correlation coefficient (ICC) were calculated. The median RID for vitality-based parameters was lower from semi-automatic than from manual measurement of mLAD (manual 12.5 %; semi-automatic 3.4 %), mSAD (manual 12.7 %; semi-automatic 5.7 %) and EASL (manual 10.4 %; semi-automatic 1.8 %). The difference in established parameters was not statistically noticeable (p > 0.05). The ICCs of LAD (manual 0.984; semi-automatic 0.982), SAD (manual 0.975; semi-automatic 0.958) and WHO (manual 0.984; semi-automatic 0.978) are high, both in manual and semi-automatic measurements. The ICCs of manual measurements of mLAD (0.897), mSAD (0.844) and EASL (0.875) are lower. This decrease cannot be found in semi-automatic measurements of mLAD (0.997), mSAD (0.992) and EASL (0.998). Conclusion Vitality-based tumor measurements of HCC and metastases after transarterial local therapies should be performed semi-automatically due to greater measurement precision, thus increasing the reproducibility and in turn the reliability of therapeutic decisions.

  3. Response evaluation of malignant liver lesions after TACE/SIRT. Comparison of manual and semi-automatic measurement of different response criteria in multislice CT

    Energy Technology Data Exchange (ETDEWEB)

    Hoeink, Anna Janina [Univ. Hospital Cologne (Germany). Diagnostic and Interventional Radiology; Schuelke, Christoph; Loehnert, Annika; Kammerer, Sara; Fortkamp, Rasmus; Heindel, Walter; Buerke, Boris [Univ. Hospital Muenster (UKM), Muenster (Germany). Dept. of Clinical Radiology; Koch, Raphael [Univ. Hospital Muenster (UKM), Muenster (Germany). Inst. of Biostatistics and Clinical Research (IBKF)

    2017-11-15

    To compare measurement precision and interobserver variability in the evaluation of hepatocellular carcinoma (HCC) and liver metastases in MSCT before and after transarterial local ablative therapies. Retrospective study of 72 patients with malignant liver lesions (42 metastases; 30 HCCs) before and after therapy (43 SIRT procedures; 29 TACE procedures). Established (LAD; SAD; WHO) and vitality-based parameters (mRECIST; mLAD; mSAD; EASL) were assessed manually and semi-automatically by two readers. The relative interobserver difference (RID) and intraclass correlation coefficient (ICC) were calculated. The median RID for vitality-based parameters was lower from semi-automatic than from manual measurement of mLAD (manual 12.5 %; semi-automatic 3.4 %), mSAD (manual 12.7 %; semi-automatic 5.7 %) and EASL (manual 10.4 %; semi-automatic 1.8 %). The difference in established parameters was not statistically noticeable (p > 0.05). The ICCs of LAD (manual 0.984; semi-automatic 0.982), SAD (manual 0.975; semi-automatic 0.958) and WHO (manual 0.984; semi-automatic 0.978) are high, both in manual and semi-automatic measurements. The ICCs of manual measurements of mLAD (0.897), mSAD (0.844) and EASL (0.875) are lower. This decrease cannot be found in semi-automatic measurements of mLAD (0.997), mSAD (0.992) and EASL (0.998). Conclusion Vitality-based tumor measurements of HCC and metastases after transarterial local therapies should be performed semi-automatically due to greater measurement precision, thus increasing the reproducibility and in turn the reliability of therapeutic decisions.

  4. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    Science.gov (United States)

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  5. Treatment choice for osteoarthritis of the knee joint according to semi-automatic MRI based assessment of disease severity

    International Nuclear Information System (INIS)

    Sasho, Takahisa; Suzuki, Masahiko; Nakagawa, Koichi; Ochiai, Nobuyasu; Matsuki, Megumi; Takahashi, Kazuhisa; Moriya, Hideshige

    2008-01-01

    Objective assessment of disease severity of osteoarthritis of the knee joint (OA knee) is fundamental to establish adequate treatment system. Regrettably, there is no such a reliable system. Grading system based upon X-ray findings or measurement of joint space narrowing is widely used method for this purpose but they are still far from satisfaction. Our previous study elucidated that measuring irregularity of the contour of the femoral condyle on MRI (irregularity index) using newly developed software enabled us to assess disease severity of OA objectively. Advantages of this system are expressing severity by metric variable and semi-automatic character. In the present study, we examined relationship between treatment selection and irregularity index. Sixty-one medial type OA knees that received total knee arthroplasty (TKA), arthroscopic surgery (AS), and conservative treatment (CT) were involved. Their x-ray grading, irregularity index were recorded at the time of corresponding treatment. Irregularity index of each group were compared. As for AS group, pre- and post-operative knee score employing JOA score were also examined to study relationship between irregularity index and improvement of knee score. All the four parameters that represent irregularity of femoral condyle were significantly higher in TKA group than in AS group, whereas no significant difference was observed between AS group and CT group. Negative correlation was observed between irregularity index and improvement of knee score after arthroscopic surgery. Although treatment selection was determined by skillful knee surgeon in this series, irregularity index could indicate adequate timing of TKA. It also served as an indicator to predict outcome of arthroscopic surgery, and could be used as to show limitation of arthroscopic surgery. Our new system to assess disease severity of OA knee can serve as an index to determine treatment options. (author)

  6. LTRsift: a graphical user interface for semi-automatic classification and postprocessing of de novo detected LTR retrotransposons

    Directory of Open Access Journals (Sweden)

    Steinbiss Sascha

    2012-11-01

    Full Text Available Abstract Background Long terminal repeat (LTR retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets, making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. Results We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. Conclusions LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining

  7. LTRsift: a graphical user interface for semi-automatic classification and postprocessing of de novo detected LTR retrotransposons.

    Science.gov (United States)

    Steinbiss, Sascha; Kastens, Sascha; Kurtz, Stefan

    2012-11-07

    Long terminal repeat (LTR) retrotransposons are a class of eukaryotic mobile elements characterized by a distinctive sequence similarity-based structure. Hence they are well suited for computational identification. Current software allows for a comprehensive genome-wide de novo detection of such elements. The obvious next step is the classification of newly detected candidates resulting in (super-)families. Such a de novo classification approach based on sequence-based clustering of transposon features has been proposed before, resulting in a preliminary assignment of candidates to families as a basis for subsequent manual refinement. However, such a classification workflow is typically split across a heterogeneous set of glue scripts and generic software (for example, spreadsheets), making it tedious for a human expert to inspect, curate and export the putative families produced by the workflow. We have developed LTRsift, an interactive graphical software tool for semi-automatic postprocessing of de novo predicted LTR retrotransposon annotations. Its user-friendly interface offers customizable filtering and classification functionality, displaying the putative candidate groups, their members and their internal structure in a hierarchical fashion. To ease manual work, it also supports graphical user interface-driven reassignment, splitting and further annotation of candidates. Export of grouped candidate sets in standard formats is possible. In two case studies, we demonstrate how LTRsift can be employed in the context of a genome-wide LTR retrotransposon survey effort. LTRsift is a useful and convenient tool for semi-automated classification of newly detected LTR retrotransposons based on their internal features. Its efficient implementation allows for convenient and seamless filtering and classification in an integrated environment. Developed for life scientists, it is helpful in postprocessing and refining the output of software for predicting LTR

  8. A Semi-automatic Algorithm for Preliminary Assessment of Labial Gingiva and Alveolar Bone Thickness of Maxillary Anterior Teeth.

    Science.gov (United States)

    Chen, Sheng-Hong; Chan, Hsun-Liang; Lu, Yongning; Ong, Sim-Heng; Wang, Hom-Lay; Ko, Eng Hong; Chang, Po-Chun

    Soft and hard tissue volumes are critical for implant placement and long-term stability. Although the literature has adequately addressed tissue biotypes of Western populations, pertinent information about Asian populations is limited. This study aimed to evaluate the soft and hard tissue profiles of the maxillary anterior teeth of the Taiwanese population using a semi-automatic algorithm. Cone beam computed tomography images of 11 adults with well-aligned maxillary anterior teeth were overlaid with those of cast models, based on the tooth crowns manually outlined by two independent observers. Each tooth was digitally trisected mesiodistally and apicocoronally. The thicknesses of the labial gingiva and alveolar bone were measured using a customized software program. No obvious difference between the observers was noted regarding the dimension of tooth crowns. The average thicknesses of the labial gingiva, the labial alveolar bone, and the palatal alveolar bone were 1.76 ± 0.11 mm, 1.02 ± 0.12 mm, and 1.80 ± 0.31 mm, respectively, with no significant differences between teeth. All parameters were thicker in the apical region than in the cervical region, and the alveolar bone was thinner in the midlabial region of incisors than in the interproximal regions. The thinnest areas were the midcervical compartment of the right central incisor (0.53 ± 0.33 mm) for the labial gingiva, the midcervical compartment of the right lateral incisor (0.23 ± 0.10 mm) for the labial alveolar bone, and the mesiocervical compartment of the left central incisor (0.33 ± 0.09 mm) for the palatal alveolar bone. This study presents an objective and comprehensive methodology for evaluating the soft and hard tissue profiles of maxillary anterior teeth and may be of value for presurgical planning for immediate implant placement. The results suggest that profiles of the Taiwanese subjects are similar to profiles of Western populations.

  9. Semi-automatic assessment of skin capillary density: proof of principle and validation.

    Science.gov (United States)

    Gronenschild, E H B M; Muris, D M J; Schram, M T; Karaca, U; Stehouwer, C D A; Houben, A J H M

    2013-11-01

    Skin capillary density and recruitment have been proven to be relevant measures of microvascular function. Unfortunately, the assessment of skin capillary density from movie files is very time-consuming, since this is done manually. This impedes the use of this technique in large-scale studies. We aimed to develop a (semi-) automated assessment of skin capillary density. CapiAna (Capillary Analysis) is a newly developed semi-automatic image analysis application. The technique involves four steps: 1) movement correction, 2) selection of the frame range and positioning of the region of interest (ROI), 3) automatic detection of capillaries, and 4) manual correction of detected capillaries. To gain insight into the performance of the technique, skin capillary density was measured in twenty participants (ten women; mean age 56.2 [42-72] years). To investigate the agreement between CapiAna and the classic manual counting procedure, we used weighted Deming regression and Bland-Altman analyses. In addition, intra- and inter-observer coefficients of variation (CVs), and differences in analysis time were assessed. We found a good agreement between CapiAna and the classic manual method, with a Pearson's correlation coefficient (r) of 0.95 (Pdifferences between the two methods, with an intercept of the Deming regression of 1.75 (-6.04; 9.54), while the Bland-Altman analysis showed a mean difference (bias) of 2.0 (-13.5; 18.4) capillaries/mm(2). The intra- and inter-observer CVs of CapiAna were 2.5% and 5.6% respectively, while for the classic manual counting procedure these were 3.2% and 7.2%, respectively. Finally, the analysis time for CapiAna ranged between 25 and 35min versus 80 and 95min for the manual counting procedure. We have developed a semi-automatic image analysis application (CapiAna) for the assessment of skin capillary density, which agrees well with the classic manual counting procedure, is time-saving, and has a better reproducibility as compared to the

  10. Semi-automatic mapping of linear-trending bedforms using 'Self-Organizing Maps' algorithm

    Science.gov (United States)

    Foroutan, M.; Zimbelman, J. R.

    2017-09-01

    Increased application of high resolution spatial data such as high resolution satellite or Unmanned Aerial Vehicle (UAV) images from Earth, as well as High Resolution Imaging Science Experiment (HiRISE) images from Mars, makes it necessary to increase automation techniques capable of extracting detailed geomorphologic elements from such large data sets. Model validation by repeated images in environmental management studies such as climate-related changes as well as increasing access to high-resolution satellite images underline the demand for detailed automatic image-processing techniques in remote sensing. This study presents a methodology based on an unsupervised Artificial Neural Network (ANN) algorithm, known as Self Organizing Maps (SOM), to achieve the semi-automatic extraction of linear features with small footprints on satellite images. SOM is based on competitive learning and is efficient for handling huge data sets. We applied the SOM algorithm to high resolution satellite images of Earth and Mars (Quickbird, Worldview and HiRISE) in order to facilitate and speed up image analysis along with the improvement of the accuracy of results. About 98% overall accuracy and 0.001 quantization error in the recognition of small linear-trending bedforms demonstrate a promising framework.

  11. SEMI-AUTOMATIC BUILDING MODELS AND FAÇADE TEXTURE MAPPING FROM MOBILE PHONE IMAGES

    Directory of Open Access Journals (Sweden)

    J. Jeong

    2016-06-01

    Full Text Available Research on 3D urban modelling has been actively carried out for a long time. Recently the need of 3D urban modelling research is increased rapidly due to improved geo-web services and popularized smart devices. Nowadays 3D urban models provided by, for example, Google Earth use aerial photos for 3D urban modelling but there are some limitations: immediate update for the change of building models is difficult, many buildings are without 3D model and texture, and large resources for maintaining and updating are inevitable. To resolve the limitations mentioned above, we propose a method for semi-automatic building modelling and façade texture mapping from mobile phone images and analyze the result of modelling with actual measurements. Our method consists of camera geometry estimation step, image matching step, and façade mapping step. Models generated from this method were compared with actual measurement value of real buildings. Ratios of edge length of models and measurements were compared. Result showed 5.8% average error of length ratio. Through this method, we could generate a simple building model with fine façade textures without expensive dedicated tools and dataset.

  12. Semi-automatic characterization and simulation of VCSEL devices for high speed VSR communications

    Science.gov (United States)

    Pellevrault, S.; Toffano, Z.; Destrez, A.; Pez, M.; Quentel, F.

    2006-04-01

    Very short range (VSR) high bit rate optical fiber communications are an emerging market dedicated to local area networks, digital displays or board to board interconnects within real time calculators. In this technology, a very fast way to exchange data with high noise immunity and low-cost is needed. Optical multimode graded index fibers are used here because they have electrical noise immunity and are easier to handle than monomode fibers. 850 nm VCSEL are used in VSR communications because of their low cost, direct on-wafer tests, and the possibility of manufacturing VCSEL arrays very easily compared to classical optical transceivers using edge-emitting laser diodes. Although much research has been carried out in temperature modeling on VCSEL emitters, few studies have been devoted to characterizations over a very broad range of temperatures. Nowadays, VCSEL VSR communications tend to be used in severe environments such as space, avionics and military equipments. Therefore, a simple way to characterize VCSEL emitters over a broad range of temperature is required. In this paper, we propose a complete characterization of the emitter part of 2.5 Gb/s opto-electrical transceiver modules operating from -40°C to +120°C using 850 nm VCSELs. Our method uses simple and semi-automatic measurements of a given set of chosen device parameters in order to make fast and efficient simulations.

  13. Colon wall motility: comparison of novel quantitative semi-automatic measurements using cine MRI.

    Science.gov (United States)

    Hoad, C L; Menys, A; Garsed, K; Marciani, L; Hamy, V; Murray, K; Costigan, C; Atkinson, D; Major, G; Spiller, R C; Taylor, S A; Gowland, P A

    2016-03-01

    Recently, cine magnetic resonance imaging (MRI) has shown promise for visualizing movement of the colonic wall, although assessment of data has been subjective and observer dependent. This study aimed to develop an objective and semi-automatic imaging metric of ascending colonic wall movement, using image registration techniques. Cine balanced turbo field echo MRI images of ascending colonic motility were acquired over 2 min from 23 healthy volunteers (HVs) at baseline and following two different macrogol stimulus drinks (11 HVs drank 1 L and 12 HVs drank 2 L). Motility metrics derived from large scale geometric and small scale pixel movement parameters following image registration were developed using the post ingestion data and compared to observer grading of wall motion. Inter and intra-observer variability in the highest correlating metric was assessed using Bland-Altman analysis calculated from two separate observations on a subset of data. All the metrics tested showed significant correlation with the observer rating scores. Line analysis (LA) produced the highest correlation coefficient of 0.74 (95% CI: 0.55-0.86), p cine MRI registered data provides a quick, accurate and non-invasive method to detect wall motion within the ascending colon following a colonic stimulus in the form of a macrogol drink. © 2015 John Wiley & Sons Ltd.

  14. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    Directory of Open Access Journals (Sweden)

    Dong Jiang

    Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.

  15. Conceptual design of semi-automatic wheelbarrow to overcome ergonomics problems among palm oil plantation workers

    Science.gov (United States)

    Nawik, N. S. M.; Deros, B. M.; Rahman, M. N. A.; Sukadarin, E. H.; Nordin, N.; Tamrin, S. B. M.; Bakar, S. A.; Norzan, M. L.

    2015-12-01

    An ergonomics problem is one of the main issues faced by palm oil plantation workers especially during harvesting and collecting of fresh fruit bunches (FFB). Intensive manual handling and labor activities involved have been associated with high prevalence of musculoskeletal disorders (MSDs) among palm oil plantation workers. New and safe technology on machines and equipment in palm oil plantation are very important in order to help workers reduce risks and injuries while working. The aim of this research is to improve the design of a wheelbarrow, which is suitable for workers and a small size oil palm plantation. The wheelbarrow design was drawn using CATIA ergonomic features. The characteristic of ergonomics assessment is performed by comparing the existing design of wheelbarrow. Conceptual design was developed based on the problems that have been reported by workers. From the analysis of the problem, finally have resulting concept design the ergonomic quality of semi-automatic wheelbarrow with safe and suitable used for palm oil plantation workers.

  16. Semi-Automatic Detection of Indigenous Settlement Features on Hispaniola through Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Till F. Sonnemann

    2017-12-01

    Full Text Available Satellite imagery has had limited application in the analysis of pre-colonial settlement archaeology in the Caribbean; visible evidence of wooden structures perishes quickly in tropical climates. Only slight topographic modifications remain, typically associated with middens. Nonetheless, surface scatters, as well as the soil characteristics they produce, can serve as quantifiable indicators of an archaeological site, detectable by analyzing remote sensing imagery. A variety of pre-processed, very diverse data sets went through a process of image registration, with the intention to combine multispectral bands to feed two different semi-automatic direct detection algorithms: a posterior probability, and a frequentist approach. Two 5 × 5 km2 areas in the northwestern Dominican Republic with diverse environments, having sufficient imagery coverage, and a representative number of known indigenous site locations, served each for one approach. Buffers around the locations of known sites, as well as areas with no likely archaeological evidence were used as samples. The resulting maps offer quantifiable statistical outcomes of locations with similar pixel value combinations as the identified sites, indicating higher probability of archaeological evidence. These still very experimental and rather unvalidated trials, as they have not been subsequently groundtruthed, show variable potential of this method in diverse environments.

  17. Semi-automatic mapping for identifying complex geobodies in seismic images

    Science.gov (United States)

    Domínguez-C, Raymundo; Romero-Salcedo, Manuel; Velasquillo-Martínez, Luis G.; Shemeretov, Leonid

    2017-03-01

    Seismic images are composed of positive and negative seismic wave traces with different amplitudes (Robein 2010 Seismic Imaging: A Review of the Techniques, their Principles, Merits and Limitations (Houten: EAGE)). The association of these amplitudes together with a color palette forms complex visual patterns. The color intensity of such patterns is directly related to impedance contrasts: the higher the contrast, the higher the color intensity. Generally speaking, low impedance contrasts are depicted with low tone colors, creating zones with different patterns whose features are not evident for a 3D automated mapping option available on commercial software. In this work, a workflow for a semi-automatic mapping of seismic images focused on those areas with low-intensity colored zones that may be associated with geobodies of petroleum interest is proposed. The CIE L*A*B* color space was used to perform the seismic image processing, which helped find small but significant differences between pixel tones. This process generated binary masks that bound color regions to low-intensity colors. The three-dimensional-mask projection allowed the construction of 3D structures for such zones (geobodies). The proposed method was applied to a set of digital images from a seismic cube and tested on four representative study cases. The obtained results are encouraging because interesting geobodies are obtained with a minimum of information.

  18. A semi-automatic method for extracting thin line structures in images as rooted tree network

    Energy Technology Data Exchange (ETDEWEB)

    Brazzini, Jacopo [Los Alamos National Laboratory; Dillard, Scott [Los Alamos National Laboratory; Soille, Pierre [EC - JRC

    2010-01-01

    This paper addresses the problem of semi-automatic extraction of line networks in digital images - e.g., road or hydrographic networks in satellite images, blood vessels in medical images, robust. For that purpose, we improve a generic method derived from morphological and hydrological concepts and consisting in minimum cost path estimation and flow simulation. While this approach fully exploits the local contrast and shape of the network, as well as its arborescent nature, we further incorporate local directional information about the structures in the image. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the target network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given seed with this metric is combined with hydrological operators for overland flow simulation to extract the line network. The algorithm is demonstrated for the extraction of blood vessels in a retina image and of a river network in a satellite image.

  19. Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

    Data.gov (United States)

    National Aeronautics and Space Administration — Enhance capabilities for collaborative data analysis and modeling in Earth sciences. Develop components for automatic workflow capture, archiving and management....

  20. Usefulness of semi-automatic volumetry compared to established linear measurements in predicting lymph node metastases in MSCT

    Energy Technology Data Exchange (ETDEWEB)

    Buerke, Boris; Puesken, Michael; Heindel, Walter; Wessling, Johannes (Dept. of Clinical Radiology, Univ. of Muenster (Germany)), email: buerkeb@uni-muenster.de; Gerss, Joachim (Dept. of Medical Informatics and Biomathematics, Univ. of Muenster (Germany)); Weckesser, Matthias (Dept. of Nuclear Medicine, Univ. of Muenster (Germany))

    2011-06-15

    Background Volumetry of lymph nodes potentially better reflect asymmetric size alterations independently of lymph node orientation in comparison to metric parameters (e.g. long-axis diameter). Purpose To distinguish between benign and malignant lymph nodes by comparing 2D and semi-automatic 3D measurements in MSCT. Material and Methods FDG-18 PET-CT was performed in 33 patients prior to therapy for malignant melanoma at stage III/IV. One hundred and eighty-six cervico-axillary, abdominal and inguinal lymph nodes were evaluated independently by two radiologists, both manually and with the use of semi-automatic segmentation software. Long axis (LAD), short axis (SAD), maximal 3D diameter, volume and elongation were obtained. PET-CT, PET-CT follow-up and/or histology served as a combined reference standard. Statistics encompassed intra-class correlation coefficients and ROC curves. Results Compared to manual assessment, semi-automatic inter-observer variability was found to be lower, e.g. at 2.4% (95% CI 0.05-4.8) for LAD. The standard of reference revealed metastases in 90 (48%) of 186 lymph nodes. Semi-automatic prediction of lymph node metastases revealed highest areas under the ROC curves for volume (reader 1 0.77, 95%CI 0.64-0.90; reader 2 0.76, 95%CI 0.59-0.86) and SAD (reader 1 0.76, 95%CI 0.64-0.88; reader 2 0.75, 95%CI 0.62-0.89). The findings for LAD (reader 1 0.73, 95%CI 0.60-0.86; reader 2 0.71, 95%CI 0.71, 95%CI 0.57-0.85) and maximal 3D diameter (reader 1 0.70, 95%CI 0.53-0.86; reader 2 0.76, 95%CI 0.50-0.80) were found substantially lower and for elongation (reader 1 0.65, 95%CI 0.50-0.79; reader 2 0.66, 95%CI 0.52-0.81) significantly lower (p < 0.05). Conclusion Semi-automatic analysis of lymph nodes in malignant melanoma is supported by high segmentation quality and reproducibility. As compared to established SAD, semi-automatic lymph node volumetry does not have an additive role for categorizing lymph nodes as normal or metastatic in malignant

  1. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  2. Semi-automatic synthesis and biological evaluation of 18F-FCH as an oncologic PET tracer

    International Nuclear Information System (INIS)

    Wu Zhanhong; Wang Shizhen; Zhou Qian; Fu Zhe; Qiu Feichan; Huo Li

    2005-01-01

    18 F-fluromethylcholine ( 18 F-FCH) as a PET tracer is synthesized. The semi-automatic synthesis assembly of 18 F-FCH is modified from CPCU(CTI). The radiochemical purity is measured by analytical HPLC. The radiochemical yield and the radiochemical purity of 18 F-FCH are 15% and >99%, respectively. The total radiosynthesis time is 55 min after EOB. The labeled product exhibited low toxicity. The biodistribution in normal mice and the toxicity are studied. PET imaging with 18 F-FCH is performed on tumor xenograft murine model. The semi-automatic synthesis assembly is promising to be used for routine clinic radiopharmaceutical preparation and preliminary study has shown the usefulness of 18 F-FCH as an oncologic PET tracer. (authors)

  3. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    OpenAIRE

    U. Mallast; R. Gloaguen; S. Geyer; T. Rödiger; C. Siebert

    2011-01-01

    In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxili...

  4. A deformable-model approach to semi-automatic segmentation of CT images demonstrated by application to the spinal canal

    International Nuclear Information System (INIS)

    Burnett, Stuart S.C.; Starkschall, George; Stevens, Craig W.; Liao Zhongxing

    2004-01-01

    Because of the importance of accurately defining the target in radiation treatment planning, we have developed a deformable-template algorithm for the semi-automatic delineation of normal tissue structures on computed tomography (CT) images. We illustrate the method by applying it to the spinal canal. Segmentation is performed in three steps: (a) partial delineation of the anatomic structure is obtained by wavelet-based edge detection; (b) a deformable-model template is fitted to the edge set by chamfer matching; and (c) the template is relaxed away from its original shape into its final position. Appropriately chosen ranges for the model parameters limit the deformations of the template, accounting for interpatient variability. Our approach differs from those used in other deformable models in that it does not inherently require the modeling of forces. Instead, the spinal canal was modeled using Fourier descriptors derived from four sets of manually drawn contours. Segmentation was carried out, without manual intervention, on five CT data sets and the algorithm's performance was judged subjectively by two radiation oncologists. Two assessments were considered: in the first, segmentation on a random selection of 100 axial CT images was compared with the corresponding contours drawn manually by one of six dosimetrists, also chosen randomly; in the second assessment, the segmentation of each image in the five evaluable CT sets (a total of 557 axial images) was rated as either successful, unsuccessful, or requiring further editing. Contours generated by the algorithm were more likely than manually drawn contours to be considered acceptable by the oncologists. The mean proportions of acceptable contours were 93% (automatic) and 69% (manual). Automatic delineation of the spinal canal was deemed to be successful on 91% of the images, unsuccessful on 2% of the images, and requiring further editing on 7% of the images. Our deformable template algorithm thus gives a robust

  5. Parallelization of the AliRoot event reconstruction by performing a semi- automatic source-code transformation

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    side bus or processor interconnections. Parallelism can only result in performance gain, if the memory usage is optimized, memory locality improved and the communication between threads is minimized. But the domain of concurrent programming has become a field for highly skilled experts, as the implementation of multithreading is difficult, error prone and labor intensive. A full re-implementation for parallel execution of existing offline frameworks, like AliRoot in ALICE, is thus unaffordable. An alternative method, is to use a semi-automatic source-to-source transformation for getting a simple parallel design, with almost no interference between threads. This reduces the need of rewriting the develop...

  6. Speaker diarization and speech recognition in the semi-automatization of audio description: An exploratory study on future possibilities?

    Directory of Open Access Journals (Sweden)

    Héctor Delgado

    2015-12-01

    Full Text Available This article presents an overview of the technological components used in the process of audio description, and suggests a new scenario in which speech recognition, machine translation, and text-to-speech, with the corresponding human revision, could be used to increase audio description provision. The article focuses on a process in which both speaker diarization and speech recognition are used in order to obtain a semi-automatic transcription of the audio description track. The technical process is presented and experimental results are summarized.

  7. Speaker diarization and speech recognition in the semi-automatization of audio description: An exploratory study on future possibilities?

    Directory of Open Access Journals (Sweden)

    Héctor Delgado

    2015-06-01

    This article presents an overview of the technological components used in the process of audio description, and suggests a new scenario in which speech recognition, machine translation, and text-to-speech, with the corresponding human revision, could be used to increase audio description provision. The article focuses on a process in which both speaker diarization and speech recognition are used in order to obtain a semi-automatic transcription of the audio description track. The technical process is presented and experimental results are summarized.

  8. A semi-automatic 2D-to-3D video conversion with adaptive key-frame selection

    Science.gov (United States)

    Ju, Kuanyu; Xiong, Hongkai

    2014-11-01

    To compensate the deficit of 3D content, 2D to 3D video conversion (2D-to-3D) has recently attracted more attention from both industrial and academic communities. The semi-automatic 2D-to-3D conversion which estimates corresponding depth of non-key-frames through key-frames is more desirable owing to its advantage of balancing labor cost and 3D effects. The location of key-frames plays a role on quality of depth propagation. This paper proposes a semi-automatic 2D-to-3D scheme with adaptive key-frame selection to keep temporal continuity more reliable and reduce the depth propagation errors caused by occlusion. The potential key-frames would be localized in terms of clustered color variation and motion intensity. The distance of key-frame interval is also taken into account to keep the accumulated propagation errors under control and guarantee minimal user interaction. Once their depth maps are aligned with user interaction, the non-key-frames depth maps would be automatically propagated by shifted bilateral filtering. Considering that depth of objects may change due to the objects motion or camera zoom in/out effect, a bi-directional depth propagation scheme is adopted where a non-key frame is interpolated from two adjacent key frames. The experimental results show that the proposed scheme has better performance than existing 2D-to-3D scheme with fixed key-frame interval.

  9. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  10. Comparison of 2D radiography and a semi-automatic CT-based 3D method for measuring change in dorsal angulation over time in distal radius fractures

    Energy Technology Data Exchange (ETDEWEB)

    Christersson, Albert; Larsson, Sune [Uppsala University, Department of Orthopaedics, Uppsala (Sweden); Nysjoe, Johan; Malmberg, Filip; Sintorn, Ida-Maria; Nystroem, Ingela [Uppsala University, Centre for Image Analysis, Uppsala (Sweden); Berglund, Lars [Uppsala University, Uppsala Clinical Research Centre, UCR Statistics, Uppsala (Sweden)

    2016-06-15

    The aim of the present study was to compare the reliability and agreement between a computer tomography-based method (CT) and digitalised 2D radiographs (XR) when measuring change in dorsal angulation over time in distal radius fractures. Radiographs from 33 distal radius fractures treated with external fixation were retrospectively analysed. All fractures had been examined using both XR and CT at six times over 6 months postoperatively. The changes in dorsal angulation between the first reference images and the following examinations in every patient were calculated from 133 follow-up measurements by two assessors and repeated at two different time points. The measurements were analysed using Bland-Altman plots, comparing intra- and inter-observer agreement within and between XR and CT. The mean differences in intra- and inter-observer measurements for XR, CT, and between XR and CT were close to zero, implying equal validity. The average intra- and inter-observer limits of agreement for XR, CT, and between XR and CT were ± 4.4 , ± 1.9 and ± 6.8 respectively. For scientific purpose, the reliability of XR seems unacceptably low when measuring changes in dorsal angulation in distal radius fractures, whereas the reliability for the semi-automatic CT-based method was higher and is therefore preferable when a more precise method is requested. (orig.)

  11. Predicting new-onset of postoperative atrial fibrillation in patients undergoing cardiac surgery using semi-automatic reading of perioperative electrocardiograms

    DEFF Research Database (Denmark)

    Gu, Jiwei; Graff, Claus; Melgaard, Jacob

    2015-01-01

    P10 Predicting new-onset of postoperative atrial fibrillation in patients undergoingcardiac surgery using semi-automatic reading of perioperative electrocardiograms. Jiwei Gu, Claus Graff, Jacob Melgaard, Søren Lundbye-Christensen, Erik Berg Schmidt, Christian Torp-Pedersen, Kristinn Thorsteinsson......, Jan Jesper Andreasen. Aalborg, DenmarkBackground: Postoperative new onset atrial fibrillation (POAF) is the most common arrhythmia after cardiac surgery. The aim of this study was to evaluate if semi-automatic readings of perioperative electrocardiograms (ECGs) is of any value in predicting POAF after...... ECG monitoring. A semi-automatic machine capable of reading differentparameters of digitalized ECG’s was used to read both lead specific (P/QRS/T amplitudes/intervals) and global measurements (P-duration/QRS-duration/PR-interval/QT/Heart Rate/hypertrophy).Results: We divided the patients into two...

  12. ANFSQ-7 the computer that shaped the cold war

    CERN Document Server

    Ulmann, Bernd

    2014-01-01

    One of the most impressive computer systems ever was the vacuum tube based behemoth AN/FSQ-7, which was the heart of the ""Semi Automatic Ground Environment"". Machines of this type were children of the Cold War and had a tremendous effect not only on this episode in politics but also generated a vast amount of spin-offs which still shape our world.

  13. Preliminary Investigation on the Effects of Shockwaves on Water Samples Using a Portable Semi-Automatic Shocktube

    Science.gov (United States)

    Wessley, G. Jims John

    2017-10-01

    The propagation of shock waves through any media results in an instantaneous increase in pressure and temperature behind the shockwave. The scope of utilizing this sudden rise in pressure and temperature in new industrial, biological and commercial areas has been explored and the opportunities are tremendous. This paper presents the design and testing of a portable semi-automatic shock tube on water samples mixed with salt. The preliminary analysis shows encouraging results as the salinity of water samples were reduced up to 5% when bombarded with 250 shocks generated using a pressure ratio of 2. 5. Paper used for normal printing is used as the diaphragm to generate the shocks. The impact of shocks of much higher intensity obtained using different diaphragms will lead to more reduction in the salinity of the sea water, thus leading to production of potable water from saline water, which is the need of the hour.

  14. Semi-Automatic Removal of Foreground Stars from Images of Galaxies

    Science.gov (United States)

    Frei, Zsolt

    1996-07-01

    A new procedure, designed to remove foreground stars from galaxy proviles is presented here. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well-known stellar photometry packages, DAOPhot (Stetson 1987). Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since: (a) the most suitable stars are selected automatically from the image for the PSF fit; (b) after star-removal an intelligent and automatic procedure removes any possible residuals; (c) unlimited number of images can be cleaned in one run without any user interaction whatsoever. (SECTION: Computing and Data Analysis)

  15. Comparison Of Semi-Automatic And Automatic Slick Detection Algorithms For Jiyeh Power Station Oil Spill, Lebanon

    Science.gov (United States)

    Osmanoglu, B.; Ozkan, C.; Sunar, F.

    2013-10-01

    After air strikes on July 14 and 15, 2006 the Jiyeh Power Station started leaking oil into the eastern Mediterranean Sea. The power station is located about 30 km south of Beirut and the slick covered about 170 km of coastline threatening the neighboring countries Turkey and Cyprus. Due to the ongoing conflict between Israel and Lebanon, cleaning efforts could not start immediately resulting in 12 000 to 15 000 tons of fuel oil leaking into the sea. In this paper we compare results from automatic and semi-automatic slick detection algorithms. The automatic detection method combines the probabilities calculated for each pixel from each image to obtain a joint probability, minimizing the adverse effects of atmosphere on oil spill detection. The method can readily utilize X-, C- and L-band data where available. Furthermore wind and wave speed observations can be used for a more accurate analysis. For this study, we utilize Envisat ASAR ScanSAR data. A probability map is generated based on the radar backscatter, effect of wind and dampening value. The semi-automatic algorithm is based on supervised classification. As a classifier, Artificial Neural Network Multilayer Perceptron (ANN MLP) classifier is used since it is more flexible and efficient than conventional maximum likelihood classifier for multisource and multi-temporal data. The learning algorithm for ANN MLP is chosen as the Levenberg-Marquardt (LM). Training and test data for supervised classification are composed from the textural information created from SAR images. This approach is semiautomatic because tuning the parameters of classifier and composing training data need a human interaction. We point out the similarities and differences between the two methods and their results as well as underlining their advantages and disadvantages. Due to the lack of ground truth data, we compare obtained results to each other, as well as other published oil slick area assessments.

  16. Semi-automatic analysis of standard uptake values in serial PET/CT studies in patients with lung cancer and lymphoma

    Directory of Open Access Journals (Sweden)

    Ly John

    2012-04-01

    Full Text Available Abstract Background Changes in maximum standardised uptake values (SUVmax between serial PET/CT studies are used to determine disease progression or regression in oncologic patients. To measure these changes manually can be time consuming in a clinical routine. A semi-automatic method for calculation of SUVmax in serial PET/CT studies was developed and compared to a conventional manual method. The semi-automatic method first aligns the serial PET/CT studies based on the CT images. Thereafter, the reader selects an abnormal lesion in one of the PET studies. After this manual step, the program automatically detects the corresponding lesion in the other PET study, segments the two lesions and calculates the SUVmax in both studies as well as the difference between the SUVmax values. The results of the semi-automatic analysis were compared to that of a manual SUVmax analysis using a Philips PET/CT workstation. Three readers did the SUVmax readings in both methods. Sixteen patients with lung cancer or lymphoma who had undergone two PET/CT studies were included. There were a total of 26 lesions. Results Linear regression analysis of changes in SUVmax show that intercepts and slopes are close to the line of identity for all readers (reader 1: intercept = 1.02, R2 = 0.96; reader 2: intercept = 0.97, R2 = 0.98; reader 3: intercept = 0.99, R2 = 0.98. Manual and semi-automatic method agreed in all cases whether SUVmax had increased or decreased between the serial studies. The average time to measure SUVmax changes in two serial PET/CT examinations was four to five times longer for the manual method compared to the semi-automatic method for all readers (reader 1: 53.7 vs. 10.5 s; reader 2: 27.3 vs. 6.9 s; reader 3: 47.5 vs. 9.5 s; p Conclusions Good agreement was shown in assessment of SUVmax changes between manual and semi-automatic method. The semi-automatic analysis was four to five times faster to perform than the manual analysis. These findings show the

  17. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  18. RFA-cut: Semi-automatic segmentation of radiofrequency ablation zones with and without needles via optimal s-t-cuts.

    Science.gov (United States)

    Egger, Jan; Busse, Harald; Brandmaier, Philipp; Seider, Daniel; Gawlitza, Matthias; Strocka, Steffen; Voglreiter, Philip; Dokter, Mark; Hofmann, Michael; Kainz, Bernhard; Chen, Xiaojun; Hann, Alexander; Boechat, Pedro; Yu, Wei; Freisleben, Bernd; Alhonnoro, Tuomas; Pollari, Mika; Moche, Michael; Schmalstieg, Dieter

    2015-01-01

    In this contribution, we present a semi-automatic segmentation algorithm for radiofrequency ablation (RFA) zones via optimal s-t-cuts. Our interactive graph-based approach builds upon a polyhedron to construct the graph and was specifically designed for computed tomography (CT) acquisitions from patients that had RFA treatments of Hepatocellular Carcinomas (HCC). For evaluation, we used twelve post-interventional CT datasets from the clinical routine and as evaluation metric we utilized the Dice Similarity Coefficient (DSC), which is commonly accepted for judging computer aided medical segmentation tasks. Compared with pure manual slice-by-slice expert segmentations from interventional radiologists, we were able to achieve a DSC of about eighty percent, which is sufficient for our clinical needs. Moreover, our approach was able to handle images containing (DSC=75.9%) and not containing (78.1%) the RFA needles still in place. Additionally, we found no statistically significant difference (p<;0.423) between the segmentation results of the subgroups for a Mann-Whitney test. Finally, to the best of our knowledge, this is the first time a segmentation approach for CT scans including the RFA needles is reported and we show why another state-of-the-art segmentation method fails for these cases. Intraoperative scans including an RFA probe are very critical in the clinical practice and need a very careful segmentation and inspection to avoid under-treatment, which may result in tumor recurrence (up to 40%). If the decision can be made during the intervention, an additional ablation can be performed without removing the entire needle. This decreases the patient stress and associated risks and costs of a separate intervention at a later date. Ultimately, the segmented ablation zone containing the RFA needle can be used for a precise ablation simulation as the real needle position is known.

  19. Semi-automatic scene generation using the Digital Anatomist Foundational Model.

    Science.gov (United States)

    Wong, B A; Rosse, C; Brinkley, J F

    1999-01-01

    A recent survey shows that a major impediment to more widespread use of computers in anatomy education is the inability to directly manipulate 3-D models, and to relate these to corresponding textual information. In the University of Washington Digital Anatomist Project we have developed a prototype Web-based scene generation program that combines the symbolic Foundational Model of Anatomy with 3-D models. A Web user can browse the Foundational Model (FM), then click to request that a 3-D scene be created of an object and its parts or branches. The scene is rendered by a graphics server, and a snapshot is sent to the Web client. The user can then manipulate the scene, adding new structures, deleting structures, rotating the scene, zooming, and saving the scene as a VRML file. Applications such as this, when fully realized with fast rendering and more anatomical content, have the potential to significantly change the way computers are used in anatomy education.

  20. Semi-automatic registration of 3D orthodontics models from photographs

    Science.gov (United States)

    Destrez, Raphaël.; Treuillet, Sylvie; Lucas, Yves; Albouy-Kissi, Benjamin

    2013-03-01

    In orthodontics, a common practice used to diagnose and plan the treatment is the dental cast. After digitization by a CT-scan or a laser scanner, the obtained 3D surface models can feed orthodontics numerical tools for computer-aided diagnosis and treatment planning. One of the pre-processing critical steps is the 3D registration of dental arches to obtain the occlusion of these numerical models. For this task, we propose a vision based method to automatically compute the registration based on photos of patient mouth. From a set of matched singular points between two photos and the dental 3D models, the rigid transformation to apply to the mandible to be in contact with the maxillary may be computed by minimizing the reprojection errors. During a precedent study, we established the feasibility of this visual registration approach with a manual selection of singular points. This paper addresses the issue of automatic point detection. Based on a priori knowledge, histogram thresholding and edge detection are used to extract specific points in 2D images. Concurrently, curvatures information detects 3D corresponding points. To improve the quality of the final registration, we also introduce a combined optimization of the projection matrix with the 2D/3D point positions. These new developments are evaluated on real data by considering the reprojection errors and the deviation angles after registration in respect to the manual reference occlusion realized by a specialist.

  1. ROADS CENTRE-AXIS EXTRACTION IN AIRBORNE SAR IMAGES: AN APPROACH BASED ON ACTIVE CONTOUR MODEL WITH THE USE OF SEMI-AUTOMATIC SEEDING

    Directory of Open Access Journals (Sweden)

    R. G. Lotte

    2013-05-01

    Full Text Available Research works dealing with computational methods for roads extraction have considerably increased in the latest two decades. This procedure is usually performed on optical or microwave sensors (radar imagery. Radar images offer advantages when compared to optical ones, for they allow the acquisition of scenes regardless of atmospheric and illumination conditions, besides the possibility of surveying regions where the terrain is hidden by the vegetation canopy, among others. The cartographic mapping based on these images is often manually accomplished, requiring considerable time and effort from the human interpreter. Maps for detecting new roads or updating the existing roads network are among the most important cartographic products to date. There are currently many studies involving the extraction of roads by means of automatic or semi-automatic approaches. Each of them presents different solutions for different problems, making this task a scientific issue still open. One of the preliminary steps for roads extraction can be the seeding of points belonging to roads, what can be done using different methods with diverse levels of automation. The identified seed points are interpolated to form the initial road network, and are hence used as an input for an extraction method properly speaking. The present work introduces an innovative hybrid method for the extraction of roads centre-axis in a synthetic aperture radar (SAR airborne image. Initially, candidate points are fully automatically seeded using Self-Organizing Maps (SOM, followed by a pruning process based on specific metrics. The centre-axis are then detected by an open-curve active contour model (snakes. The obtained results were evaluated as to their quality with respect to completeness, correctness and redundancy.

  2. Clinical evaluation of semi-automatic open-source algorithmic software segmentation of the mandibular bone: Practical feasibility and assessment of a new course of action.

    Science.gov (United States)

    Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan

    2018-01-01

    Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a

  3. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  4. Stiffness analysis of spring mechanism for semi automatic gripper motion of tendon driven remote manipulator

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang

    2012-01-01

    Remote handling manipulators are widely used for performing hazardous tasks, and it is essential to ensure the reliable performance of such systems. Toward this end, tendon driven mechanisms are adopted in such systems to reduce the weight of the distal parts of the manipulator while maintaining the handling performance. In this study, several approaches for the design of a gripper system for a tendon driven remote handling system are introduced. Basically, this gripper has an underactuated spring mechanism that is combined with a slave manipulator triggered by a master operator. Based on the requirements under the specified tendon driven mechanism, the connecting position of the spring system on the gripper mechanism and kinematic influence coefficient (KIC) analysis are performed. As a result, a suitable combination of components for the proper design of the target system is presented and verified

  5. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    Science.gov (United States)

    Vho, Alice; Bistacchi, Andrea

    2015-04-01

    A quantitative analysis of fault-rock distribution is of paramount importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation along faults at depth. Here we present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM). This workflow has been developed on a real case of study: the strike-slip Gole Larghe Fault Zone (GLFZ). It consists of a fault zone exhumed from ca. 10 km depth, hosted in granitoid rocks of Adamello batholith (Italian Southern Alps). Individual seismogenic slip surfaces generally show green cataclasites (cemented by the precipitation of epidote and K-feldspar from hydrothermal fluids) and more or less well preserved pseudotachylytes (black when well preserved, greenish to white when altered). First of all, a digital model for the outcrop is reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs, processed with VisualSFM software. By using high resolution photographs the DOM can have a much higher resolution than with LIDAR surveys, up to 0.2 mm/pixel. Then, image processing is performed to map the fault-rock distribution with the ImageJ-Fiji package. Green cataclasites and epidote/K-feldspar veins can be quite easily separated from the host rock (tonalite) using spectral analysis. Particularly, band ratio and principal component analysis have been tested successfully. The mapping of black pseudotachylyte veins is more tricky because the differences between the pseudotachylyte and biotite spectral signature are not appreciable. For this reason we have tested different morphological processing tools aimed at identifying (and subtracting) the tiny biotite grains. We propose a solution based on binary images involving a combination of size and circularity thresholds. Comparing the results with manually segmented images, we noticed that major problems occur only when pseudotachylyte veins are very thin and discontinuous. After

  6. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  7. Development of computational algorithms for quantification of pulmonary structures; Desenvolvimento de algoritmos computacionais para quantificacao de estruturas pulmonares

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A., E-mail: marceladeoliveira@ig.com.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Hospital das Clinicas. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2012-12-15

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  8. A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text

    Science.gov (United States)

    Nguyen, Bao-An; Yang, Don-Lin

    2012-01-01

    An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…

  9. Comparison of acute and chronic traumatic brain injury using semi-automatic multimodal segmentation of MR volumes.

    Science.gov (United States)

    Irimia, Andrei; Chambers, Micah C; Alger, Jeffry R; Filippou, Maria; Prastawa, Marcel W; Wang, Bo; Hovda, David A; Gerig, Guido; Toga, Arthur W; Kikinis, Ron; Vespa, Paul M; Van Horn, John D

    2011-11-01

    Although neuroimaging is essential for prompt and proper management of traumatic brain injury (TBI), there is a regrettable and acute lack of robust methods for the visualization and assessment of TBI pathophysiology, especially for of the purpose of improving clinical outcome metrics. Until now, the application of automatic segmentation algorithms to TBI in a clinical setting has remained an elusive goal because existing methods have, for the most part, been insufficiently robust to faithfully capture TBI-related changes in brain anatomy. This article introduces and illustrates the combined use of multimodal TBI segmentation and time point comparison using 3D Slicer, a widely-used software environment whose TBI data processing solutions are openly available. For three representative TBI cases, semi-automatic tissue classification and 3D model generation are performed to perform intra-patient time point comparison of TBI using multimodal volumetrics and clinical atrophy measures. Identification and quantitative assessment of extra- and intra-cortical bleeding, lesions, edema, and diffuse axonal injury are demonstrated. The proposed tools allow cross-correlation of multimodal metrics from structural imaging (e.g., structural volume, atrophy measurements) with clinical outcome variables and other potential factors predictive of recovery. In addition, the workflows described are suitable for TBI clinical practice and patient monitoring, particularly for assessing damage extent and for the measurement of neuroanatomical change over time. With knowledge of general location, extent, and degree of change, such metrics can be associated with clinical measures and subsequently used to suggest viable treatment options.

  10. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    Science.gov (United States)

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  11. A SEMI-AUTOMATIC RULE SET BUILDING METHOD FOR URBAN LAND COVER CLASSIFICATION BASED ON MACHINE LEARNING AND HUMAN KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    H. Y. Gu

    2017-09-01

    Full Text Available Classification rule set is important for Land Cover classification, which refers to features and decision rules. The selection of features and decision are based on an iterative trial-and-error approach that is often utilized in GEOBIA, however, it is time-consuming and has a poor versatility. This study has put forward a rule set building method for Land cover classification based on human knowledge and machine learning. The use of machine learning is to build rule sets effectively which will overcome the iterative trial-and-error approach. The use of human knowledge is to solve the shortcomings of existing machine learning method on insufficient usage of prior knowledge, and improve the versatility of rule sets. A two-step workflow has been introduced, firstly, an initial rule is built based on Random Forest and CART decision tree. Secondly, the initial rule is analyzed and validated based on human knowledge, where we use statistical confidence interval to determine its threshold. The test site is located in Potsdam City. We utilised the TOP, DSM and ground truth data. The results show that the method could determine rule set for Land Cover classification semi-automatically, and there are static features for different land cover classes.

  12. Semi-Automatic Registration of Airborne and Terrestrial Laser Scanning Data Using Building Corner Matching with Boundaries as Reliability Check

    Directory of Open Access Journals (Sweden)

    Liang Cheng

    2013-11-01

    Full Text Available Data registration is a prerequisite for the integration of multi-platform laser scanning in various applications. A new approach is proposed for the semi-automatic registration of airborne and terrestrial laser scanning data with buildings without eaves. Firstly, an automatic calculation procedure for thresholds in density of projected points (DoPP method is introduced to extract boundary segments from terrestrial laser scanning data. A new algorithm, using a self-extending procedure, is developed to recover the extracted boundary segments, which then intersect to form the corners of buildings. The building corners extracted from airborne and terrestrial laser scanning are reliably matched through an automatic iterative process in which boundaries from two datasets are compared for the reliability check. The experimental results illustrate that the proposed approach provides both high reliability and high geometric accuracy (average error of 0.44 m/0.15 m in horizontal/vertical direction for corresponding building corners for the final registration of airborne laser scanning (ALS and tripod mounted terrestrial laser scanning (TLS data.

  13. Semi-automatic identification of punching areas for tissue microarray building: the tubular breast cancer pilot study

    Directory of Open Access Journals (Sweden)

    Beltrame Francesco

    2010-11-01

    Full Text Available Abstract Background Tissue MicroArray technology aims to perform immunohistochemical staining on hundreds of different tissue samples simultaneously. It allows faster analysis, considerably reducing costs incurred in staining. A time consuming phase of the methodology is the selection of tissue areas within paraffin blocks: no utilities have been developed for the identification of areas to be punched from the donor block and assembled in the recipient block. Results The presented work supports, in the specific case of a primary subtype of breast cancer (tubular breast cancer, the semi-automatic discrimination and localization between normal and pathological regions within the tissues. The diagnosis is performed by analysing specific morphological features of the sample such as the absence of a double layer of cells around the lumen and the decay of a regular glands-and-lobules structure. These features are analysed using an algorithm which performs the extraction of morphological parameters from images and compares them to experimentally validated threshold values. Results are satisfactory since in most of the cases the automatic diagnosis matches the response of the pathologists. In particular, on a total of 1296 sub-images showing normal and pathological areas of breast specimens, algorithm accuracy, sensitivity and specificity are respectively 89%, 84% and 94%. Conclusions The proposed work is a first attempt to demonstrate that automation in the Tissue MicroArray field is feasible and it can represent an important tool for scientists to cope with this high-throughput technique.

  14. Derivation of groundwater flow-paths based on semi-automatic extraction of lineaments from remote sensing data

    Directory of Open Access Journals (Sweden)

    U. Mallast

    2011-08-01

    Full Text Available In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxiliary information and finally evaluated in terms of hydro-geological significance. Using the example of the western catchment of the Dead Sea (Israel/Palestine, the orientation and location of the differentiated lineaments are compared to characteristics of known structural features. We demonstrate that a strong correlation between lineaments and structural features exists. Using Euclidean distances between lineaments and wells provides an assessment criterion to evaluate the hydraulic significance of detected lineaments. Based on this analysis, we suggest that the statistical analysis of lineaments allows a delineation of flow-paths and thus significant information on groundwater movements. To validate the flow-paths we compare them to existing results of groundwater models that are based on well data.

  15. Segmentation of Multi-Isotope Imaging Mass Spectrometry Data for Semi-Automatic Detection of Regions of Interest

    Science.gov (United States)

    Poczatek, J. Collin; Turck, Christoph W.; Lechene, Claude

    2012-01-01

    Multi-isotope imaging mass spectrometry (MIMS) associates secondary ion mass spectrometry (SIMS) with detection of several atomic masses, the use of stable isotopes as labels, and affiliated quantitative image-analysis software. By associating image and measure, MIMS allows one to obtain quantitative information about biological processes in sub-cellular domains. MIMS can be applied to a wide range of biomedical problems, in particular metabolism and cell fate [1], [2], [3]. In order to obtain morphologically pertinent data from MIMS images, we have to define regions of interest (ROIs). ROIs are drawn by hand, a tedious and time-consuming process. We have developed and successfully applied a support vector machine (SVM) for segmentation of MIMS images that allows fast, semi-automatic boundary detection of regions of interests. Using the SVM, high-quality ROIs (as compared to an expert's manual delineation) were obtained for 2 types of images derived from unrelated data sets. This automation simplifies, accelerates and improves the post-processing analysis of MIMS images. This approach has been integrated into “Open MIMS,” an ImageJ-plugin for comprehensive analysis of MIMS images that is available online at http://www.nrims.hms.harvard.edu/NRIMS_ImageJ.php. PMID:22347386

  16. From Point Clouds to Building Information Models: 3D Semi-Automatic Reconstruction of Indoors of Existing Buildings

    Directory of Open Access Journals (Sweden)

    Hélène Macher

    2017-10-01

    Full Text Available The creation of as-built Building Information Models requires the acquisition of the as-is state of existing buildings. Laser scanners are widely used to achieve this goal since they permit to collect information about object geometry in form of point clouds and provide a large amount of accurate data in a very fast way and with a high level of details. Unfortunately, the scan-to-BIM (Building Information Model process remains currently largely a manual process which is time consuming and error-prone. In this paper, a semi-automatic approach is presented for the 3D reconstruction of indoors of existing buildings from point clouds. Several segmentations are performed so that point clouds corresponding to grounds, ceilings and walls are extracted. Based on these point clouds, walls and slabs of buildings are reconstructed and described in the IFC format in order to be integrated into BIM software. The assessment of the approach is proposed thanks to two datasets. The evaluation items are the degree of automation, the transferability of the approach and the geometric quality of results of the 3D reconstruction. Additionally, quality indexes are introduced to inspect the results in order to be able to detect potential errors of reconstruction.

  17. Resolving Carbonate Platform Geometries on the Island of Bonaire, Caribbean Netherlands through Semi-Automatic GPR Facies Classification

    Science.gov (United States)

    Bowling, R. D.; Laya, J. C.; Everett, M. E.

    2018-05-01

    The study of exposed carbonate platforms provides observational constraints on regional tectonics and sea-level history. In this work Miocene-aged carbonate platform units of the Seroe Domi Formation are investigated, on the island of Bonaire, located in the Southern Caribbean. Ground penetrating radar (GPR) was used to probe near-surface structural geometries associated with these lithologies. The single cross-island transect described herein allowed for continuous mapping of geologic structures on kilometer length scales. Numerical analysis was applied to the data in the form of k-means clustering of structure-parallel vectors derived from image structure tensors. This methodology enables radar facies along the survey transect to be semi-automatically mapped. The results provide subsurface evidence to support previous surficial and outcrop observations, and reveal complex stratigraphy within the platform. From the GPR data analysis, progradational clinoform geometries were observed on the northeast side of the island which supports the tectonics and depositional trends of the region. Furthermore, several leeward-side radar facies are identified which correlate to environments of deposition conducive to dolomitization via reflux mechanisms.

  18. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    Science.gov (United States)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  19. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  20. Development of a semi-automatic alignment tool for accelerated localization of the prostate

    International Nuclear Information System (INIS)

    Hua Chiaho; Lovelock, D. Michael; Mageras, Gikas S.; Katz, Matthew S.; Mechalakos, James; Lief, Eugene P.; Hollister, Timothy; Lutz, Wendell R.; Zelefsky, Michael J.; Ling, Clifton C.

    2003-01-01

    Purpose: Delivering high dose to prostate with external beam radiation has been shown to improve local tumor control. However, it has to be carefully performed to avoid partial target miss and delivering excessive dose to surrounding normal tissues. One way to achieve safe dose escalation is to precisely localize prostate immediately before daily treatment. Therefore, the radiation can be accurately delivered to the target. Once the prostate position is determined with high confidence, planning target volume (PTV) safety margin might be reduced for further reduction of rectal toxicity. A rapid computed tomography (CT)-based online prostate localization method is presented for this purpose. Methods and Materials: Immediately before each treatment session, the patient is immobilized and undergoes a CT scan in the treatment position using a CT scanner situated in the treatment room. At the CT console, posterior, anterior, left, and right extents of the prostate are manually identified on each axial slice. The translational prostate displacements relative to the planned position are estimated by simultaneously fitting these identified extents from this CT scan to a template created from the finely sliced planning CT scan. A total of 106 serial CT scans from 8 prostate cancer patients were performed immediately before treatments and used to retrospectively evaluate the precision of this daily prostate targeting method. The three-dimensional displacement of the prostate with respect to its planned position was estimated. Results: Five axial slices from each treatment CT scan were sufficient to produce a reliable correction when compared with prostate center of gravity (CoG) displacements calculated from physician-drawn contours. The differences (mean ± SD) between these two correction schemes in the right-left (R/L), posterior-anterior (P/A), and superior-inferior (S/I) directions are 0.0 ± 0.4 mm, 0.0 ± 0.7 mm, and -0.4 ± 1.9 mm, respectively. With daily CT extent

  1. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  2. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  3. Rapid, semi-automatic fracture and contact mapping for point clouds, images and geophysical data

    Science.gov (United States)

    Thiele, Samuel T.; Grose, Lachlan; Samsu, Anindita; Micklethwaite, Steven; Vollgger, Stefan A.; Cruden, Alexander R.

    2017-12-01

    The advent of large digital datasets from unmanned aerial vehicle (UAV) and satellite platforms now challenges our ability to extract information across multiple scales in a timely manner, often meaning that the full value of the data is not realised. Here we adapt a least-cost-path solver and specially tailored cost functions to rapidly interpolate structural features between manually defined control points in point cloud and raster datasets. We implement the method in the geographic information system QGIS and the point cloud and mesh processing software CloudCompare. Using these implementations, the method can be applied to a variety of three-dimensional (3-D) and two-dimensional (2-D) datasets, including high-resolution aerial imagery, digital outcrop models, digital elevation models (DEMs) and geophysical grids. We demonstrate the algorithm with four diverse applications in which we extract (1) joint and contact patterns in high-resolution orthophotographs, (2) fracture patterns in a dense 3-D point cloud, (3) earthquake surface ruptures of the Greendale Fault associated with the Mw7.1 Darfield earthquake (New Zealand) from high-resolution light detection and ranging (lidar) data, and (4) oceanic fracture zones from bathymetric data of the North Atlantic. The approach improves the consistency of the interpretation process while retaining expert guidance and achieves significant improvements (35-65 %) in digitisation time compared to traditional methods. Furthermore, it opens up new possibilities for data synthesis and can quantify the agreement between datasets and an interpretation.

  4. Petascale Computational Systems

    OpenAIRE

    Bell, Gordon; Gray, Jim; Szalay, Alex

    2007-01-01

    Computational science is changing to be data intensive. Super-Computers must be balanced systems; not just CPU farms but also petascale IO and networking arrays. Anyone building CyberInfrastructure should allocate resources to support a balanced Tier-1 through Tier-3 design.

  5. SEMI-AUTOMATIC CO-REGISTRATION OF PHOTOGRAMMETRIC AND LIDAR DATA USING BUILDINGS

    Directory of Open Access Journals (Sweden)

    C. Armenakis

    2012-07-01

    Full Text Available In this work, the co-registration steps between LiDAR and photogrammetric DSM 3Ddata are analyzed and a solution based on automated plane matching is proposed and implemented. For a robust 3D geometric transformation both planes and points are used. Initially planes are chosen as the co-registration primitives. To confine the search space for the plane matching a sequential automatic building matching is performed first. For matching buildings from the LiDAR and the photogrammetric data, a similarity objective function is formed based on the roof height difference (RHD, the 3D histogram of the building attributes, and the building boundary area of a building. A region growing algorithm based on a Triangulated Irregular Network (TIN is implemented to extract planes from both datasets. Next, an automatic successive process for identifying and matching corresponding planes from the two datasets has been developed and implemented. It is based on the building boundary region and determines plane pairs through a robust matching process thus eliminating outlier pairs. The selected correct plane pairs are the input data for the geometric transformation process. The 3D conformal transformation method in conjunction with the attitude quaternion is applied to obtain the transformation parameters using the normal vectors of the corresponding plane pairs. Following the mapping of one dataset onto the coordinate system of the other, the Iterative Closest Point (ICP algorithm is then applied, using the corresponding building point clouds to further refine the transformation solution. The results indicate that the combination of planes and points improve the co-registration outcomes.

  6. Semi-Automatic Classification Of Histopathological Images: Dealing With Inter-Slide Variations

    Directory of Open Access Journals (Sweden)

    Michael Gadermayr

    2016-06-01

    In case of 50 available labelled sample patches of a certain whole slide image, the overall classification rate increased from 92 % to 98 % through including the interactive labelling step. Even with only 20 labelled patches, accuracy already increased to 97 %. Without a pre-trained model, if training is performed on target domain data only, 88 % (20 labelled samples and 95 % (50 labelled samples accuracy, respectively, were obtained. If enough target domain data was available (about 20 images, the amount of source domain data was of minor relevance. The difference in outcome between a source domain training data set containing 100 patches from one whole slide image and a set containing 700 patches from seven images was lower than 1 %. Contrarily, without target domain data, the difference in accuracy was 10 % (82 % compared to 92 % between these two settings. Execution runtime between two interaction steps is significantly below one second (0.23 s, which is an important usability criterion. It proved to be beneficial to select specific target domain data in an active learning sense based on the currently available trained model. While experimental evaluation provided strong empirical evidence for increased classification performance with the proposed method, the additional manual effort can be kept at a low level. The labelling of e.g. 20 images per slide is surely less time consuming than the validation of a complete whole slide image processed with a fully automatic, but less reliable, segmentation approach. Finally, it should be highlighted that the proposed interaction protocol could easily be adapted to other histopathological classification or segmentation tasks, also for implementation in a clinical system.  

  7. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  8. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  9. A Pharmacy Computer System

    OpenAIRE

    Claudia CIULCA-VLADAIA; Călin MUNTEAN

    2009-01-01

    Objective: Describing a model of evaluation seen from a customer’s point of view for the current needed pharmacy computer system. Data Sources: literature research, ATTOFARM, WINFARM P.N.S., NETFARM, Info World - PHARMACY MANAGER and HIPOCRATE FARMACIE. Study Selection: Five Pharmacy Computer Systems were selected due to their high rates of implementing at a national level. We used the new criteria recommended by EUROREC Institute in EHR that modifies the model of data exchanges between the E...

  10. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    Science.gov (United States)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  11. Validation of a semi-automatic protocol for the assessment of the tear meniscus central area based on open-source software

    Science.gov (United States)

    Pena-Verdeal, Hugo; Garcia-Resua, Carlos; Yebra-Pimentel, Eva; Giraldez, Maria J.

    2017-08-01

    Purpose: Different lower tear meniscus parameters can be clinical assessed on dry eye diagnosis. The aim of this study was to propose and analyse the variability of a semi-automatic method for measuring lower tear meniscus central area (TMCA) by using open source software. Material and methods: On a group of 105 subjects, one video of the lower tear meniscus after fluorescein instillation was generated by a digital camera attached to a slit-lamp. A short light beam (3x5 mm) with moderate illumination in the central portion of the meniscus (6 o'clock) was used. Images were extracted from each video by a masked observer. By using an open source software based on Java (NIH ImageJ), a further observer measured in a masked and randomized order the TMCA in the short light beam illuminated area by two methods: (1) manual method, where TMCA images was "manually" measured; (2) semi-automatic method, where TMCA images were transformed in an 8-bit-binary image, then holes inside this shape were filled and on the isolated shape, the area size was obtained. Finally, both measurements, manual and semi-automatic, were compared. Results: Paired t-test showed no statistical difference between both techniques results (p = 0.102). Pearson correlation between techniques show a significant positive near to perfect correlation (r = 0.99; p Conclusions: This study showed a useful tool to objectively measure the frontal central area of the meniscus in photography by free open source software.

  12. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  13. Computer system operation

    International Nuclear Information System (INIS)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A.

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new

  14. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  15. SplitRacer - a new Semi-Automatic Tool to Quantify And Interpret Teleseismic Shear-Wave Splitting

    Science.gov (United States)

    Reiss, M. C.; Rumpker, G.

    2017-12-01

    We have developed a semi-automatic, MATLAB-based GUI to combine standard seismological tasks such as the analysis and interpretation of teleseismic shear-wave splitting. Shear-wave splitting analysis is widely used to infer seismic anisotropy, which can be interpreted in terms of lattice-preferred orientation of mantle minerals, shape-preferred orientation caused by fluid-filled cracks or alternating layers. Seismic anisotropy provides a unique link between directly observable surface structures and the more elusive dynamic processes in the mantle below. Thus, resolving the seismic anisotropy of the lithosphere/asthenosphere is of particular importance for geodynamic modeling and interpretations. The increasing number of seismic stations from temporary experiments and permanent installations creates a new basis for comprehensive studies of seismic anisotropy world-wide. However, the increasingly large data sets pose new challenges for the rapid and reliably analysis of teleseismic waveforms and for the interpretation of the measurements. Well-established routines and programs are available but are often impractical for analyzing large data sets from hundreds of stations. Additionally, shear wave splitting results are seldom evaluated using the same well-defined quality criteria which may complicate comparison with results from different studies. SplitRacer has been designed to overcome these challenges by incorporation of the following processing steps: i) downloading of waveform data from multiple stations in mseed-format using FDSNWS tools; ii) automated initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold; iii) particle-motion analysis of selected phases at longer periods to detect and correct for sensor misalignment; iv) splitting analysis of selected phases based on transverse-energy minimization for multiple, randomly-selected, relevant time windows; v) one and two-layer joint-splitting analysis for all phases at one station by

  16. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    Science.gov (United States)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  17. Belle computing system

    International Nuclear Information System (INIS)

    Adachi, Ichiro; Hibino, Taisuke; Hinz, Luc; Itoh, Ryosuke; Katayama, Nobu; Nishida, Shohei; Ronga, Frederic; Tsukamoto, Toshifumi; Yokoyama, Masahiko

    2004-01-01

    We describe the present status of the computing system in the Belle experiment at the KEKB e+e- asymmetric-energy collider. So far, we have logged more than 160fb-1 of data, corresponding to the world's largest data sample of 170M BB-bar pairs at the -bar (4S) energy region. A large amount of event data has to be processed to produce an analysis event sample in a timely fashion. In addition, Monte Carlo events have to be created to control systematic errors accurately. This requires stable and efficient usage of computing resources. Here, we review our computing model and then describe how we efficiently proceed DST/MC productions in our system

  18. Computer Based Expert Systems.

    Science.gov (United States)

    Parry, James D.; Ferrara, Joseph M.

    1985-01-01

    Claims knowledge-based expert computer systems can meet needs of rural schools for affordable expert advice and support and will play an important role in the future of rural education. Describes potential applications in prediction, interpretation, diagnosis, remediation, planning, monitoring, and instruction. (NEC)

  19. Mining Department computer systems

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    Describes the main computer systems currently available, or being developed by the Mining Department of the UK National Coal Board. They are primarily for the use of mining and specialist engineers, but some of them have wider applications, particularly in the research and development and management statistics fields.

  20. DYMAC computer system

    International Nuclear Information System (INIS)

    Hagen, J.; Ford, R.F.

    1979-01-01

    The DYnamic Materials ACcountability program (DYMAC) has been monitoring nuclear material at the Los Alamos Scientific Laboratory plutonium processing facility since January 1978. This paper presents DYMAC's features and philosophy, especially as reflected in its computer system design. Early decisions and tradeoffs are evaluated through the benefit of a year's operating experience

  1. Production optimization of {sup 99}Mo/{sup 99m}Tc zirconium molybate gel generators at semi-automatic device: DISIGEG

    Energy Technology Data Exchange (ETDEWEB)

    Monroy-Guzman, F., E-mail: fabiola.monroy@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Rivero Gutierrez, T., E-mail: tonatiuh.rivero@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Lopez Malpica, I.Z.; Hernandez Cortes, S.; Rojas Nava, P.; Vazquez Maldonado, J.C. [Instituto Nacional de Investigaciones Nucleares, Carretera Mexico-Toluca S/N, La Marquesa, Ocoyoacac, 52750, Estado de Mexico (Mexico); Vazquez, A. [Instituto Mexicano del Petroleo, Eje Central Norte Lazaro Cardenas 152, Col. San Bartolo Atepehuacan, 07730, Mexico D.F. (Mexico)

    2012-01-15

    DISIGEG is a synthesis installation of zirconium {sup 99}Mo-molybdate gels for {sup 99}Mo/{sup 99m}Tc generator production, which has been designed, built and installed at the ININ. The device consists of a synthesis reactor and five systems controlled via keyboard: (1) raw material access, (2) chemical air stirring, (3) gel dried by air and infrared heating, (4) moisture removal and (5) gel extraction. DISIGEG operation is described and dried condition effects of zirconium {sup 99}Mo- molybdate gels on {sup 99}Mo/{sup 99m}Tc generator performance were evaluated as well as some physical-chemical properties of these gels. The results reveal that temperature, time and air flow applied during the drying process directly affects zirconium {sup 99}Mo-molybdate gel generator performance. All gels prepared have a similar chemical structure probably constituted by three-dimensional network, based on zirconium pentagonal bipyramids and molybdenum octahedral. Basic structural variations cause a change in gel porosity and permeability, favouring or inhibiting {sup 99m}TcO{sub 4}{sup -} diffusion into the matrix. The {sup 99m}TcO{sub 4}{sup -} eluates produced by {sup 99}Mo/{sup 99m}Tc zirconium {sup 99}Mo-molybdate gel generators prepared in DISIGEG, air dried at 80 Degree-Sign C for 5 h and using an air flow of 90 mm, satisfied all the Pharmacopoeias regulations: {sup 99m}Tc yield between 70-75%, {sup 99}Mo breakthrough less than 3 Multiplication-Sign 10{sup -3}%, radiochemical purities about 97% sterile and pyrogen-free eluates with a pH of 6. - Highlights: Black-Right-Pointing-Pointer {sup 99}Mo/{sup 99m}Tc generators based on {sup 99}Mo-molybdate gels were synthesized at a semi-automatic device. Black-Right-Pointing-Pointer Generator performances depend on synthesis conditions of the zirconium {sup 99}Mo-molybdate gel. Black-Right-Pointing-Pointer {sup 99m}TcO{sub 4}{sup -} diffusion and yield into generator depends on gel porosity and permeability. Black

  2. Morphometric synaptology of a whole neuron profile using a semiautomatic interactive computer system.

    Science.gov (United States)

    Saito, K; Niki, K

    1983-07-01

    We propose a new method of dealing with morphometric synaptology that processes all synapses and boutons around the HRP marked neuron on a large composite electron micrograph, rather than a qualitative or a piecemeal quantitative study of a particular synapse and/or bouton that is not positioned on the surface of the neuron. This approach requires the development of both neuroanatomical procedures, by which a specific whole neuronal profile is identified, and valuable specialized tools, which support the collection and analysis of a great volume of morphometric data from composite electron micrographs, in order to reduce the burden of the morphologist. The present report is also concerned with the total and reliable semi-automatic interactive computer system for gathering and analyzing morphometric data that has been under development in our laboratory. A morphologist performs the pattern recognition portion by using a large-sized tablet digitizer and a menu-sheet command, and the system registers the various morphometric values of many different neurons and performs statistical analysis. Some examples of morphometric measurements and analysis show the usefulness and efficiency of the proposed system and method.

  3. Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Friday, Adrian

    2009-01-01

    . While such growth is positive, the newest generation of ubicomp practitioners and researchers, isolated to specific tasks, are in danger of losing their sense of history and the broader perspective that has been so essential to the field’s creativity and brilliance. Under the guidance of John Krumm...... applications Privacy protection in systems that connect personal devices and personal information Moving from the graphical to the ubiquitous computing user interface Techniques that are revolutionizing the way we determine a person’s location and understand other sensor measurements While we needn’t become...

  4. The Computational Sensorimotor Systems Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Computational Sensorimotor Systems Lab focuses on the exploration, analysis, modeling and implementation of biological sensorimotor systems for both scientific...

  5. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    Energy Technology Data Exchange (ETDEWEB)

    Dietzel, Matthias, E-mail: dietzelmatthias2@hotmail.com [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Hopp, Torsten; Ruiter, Nicole [Karlsruhe Institute of Technology (KIT), Institute for Data Processing and Electronics, Postfach 3640, D-76021 Karlsruhe (Germany); Zoubi, Ramy [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Runnebaum, Ingo B. [Clinic of Gynecology and Obstetrics, Friedrich-Schiller-University Jena, Bachstrasse 18, D-07743 Jena (Germany); Kaiser, Werner A. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Medical School, University of Harvard, 25 Shattuck Street, Boston, MA 02115 (United States); Baltzer, Pascal A.T. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany)

    2011-08-15

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE {+-} Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  6. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    International Nuclear Information System (INIS)

    Dietzel, Matthias; Hopp, Torsten; Ruiter, Nicole; Zoubi, Ramy; Runnebaum, Ingo B.; Kaiser, Werner A.; Baltzer, Pascal A.T.

    2011-01-01

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE ± Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  7. Semi-automatic segmentation of gated blood pool emission tomographic images by watersheds: application to the determination of right and left ejection fractions

    International Nuclear Information System (INIS)

    Mariano-Goulart, D.; Collet, H.; Kotzki, P.-O.; Zanca, M.; Rossi, M.

    1998-01-01

    Tomographic multi-gated blood pool scintigraphy (TMUGA) is a widely available method which permits simultaneous assessment of right and left ventricular ejection fractions. However, the widespread clinical use of this technique is impeded by the lack of segmentation methods dedicated to an automatic analysis of ventricular activities. In this study we evaluated how a watershed algorithm succeeds in providing semi-automatic segmentation of ventricular activities in order to measure right and left ejection fractions by TMUGA. The left ejection fractions of 30 patients were evaluated both with TMUGA and with planar multi-gated blood pool scintigraphy (PMUGA). Likewise, the right ejection fractions of 25 patients were evaluated with first-pass scintigraphy (FP) and with TMUGA. The watershed algorithm was applied to the reconstructed slices in order to group together the voxels whose activity came from one specific cardiac cavity. First, the results of the watershed algorithm were compared with manual drawing around left and right ventricles. Left ejection fractions evaluated by TMUGA with the watershed procedure were not significantly different (p=0.30) from manual outlines whereas a small but significant difference was found for right ejection fractions (p=0.004). Then right and left ejection fractions evaluated by TMUGA (with the semi-automatic segmentation procedure) were compared with the results obtained by FP or PMUGA. Left ventricular ejection fractions evaluated by TMUGA showed an excellent correlation with those evaluated by PMUGA (r=0.93; SEE=5.93%; slope=0.99; intercept = 4.17%). The measurements of these ejection fractions were significantly higher with TMUGA than with PMUGA (P<0.01). The interoperator variability for the measurement of left ejection fractions by TMUGA was 4.6%. Right ventricular ejection fractions evaluated by TMUGA showed a good correlation with those evaluated by FP (r = 0.81; SEE = 6.68%; slope = 1.00; intercept = 0.85%) and were not

  8. Computer systems a programmer's perspective

    CERN Document Server

    Bryant, Randal E

    2016-01-01

    Computer systems: A Programmer’s Perspective explains the underlying elements common among all computer systems and how they affect general application performance. Written from the programmer’s perspective, this book strives to teach readers how understanding basic elements of computer systems and executing real practice can lead them to create better programs. Spanning across computer science themes such as hardware architecture, the operating system, and systems software, the Third Edition serves as a comprehensive introduction to programming. This book strives to create programmers who understand all elements of computer systems and will be able to engage in any application of the field--from fixing faulty software, to writing more capable programs, to avoiding common flaws. It lays the groundwork for readers to delve into more intensive topics such as computer architecture, embedded systems, and cybersecurity. This book focuses on systems that execute an x86-64 machine code, and recommends th...

  9. Digital curation: a proposal of a semi-automatic digital object selection-based model for digital curation in Big Data environments

    Directory of Open Access Journals (Sweden)

    Moisés Lima Dutra

    2016-08-01

    Full Text Available Introduction: This work presents a new approach for Digital Curations from a Big Data perspective. Objective: The objective is to propose techniques to digital curations for selecting and evaluating digital objects that take into account volume, velocity, variety, reality, and the value of the data collected from multiple knowledge domains. Methodology: This is an exploratory research of applied nature, which addresses the research problem in a qualitative way. Heuristics allow this semi-automatic process to be done either by human curators or by software agents. Results: As a result, it was proposed a model for searching, processing, evaluating and selecting digital objects to be processed by digital curations. Conclusions: It is possible to use Big Data environments as a source of information resources for Digital Curation; besides, Big Data techniques and tools can support the search and selection process of information resources by Digital Curations.

  10. A novel semi-automatic snake robot for natural orifice transluminal endoscopic surgery: preclinical tests in animal and human cadaver models (with video).

    Science.gov (United States)

    Son, Jaebum; Cho, Chang Nho; Kim, Kwang Gi; Chang, Tae Young; Jung, Hyunchul; Kim, Sung Chun; Kim, Min-Tae; Yang, Nari; Kim, Tae-Yun; Sohn, Dae Kyung

    2015-06-01

    Natural orifice transluminal endoscopic surgery (NOTES) is an emerging surgical technique. We aimed to design, create, and evaluate a new semi-automatic snake robot for NOTES. The snake robot employs the characteristics of both a manual endoscope and a multi-segment snake robot. This robot is inserted and retracted manually, like a classical endoscope, while its shape is controlled using embedded robot technology. The feasibility of a prototype robot for NOTES was evaluated in animals and human cadavers. The transverse stiffness and maneuverability of the snake robot appeared satisfactory. It could be advanced through the anus as far as the peritoneal cavity without any injury to adjacent organs. Preclinical tests showed that the device could navigate the peritoneal cavity. The snake robot has advantages of high transverse force and intuitive control. This new robot may be clinically superior to conventional tools for transanal NOTES.

  11. Digi-Clima Grid: image processing and distributed computing for recovering historical climate data

    Directory of Open Access Journals (Sweden)

    Sergio Nesmachnow

    2015-12-01

    Full Text Available This article describes the Digi-Clima Grid project, whose main goals are to design and implement semi-automatic techniques for digitalizing and recovering historical climate records applying parallel computing techniques over distributed computing infrastructures. The specific tool developed for image processing is described, and the implementation over grid and cloud infrastructures is reported. A experimental analysis over institutional and volunteer-based grid/cloud distributed systems demonstrate that the proposed approach is an efficient tool for recovering historical climate data. The parallel implementations allow to distribute the processing load, achieving accurate speedup values.

  12. Core status computing system

    International Nuclear Information System (INIS)

    Yoshida, Hiroyuki.

    1982-01-01

    Purpose: To calculate power distribution, flow rate and the like in the reactor core with high accuracy in a BWR type reactor. Constitution: Total flow rate signals, traverse incore probe (TIP) signals as the neutron detector signals, thermal power signals and pressure signals are inputted into a process computer, where the power distribution and the flow rate distribution in the reactor core are calculated. A function generator connected to the process computer calculates the absolute flow rate passing through optional fuel assemblies using, as variables, flow rate signals from the introduction part for fuel assembly flow rate signals, data signals from the introduction part for the geometrical configuration data at the flow rate measuring site of fuel assemblies, total flow rate signals for the reactor core and the signals from the process computer. Numerical values thus obtained are given to the process computer as correction signals to perform correction for the experimental data. (Moriyama, K.)

  13. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  14. Secure computing on reconfigurable systems

    OpenAIRE

    Fernandes Chaves, R.J.

    2007-01-01

    This thesis proposes a Secure Computing Module (SCM) for reconfigurable computing systems. SC provides a protected and reliable computational environment, where data security and protection against malicious attacks to the system is assured. SC is strongly based on encryption algorithms and on the attestation of the executed functions. The use of SC on reconfigurable devices has the advantage of being highly adaptable to the application and the user requirements, while providing high performa...

  15. Computer Security Systems Enable Access.

    Science.gov (United States)

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  16. Computable Types for Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter); K. Ambos-Spies; B. Loewe; W. Merkle

    2009-01-01

    textabstractIn this paper, we develop a theory of computable types suitable for the study of dynamic systems in discrete and continuous time. The theory uses type-two effectivity as the underlying computational model, but we quickly develop a type system which can be manipulated abstractly, but for

  17. Computed tomography system

    International Nuclear Information System (INIS)

    Lambert, T.W.; Blake, J.E.

    1981-01-01

    This invention relates to computed tomography and is particularly concerned with determining the CT numbers of zones of interest in an image displayed on a cathode ray tube which zones lie in the so-called level or center of the gray scale window. (author)

  18. Design and development of a prototypical software for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small- and medium-sized enterprises (SME)

    Science.gov (United States)

    Möller, Thomas; Bellin, Knut; Creutzburg, Reiner

    2015-03-01

    The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.

  19. Energy efficient distributed computing systems

    CERN Document Server

    Lee, Young-Choon

    2012-01-01

    The energy consumption issue in distributed computing systems raises various monetary, environmental and system performance concerns. Electricity consumption in the US doubled from 2000 to 2005.  From a financial and environmental standpoint, reducing the consumption of electricity is important, yet these reforms must not lead to performance degradation of the computing systems.  These contradicting constraints create a suite of complex problems that need to be resolved in order to lead to 'greener' distributed computing systems.  This book brings together a group of outsta

  20. Computational Systems Chemical Biology

    OpenAIRE

    Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander

    2011-01-01

    There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007).

  1. Semi-automatic spray pyrolysis deposition of thin, transparent, titania films as blocking layers for dye-sensitized and perovskite solar cells.

    Science.gov (United States)

    Krýsová, Hana; Krýsa, Josef; Kavan, Ladislav

    2018-01-01

    For proper function of the negative electrode of dye-sensitized and perovskite solar cells, the deposition of a nonporous blocking film is required on the surface of F-doped SnO 2 (FTO) glass substrates. Such a blocking film can minimise undesirable parasitic processes, for example, the back reaction of photoinjected electrons with the oxidized form of the redox mediator or with the hole-transporting medium can be avoided. In the present work, thin, transparent, blocking TiO 2 films are prepared by semi-automatic spray pyrolysis of precursors consisting of titanium diisopropoxide bis(acetylacetonate) as the main component. The variation in the layer thickness of the sprayed films is achieved by varying the number of spray cycles. The parameters investigated in this work were deposition temperature (150, 300 and 450 °C), number of spray cycles (20-200), precursor composition (with/without deliberately added acetylacetone), concentration (0.05 and 0.2 M) and subsequent post-calcination at 500 °C. The photo-electrochemical properties were evaluated in aqueous electrolyte solution under UV irradiation. The blocking properties were tested by cyclic voltammetry with a model redox probe with a simple one-electron-transfer reaction. Semi-automatic spraying resulted in the formation of transparent, homogeneous, TiO 2 films, and the technique allows for easy upscaling to large electrode areas. The deposition temperature of 450 °C was necessary for the fabrication of highly photoactive TiO 2 films. The blocking properties of the as-deposited TiO 2 films (at 450 °C) were impaired by post-calcination at 500 °C, but this problem could be addressed by increasing the number of spray cycles. The modification of the precursor by adding acetylacetone resulted in the fabrication of TiO 2 films exhibiting perfect blocking properties that were not influenced by post-calcination. These results will surely find use in the fabrication of large-scale dye-sensitized and perovskite solar

  2. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  3. Computer systems and nuclear industry

    International Nuclear Information System (INIS)

    Nkaoua, Th.; Poizat, F.; Augueres, M.J.

    1999-01-01

    This article deals with computer systems in nuclear industry. In most nuclear facilities it is necessary to handle a great deal of data and of actions in order to help plant operator to drive, to control physical processes and to assure the safety. The designing of reactors requires reliable computer codes able to simulate neutronic or mechanical or thermo-hydraulic behaviours. Calculations and simulations play an important role in safety analysis. In each of these domains, computer systems have progressively appeared as efficient tools to challenge and master complexity. (A.C.)

  4. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    Science.gov (United States)

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, pgenerating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  5. Creation of individual ideally shaped stents using multi-slice CT: in vitro results from the semi-automatic virtual stent (SAVS) designer

    International Nuclear Information System (INIS)

    Hyodoh, Hideki; Katagiri, Yoshimi; Hyodoh, Kazusa; Akiba, Hidenari; Hareyama, Masato; Sakai, Toyohiko

    2005-01-01

    To plan stent-grafting for thoracic aortic aneurysm with complicated morphology, we created a virtual stent-grafting program [Semi Automatic Virtual Stent (SAVS) designer] using three-dimensional CT data. The usefulness of the SAVS designer was evaluated by measurement of transformed anatomical and straight stents. Curved model images (source, multi-planer reconstruction and volume rendering) were created, and a hollow virtual stent was produced by the SAVS designer. A straight Nitinol stent was transformed to match the curved configuration of the virtual stent. The accuracy of the anatomical stent was evaluated by experimental strain phantom studies in comparison with the straight stent. Mean separation length was 0 mm in the anatomical stent [22 mm outer diameter (OD)] and 5 mm in the straight stent (22 mm OD). The straight stent strain voltage was four times that of the anatomical stent at the stent end. The anatomical stent is useful because it fits the curved structure of the aorta and reduces the strain force compared to the straight stent. The SAVS designer can help to design and produce the anatomical stent. (orig.)

  6. Semi-Automatic Evaluation of Intrasubject Variability and Inter-session of Cerebral Activation Areas by Neuro functional Magnetic Resonance (FMRI)

    International Nuclear Information System (INIS)

    Rascovsky, Simon; Delgado, Jorge Andres; Sanz, Alexander

    2008-01-01

    To verify the reproducibility of word generation, text comprehension, antonyms generation and motor/somatosensory RMF protocols in a test-retest evaluation through a semiautomatic stereotaxical localization method for activation comparison. Methods: Word generation, text comprehension, antonyms generation and motor/somatosensory FMRI paradigms were applied on 8 healthy subjects on two separate sessions, performing the evaluation of inter-session activations through conjunction and cluster analysis. Results: Activations according to Brodmann areas were reproducible in 50%, 62.5% and 75% for word generation, text comprehension and antonyms generation respectively. For the motor paradigms, right motor conjoined activations were found in 86% of subjects and in 100% of subjects for left conjoined activations. Conclusions: The semi-automatic method of determining inter-session areas of common activation allows its use for functional cytoarchitectonic localization of fMRI activations with minimal intervention, and can be used as a quality control measure of the different paradigms used in RMF, minimizing observer bias.

  7. A new generic method for the semi-automatic extraction of river and road networks in low and mid-resolution satellite images

    Energy Technology Data Exchange (ETDEWEB)

    Grazzini, Jacopo [Los Alamos National Laboratory; Dillard, Scott [PNNL; Soille, Pierre [EC JRC

    2010-10-21

    This paper addresses the problem of semi-automatic extraction of road or hydrographic networks in satellite images. For that purpose, we propose an approach combining concepts arising from mathematical morphology and hydrology. The method exploits both geometrical and topological characteristics of rivers/roads and their tributaries in order to reconstruct the complete networks. It assumes that the images satisfy the following two general assumptions, which are the minimum conditions for a road/river network to be identifiable and are usually verified in low- to mid-resolution satellite images: (i) visual constraint: most pixels composing the network have similar spectral signature that is distinguishable from most of the surrounding areas; (ii) geometric constraint: a line is a region that is relatively long and narrow, compared with other objects in the image. While this approach fully exploits local (roads/rivers are modeled as elongated regions with a smooth spectral signature in the image and a maximum width) and global (they are structured like a tree) characteristics of the networks, further directional information about the image structures is incorporated. Namely, an appropriate anisotropic metric is designed by using both the characteristic features of the target network and the eigen-decomposition of the gradient structure tensor of the image. Following, the geodesic propagation from a given network seed with this metric is combined with hydrological operators for overland flow simulation to extract the paths which contain most line evidence and identify them with the target network.

  8. Operating systems. [of computers

    Science.gov (United States)

    Denning, P. J.; Brown, R. L.

    1984-01-01

    A counter operating system creates a hierarchy of levels of abstraction, so that at a given level all details concerning lower levels can be ignored. This hierarchical structure separates functions according to their complexity, characteristic time scale, and level of abstraction. The lowest levels include the system's hardware; concepts associated explicitly with the coordination of multiple tasks appear at intermediate levels, which conduct 'primitive processes'. Software semaphore is the mechanism controlling primitive processes that must be synchronized. At higher levels lie, in rising order, the access to the secondary storage devices of a particular machine, a 'virtual memory' scheme for managing the main and secondary memories, communication between processes by way of a mechanism called a 'pipe', access to external input and output devices, and a hierarchy of directories cataloguing the hardware and software objects to which access must be controlled.

  9. Retrofitting of NPP Computer systems

    International Nuclear Information System (INIS)

    Pettersen, G.

    1994-01-01

    Retrofitting of nuclear power plant control rooms is a continuing process for most utilities. This involves introducing and/or extending computer-based solutions for surveillance and control as well as improving the human-computer interface. The paper describes typical requirements when retrofitting NPP process computer systems, and focuses on the activities of Institute for energieteknikk, OECD Halden Reactor project with respect to such retrofitting, using examples from actual delivery projects. In particular, a project carried out for Forsmarksverket in Sweden comprising upgrade of the operator system in the control rooms of units 1 and 2 is described. As many of the problems of retrofitting NPP process computer systems are similar to such work in other kinds of process industries, an example from a non-nuclear application area is also given

  10. Computer System Design System-on-Chip

    CERN Document Server

    Flynn, Michael J

    2011-01-01

    The next generation of computer system designers will be less concerned about details of processors and memories, and more concerned about the elements of a system tailored to particular applications. These designers will have a fundamental knowledge of processors and other elements in the system, but the success of their design will depend on the skills in making system-level tradeoffs that optimize the cost, performance and other attributes to meet application requirements. This book provides a new treatment of computer system design, particularly for System-on-Chip (SOC), which addresses th

  11. Semi-automatic determination of tin in marine materials by continuous flow hydride generation inductively coupled plasma atomic emission spectrometry

    International Nuclear Information System (INIS)

    Feng Yonglai; Narasaki, Hisataki; Chen Hongyuan; Tian Liching

    1997-01-01

    A semi-automated continuous flow hydride generation system with inductively coupled plasma atomic emission spectrometry (ICP-AES) was used for the determination of tin in marine materials. The effects of acids (H 2 SO 4 and HCl) were studied. The analytical parameters were thoroughly investigated. Under optimized conditions, the detection limit is 0.4 ng/ml. Interferences from transition elements were investigated and seven masking reagents were tested. L-cysteine hydrochloride monohydrate (1%) was used to mask the interferences from foreign ions. Finally, the accuracy, checked with a marine standard reference material obtained from the National Research Council (NRC), was within the certified value. (orig.). With 6 figs., 4 tabs

  12. Reproducibility of a semi-automatic method for 6-point vertebral morphometry in a multi-centre trial

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Stoppino, Luca Pio; Placentino, Maria Grazia; D'Errico, Francesco; Palmieri, Francesco

    2009-01-01

    Purpose: To evaluate the reproducibility of a semi-automated system for vertebral morphometry (MorphoXpress) in a large multi-centre trial. Materials and methods: The study involved 132 clinicians (no radiologist) with different levels of experience across 20 osteo-centres in Italy. All have received training in using MorphoXpress. An expert radiologist was also involved providing data used as standard of reference. The test image originate from normal clinical activity and represent a variety of normal, under and over exposed films, indicating both normal anatomy and vertebral deformities. The image was represented twice to the clinicians in a random order. Using the software, the clinicians initially marked the midpoints of the upper and lower vertebrae to include as many of the vertebrae (T5-L4) as practical within each given image. MorphoXpress performs the localisation of all morphometric points based on statistical model-based vision system. Intra-operator as well inter-operator measurement of agreement was calculated using the coefficient of variation and the mean and standard deviation of the difference of two measurements to check their agreement. Results: The overall intra-operator mean differences in vertebral heights is 1.61 ± 4.27% (1 S.D.). The overall intra-operator coefficient of variation is 3.95%. The overall inter-operator mean differences in vertebral heights is 2.93 ± 5.38% (1 S.D.). The overall inter-operator coefficient of variation is 6.89%. Conclusions: The technology tested here can facilitate reproducible quantitative morphometry suitable for large studies of vertebral deformities

  13. A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.

    Science.gov (United States)

    Rau, Jiann-Yeou; Yeh, Po-Chia

    2012-01-01

    The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum.

  14. Computer control system of TRISTAN

    International Nuclear Information System (INIS)

    Kurokawa, Shin-ichi; Shinomoto, Manabu; Kurihara, Michio; Sakai, Hiroshi.

    1984-01-01

    For the operation of a large accelerator, it is necessary to connect an enormous quantity of electro-magnets, power sources, vacuum equipment, high frequency accelerator and so on and to control them harmoniously. For the purpose, a number of computers are adopted, and connected with a network, in this way, a large computer system for laboratory automation which integrates and controls the whole system is constructed. As a distributed system of large scale, the functions such as electro-magnet control, file processing and operation control are assigned to respective computers, and the total control is made feasible by network connection, at the same time, as the interface with controlled equipment, the CAMAC (computer-aided measurement and control) is adopted to ensure the flexibility and the possibility of expansion of the system. Moreover, the language ''NODAL'' having network support function was developed so as to easily make software without considering the composition of more complex distributed system. The accelerator in the TRISTAN project is composed of an electron linear accelerator, an accumulation ring of 6 GeV and a main ring of 30 GeV. Two ring type accelerators must be synchronously operated as one body, and are controlled with one computer system. The hardware and software are outlined. (Kako, I.)

  15. The Reliability of Technical and Tactical Tagging Analysis Conducted by a Semi-Automatic VTS in Soccer.

    Science.gov (United States)

    Beato, Marco; Jamil, Mikael; Devereux, Gavin

    2018-06-01

    The Video Tracking multiple cameras system (VTS) is a technology that records two-dimensional position data (x and y) at high sampling rates (over 25 Hz). The VTS is of great interest because it can record external load variables as well as collect technical and tactical parameters. Performance analysis is mainly focused on physical demands, yet less attention has been afforded to technical and tactical factors. Digital.Stadium® VTS is a performance analysis device widely used at national and international levels (i.e. Italian Serie A, Euro 2016) and the reliability evaluation of its technical tagging analysis (e.g. shots, passes, assists, set pieces) could be paramount for its application at elite level competitions, as well as in research studies. Two professional soccer teams, with 30 male players (age 23 ± 5 years, body mass 78.3 ± 6.9 kg, body height 1.81 ± 0.06 m), were monitored in the 2016 season during a friendly match and data analysis was performed immediately after the game ended. This process was then replicated a week later (4 operators conducted the data analysis in each week). This study reports a near perfect relationship between Match and its Replication. R2 coefficients (relationships between Match and Replication) were highly significant for each of the technical variables considered (p technical tagging data accurately.

  16. Computer controlled high voltage system

    Energy Technology Data Exchange (ETDEWEB)

    Kunov, B; Georgiev, G; Dimitrov, L [and others

    1996-12-31

    A multichannel computer controlled high-voltage power supply system is developed. The basic technical parameters of the system are: output voltage -100-3000 V, output current - 0-3 mA, maximum number of channels in one crate - 78. 3 refs.

  17. The ALICE Magnetic System Computation.

    CERN Document Server

    Klempt, W; CERN. Geneva; Swoboda, Detlef

    1995-01-01

    In this note we present the first results from the ALICE magnetic system computation performed in the 3-dimensional way with the Vector Fields TOSCA code (version 6.5) [1]. To make the calculations we have used the IBM RISC System 6000-370 and 6000-550 machines combined in the CERN PaRC UNIX cluster.

  18. Semi-automatic engineering and tailoring of high-efficiency Bragg-reflection waveguide samples for quantum photonic applications

    Science.gov (United States)

    Pressl, B.; Laiho, K.; Chen, H.; Günthner, T.; Schlager, A.; Auchter, S.; Suchomel, H.; Kamp, M.; Höfling, S.; Schneider, C.; Weihs, G.

    2018-04-01

    Semiconductor alloys of aluminum gallium arsenide (AlGaAs) exhibit strong second-order optical nonlinearities. This makes them prime candidates for the integration of devices for classical nonlinear optical frequency conversion or photon-pair production, for example, through the parametric down-conversion (PDC) process. Within this material system, Bragg-reflection waveguides (BRW) are a promising platform, but the specifics of the fabrication process and the peculiar optical properties of the alloys require careful engineering. Previously, BRW samples have been mostly derived analytically from design equations using a fixed set of aluminum concentrations. This approach limits the variety and flexibility of the device design. Here, we present a comprehensive guide to the design and analysis of advanced BRW samples and show how to automatize these tasks. Then, nonlinear optimization techniques are employed to tailor the BRW epitaxial structure towards a specific design goal. As a demonstration of our approach, we search for the optimal effective nonlinearity and mode overlap which indicate an improved conversion efficiency or PDC pair production rate. However, the methodology itself is much more versatile as any parameter related to the optical properties of the waveguide, for example the phasematching wavelength or modal dispersion, may be incorporated as design goals. Further, we use the developed tools to gain a reliable insight in the fabrication tolerances and challenges of real-world sample imperfections. One such example is the common thickness gradient along the wafer, which strongly influences the photon-pair rate and spectral properties of the PDC process. Detailed models and a better understanding of the optical properties of a realistic BRW structure are not only useful for investigating current samples, but also provide important feedback for the design and fabrication of potential future turn-key devices.

  19. Computational Intelligence for Engineering Systems

    CERN Document Server

    Madureira, A; Vale, Zita

    2011-01-01

    "Computational Intelligence for Engineering Systems" provides an overview and original analysis of new developments and advances in several areas of computational intelligence. Computational Intelligence have become the road-map for engineers to develop and analyze novel techniques to solve problems in basic sciences (such as physics, chemistry and biology) and engineering, environmental, life and social sciences. The contributions are written by international experts, who provide up-to-date aspects of the topics discussed and present recent, original insights into their own experien

  20. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  1. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  2. Conceptual Design and Simulation of a Semi-Automatic Cell for the Washing and Preparation of a Corpse Prior to an Islamic Burial

    Directory of Open Access Journals (Sweden)

    A. Meghdari

    2012-07-01

    Full Text Available Washing the corpse and dressing the body prior to burial is an act of love and necessity in many religions. Applying robotics and automation technologies for the washing and preparation of a deceased Muslim in accordance with the Islamic Shari'at laws has been the challenging foundation of this research. With an increasing annual population growth resulting in an increase in the number of deaths (historically and/or immediately after a national disaster, automating part of this procedure to increase the speed of operation, reducing the health risks to the personnel of washing rooms “Ghassalkhaneh” at the cemeteries and enhancing their quality of life have been the primary objectives of this project. We have named and patented this semi-automated corpse preparation machine as the “PaakShooy” or “پاک شوی” in Persian (Farsi which means purifying the deceased. The whole process is composed of three operational units lined up in a series; the automatic washing chamber, drying cell and the semi-automatic shrouding table. This paper covers an introductory concept of the subject in Islam, a conceptual design of various machines and mechanisms to automate the important tasks in accordance with Islamic laws, and the final detailed design, graphic simulation and animation of the PaakShooy machine. In doing so, consultation with Islamic scholars has been a priority from the beginning of the project to the end and a few Fatwa have been issued by some high ranking Ayatollahs in support of the project. With a few modifications, the semi-automated PaakShooy machine may now be updated to conform to other religions/customs.

  3. An investigation into the factors that influence toolmark identifications on ammunition discharged from semi-automatic pistols recovered from car fires.

    Science.gov (United States)

    Collender, Mark A; Doherty, Kevin A J; Stanton, Kenneth T

    2017-01-01

    Following a shooting incident where a vehicle is used to convey the culprits to and from the scene, both the getaway car and the firearm are often deliberately burned in an attempt to destroy any forensic evidence which may be subsequently recovered. Here we investigate the factors that influence the ability to make toolmark identifications on ammunition discharged from pistols recovered from such car fires. This work was carried out by conducting a number of controlled furnace tests in conjunction with real car fire tests in which three 9mm semi-automatic pistols were burned. Comparisons between pre-burn and post burn test fired ammunition discharged from these pistols were then performed to establish if identifications were still possible. The surfaces of the furnace heated samples and car fire samples were examined following heating/burning to establish what factors had influenced their surface morphology. The primary influence on the surfaces of the furnace heated and car fire samples was the formation of oxide layers. The car fire samples were altered to a greater extent than the furnace heated samples. Identifications were still possible between pre- and post-burn discharged cartridge cases, but this was not the case for the discharged bullets. It is suggested that the reason for this is a difference between the types of firearms discharge-generated toolmarks impressed onto the base of cartridge cases compared to those striated along the surfaces of bullets. It was also found that the temperatures recorded in the front foot wells were considerably less than those recorded on top of the rear seats during the car fires. These factors should be assessed by forensic firearms examiners when performing casework involving pistols recovered from car fires. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  4. The Evaluation of Computer Systems

    Directory of Open Access Journals (Sweden)

    Cezar Octavian Mihalcescu

    2007-01-01

    Full Text Available Generally, the evaluation of the computersystems is especially interesting at present from severalpoints of view: computer-related, managerial,sociological etc. The reasons for this extended interest arerepresented by the fact that IT becomes increasinglyimportant for reaching the goals of an organization, ingeneral, and the strategic ones in particular. Evaluationmeans the estimation or determination of value, and issynonymous with measuring the value. Evaluating theeconomic value of Computer Systems should be studiedat three levels: individually, at a group level and at anorganization level.

  5. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  6. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  7. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  8. Computer-Mediated Communication Systems

    Directory of Open Access Journals (Sweden)

    Bin Yu

    2011-10-01

    Full Text Available The essence of communication is to exchange and share information. Computers provide a new medium to human communication. CMC system, composed of human and computers, absorbs and then extends the advantages of all former formats of communication, embracing the instant interaction of oral communication, the abstract logics of printing dissemination, and the vivid images of movie and television. It also creates a series of new communication formats, such as Hyper Text, Multimedia etc. which are the information organizing methods, and cross-space message delivering patterns. Benefiting from the continuous development of technique and mechanism, the computer-mediated communication makes the dream of transmitting information cross space and time become true, which will definitely have a great impact on our social lives.

  9. Construction, implementation and testing of an image identification system using computer vision methods for fruit flies with economic importance (Diptera: Tephritidae).

    Science.gov (United States)

    Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang

    2017-07-01

    Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  10. Computer-aided system design

    Science.gov (United States)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  11. Computer-aided instruction system

    International Nuclear Information System (INIS)

    Teneze, Jean Claude

    1968-01-01

    This research thesis addresses the use of teleprocessing and time sharing by the RAX IBM system and the possibility to introduce a dialog with the machine to develop an application in which the computer plays the role of a teacher for different pupils at the same time. Two operating modes are thus exploited: a teacher-mode and a pupil-mode. The developed CAI (computer-aided instruction) system comprises a checker to check the course syntax in teacher-mode, a translator to trans-code the course written in teacher-mode into a form which can be processes by the execution programme, and the execution programme which presents the course in pupil-mode

  12. The CESR computer control system

    International Nuclear Information System (INIS)

    Helmke, R.G.; Rice, D.H.; Strohman, C.

    1986-01-01

    The control system for the Cornell Electron Storage Ring (CESR) has functioned satisfactorily since its implementation in 1979. Key characteristics are fast tuning response, almost exclusive use of FORTRAN as a programming language, and efficient coordinated ramping of CESR guide field elements. This original system has not, however, been able to keep pace with the increasing complexity of operation of CESR associated with performance upgrades. Limitations in address space, expandability, access to data system-wide, and program development impediments have prompted the undertaking of a major upgrade. The system under development accomodates up to 8 VAX computers for all applications programs. The database and communications semaphores reside in a shared multi-ported memory, and each hardware interface bus is controlled by a dedicated 32 bit micro-processor in a VME based system. (orig.)

  13. Semi-Automatic Mapping of Tidal Cracks in the Fast Ice Region near Zhongshan Station in East Antarctica Using Landsat-8 OLI Imagery

    Directory of Open Access Journals (Sweden)

    Fengming Hui

    2016-03-01

    Full Text Available Tidal cracks are linear features that appear parallel to coastlines in fast ice regions due to the actions of periodic and non-periodic sea level oscillations. They can influence energy and heat exchange between the ocean, ice, and atmosphere, as well as human activities. In this paper, the LINE module of Geomatics 2015 software was used to automatically extract tidal cracks in fast ice regions near the Chinese Zhongshan Station in East Antarctica from Landsat-8 Operational Land Imager (OLI data with resolutions of 15 m (panchromatic band 8 and 30 m (multispectral bands 1–7. The detected tidal cracks were determined based on matching between the output from the LINE module and manually-interpreted tidal cracks in OLI images. The ratio of the length of detected tidal cracks to the total length of interpreted cracks was used to evaluate the automated detection method. Results show that the vertical direction gradient is a better input to the LINE module than the top-of-atmosphere (TOA reflectance input for estimating the presence of cracks, regardless of the examined resolution. Data with a resolution of 15 m also gives better results in crack detection than data with a resolution of 30 m. The statistics also show that, in the results from the 15-m-resolution data, the ratios in Band 8 performed best with values of the above-mentioned ratio of 50.92 and 31.38 percent using the vertical gradient and the TOA reflectance methods, respectively. On the other hand, in the results from the 30-m-resolution data, the ratios in Band 5 performed best with ratios of 47.43 and 17.8 percent using the same methods, respectively. This implies that Band 8 was better for tidal crack detection than the multispectral fusion data (Bands 1–7, and Band 5 with a resolution of 30 m was best among the multispectral data. The semi-automatic mapping of tidal cracks will improve the safety of vehicles travel in fast ice regimes.

  14. Computer Security for the Computer Systems Manager.

    Science.gov (United States)

    1982-12-01

    power sources essential to system availabilit y. Environsental degradation can cause system collapse or simply make the arer uncomforable work in...attack (civil disobedience, military as- sault, arson, locting, sabotage, vanlilism) * fire • smoke, dust, and dirt intrusion * bursting water pipes

  15. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  16. Automated validation of a computer operating system

    Science.gov (United States)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  17. Computer aide design tool for natural and artificial lighting engineering. Computer graphics as a tool for visual comfort and ergonomy; Outil de conception architecturale pour l'eclairage naturel/artificiel. Application de la synthese d'image pour la prise en compte des nations de confort et d'ergonomie visuels

    Energy Technology Data Exchange (ETDEWEB)

    Carre, S

    1998-07-01

    The aim of this work is to propose an computer aided design for the photo-simulation lighting systems. The proposed methods are adapted to the interactive modification of parameters concerning the natural and artificial lighting and the furniture place. The code is based on the hierarchical radiosity method using an adaptive decomposition of the transmitter and receptive surfaces. The chosen original meshing method increase the simulation time and is very efficient for the dynamical aspects management. An adaptive spectral representation is proposed to reduce the memory size. A semi-automatic evaluation of the visual comfort criteria is integrated in the simulation. (A.L.B.)

  18. Semi-automatized segmentation method using image-based flow cytometry to study sperm physiology: the case of capacitation-induced tyrosine phosphorylation.

    Science.gov (United States)

    Matamoros-Volante, Arturo; Moreno-Irusta, Ayelen; Torres-Rodriguez, Paulina; Giojalas, Laura; Gervasi, María G; Visconti, Pablo E; Treviño, Claudia L

    2018-02-01

    Is image-based flow cytometry a useful tool to study intracellular events in human sperm such as protein tyrosine phosphorylation or signaling processes? Image-based flow cytometry is a powerful tool to study intracellular events in a relevant number of sperm cells, which enables a robust statistical analysis providing spatial resolution in terms of the specific subcellular localization of the labeling. Sperm capacitation is required for fertilization. During this process, spermatozoa undergo numerous physiological changes, via activation of different signaling pathways, which are not completely understood. Classical approaches for studying sperm physiology include conventional microscopy, flow cytometry and Western blotting. These techniques present disadvantages for obtaining detailed subcellular information of signaling pathways in a relevant number of cells. This work describes a new semi-automatized analysis using image-based flow cytometry which enables the study, at the subcellular and population levels, of different sperm parameters associated with signaling. The increase in protein tyrosine phosphorylation during capacitation is presented as an example. Sperm cells were isolated from seminal plasma by the swim-up technique. We evaluated the intensity and distribution of protein tyrosine phosphorylation in sperm incubated in non-capacitation and capacitation-supporting media for 1 and 18 h under different experimental conditions. We used an antibody against FER kinase and pharmacological inhibitors in an attempt to identify the kinases involved in protein tyrosine phosphorylation during human sperm capacitation. Semen samples from normospermic donors were obtained by masturbation after 2-3 days of sexual abstinence. We used the innovative technique image-based flow cytometry and image analysis tools to segment individual images of spermatozoa. We evaluated and quantified the regions of sperm where protein tyrosine phosphorylation takes place at the

  19. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  20. CAESY - COMPUTER AIDED ENGINEERING SYSTEM

    Science.gov (United States)

    Wette, M. R.

    1994-01-01

    Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.

  1. Computed radiography systems performance evaluation

    International Nuclear Information System (INIS)

    Xavier, Clarice C.; Nersissian, Denise Y.; Furquim, Tania A.C.

    2009-01-01

    The performance of a computed radiography system was evaluated, according to the AAPM Report No. 93. Evaluation tests proposed by the publication were performed, and the following nonconformities were found: imaging p/ate (lP) dark noise, which compromises the clinical image acquired using the IP; exposure indicator uncalibrated, which can cause underexposure to the IP; nonlinearity of the system response, which causes overexposure; resolution limit under the declared by the manufacturer and erasure thoroughness uncalibrated, impairing structures visualization; Moire pattern visualized at the grid response, and IP Throughput over the specified by the manufacturer. These non-conformities indicate that digital imaging systems' lack of calibration can cause an increase in dose in order that image prob/ems can be so/ved. (author)

  2. Automated Computer Access Request System

    Science.gov (United States)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  3. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  4. Computer vision in control systems

    CERN Document Server

    Jain, Lakhmi

    2015-01-01

    Volume 1 : This book is focused on the recent advances in computer vision methodologies and technical solutions using conventional and intelligent paradigms. The Contributions include: ·         Morphological Image Analysis for Computer Vision Applications. ·         Methods for Detecting of Structural Changes in Computer Vision Systems. ·         Hierarchical Adaptive KL-based Transform: Algorithms and Applications. ·         Automatic Estimation for Parameters of Image Projective Transforms Based on Object-invariant Cores. ·         A Way of Energy Analysis for Image and Video Sequence Processing. ·         Optimal Measurement of Visual Motion Across Spatial and Temporal Scales. ·         Scene Analysis Using Morphological Mathematics and Fuzzy Logic. ·         Digital Video Stabilization in Static and Dynamic Scenes. ·         Implementation of Hadamard Matrices for Image Processing. ·         A Generalized Criterion ...

  5. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  6. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  7. Advanced topics in security computer system design

    International Nuclear Information System (INIS)

    Stachniak, D.E.; Lamb, W.R.

    1989-01-01

    The capability, performance, and speed of contemporary computer processors, plus the associated performance capability of the operating systems accommodating the processors, have enormously expanded the scope of possibilities for designers of nuclear power plant security computer systems. This paper addresses the choices that could be made by a designer of security computer systems working with contemporary computers and describes the improvement in functionality of contemporary security computer systems based on an optimally chosen design. Primary initial considerations concern the selection of (a) the computer hardware and (b) the operating system. Considerations for hardware selection concern processor and memory word length, memory capacity, and numerous processor features

  8. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  9. Trusted computing for embedded systems

    CERN Document Server

    Soudris, Dimitrios; Anagnostopoulos, Iraklis

    2015-01-01

    This book describes the state-of-the-art in trusted computing for embedded systems. It shows how a variety of security and trusted computing problems are addressed currently and what solutions are expected to emerge in the coming years. The discussion focuses on attacks aimed at hardware and software for embedded systems, and the authors describe specific solutions to create security features. Case studies are used to present new techniques designed as industrial security solutions. Coverage includes development of tamper resistant hardware and firmware mechanisms for lightweight embedded devices, as well as those serving as security anchors for embedded platforms required by applications such as smart power grids, smart networked and home appliances, environmental and infrastructure sensor networks, etc. ·         Enables readers to address a variety of security threats to embedded hardware and software; ·         Describes design of secure wireless sensor networks, to address secure authen...

  10. Method of semi-automatic high precision potentiometric titration for characterization of uranium compounds; Metodo de titulacao potenciometrica de alta precisao semi-automatizado para a caracterizacao de compostos de uranio

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, Barbara Fernandes G.; Dias, Fabio C.; Barros, Pedro D. de; Araujo, Radier Mario S. de; Delgado, Jose Ubiratan; Silva, Jose Wanderley S. da, E-mail: barbara@ird.gov.b, E-mail: fabio@ird.gov.b, E-mail: pedrodio@ird.gov.b, E-mail: radier@ird.gov.b, E-mail: delgado@ird.gov.b, E-mail: wanderley@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Lopes, Ricardo T., E-mail: ricardo@lin.ufrj.b [Universidade Federal do Rio de Janeiro (LIN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Lab. de Instrumentacao Nuclear

    2011-10-26

    The method of high precision potentiometric titration is widely used in the certification and characterization of uranium compounds. In order to reduce the analysis and diminish the influence if the annalist, a semi-automatic version of the method was developed at the safeguards laboratory of the CNEN-RJ, Brazil. The method was applied with traceability guaranteed by use of primary standard of potassium dichromate. The standard uncertainty combined in the determination of concentration of total uranium was of the order of 0.01%, which is better related to traditionally methods used by the nuclear installations which is of the order of 0.1%

  11. Portable computers - portable operating systems

    International Nuclear Information System (INIS)

    Wiegandt, D.

    1985-01-01

    Hardware development has made rapid progress over the past decade. Computers used to have attributes like ''general purpose'' or ''universal'', nowadays they are labelled ''personal'' and ''portable''. Recently, a major manufacturing company started marketing a portable version of their personal computer. But even for these small computers the old truth still holds that the biggest disadvantage of a computer is that it must be programmed, hardware by itself does not make a computer. (orig.)

  12. ELASTIC CLOUD COMPUTING ARCHITECTURE AND SYSTEM FOR HETEROGENEOUS SPATIOTEMPORAL COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-10-01

    Full Text Available Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs, while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  13. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    Science.gov (United States)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  14. Impact of new computing systems on finite element computations

    International Nuclear Information System (INIS)

    Noor, A.K.; Fulton, R.E.; Storaasi, O.O.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified

  15. A remote assessment system with a vision robot and wearable sensors.

    Science.gov (United States)

    Zhang, Tong; Wang, Jue; Ren, Yumiao; Li, Jianjun

    2004-01-01

    This paper describes an ongoing researched remote rehabilitation assessment system that has a 6-freedom double-eyes vision robot to catch vision information, and a group of wearable sensors to acquire biomechanical signals. A server computer is fixed on the robot, to provide services to the robot's controller and all the sensors. The robot is connected to Internet by wireless channel, and so do the sensors to the robot. Rehabilitation professionals can semi-automatically practise an assessment program via Internet. The preliminary results show that the smart device, including the robot and the sensors, can improve the quality of remote assessment, and reduce the complexity of operation at a distance.

  16. Conflict Resolution in Computer Systems

    Directory of Open Access Journals (Sweden)

    G. P. Mojarov

    2015-01-01

    Full Text Available A conflict situation in computer systems CS is the phenomenon arising when the processes have multi-access to the shared resources and none of the involved processes can proceed because of their waiting for the certain resources locked by the other processes which, in turn, are in a similar position. The conflict situation is also called a deadlock that has quite clear impact on the CS state.To find the reduced to practice algorithms to resolve the impasses is of significant applied importance for ensuring information security of computing process and thereupon the presented article is aimed at solving a relevant problem.The gravity of situation depends on the types of processes in a deadlock, types of used resources, number of processes, and a lot of other factors.A disadvantage of the method for preventing the impasses used in many modern operating systems and based on the preliminary planning resources required for the process is obvious - waiting time can be overlong. The preventing method with the process interruption and deallocation of its resources is very specific and a little effective, when there is a set of the polytypic resources requested dynamically. The drawback of another method, to prevent a deadlock by ordering resources, consists in restriction of possible sequences of resource requests.A different way of "struggle" against deadlocks is a prevention of impasses. In the future a prediction of appearing impasses is supposed. There are known methods [1,4,5] to define and prevent conditions under which deadlocks may occur. Thus the preliminary information on what resources a running process can request is used. Before allocating a free resource to the process, a test for a state “safety” condition is provided. The state is "safe" if in the future impasses cannot occur as a result of resource allocation to the process. Otherwise the state is considered to be " hazardous ", and resource allocation is postponed. The obvious

  17. Integrated Computer System of Management in Logistics

    Science.gov (United States)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  18. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  19. 32P-postlabeling assay for carcinogen-DNA adducts: description of beta shielding apparatus and semi-automatic spotting and washing devices that facilitate the handling of multiple samples

    International Nuclear Information System (INIS)

    Reddy, M.V.; Blackburn, G.R.

    1990-01-01

    The utilization of the 32 P-postlabeling assay in combination with TLC for the sensitive detection and estimation of aromatic DNA adducts has been increasing. The procedure consists of 32 P-labeling of carcinogen-adducted 3'-nucleotides in the DNA digests using γ- 32 P ATP and polynucleotide kinase, separation of 32 P-labeled adducts by TLC, and their detection by autoradiography. During both 32 P-labeling and initial phases of TLC, a relatively high amount of γ- 32 P ATP is handled when 30 samples are processed simultaneously. We describe the design of acrylic shielding apparatus, semi-automatic TLC spotting devices, and devices for development and washing of multiple TLC plates, which not only provide substantial protection from exposure to 32 P beta radiation, but also allow quick and easy handling of a large number of samples. Specifically, the equipment includes: (i) a multi-tube carousel rack having 15 wells to hold capless Eppendorf tubes and a rotatable lid with an aperture to access individual tubes; (ii) a pipette shielder; (iii) two semi-automatic spotting devices to apply radioactive solutions to TLC plates; (iv) a multi-plate holder for TLC plates; and (v) a mechanical device for washing multiple TLC plates. Item (i) is small enough to be held in one-hand, vortexed, and centrifuged to mix the solutions in each tube while beta radiation is shielded. Items (iii) to (iv) aid in the automation of the assay. (author)

  20. A distributed computer system for digitising machines

    International Nuclear Information System (INIS)

    Bairstow, R.; Barlow, J.; Waters, M.; Watson, J.

    1977-07-01

    This paper describes a Distributed Computing System, based on micro computers, for the monitoring and control of digitising tables used by the Rutherford Laboratory Bubble Chamber Research Group in the measurement of bubble chamber photographs. (author)

  1. Applied computation and security systems

    CERN Document Server

    Saeed, Khalid; Choudhury, Sankhayan; Chaki, Nabendu

    2015-01-01

    This book contains the extended version of the works that have been presented and discussed in the First International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2014) held during April 18-20, 2014 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland and University of Calcutta, India. The Volume I of this double-volume book contains fourteen high quality book chapters in three different parts. Part 1 is on Pattern Recognition and it presents four chapters. Part 2 is on Imaging and Healthcare Applications contains four more book chapters. The Part 3 of this volume is on Wireless Sensor Networking and it includes as many as six chapters. Volume II of the book has three Parts presenting a total of eleven chapters in it. Part 4 consists of five excellent chapters on Software Engineering ranging from cloud service design to transactional memory. Part 5 in Volume II is on Cryptography with two book...

  2. Universal blind quantum computation for hybrid system

    Science.gov (United States)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang

    2017-08-01

    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  3. Torness computer system turns round data

    International Nuclear Information System (INIS)

    Dowler, E.; Hamilton, J.

    1989-01-01

    The Torness nuclear power station has two advanced gas-cooled reactors. A key feature is the distributed computer system which covers both data processing and auto-control. The complete computer system has over 80 processors with 45000 digital and 22000 analogue input signals. The on-line control and monitoring systems includes operating systems, plant data acquisition and processing, alarm and event detection, communications software, process management systems and database management software. Some features of the system are described. (UK)

  4. Computer Education with "Retired" Industrial Systems.

    Science.gov (United States)

    Nesin, Dan; And Others

    1980-01-01

    Describes a student-directed computer system revival project in the Electrical and Computer Engineering department at California State Polytechnic University, which originated when an obsolete computer was donated to the department. Discusses resulting effects in undergraduate course offerings, in student extracurricular activities, and in…

  5. The Northeast Utilities generic plant computer system

    International Nuclear Information System (INIS)

    Spitzner, K.J.

    1980-01-01

    A variety of computer manufacturers' equipment monitors plant systems in Northeast Utilities' (NU) nuclear and fossil power plants. The hardware configuration and the application software in each of these systems are essentially one of a kind. Over the next few years these computer systems will be replaced by the NU Generic System, whose prototype is under development now for Millstone III, an 1150 Mwe Pressurized Water Reactor plant being constructed in Waterford, Connecticut. This paper discusses the Millstone III computer system design, concentrating on the special problems inherent in a distributed system configuration such as this. (auth)

  6. Distributed computer systems theory and practice

    CERN Document Server

    Zedan, H S M

    2014-01-01

    Distributed Computer Systems: Theory and Practice is a collection of papers dealing with the design and implementation of operating systems, including distributed systems, such as the amoeba system, argus, Andrew, and grapevine. One paper discusses the concepts and notations for concurrent programming, particularly language notation used in computer programming, synchronization methods, and also compares three classes of languages. Another paper explains load balancing or load redistribution to improve system performance, namely, static balancing and adaptive load balancing. For program effici

  7. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    Science.gov (United States)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  8. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  9. Nuclear power plant information management system 'NUPIMAS'

    International Nuclear Information System (INIS)

    Matsumoto, M.; Saruyama, I.; Kurokawa, Y.; Kayano, M.; Katto, S.

    1980-01-01

    NUPIMAS is an interactive computer graphic system used for the design of pipings of nuclear power plant and the production of their drawings. Data on piping, duct, cable tray, equipment and building are stored in the computer and the following conversational-mode design works are performed online by means of graphic display, plotter and others: (1) Piping route study and interference check. (2) Modification of piping route and specifications. (3) Semi-automatic design of low-temperature piping supports. As the result of these design works the following drawings and lists are produced and interactively refined by computer: (1) Composite drawings. (2) Piping assembly drawings and shop drawings. (3) Bill of material. (4) Welding procedure instruction. (5) Duct route drawings (Isometric and 3-plane views). (6) Shop and assembly drawings of supports, etc. This system is already in practical use, obtaining good results. (author)

  10. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  11. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  12. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  13. DDP-516 Computer Graphics System Capabilities

    Science.gov (United States)

    1972-06-01

    This report describes the capabilities of the DDP-516 Computer Graphics System. One objective of this report is to acquaint DOT management and project planners with the system's current capabilities, applications hardware and software. The Appendix i...

  14. Preventive maintenance for computer systems - concepts & issues ...

    African Journals Online (AJOL)

    Performing preventive maintenance activities for the computer is not optional. The computer is a sensitive and delicate device that needs adequate time and attention to make it work properly. In this paper, the concept and issues on how to prolong the life span of the system, that is, the way to make the system last long and ...

  15. A computable type theory for control systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter); L. Guo; J. Baillieul

    2009-01-01

    htmlabstractIn this paper, we develop a theory of computable types suitable for the study of control systems. The theory uses type-two effectivity as the underlying computational model, but we quickly develop a type system which can be manipulated abstractly, but for which all allowable operations

  16. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  17. Computer-Supported Information Systems.

    Science.gov (United States)

    Mayhew, William H.

    1983-01-01

    The planning and implementation of a computerized management information system at a fictional small college is described. Nine key points are made regarding department involvement, centralization, gradual program implementation, lowering costs, system documentation, and upper-level administrative support. (MSE)

  18. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  19. Analysis of carotid artery plaque and wall boundaries on CT images by using a semi-automatic method based on level set model

    International Nuclear Information System (INIS)

    Saba, Luca; Sannia, Stefano; Ledda, Giuseppe; Gao, Hao; Acharya, U.R.; Suri, Jasjit S.

    2012-01-01

    The purpose of this study was to evaluate the potentialities of a semi-automated technique in the detection and measurement of the carotid artery plaque. Twenty-two consecutive patients (18 males, 4 females; mean age 62 years) examined with MDCTA from January 2011 to March 2011 were included in this retrospective study. Carotid arteries are examined with a 16-multi-detector-row CT system, and for each patient, the most diseased carotid was selected. In the first phase, the carotid plaque was identified and one experienced radiologist manually traced the inner and outer boundaries by using polyline and radial distance method (PDM and RDM, respectively). In the second phase, the carotid inner and outer boundaries were traced with an automated algorithm: level-set-method (LSM). Data were compared by using Pearson rho correlation, Bland-Altman, and regression. A total of 715 slices were analyzed. The mean thickness of the plaque using the reference PDM was 1.86 mm whereas using the LSM-PDM was 1.96 mm; using the reference RDM was 2.06 mm whereas using the LSM-RDM was 2.03 mm. The correlation values between the references, the LSM, the PDM and the RDM were 0.8428, 0.9921, 0.745 and 0.6425. Bland-Altman demonstrated a very good agreement in particular with the RDM method. Results of our study indicate that LSM method can automatically measure the thickness of the plaque and that the best results are obtained with the RDM. Our results suggest that advanced computer-based algorithms can identify and trace the plaque boundaries like an experienced human reader. (orig.)

  20. Analysis of carotid artery plaque and wall boundaries on CT images by using a semi-automatic method based on level set model

    Energy Technology Data Exchange (ETDEWEB)

    Saba, Luca; Sannia, Stefano; Ledda, Giuseppe [University of Cagliari - Azienda Ospedaliero Universitaria di Cagliari, Department of Radiology, Monserrato, Cagliari (Italy); Gao, Hao [University of Strathclyde, Signal Processing Centre for Excellence in Signal and Image Processing, Department of Electronic and Electrical Engineering, Glasgow (United Kingdom); Acharya, U.R. [Ngee Ann Polytechnic University, Department of Electronics and Computer Engineering, Clementi (Singapore); Suri, Jasjit S. [Biomedical Technologies Inc., Denver, CO (United States); Idaho State University (Aff.), Pocatello, ID (United States)

    2012-11-15

    The purpose of this study was to evaluate the potentialities of a semi-automated technique in the detection and measurement of the carotid artery plaque. Twenty-two consecutive patients (18 males, 4 females; mean age 62 years) examined with MDCTA from January 2011 to March 2011 were included in this retrospective study. Carotid arteries are examined with a 16-multi-detector-row CT system, and for each patient, the most diseased carotid was selected. In the first phase, the carotid plaque was identified and one experienced radiologist manually traced the inner and outer boundaries by using polyline and radial distance method (PDM and RDM, respectively). In the second phase, the carotid inner and outer boundaries were traced with an automated algorithm: level-set-method (LSM). Data were compared by using Pearson rho correlation, Bland-Altman, and regression. A total of 715 slices were analyzed. The mean thickness of the plaque using the reference PDM was 1.86 mm whereas using the LSM-PDM was 1.96 mm; using the reference RDM was 2.06 mm whereas using the LSM-RDM was 2.03 mm. The correlation values between the references, the LSM, the PDM and the RDM were 0.8428, 0.9921, 0.745 and 0.6425. Bland-Altman demonstrated a very good agreement in particular with the RDM method. Results of our study indicate that LSM method can automatically measure the thickness of the plaque and that the best results are obtained with the RDM. Our results suggest that advanced computer-based algorithms can identify and trace the plaque boundaries like an experienced human reader. (orig.)

  1. Automated Fuel Element Closure Welding System

    International Nuclear Information System (INIS)

    Wahlquist, D.R.

    1993-01-01

    The Automated Fuel Element Closure Welding System is a robotic device that will load and weld top end plugs onto nuclear fuel elements in a highly radioactive and inert gas environment. The system was developed at Argonne National Laboratory-West as part of the Fuel Cycle Demonstration. The welding system performs four main functions, it (1) injects a small amount of a xenon/krypton gas mixture into specific fuel elements, and (2) loads tiny end plugs into the tops of fuel element jackets, and (3) welds the end plugs to the element jackets, and (4) performs a dimensional inspection of the pre- and post-welded fuel elements. The system components are modular to facilitate remote replacement of failed parts. The entire system can be operated remotely in manual, semi-automatic, or fully automatic modes using a computer control system. The welding system is currently undergoing software testing and functional checkout

  2. Configurating computer-controlled bar system

    OpenAIRE

    Šuštaršič, Nejc

    2010-01-01

    The principal goal of my diploma thesis is creating an application for configurating computer-controlled beverages dispensing systems. In the preamble of my thesis I present the theoretical platform for point of sale systems and beverages dispensing systems, which are required for the understanding of the target problematics. As with many other fields, computer tehnologies entered the field of managing bars and restaurants quite some time ago. Basic components of every bar or restaurant a...

  3. Automating the segmentation of medical images for the production of voxel tomographic computational models

    International Nuclear Information System (INIS)

    Caon, M.

    2001-01-01

    Radiation dosimetry for the diagnostic medical imaging procedures performed on humans requires anatomically accurate, computational models. These may be constructed from medical images as voxel-based tomographic models. However, they are time consuming to produce and as a consequence, there are few available. This paper discusses the emergence of semi-automatic segmentation techniques and describes an application (iRAD) written in Microsoft Visual Basic that allows the bitmap of a medical image to be segmented interactively and semi-automatically while displayed in Microsoft Excel. iRAD will decrease the time required to construct voxel models. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  4. Support system for ATLAS distributed computing operations

    CERN Document Server

    Kishimoto, Tomoe; The ATLAS collaboration

    2018-01-01

    The ATLAS distributed computing system has allowed the experiment to successfully meet the challenges of LHC Run 2. In order for distributed computing to operate smoothly and efficiently, several support teams are organized in the ATLAS experiment. The ADCoS (ATLAS Distributed Computing Operation Shifts) is a dedicated group of shifters who follow and report failing jobs, failing data transfers between sites, degradation of ATLAS central computing services, and more. The DAST (Distributed Analysis Support Team) provides user support to resolve issues related to running distributed analysis on the grid. The CRC (Computing Run Coordinator) maintains a global view of the day-to-day operations. In this presentation, the status and operational experience of the support system for ATLAS distributed computing in LHC Run 2 will be reported. This report also includes operations experience from the grid site point of view, and an analysis of the errors that create the biggest waste of wallclock time. The report of oper...

  5. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  6. Computational Intelligence in Information Systems Conference

    CERN Document Server

    Au, Thien-Wan; Omar, Saiful

    2017-01-01

    This book constitutes the Proceedings of the Computational Intelligence in Information Systems conference (CIIS 2016), held in Brunei, November 18–20, 2016. The CIIS conference provides a platform for researchers to exchange the latest ideas and to present new research advances in general areas related to computational intelligence and its applications. The 26 revised full papers presented in this book have been carefully selected from 62 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.

  7. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  8. Computer-aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1997-12-16

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP).

  9. Computer-aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1997-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol Operations Center. This document reflects the as-built requirements for the system that was delivered by GTE Northwest, Inc. This system provided a commercial off-the-shelf computer-aided dispatching system and alarm monitoring system currently in operations at the Hanford Patrol Operations Center, Building 2721E. This system also provides alarm back-up capability for the Plutonium Finishing Plant (PFP)

  10. Fault-tolerant computing systems

    International Nuclear Information System (INIS)

    Dal Cin, M.; Hohl, W.

    1991-01-01

    Tests, Diagnosis and Fault Treatment were chosen as the guiding themes of the conference. However, the scope of the conference included reliability, availability, safety and security issues in software and hardware systems as well. The sessions were organized for the conference which was completed by an industrial presentation: Keynote Address, Reconfiguration and Recover, System Level Diagnosis, Voting and Agreement, Testing, Fault-Tolerant Circuits, Array Testing, Modelling, Applied Fault Tolerance, Fault-Tolerant Arrays and Systems, Interconnection Networks, Fault-Tolerant Software. One paper has been indexed separately in the database. (orig./HP)

  11. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  12. Computer Algebra Systems in Undergraduate Instruction.

    Science.gov (United States)

    Small, Don; And Others

    1986-01-01

    Computer algebra systems (such as MACSYMA and muMath) can carry out many of the operations of calculus, linear algebra, and differential equations. Use of them with sketching graphs of rational functions and with other topics is discussed. (MNS)

  13. Computing for Decentralized Systems (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    With the rise of Bitcoin, Ethereum, and other cryptocurrencies it is becoming apparent the paradigm shift towards decentralized computing. Computer engineers will need to understand this shift when developing systems in the coming years. Transferring value over the Internet is just one of the first working use cases of decentralized systems, but it is expected they will be used for a number of different services such as general purpose computing, data storage, or even new forms of governance. Decentralized systems, however, pose a series of challenges that cannot be addressed with traditional approaches in computing. Not having a central authority implies truth must be agreed upon rather than simply trusted and, so, consensus protocols, cryptographic data structures like the blockchain, and incentive models like mining rewards become critical for the correct behavior of decentralized system. This series of lectures will be a fast track to introduce these fundamental concepts through working examples and pra...

  14. Computing for Decentralized Systems (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    With the rise of Bitcoin, Ethereum, and other cryptocurrencies it is becoming apparent the paradigm shift towards decentralized computing. Computer engineers will need to understand this shift when developing systems in the coming years. Transferring value over the Internet is just one of the first working use cases of decentralized systems, but it is expected they will be used for a number of different services such as general purpose computing, data storage, or even new forms of governance. Decentralized systems, however, pose a series of challenges that cannot be addressed with traditional approaches in computing. Not having a central authority implies truth must be agreed upon rather than simply trusted and, so, consensus protocols, cryptographic data structures like the blockchain, and incentive models like mining rewards become critical for the correct behavior of decentralized system. This series of lectures will be a fast track to introduce these fundamental concepts through working examples and pra...

  15. The structural robustness of multiprocessor computing system

    Directory of Open Access Journals (Sweden)

    N. Andronaty

    1996-03-01

    Full Text Available The model of the multiprocessor computing system on the base of transputers which permits to resolve the question of valuation of a structural robustness (viability, survivability is described.

  16. Computer program for optical systems ray tracing

    Science.gov (United States)

    Ferguson, T. J.; Konn, H.

    1967-01-01

    Program traces rays of light through optical systems consisting of up to 65 different optical surfaces and computes the aberrations. For design purposes, paraxial tracings with astigmation and third order tracings are provided.

  17. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  18. Development of a Semi-Automatic Technique for Flow Estimation using Optical Flow Registration and k-means Clustering on Two Dimensional Cardiovascular Magnetic Resonance Flow Images

    DEFF Research Database (Denmark)

    Brix, Lau; Christoffersen, Christian P. V.; Kristiansen, Martin Søndergaard

    was then categorized into groups by the k-means clustering method. Finally, the cluster containing the vessel under investigation was selected manually by a single mouse click. All calculations were performed on a Nvidia 8800 GTX graphics card using the Compute Unified Device Architecture (CUDA) extension to the C...

  19. Computer networks in future accelerator control systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.

    1977-03-01

    Some findings of a study concerning a computer based control and monitoring system for the proposed ISABELLE Intersecting Storage Accelerator are presented. Requirements for development and implementation of such a system are discussed. An architecture is proposed where the system components are partitioned along functional lines. Implementation of some conceptually significant components is reviewed

  20. Computer-Based Wireless Advertising Communication System

    Directory of Open Access Journals (Sweden)

    Anwar Al-Mofleh

    2009-10-01

    Full Text Available In this paper we developed a computer based wireless advertising communication system (CBWACS that enables the user to advertise whatever he wants from his own office to the screen in front of the customer via wireless communication system. This system consists of two PIC microcontrollers, transmitter, receiver, LCD, serial cable and antenna. The main advantages of the system are: the wireless structure and the system is less susceptible to noise and other interferences because it uses digital communication techniques.

  1. Computer-Aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1996-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This system is defined as a Commercial-Off the-Shelf computer dispatching system providing both text and graphical display information while interfacing with the diverse reporting system within the Hanford Facility. This system also provided expansion capabilities to integrate Hanford Fire and the Occurrence Notification Center and provides back-up capabilities for the Plutonium Processing Facility

  2. Computer vision system R&D for EAST Articulated Maintenance Arm robot

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Linglong, E-mail: linglonglin@ipp.ac.cn; Song, Yuntao, E-mail: songyt@ipp.ac.cn; Yang, Yang, E-mail: yangy@ipp.ac.cn; Feng, Hansheng, E-mail: hsfeng@ipp.ac.cn; Cheng, Yong, E-mail: chengyong@ipp.ac.cn; Pan, Hongtao, E-mail: panht@ipp.ac.cn

    2015-11-15

    Highlights: • We discussed the image preprocessing, object detection and pose estimation algorithms under poor light condition of inner vessel of EAST tokamak. • The main pipeline, including contours detection, contours filter, MER extracted, object location and pose estimation, was carried out in detail. • The technical issues encountered during the research were discussed. - Abstract: Experimental Advanced Superconducting Tokamak (EAST) is the first full superconducting tokamak device which was constructed at Institute of Plasma Physics Chinese Academy of Sciences (ASIPP). The EAST Articulated Maintenance Arm (EAMA) robot provides the means of the in-vessel maintenance such as inspection and picking up the fragments of first wall. This paper presents a method to identify and locate the fragments semi-automatically by using the computer vision. The use of computer vision in identification and location faces some difficult challenges such as shadows, poor contrast, low illumination level, less texture and so on. The method developed in this paper enables credible identification of objects with shadows through invariant image and edge detection. The proposed algorithms are validated through our ASIPP robotics and computer vision platform (ARVP). The results show that the method can provide a 3D pose with reference to robot base so that objects with different shapes and size can be picked up successfully.

  3. Information systems and computing technology

    CERN Document Server

    Zhang, Lei

    2013-01-01

    Invited papersIncorporating the multi-cross-sectional temporal effect in Geographically Weighted Logit Regression K. Wu, B. Liu, B. Huang & Z. LeiOne shot learning human actions recognition using key posesW.H. Zou, S.G. Li, Z. Lei & N. DaiBand grouping pansharpening for WorldView-2 satellite images X. LiResearch on GIS based haze trajectory data analysis system Y. Wang, J. Chen, J. Shu & X. WangRegular papersA warning model of systemic financial risks W. Xu & Q. WangResearch on smart mobile phone user experience with grounded theory J.P. Wan & Y.H. ZhuThe software reliability analysis based on

  4. Computer automation of a dilution cryogenic system

    International Nuclear Information System (INIS)

    Nogues, C.

    1992-09-01

    This study has been realized in the framework of studies on developing new technic for low temperature detectors for neutrinos and dark matter. The principles of low temperature physics and helium 4 and dilution cryostats, are first reviewed. The cryogenic system used and the technic for low temperature thermometry and regulation systems are then described. The computer automation of the dilution cryogenic system involves: numerical measurement of the parameter set (pressure, temperature, flow rate); computer assisted operating of the cryostat and the pump bench; numerical regulation of pressure and temperature; operation sequence full automation allowing the system to evolve from a state to another (temperature descent for example)

  5. Artificial immune system applications in computer security

    CERN Document Server

    Tan, Ying

    2016-01-01

    This book provides state-of-the-art information on the use, design, and development of the Artificial Immune System (AIS) and AIS-based solutions to computer security issues. Artificial Immune System: Applications in Computer Security focuses on the technologies and applications of AIS in malware detection proposed in recent years by the Computational Intelligence Laboratory of Peking University (CIL@PKU). It offers a theoretical perspective as well as practical solutions for readers interested in AIS, machine learning, pattern recognition and computer security. The book begins by introducing the basic concepts, typical algorithms, important features, and some applications of AIS. The second chapter introduces malware and its detection methods, especially for immune-based malware detection approaches. Successive chapters present a variety of advanced detection approaches for malware, including Virus Detection System, K-Nearest Neighbour (KNN), RBF networ s, and Support Vector Machines (SVM), Danger theory, ...

  6. Replacement of the JRR-3 computer system

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Tomoaki; Kobayashi, Kenichi; Suwa, Masayuki; Mineshima, Hiromi; Sato, Mitsugu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-10-01

    The JRR-3 computer system contributes to stable operation of JRR-3 since 1990. But now about 10 years have passed since it was designed and some problems have occurred. Under these situations, we should replace the old computer system for safe and stable operation. In this replacement, the system is improved as regards man-machine interface and efficiency about maintenance. The new system consists of three functions, which are 'the function of management for operation information' (renewal function), 'the function of management for facility information' (new function) and the function of management for information publication' (new function). By this replacement, New JRR-3 computer system can contribute to safe and stable operation. (author)

  7. Quantum Computing in Solid State Systems

    CERN Document Server

    Ruggiero, B; Granata, C

    2006-01-01

    The aim of Quantum Computation in Solid State Systems is to report on recent theoretical and experimental results on the macroscopic quantum coherence of mesoscopic systems, as well as on solid state realization of qubits and quantum gates. Particular attention has been given to coherence effects in Josephson devices. Other solid state systems, including quantum dots, optical, ion, and spin devices which exhibit macroscopic quantum coherence are also discussed. Quantum Computation in Solid State Systems discusses experimental implementation of quantum computing and information processing devices, and in particular observations of quantum behavior in several solid state systems. On the theoretical side, the complementary expertise of the contributors provides models of the various structures in connection with the problem of minimizing decoherence.

  8. Role of computers in CANDU safety systems

    International Nuclear Information System (INIS)

    Hepburn, G.A.; Gilbert, R.S.; Ichiyen, N.M.

    1985-01-01

    Small digital computers are playing an expanding role in the safety systems of CANDU nuclear generating stations, both as active components in the trip logic, and as monitoring and testing systems. The paper describes three recent applications: (i) A programmable controller was retro-fitted to Bruce ''A'' Nuclear Generating Station to handle trip setpoint modification as a function of booster rod insertion. (ii) A centralized monitoring computer to monitor both shutdown systems and the Emergency Coolant Injection system, is currently being retro-fitted to Bruce ''A''. (iii) The implementation of process trips on the CANDU 600 design using microcomputers. While not truly a retrofit, this feature was added very late in the design cycle to increase the margin against spurious trips, and has now seen about 4 unit-years of service at three separate sites. Committed future applications of computers in special safety systems are also described. (author)

  9. Replacement of the JRR-3 computer system

    International Nuclear Information System (INIS)

    Kato, Tomoaki; Kobayashi, Kenichi; Suwa, Masayuki; Mineshima, Hiromi; Sato, Mitsugu

    2000-01-01

    The JRR-3 computer system contributes to stable operation of JRR-3 since 1990. But now about 10 years have passed since it was designed and some problems have occurred. Under these situations, we should replace the old computer system for safe and stable operation. In this replacement, the system is improved as regards man-machine interface and efficiency about maintenance. The new system consists of three functions, which are 'the function of management for operation information' (renewal function), 'the function of management for facility information' (new function) and the function of management for information publication' (new function). By this replacement, New JRR-3 computer system can contribute to safe and stable operation. (author)

  10. Real time computer system with distributed microprocessors

    International Nuclear Information System (INIS)

    Heger, D.; Steusloff, H.; Syrbe, M.

    1979-01-01

    The usual centralized structure of computer systems, especially of process computer systems, cannot sufficiently use the progress of very large-scale integrated semiconductor technology with respect to increasing the reliability and performance and to decreasing the expenses especially of the external periphery. This and the increasing demands on process control systems has led the authors to generally examine the structure of such systems and to adapt it to the new surroundings. Computer systems with distributed, optical fibre-coupled microprocessors allow a very favourable problem-solving with decentralized controlled buslines and functional redundancy with automatic fault diagnosis and reconfiguration. A fit programming system supports these hardware properties: PEARL for multicomputer systems, dynamic loader, processor and network operating system. The necessary design principles for this are proved mainly theoretically and by value analysis. An optimal overall system of this new generation of process control systems was established, supported by results of 2 PDV projects (modular operating systems, input/output colour screen system as control panel), for the purpose of testing by apllying the system for the control of 28 pit furnaces of a steel work. (orig.) [de

  11. Integration of scheduling and discrete event simulation systems to improve production flow planning

    Science.gov (United States)

    Krenczyk, D.; Paprocka, I.; Kempa, W. M.; Grabowik, C.; Kalinowski, K.

    2016-08-01

    The increased availability of data and computer-aided technologies such as MRPI/II, ERP and MES system, allowing producers to be more adaptive to market dynamics and to improve production scheduling. Integration of production scheduling and computer modelling, simulation and visualization systems can be useful in the analysis of production system constraints related to the efficiency of manufacturing systems. A integration methodology based on semi-automatic model generation method for eliminating problems associated with complexity of the model and labour-intensive and time-consuming process of simulation model creation is proposed. Data mapping and data transformation techniques for the proposed method have been applied. This approach has been illustrated through examples of practical implementation of the proposed method using KbRS scheduling system and Enterprise Dynamics simulation system.

  12. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  13. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  14. Expert systems and computer based industrial systems

    International Nuclear Information System (INIS)

    Dunand, R.

    1989-01-01

    Framentec is the artificial intelligence subsidiary of FRAMATOME. It is involved in expert-system activities of Shells, developments, methodology and software for maintenance (Maintex) and consulting and methodology. Specific applications in the nuclear field are presented. The first is an expert system to assist in the piping support design prototype, the second is an expert system that assists an ultrasonic testing operator in determining the nature of a welding defect and the third is a welding machine diagnosis advisor. Maintex is a software tool to provide assistance in the repair of complex industrial equipment. (author)

  15. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  16. Terrace Layout Using a Computer Assisted System

    Science.gov (United States)

    Development of a web-based terrace design tool based on the MOTERR program is presented, along with representative layouts for conventional and parallel terrace systems. Using digital elevation maps and geographic information systems (GIS), this tool utilizes personal computers to rapidly construct ...

  17. Case Studies in Library Computer Systems.

    Science.gov (United States)

    Palmer, Richard Phillips

    Twenty descriptive case studies of computer applications in a variety of libraries are presented in this book. Computerized circulation, serial and acquisition systems in public, high school, college, university and business libraries are included. Each of the studies discusses: 1) the environment in which the system operates, 2) the objectives of…

  18. Slab cooling system design using computer simulation

    NARCIS (Netherlands)

    Lain, M.; Zmrhal, V.; Drkal, F.; Hensen, J.L.M.

    2007-01-01

    For a new technical library building in Prague computer simulations were carried out to help design of slab cooling system and optimize capacity of chillers. In the paper is presented concept of new technical library HVAC system, the model of the building, results of the energy simulations for

  19. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  20. Computer Application Systems at the University.

    Science.gov (United States)

    Bazewicz, Mieczyslaw

    1979-01-01

    The results of the WASC Project at the Technical University of Wroclaw have confirmed the possibility of constructing informatic systems based on the recognized size and specifics of user's needs (needs of the university) and provided some solutions to the problem of collaboration of computer systems at remote universities. (Author/CMV)

  1. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels......Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  2. Applying improved instrumentation and computer control systems

    International Nuclear Information System (INIS)

    Bevilacqua, F.; Myers, J.E.

    1977-01-01

    In-core and out-of-core instrumentation systems for the Cherokee-I reactor are described. The reactor has 61m-core instrument assemblies. Continuous computer monitoring and processing of data from over 300 fixed detectors will be used to improve the manoeuvering of core power. The plant protection system is a standard package for the Combustion Engineering System 80, consisting of two independent systems, the reactor protection system and the engineering safety features activation system, both of which are designed to meet NRC, ANS and IEEE design criteria or standards. The plants protection system has its own computer which provides plant monitoring, alarming, logging and performance calculations. (U.K.)

  3. Unified Computational Intelligence for Complex Systems

    CERN Document Server

    Seiffertt, John

    2010-01-01

    Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to e

  4. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  5. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  6. Computer-Aided dispatching system design specification

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, M.G.

    1996-09-27

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This document outlines the negotiated requirements as agreed to by GTE Northwest during technical contract discussions. This system defines a commercial off-the-shelf computer dispatching system providing both test and graphic display information while interfacing with diverse alarm reporting system within the Hanford Site. This system provided expansion capability to integrate Hanford Fire and the Occurrence Notification Center. The system also provided back-up capability for the Plutonium Processing Facility (PFP).

  7. Operator support system using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bueno, Elaine Inacio, E-mail: ebueno@ifsp.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Sao Paulo, SP (Brazil); Pereira, Iraci Martinez, E-mail: martinez@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  8. Operator support system using computational intelligence techniques

    International Nuclear Information System (INIS)

    Bueno, Elaine Inacio; Pereira, Iraci Martinez

    2015-01-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  9. Infrastructure Support for Collaborative Pervasive Computing Systems

    DEFF Research Database (Denmark)

    Vestergaard Mogensen, Martin

    Collaborative Pervasive Computing Systems (CPCS) are currently being deployed to support areas such as clinical work, emergency situations, education, ad-hoc meetings, and other areas involving information sharing and collaboration.These systems allow the users to work together synchronously......, but from different places, by sharing information and coordinating activities. Several researchers have shown the value of such distributed collaborative systems. However, building these systems is by no means a trivial task and introduces a lot of yet unanswered questions. The aforementioned areas......, are all characterized by unstable, volatile environments, either due to the underlying components changing or the nomadic work habits of users. A major challenge, for the creators of collaborative pervasive computing systems, is the construction of infrastructures supporting the system. The complexity...

  10. Monitoring SLAC High Performance UNIX Computing Systems

    International Nuclear Information System (INIS)

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  11. 'Micro-8' micro-computer system

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Nakahara, Yoshinori; Yamada, Takayuki; Takeuchi, Norio; Koyama, Kinji

    1978-08-01

    The micro-computer Micro-8 system has been developed to organize a data exchange network between various instruments and a computer group including a large computer system. Used for packet exchangers and terminal controllers, the system consists of ten kinds of standard boards including a CPU board with INTEL-8080 one-chip-processor. CPU architecture, BUS architecture, interrupt control, and standard-boards function are explained in circuit block diagrams. Operations of the basic I/O device, digital I/O board and communication adapter are described with definitions of the interrupt ramp status, I/O command, I/O mask, data register, etc. In the appendixes are circuit drawings, INTEL-8080 micro-processor specifications, BUS connections, I/O address mappings, jumper connections of address selection, and interface connections. (author)

  12. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel

    2014-01-01

    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  13. Metasynthetic computing and engineering of complex systems

    CERN Document Server

    Cao, Longbing

    2015-01-01

    Provides a comprehensive overview and introduction to the concepts, methodologies, analysis, design and applications of metasynthetic computing and engineering. The author: Presents an overview of complex systems, especially open complex giant systems such as the Internet, complex behavioural and social problems, and actionable knowledge discovery and delivery in the big data era. Discusses ubiquitous intelligence in complex systems, including human intelligence, domain intelligence, social intelligence, network intelligence, data intelligence and machine intelligence, and their synergy thro

  14. Appearance Constrained Semi-Automatic Segmentation from DCE-MRI is Reproducible and Feasible for Breast Cancer Radiomics: A Feasibility Study.

    Science.gov (United States)

    Veeraraghavan, Harini; Dashevsky, Brittany Z; Onishi, Natsuko; Sadinski, Meredith; Morris, Elizabeth; Deasy, Joseph O; Sutton, Elizabeth J

    2018-03-19

    We present a segmentation approach that combines GrowCut (GC) with cancer-specific multi-parametric Gaussian Mixture Model (GCGMM) to produce accurate and reproducible segmentations. We evaluated GCGMM using a retrospectively collected 75 invasive ductal carcinoma with ERPR+ HER2- (n = 15), triple negative (TN) (n = 9), and ER-HER2+ (n = 57) cancers with variable presentation (mass and non-mass enhancement) and background parenchymal enhancement (mild and marked). Expert delineated manual contours were used to assess the segmentation performance using Dice coefficient (DSC), mean surface distance (mSD), Hausdorff distance, and volume ratio (VR). GCGMM segmentations were significantly more accurate than GrowCut (GC) and fuzzy c-means clustering (FCM). GCGMM's segmentations and the texture features computed from those segmentations were the most reproducible compared with manual delineations and other analyzed segmentation methods. Finally, random forest (RF) classifier trained with leave-one-out cross-validation using features extracted from GCGMM segmentation resulted in the best accuracy for ER-HER2+ vs. ERPR+/TN (GCGMM 0.95, expert 0.95, GC 0.90, FCM 0.92) and for ERPR + HER2- vs. TN (GCGMM 0.92, expert 0.91, GC 0.77, FCM 0.83).

  15. Plant computer system in nuclear power station

    International Nuclear Information System (INIS)

    Kato, Shinji; Fukuchi, Hiroshi

    1991-01-01

    In nuclear power stations, centrally concentrated monitoring system has been adopted, and in central control rooms, large quantity of information and operational equipments concentrate, therefore, those become the important place of communication between plants and operators. Further recently, due to the increase of the unit capacity, the strengthening of safety, the problems of man-machine interface and so on, it has become important to concentrate information, to automate machinery and equipment and to simplify them for improving the operational environment, reliability and so on. On the relation of nuclear power stations and computer system, to which attention has been paid recently as the man-machine interface, the example in Tsuruga Power Station, Japan Atomic Power Co. is shown. No.2 plant in the Tsuruga Power Station is a PWR plant with 1160 MWe output, which is a home built standardized plant, accordingly the computer system adopted here is explained. The fundamental concept of the central control board, the process computer system, the design policy, basic system configuration, reliability and maintenance, CRT display, and the computer system for No.1 BWR 357 MW plant are reported. (K.I.)

  16. Focus stacking: Comparing commercial top-end set-ups with a semi-automatic low budget approach. A possible solution for mass digitization of type specimens.

    Science.gov (United States)

    Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick

    2014-01-01

    In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget.

  17. Focus stacking: Comparing commercial top-end set-ups with a semi-automatic low budget approach. A possible solution for mass digitization of type specimens

    Directory of Open Access Journals (Sweden)

    Jonathan Brecko

    2014-12-01

    Full Text Available In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker. We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget.

  18. Computer systems and networks: Status and perspectives

    International Nuclear Information System (INIS)

    Zacharov, Z.

    1981-01-01

    The properties of computers are discussed, both as separate units and in inter-coupled systems. The main elements of modern processor thechnology are reviewed and the associated peripheral components are disscussed in the light of the prevailling rapid pace of developments. Particular emphais is given to the impact of very large scale integrated circuitry in these developments. Computer networks, and considered in some detail, including comon-carrier and local-area networks and the problem of inter-working is included in the discussion. Components of network systems and the associated technology are also among the topics treated. (orig.)

  19. Computer systems and networks status and perspectives

    CERN Document Server

    Zacharov, V

    1981-01-01

    The properties of computers are discussed, both as separate units and in inter-coupled systems. The main elements of modern processor technology are reviewed and the associated peripheral components are discussed in the light of the prevailing rapid pace of developments. Particular emphasis is given to the impact of very large scale integrated circuitry in these developments. Computer networks are considered in some detail, including common-carrier and local-area networks, and the problem of inter-working is included in the discussion. Components of network systems and the associated technology are also among the topics treated.

  20. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  1. Large computer systems and new architectures

    International Nuclear Information System (INIS)

    Bloch, T.

    1978-01-01

    The super-computers of today are becoming quite specialized and one can no longer expect to get all the state-of-the-art software and hardware facilities in one package. In order to achieve faster and faster computing it is necessary to experiment with new architectures, and the cost of developing each experimental architecture into a general-purpose computer system is too high when one considers the relatively small market for these computers. The result is that such computers are becoming 'back-ends' either to special systems (BSP, DAP) or to anything (CRAY-1). Architecturally the CRAY-1 is the most attractive today since it guarantees a speed gain of a factor of two over a CDC 7600 thus allowing us to regard any speed up resulting from vectorization as a bonus. It looks, however, as if it will be very difficult to make substantially faster computers using only pipe-lining techniques and that it will be necessary to explore multiple processors working on the same problem. The experience which will be gained with the BSP and the DAP over the next few years will certainly be most valuable in this respect. (Auth.)

  2. Framework for computer-aided systems design

    International Nuclear Information System (INIS)

    Esselman, W.H.

    1992-01-01

    Advanced computer technology, analytical methods, graphics capabilities, and expert systems contribute to significant changes in the design process. Continued progress is expected. Achieving the ultimate benefits of these computer-based design tools depends on successful research and development on a number of key issues. A fundamental understanding of the design process is a prerequisite to developing these computer-based tools. In this paper a hierarchical systems design approach is described, and methods by which computers can assist the designer are examined. A framework is presented for developing computer-based design tools for power plant design. These tools include expert experience bases, tutorials, aids in decision making, and tools to develop the requirements, constraints, and interactions among subsystems and components. Early consideration of the functional tasks is encouraged. Methods of acquiring an expert's experience base is a fundamental research problem. Computer-based guidance should be provided in a manner that supports the creativity, heuristic approaches, decision making, and meticulousness of a good designer

  3. Architecture, systems research and computational sciences

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  4. Expert-systems and computer-based industrial systems

    International Nuclear Information System (INIS)

    Terrien, J.F.

    1987-01-01

    Framatome makes wide use of expert systems, computer-assisted engineering, production management and personnel training. It has set up separate business units and subsidiaries and also participates in other companies which have the relevant expertise. Five examples of the products and services available in these are discussed. These are in the field of applied artificial intelligence and expert-systems, in integrated computer-aid design and engineering, structural analysis, computer-related products and services and document management systems. The structure of the companies involved and the work they are doing is discussed. (UK)

  5. Towards a new PDG computing system

    International Nuclear Information System (INIS)

    Beringer, J; Dahl, O; Zyla, P; Jackson, K; McParland, C; Poon, S; Robertson, D

    2011-01-01

    The computing system that supports the worldwide Particle Data Group (PDG) of over 170 authors in the production of the Review of Particle Physics was designed more than 20 years ago. It has reached its scalability and usability limits and can no longer satisfy the requirements and wishes of PDG collaborators and users alike. We discuss the ongoing effort to modernize the PDG computing system, including requirements, architecture and status of implementation. The new system will provide improved user features and will fully support the PDG collaboration from distributed web-based data entry, work flow management, authoring and refereeing to data verification and production of the web edition and manuscript for the publisher. Cross-linking with other HEP information systems will be greatly improved.

  6. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  7. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  9. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  10. Personal computer based home automation system

    OpenAIRE

    Hellmuth, George F.

    1993-01-01

    The systems engineering process is applied in the development of the preliminary design of a home automation communication protocol. The objective of the communication protocol is to provide a means for a personal computer to communicate with adapted appliances in the home. A needs analysis is used to ascertain that a need exist for a home automation system. Numerous design alternatives are suggested and evaluated to determine the best possible protocol design. Coaxial cable...

  11. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  12. Computer-aided control system design

    International Nuclear Information System (INIS)

    Lebenhaft, J.R.

    1986-01-01

    Control systems are typically implemented using conventional PID controllers, which are then tuned manually during plant commissioning to compensate for interactions between feedback loops. As plants increase in size and complexity, such controllers can fail to provide adequate process regulations. Multivariable methods can be utilized to overcome these limitations. At the Chalk River Nuclear Laboratories, modern control systems are designed and analyzed with the aid of MVPACK, a system of computer programs that appears to the user like a high-level calculator. The software package solves complicated control problems, and provides useful insight into the dynamic response and stability of multivariable systems

  13. Computer-aided protective system (CAPS)

    International Nuclear Information System (INIS)

    Squire, R.K.

    1988-01-01

    A method of improving the security of materials in transit is described. The system provides a continuously monitored position location system for the transport vehicle, an internal computer-based geographic delimiter that makes continuous comparisons of actual positions with the preplanned routing and schedule, and a tamper detection/reaction system. The position comparison is utilized to institute preprogrammed reactive measures if the carrier is taken off course or schedule, penetrated, or otherwise interfered with. The geographic locater could be an independent internal platform or an external signal-dependent system utilizing GPS, Loran or similar source of geographic information; a small (micro) computer could provide adequate memory and computational capacity; the insurance of integrity of the system indicates the need for a tamper-proof container and built-in intrusion sensors. A variant of the system could provide real-time transmission of the vehicle position and condition to a central control point for; such transmission could be encrypted to preclude spoofing

  14. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  15. Distributed computer control system for reactor optimization

    International Nuclear Information System (INIS)

    Williams, A.H.

    1983-01-01

    At the Oldbury power station a prototype distributed computer control system has been installed. This system is designed to support research and development into improved reactor temperature control methods. This work will lead to the development and demonstration of new optimal control systems for improvement of plant efficiency and increase of generated output. The system can collect plant data from special test instrumentation connected to dedicated scanners and from the station's existing data processing system. The system can also, via distributed microprocessor-based interface units, make adjustments to the desired reactor channel gas exit temperatures. The existing control equipment will then adjust the height of control rods to maintain operation at these temperatures. The design of the distributed system is based on extensive experience with distributed systems for direct digital control, operator display and plant monitoring. The paper describes various aspects of this system, with particular emphasis on: (1) the hierarchal system structure; (2) the modular construction of the system to facilitate installation, commissioning and testing, and to reduce maintenance to module replacement; (3) the integration of the system into the station's existing data processing system; (4) distributed microprocessor-based interfaces to the reactor controls, with extensive security facilities implemented by hardware and software; (5) data transfer using point-to-point and bussed data links; (6) man-machine communication based on VDUs with computer input push-buttons and touch-sensitive screens; and (7) the use of a software system supporting a high-level engineer-orientated programming language, at all levels in the system, together with comprehensive data link management

  16. International Conference on Soft Computing Systems

    CERN Document Server

    Panigrahi, Bijaya

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in International Conference on Soft Computing Systems (ICSCS 2015) held at Noorul Islam Centre for Higher Education, Chennai, India. These research papers provide the latest developments in the emerging areas of Soft Computing in Engineering and Technology. The book is organized in two volumes and discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  17. Application engineering for process computer systems

    International Nuclear Information System (INIS)

    Mueller, K.

    1975-01-01

    The variety of tasks for process computers in nuclear power stations necessitates the centralization of all production stages from the planning stage to the delivery of the finished process computer system (PRA) to the user. This so-called 'application engineering' comprises all of the activities connected with the application of the PRA: a) establishment of the PRA concept, b) project counselling, c) handling of offers, d) handling of orders, e) internal handling of orders, f) technical counselling, g) establishing of parameters, h) monitoring deadlines, i) training of customers, j) compiling an operation manual. (orig./AK) [de

  18. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  19. Embedded systems for supporting computer accessibility.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  20. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  1. Overview of the ATLAS distributed computing system

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment successfully uses a worldwide computing infrastructure to support the physics program during LHC Run 2. The grid workflow system PanDA routinely manages 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG and handled by the ATLAS data management system Rucio. To prepare for the ever growing LHC luminosity in future runs new developments are underway to even more efficiently use opportunistic resources such as HPCs and utilize new technologies. This presentation will review and explain the outline and the performance of the ATLAS distributed computing system and give an outlook to new workflow and data management ideas for the beginning of the LHC Run 3.

  2. Reactor safety: the Nova computer system

    International Nuclear Information System (INIS)

    Eisgruber, H.; Stadelmann, W.

    1991-01-01

    After instances of maloperation, the causes of defects, the effectiveness of the measures taken to control the situation, and possibilities to avoid future recurrences need to be investigated above all before the plant is restarted. The most important aspect in all these efforts is to check the sequence in time, and the completeness, of the control measures initiated automatically. For this verification, a computer system is used instead of time-consuming manual analytical techniques, which produces the necessary information almost in real time. The results are available within minutes after completion of the measures initiated automatically. As all short-term safety functions are initiated by automatic systems, their consistent and comprehensive verification results in a clearly higher level of safety. The report covers the development of the computer system, and its implementation, in the Gundremmingen nuclear power station. Similar plans are being pursued in Biblis and Muelheim-Kaerlich. (orig.) [de

  3. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Van Arsdall, P.J. LLNL

    1998-01-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  4. Computer control system of TARN-2

    International Nuclear Information System (INIS)

    Watanabe, S.

    1989-01-01

    The CAMAC interface system is employed in order to regulate the power supply, beam diagnostic and so on. Five CAMAC stations are located in the TARN-2 area and are linked with a serial highway system. The CAMAC serial highway is driven by a serial highway driver, Kinetic 3992, which is housed in the CAMAC powered crate and regulated by two successive methods. One is regulated by the mini computer through the standard branch-highway crate controller, named Type-A2, and the other is regulated with the microcomputer through the auxiliary crate controller. The CAMAC serial highway comprises the two-way optical cables with a total length of 300 m. Each CAMAC station has the serial and auxiliary crate controllers so as to realize alternative control with the local computer system. Interpreter, INSBASIC, is used in the main control computer. There are many kinds of the 'device control function' of the INSBASIC. Because the 'device control function' implies physical operating procedure of such a device, only knowledge of the logical operating procedure is required. A touch panel system is employed to regulate the complicated control flow without any knowledge of the usage of the device. A rotary encoder system, which is analogous to the potentiometer operation, is also available for smooth adjustment of the setting parameter. (author)

  5. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  6. FFTF fission gas monitor computer system

    International Nuclear Information System (INIS)

    Hubbard, J.A.

    1987-01-01

    The Fast Flux Test Facility (FFTF) is a liquid-metal-cooled test reactor located on the Hanford site. A dual computer system has been developed to monitor the reactor cover gas to detect and characterize any fuel or test pin fission gas releases. The system acquires gamma spectra data, identifies isotopes, calculates specific isotope and overall cover gas activity, presents control room alarms and displays, and records and prints data and analysis reports. The fission gas monitor system makes extensive use of commercially available hardware and software, providing a reliable and easily maintained system. The design provides extensive automation of previous manual operations, reducing the need for operator training and minimizing the potential for operator error. The dual nature of the system allows one monitor to be taken out of service for periodic tests or maintenance without interrupting the overall system functions. A built-in calibrated gamma source can be controlled by the computer, allowing the system to provide rapid system self tests and operational performance reports

  7. Computer-controlled radiation monitoring system

    International Nuclear Information System (INIS)

    Homann, S.G.

    1994-01-01

    A computer-controlled radiation monitoring system was designed and installed at the Lawrence Livermore National Laboratory's Multiuser Tandem Laboratory (10 MV tandem accelerator from High Voltage Engineering Corporation). The system continuously monitors the photon and neutron radiation environment associated with the facility and automatically suspends accelerator operation if preset radiation levels are exceeded. The system has proved reliable real-time radiation monitoring over the past five years, and has been a valuable tool for maintaining personnel exposure as low as reasonably achievable

  8. Analog system for computing sparse codes

    Science.gov (United States)

    Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell

    2010-08-24

    A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.

  9. Decomposability queueing and computer system applications

    CERN Document Server

    Courtois, P J

    1977-01-01

    Decomposability: Queueing and Computer System Applications presents a set of powerful methods for systems analysis. This 10-chapter text covers the theory of nearly completely decomposable systems upon which specific analytic methods are based.The first chapters deal with some of the basic elements of a theory of nearly completely decomposable stochastic matrices, including the Simon-Ando theorems and the perturbation theory. The succeeding chapters are devoted to the analysis of stochastic queuing networks that appear as a type of key model. These chapters also discuss congestion problems in

  10. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1988-09-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multi-user Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implementation with four months with a computer and instrumentation cost of approximately $100K. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking and operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the efficient implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. 3 refs

  11. Distributed computer controls for accelerator systems

    Science.gov (United States)

    Moore, T. L.

    1989-04-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed.

  12. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1989-01-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. (orig.)

  13. Applicability of Computational Systems Biology in Toxicology

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Hadrup, Niels; Audouze, Karine Marie Laure

    2014-01-01

    be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method......Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources...... and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search...

  14. Computer aided system engineering for space construction

    Science.gov (United States)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  15. Precision surveying system for PEP

    International Nuclear Information System (INIS)

    Gunn, J.; Lauritzen, T.; Sah, R.; Pellisier, P.F.

    1977-01-01

    A semi-automatic precision surveying system is being developed for PEP. Reference elevations for vertical alignment will be provided by a liquid level. The short range surveying will be accomplished using a Laser Surveying System featuring automatic data acquisition and analysis

  16. The Australian Computational Earth Systems Simulator

    Science.gov (United States)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  17. Upgrade plan for HANARO control computer system

    International Nuclear Information System (INIS)

    Kim, Min Jin; Kim, Young Ki; Jung, Hwan Sung; Choi, Young San; Woo, Jong Sub; Jun, Byung Jin

    2001-01-01

    A microprocessor based digital control system, the Multi-Loop Controller (MLC), which was chosen to control HANARO, was introduced to the market in early '80s and it had been used to control petrochemical plant, paper mill and Slowpoke reactor in Canada. Due to the development in computer technology, it has become so outdated model and the production of this model was discontinued a few years ago. Hence difficulty in acquiring the spare parts is expected. To achieve stable reactor control during its lifetime and to avoid possible technical dependency to the manufacturer, a long-term replacement plan for HANARO control computer system is on its way. The plan will include a few steps in its process. This paper briefly introduces the methods of implementation of the process and discusses the engineering activities of the plan

  18. Interactive computer-enhanced remote viewing system

    International Nuclear Information System (INIS)

    Tourtellott, J.A.; Wagner, J.F.

    1995-01-01

    Remediation activities such as decontamination and decommissioning (D ampersand D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically

  19. Interactive computer-enhanced remote viewing system

    Energy Technology Data Exchange (ETDEWEB)

    Tourtellott, J.A.; Wagner, J.F. [Mechanical Technology Incorporated, Latham, NY (United States)

    1995-10-01

    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically.

  20. Performance evaluation of a computed radiography system

    Energy Technology Data Exchange (ETDEWEB)

    Roussilhe, J.; Fallet, E. [Carestream Health France, 71 - Chalon/Saone (France); Mango, St.A. [Carestream Health, Inc. Rochester, New York (United States)

    2007-07-01

    Computed radiography (CR) standards have been formalized and published in Europe and in the US. The CR system classification is defined in those standards by - minimum normalized signal-to-noise ratio (SNRN), and - maximum basic spatial resolution (SRb). Both the signal-to-noise ratio (SNR) and the contrast sensitivity of a CR system depend on the dose (exposure time and conditions) at the detector. Because of their wide dynamic range, the same storage phosphor imaging plate can qualify for all six CR system classes. The exposure characteristics from 30 to 450 kV, the contrast sensitivity, and the spatial resolution of the KODAK INDUSTREX CR Digital System have been thoroughly evaluated. This paper will present some of the factors that determine the system's spatial resolution performance. (authors)

  1. TMX-U computer system in evolution

    International Nuclear Information System (INIS)

    Casper, T.A.; Bell, H.; Brown, M.; Gorvad, M.; Jenkins, S.; Meyer, W.; Moller, J.; Perkins, D.

    1986-01-01

    Over the past three years, the total TMX-U diagnsotic data base has grown to exceed 10 megabytes from over 1300 channels; roughly triple the originally designed size. This acquisition and processing load has resulted in an experiment repetition rate exceeding 10 minutes per shot using the five original Hewlett-Packard HP-1000 computers with their shared disks. Our new diagnostics tend to be multichannel instruments, which, in our environment, can be more easily managed using local computers. For this purpose, we are using HP series 9000 computers for instrument control, data acquisition, and analysis. Fourteen such systems are operational with processed format output exchanged via a shared resource manager. We are presently implementing the necessary hardware and software changes to create a local area network allowing us to combine the data from these systems with our main data archive. The expansion of our diagnostic system using the paralled acquisition and processing concept allows us to increase our data base with a minimum of impact on the experimental repetition rate

  2. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  3. Radiation management computer system for Monju

    International Nuclear Information System (INIS)

    Aoyama, Kei; Yasutomo, Katsumi; Sudou, Takayuki; Yamashita, Masahiro; Hayata, Kenichi; Ueda, Hajime; Hosokawa, Hideo

    2002-01-01

    Radiation management of nuclear power research institutes, nuclear power stations and other such facilities are strictly managed under Japanese laws and management policies. Recently, the momentous issues of more accurate radiation dose management and increased work efficiency has been discussed. Up to now, Fuji Electric Company has supplied a large number of Radiation Management Systems to nuclear power stations and related nuclear facilities. We introduce the new radiation management computer system with adopted WWW technique for Japan Nuclear Cycle Development Institute, MONJU Fast Breeder Reactor (MONJU). (author)

  4. A computer-based purchase management system

    International Nuclear Information System (INIS)

    Kuriakose, K.K.; Subramani, M.G.

    1989-01-01

    The details of a computer-based purchase management system developed to meet the specific requirements of Madras Regional Purchase Unit (MRPU) is given. Howe ver it can be easily modified to meet the requirements of any other purchase department. It covers various operations of MRPU starting from indent processing to preparation of purchase orders and reminders. In order to enable timely management action and control facilities are provided to generate the necessary management information reports. The scope for further work is also discussed. The system is completely menu driven and user friendly. Appendix A and B contains the menu implemented and the sample outputs respectively. (author)

  5. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  6. Computational modeling of shallow geothermal systems

    CERN Document Server

    Al-Khoury, Rafid

    2011-01-01

    A Step-by-step Guide to Developing Innovative Computational Tools for Shallow Geothermal Systems Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. Shallow geothermal systems are increasingly utilized for heating and cooling of buildings and greenhouses. However, their utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. Projects of this nature are not getting the public support they deserve because of the uncertainties associated with

  7. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  8. Interactive computer-enhanced remote viewing system

    International Nuclear Information System (INIS)

    Tourtellott, J.A.; Wagner, J.F.

    1995-01-01

    Remediation activities such as decontamination and decommissioning (D ampersand D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths am clear of obstacles. This need for a task space model is most pronounced in the remediation of obsolete production facilities and underground storage tanks. Production facilities at many sites contain compact process machinery and systems that were used to produce weapons grade material. For many such systems, a complex maze of pipes (with potentially dangerous contents) must be removed, and this represents a significant D ampersand D challenge. In an analogous way, the underground storage tanks at sites such as Hanford represent a challenge because of their limited entry and the tumbled profusion of in-tank hardware. In response to this need, the Interactive Computer-Enhanced Remote Viewing System (ICERVS) is being designed as a software system to: (1) Provide a reliable geometric description of a robotic task space, and (2) Enable robotic remediation to be conducted more effectively and more economically than with available techniques. A system such as ICERVS is needed because of the problems discussed below

  9. Production Management System for AMS Computing Centres

    Science.gov (United States)

    Choutko, V.; Demakov, O.; Egorov, A.; Eline, A.; Shan, B. S.; Shi, R.

    2017-10-01

    The Alpha Magnetic Spectrometer [1] (AMS) has collected over 95 billion cosmic ray events since it was installed on the International Space Station (ISS) on May 19, 2011. To cope with enormous flux of events, AMS uses 12 computing centers in Europe, Asia and North America, which have different hardware and software configurations. The centers are participating in data reconstruction, Monte-Carlo (MC) simulation [2]/Data and MC production/as well as in physics analysis. Data production management system has been developed to facilitate data and MC production tasks in AMS computing centers, including job acquiring, submitting, monitoring, transferring, and accounting. It was designed to be modularized, light-weighted, and easy-to-be-deployed. The system is based on Deterministic Finite Automaton [3] model, and implemented by script languages, Python and Perl, and the built-in sqlite3 database on Linux operating systems. Different batch management systems, file system storage, and transferring protocols are supported. The details of the integration with Open Science Grid are presented as well.

  10. Compact, open-architecture computed radiography system

    International Nuclear Information System (INIS)

    Huang, H.K.; Lim, A.; Kangarloo, H.; Eldredge, S.; Loloyan, M.; Chuang, K.S.

    1990-01-01

    Computed radiography (CR) was introduced in 1982, and its basic system design has not changed. Current CR systems have certain limitations: spatial resolution and signal-to-noise ratios are lower than those of screen-film systems, they are complicated and expensive to build, and they have a closed architecture. The authors of this paper designed and implemented a simpler, lower-cost, compact, open-architecture CR system to overcome some of these limitations. The open-architecture system is a manual-load-single-plate reader that can fit on a desk top. Phosphor images are stored in a local disk and can be sent to any other computer through standard interfaces. Any manufacturer's plate can be read with a scanning time of 90 second for a 35 x 43-cm plate. The standard pixel size is 174 μm and can be adjusted for higher spatial resolution. The data resolution is 12 bits/pixel over an x-ray exposure range of 0.01-100 mR

  11. A computer-aided continuous assessment system

    Directory of Open Access Journals (Sweden)

    B. C.H. Turton

    1996-12-01

    Full Text Available Universities within the United Kingdom have had to cope with a massive expansion in undergraduate student numbers over the last five years (Committee of Scottish University Principals, 1993; CVCP Briefing Note, 1994. In addition, there has been a move towards modularization and a closer monitoring of a student's progress throughout the year. Since the price/performance ratio of computer systems has continued to improve, Computer- Assisted Learning (CAL has become an attractive option. (Fry, 1990; Benford et al, 1994; Laurillard et al, 1994. To this end, the Universities Funding Council (UFQ has funded the Teaching and Learning Technology Programme (TLTP. However universities also have a duty to assess as well as to teach. This paper describes a Computer-Aided Assessment (CAA system capable of assisting in grading students and providing feedback. In this particular case, a continuously assessed course (Low-Level Languages of over 100 students is considered. Typically, three man-days are required to mark one assessed piece of coursework from the students in this class. Any feedback on how the questions were dealt with by the student are of necessity brief. Most of the feedback is provided in a tutorial session that covers the pitfalls encountered by the majority of the students.

  12. RXY/DRXY-a postprocessing graphical system for scientific computation

    International Nuclear Information System (INIS)

    Jin Qijie

    1990-01-01

    Scientific computing require computer graphical function for its visualization. The developing objects and functions of a postprocessing graphical system for scientific computation are described, and also briefly described its implementation

  13. Computational needs for the RIA accelerator systems

    International Nuclear Information System (INIS)

    Ostroumov, P.N.; Nolen, J.A.; Mustapha, B.

    2006-01-01

    This paper discusses the computational needs for the full design and simulation of the RIA accelerator systems. Beam dynamics simulations are essential to first define and optimize the architectural design for both the driver linac and the post-accelerator. They are also important to study different design options and various off-normal modes in order to decide on the most-performing and cost-effective design. Due to the high-intensity primary beams, the beam-stripper interaction is a source of both radioactivation and beam contamination and should be carefully investigated and simulated for proper beam collimation and shielding. The targets and fragment separators area needs also very special attention in order to reduce any radiological hazards by careful shielding design. For all these simulations parallel computing is an absolute necessity

  14. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  15. Specialized Computer Systems for Environment Visualization

    Science.gov (United States)

    Al-Oraiqat, Anas M.; Bashkov, Evgeniy A.; Zori, Sergii A.

    2018-06-01

    The need for real time image generation of landscapes arises in various fields as part of tasks solved by virtual and augmented reality systems, as well as geographic information systems. Such systems provide opportunities for collecting, storing, analyzing and graphically visualizing geographic data. Algorithmic and hardware software tools for increasing the realism and efficiency of the environment visualization in 3D visualization systems are proposed. This paper discusses a modified path tracing algorithm with a two-level hierarchy of bounding volumes and finding intersections with Axis-Aligned Bounding Box. The proposed algorithm eliminates the branching and hence makes the algorithm more suitable to be implemented on the multi-threaded CPU and GPU. A modified ROAM algorithm is used to solve the qualitative visualization of reliefs' problems and landscapes. The algorithm is implemented on parallel systems—cluster and Compute Unified Device Architecture-networks. Results show that the implementation on MPI clusters is more efficient than Graphics Processing Unit/Graphics Processing Clusters and allows real-time synthesis. The organization and algorithms of the parallel GPU system for the 3D pseudo stereo image/video synthesis are proposed. With realizing possibility analysis on a parallel GPU-architecture of each stage, 3D pseudo stereo synthesis is performed. An experimental prototype of a specialized hardware-software system 3D pseudo stereo imaging and video was developed on the CPU/GPU. The experimental results show that the proposed adaptation of 3D pseudo stereo imaging to the architecture of GPU-systems is efficient. Also it accelerates the computational procedures of 3D pseudo-stereo synthesis for the anaglyph and anamorphic formats of the 3D stereo frame without performing optimization procedures. The acceleration is on average 11 and 54 times for test GPUs.

  16. Tutoring system for nondestructive testing using computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Koo; Koh, Sung Nam [Joong Ang Inspection Co.,Ltd., Seoul (Korea, Republic of); Shim, Yun Ju; Kim, Min Koo [Dept. of Computer Engineering, Aju University, Suwon (Korea, Republic of)

    1997-10-15

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  17. Tutoring system for nondestructive testing using computer

    International Nuclear Information System (INIS)

    Kim, Jin Koo; Koh, Sung Nam; Shim, Yun Ju; Kim, Min Koo

    1997-01-01

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  18. Visual computing scientific visualization and imaging systems

    CERN Document Server

    2014-01-01

    This volume aims to stimulate discussions on research involving the use of data and digital images as an understanding approach for analysis and visualization of phenomena and experiments. The emphasis is put not only on graphically representing data as a way of increasing its visual analysis, but also on the imaging systems which contribute greatly to the comprehension of real cases. Scientific Visualization and Imaging Systems encompass multidisciplinary areas, with applications in many knowledge fields such as Engineering, Medicine, Material Science, Physics, Geology, Geographic Information Systems, among others. This book is a selection of 13 revised and extended research papers presented in the International Conference on Advanced Computational Engineering and Experimenting -ACE-X conferences 2010 (Paris), 2011 (Algarve), 2012 (Istanbul) and 2013 (Madrid). The examples were particularly chosen from materials research, medical applications, general concepts applied in simulations and image analysis and ot...

  19. Implementing a modular system of computer codes

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.

    1983-07-01

    A modular computation system has been developed for nuclear reactor core analysis. The codes can be applied repeatedly in blocks without extensive user input data, as needed for reactor history calculations. The primary control options over the calculational paths and task assignments within the codes are blocked separately from other instructions, admitting ready access by user input instruction or directions from automated procedures and promoting flexible and diverse applications at minimum application cost. Data interfacing is done under formal specifications with data files manipulated by an informed manager. This report emphasizes the system aspects and the development of useful capability, hopefully informative and useful to anyone developing a modular code system of much sophistication. Overall, this report in a general way summarizes the many factors and difficulties that are faced in making reactor core calculations, based on the experience of the authors. It provides the background on which work on HTGR reactor physics is being carried out

  20. 2XIIB computer data acquisition system

    International Nuclear Information System (INIS)

    Tyler, G.C.

    1975-01-01

    All major plasma diagnostic measurements from the 2XIIB experiment are recorded, digitized, and stored by the computer data acquisition system. The raw data is then examined, correlated, reduced, and useful portions are quickly retrieved which direct the future conduct of the plasma experiment. This is done in real time and on line while the data is current. The immediate availability of this pertinent data has accelerated the rate at which the 2XII personnel have been able to gain knowledge in the study of plasma containment and fusion interaction. The up time of the experiment is being used much more effectively than ever before. This paper describes the hardware configuration of our data system in relation to various plasma parameters measured, the advantages of powerful software routines to reduce and correlate the data, the present plans for expansion of the system, and the problems we have had to overcome in certain areas to meet our original goals

  1. Interactive computer enhanced remote viewing system

    International Nuclear Information System (INIS)

    Smith, D.A.; Tourtellott, J.A.

    1994-01-01

    The Interactive, Computer Enhanced, Remote Viewing System (ICERVSA) is a volumetric data system designed to help the Department of Energy (DOE) improve remote operations in hazardous sites by providing reliable and accurate maps of task spaces where robots will clean up nuclear wastes. The ICERVS mission is to acquire, store, integrate and manage all the sensor data for a site and to provide the necessary tools to facilitate its visualization and interpretation. Empirical sensor data enters through the Common Interface for Sensors and after initial processing, is stored in the Volumetric Database. The data can be analyzed and displayed via a Graphic User Interface with a variety of visualization tools. Other tools permit the construction of geometric objects, such as wire frame models, to represent objects which the operator may recognize in the live TV image. A computer image can be generated that matches the viewpoint of the live TV camera at the remote site, facilitating access to site data. Lastly, the data can be gathered, processed, and transmitted in acceptable form to a robotic controller. Descriptions are given of all these components. The final phase of the ICERVS project, which has just begun, will produce a full scale system and demonstrate it at a DOE site to be selected. A task added to this Phase will adapt the ICERVS to meet the needs of the Dismantlement and Decommissioning (D and D) work at the Oak Ridge National Laboratory (ORNL)

  2. Superconducting system for adiabatic quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Corato, V [Dipartimento di Ingegneria dell' Informazione, Second University of Naples, 81031 Aversa (Italy); Roscilde, T [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-0484 (Canada); Ruggiero, B [Istituto di Cibernetica ' E.Caianiello' del CNR, I-80078, Pozzuoli (Italy); Granata, C [Istituto di Cibernetica ' E.Caianiello' del CNR, I-80078, Pozzuoli (Italy); Silvestrini, P [Dipartimento di Ingegneria dell' Informazione, Second University of Naples, 81031 Aversa (Italy)

    2006-06-01

    We study the Hamiltonian of a system of inductively coupled flux qubits, which has been theoretically proposed for adiabatic quantum computation to handle NP problems. We study the evolution of a basic structure consisting of three coupled rf-SQUIDs upon tuning the external flux bias, and we show that the adiabatic nature of the evolution is guaranteed by the presence of the single-SQUID gap. We further propose a scheme and the first realization of an experimental device suitable for verifying the theoretical results.

  3. Computers start to think with expert systems

    Energy Technology Data Exchange (ETDEWEB)

    1983-03-21

    A growing number of professionals-notably in oil and mineral exploration, plasma research, medicine, VLSI circuit design, drug design and robotics-are beginning to use computerised expert systems. A computer program uses knowledge and inference procedures to solve problems which are sufficiently difficult to require significant human expertise for their solution. The facts constitute a body of information that is widely shared, publicly available and generally agreed upon by experts in the field. The heuristics are mostly private, and little discussed, rules of good judgement (rules of plausible reasoning, rules of good guessing, etc.) that characterise expert-level decision making in the field.

  4. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  5. Programming guidelines for computer systems of NPPs

    International Nuclear Information System (INIS)

    Suresh babu, R.M.; Mahapatra, U.

    1999-09-01

    Software quality is assured by systematic development and adherence to established standards. All national and international software quality standards have made it mandatory for the software development organisation to produce programming guidelines as part of software documentation. This document contains a set of programming guidelines for detailed design and coding phases of software development cycle. These guidelines help to improve software quality by increasing visibility, verifiability, testability and maintainability. This can be used organisation-wide for various computer systems being developed for our NPPs. This also serves as a guide for reviewers. (author)

  6. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  7. Spectrometer user interface to computer systems

    International Nuclear Information System (INIS)

    Salmon, L.; Davies, M.; Fry, F.A.; Venn, J.B.

    1979-01-01

    A computer system for use in radiation spectrometry should be designed around the needs and comprehension of the user and his operating environment. To this end, the functions of the system should be built in a modular and independent fashion such that they can be joined to the back end of an appropriate user interface. The point that this interface should be designed rather than just allowed to evolve is illustrated by reference to four related computer systems of differing complexity and function. The physical user interfaces in all cases are keyboard terminals, and the virtues and otherwise of these devices are discussed and compared with others. The language interface needs to satisfy a number of requirements, often conflicting. Among these, simplicity and speed of operation compete with flexibility and scope. Both experienced and novice users need to be considered, and any individual's needs may vary from naive to complex. To be efficient and resilient, the implementation must use an operating system, but the user needs to be protected from its complex and unfamiliar syntax. At the same time the interface must allow the user access to all services appropriate to his needs. The user must also receive an image of privacy in a multi-user system. The interface itself must be stable and exhibit continuity between implementations. Some of these conflicting needs have been overcome by the SABRE interface with languages operating at several levels. The foundation is a simple semimnemonic command language that activates indididual and independent functions. The commands can be used with positional parameters or in an interactive dialogue the precise nature of which depends upon the operating environment and the user's experience. A command procedure or macrolanguage allows combinations of commands with conditional branching and arithmetic features. Thus complex but repetitive operations are easily performed

  8. Design And Construction Of Controller System And Data Acquisition Of Creep Test Machine

    International Nuclear Information System (INIS)

    Farokhi; Arhatari, B.D.; DT. SonyTj.. Histori; Sudarno; Haryanto, Mudi; Triyadi, Ari

    2001-01-01

    Design and construction of creep test machine have been done to get a higher performance of controller system and data acquisition of that machine. The Design and construction were made by adding an automatic power control circuit, an interface and computer program on PC. The interface circuit is made in a form of a card which applicable on the compatible ISA-IBM PC. The computer program is written in turbo C++. With that modification, the test results show reduction in measurement error from 80μm to 90μm. The modification gives also benefit semi-automatic of the creep test machine. It means decreasing on the operator dependence. Another advantages are to make easier on the result data reading, to show the result data on the real time or on file, to make easier on appearing of a test result curve and on the result data analysis

  9. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  10. FFTF integrated leak rate computer system

    International Nuclear Information System (INIS)

    Hubbard, J.A.

    1987-01-01

    The Fast Flux Test Facility (FFTF) is a liquid-metal-cooled test reactor located on the Hanford site. The FFTF is the only reactor of this type designed and operated to meet the licensing requirements of the Nuclear Regulatory Commission. Unique characteristics of the FFTF that present special challenges related to leak rate testing include thin wall containment vessel construction, cover gas systems that penetrate containment, and a low-pressure design basis accident. The successful completion of the third FFTF integrated leak rate test 5 days ahead of schedule and 10% under budget was a major achievement for the Westinghouse Hanford Company. The success of this operational safety test was due in large part to a special network (LAN) of three IBM PC/XT computers, which monitored the sensor data, calculated the containment vessel leak rate, and displayed test results. The equipment configuration allowed continuous monitoring of the progress of the test independent of the data acquisition and analysis functions, and it also provided overall improved system reliability by permitting immediate switching to backup computers in the event of equipment failure

  11. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  12. Potential of Cognitive Computing and Cognitive Systems

    Science.gov (United States)

    Noor, Ahmed K.

    2015-01-01

    Cognitive computing and cognitive technologies are game changers for future engineering systems, as well as for engineering practice and training. They are major drivers for knowledge automation work, and the creation of cognitive products with higher levels of intelligence than current smart products. This paper gives a brief review of cognitive computing and some of the cognitive engineering systems activities. The potential of cognitive technologies is outlined, along with a brief description of future cognitive environments, incorporating cognitive assistants - specialized proactive intelligent software agents designed to follow and interact with humans and other cognitive assistants across the environments. The cognitive assistants engage, individually or collectively, with humans through a combination of adaptive multimodal interfaces, and advanced visualization and navigation techniques. The realization of future cognitive environments requires the development of a cognitive innovation ecosystem for the engineering workforce. The continuously expanding major components of the ecosystem include integrated knowledge discovery and exploitation facilities (incorporating predictive and prescriptive big data analytics); novel cognitive modeling and visual simulation facilities; cognitive multimodal interfaces; and cognitive mobile and wearable devices. The ecosystem will provide timely, engaging, personalized / collaborative, learning and effective decision making. It will stimulate creativity and innovation, and prepare the participants to work in future cognitive enterprises and develop new cognitive products of increasing complexity. http://www.aee.odu.edu/cognitivecomp

  13. Handbook for the Computer Security Certification of Trusted Systems

    National Research Council Canada - National Science Library

    Weissman, Clark

    1995-01-01

    Penetration testing is required for National Computer Security Center (NCSC) security evaluations of systems and products for the B2, B3, and A1 class ratings of the Trusted Computer System Evaluation Criteria (TCSEC...

  14. Grid Computing BOINC Redesign Mindmap with incentive system (gamification)

    OpenAIRE

    Kitchen, Kris

    2016-01-01

    Grid Computing BOINC Redesign Mindmap with incentive system (gamification) this is a PDF viewable of https://figshare.com/articles/Grid_Computing_BOINC_Redesign_Mindmap_with_incentive_system_gamification_/1265350

  15. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  16. Multiaxis, Lightweight, Computer-Controlled Exercise System

    Science.gov (United States)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William

    2006-01-01

    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via

  17. The models of the life cycle of a computer system

    Directory of Open Access Journals (Sweden)

    Sorina-Carmen Luca

    2006-01-01

    Full Text Available The paper presents a comparative study on the patterns of the life cycle of a computer system. There are analyzed the advantages of each pattern and presented the graphic schemes that point out each stage and step in the evolution of a computer system. In the end the classifications of the methods of projecting the computer systems are discussed.

  18. 21 CFR 892.1200 - Emission computed tomography system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Emission computed tomography system. 892.1200... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1200 Emission computed tomography system. (a) Identification. An emission computed tomography system is a device intended to detect the...

  19. 14 CFR 415.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  20. Computer-Mediated Communications Systems: Will They Catch On?

    Science.gov (United States)

    Cook, Dave; Ridley, Michael

    1990-01-01

    Describes the use of CoSy, a computer conferencing system, by academic librarians at McMaster University in Ontario. Computer-mediated communications systems (CMCS) are discussed, the use of the system for electronic mail and computer conferencing is described, the perceived usefulness of CMCS is examined, and a sidebar explains details of the…

  1. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  2. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    Science.gov (United States)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  3. Computer simulations of high pressure systems

    International Nuclear Information System (INIS)

    Wilkins, M.L.

    1977-01-01

    Numerical methods are capable of solving very difficult problems in solid mechanics and gas dynamics. In the design of engineering structures, critical decisions are possible if the behavior of materials is correctly described in the calculation. Problems of current interest require accurate analysis of stress-strain fields that range from very small elastic displacement to very large plastic deformation. A finite difference program is described that solves problems over this range and in two and three space-dimensions and time. A series of experiments and calculations serve to establish confidence in the plasticity formulation. The program can be used to design high pressure systems where plastic flow occurs. The purpose is to identify material properties, strength and elongation, that meet the operating requirements. An objective is to be able to perform destructive testing on a computer rather than on the engineering structure. Examples of topical interest are given

  4. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  5. System Matrix Analysis for Computed Tomography Imaging

    Science.gov (United States)

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  6. Data processing device for computed tomography system

    International Nuclear Information System (INIS)

    Nakayama, N.; Ito, Y.; Iwata, K.; Nishihara, E.; Shibayama, S.

    1984-01-01

    A data processing device applied to a computed tomography system which examines a living body utilizing radiation of X-rays is disclosed. The X-rays which have penetrated the living body are converted into electric signals in a detecting section. The electric signals are acquired and converted from an analog form into a digital form in a data acquisition section, and then supplied to a matrix data-generating section included in the data processing device. By this matrix data-generating section are generated matrix data which correspond to a plurality of projection data. These matrix data are supplied to a partial sum-producing section. The partial sums respectively corresponding to groups of the matrix data are calculated in this partial sum-producing section and then supplied to an accumulation section. In this accumulation section, the final value corresponding to the total sum of the matrix data is calculated, whereby the calculation for image reconstruction is performed

  7. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  8. Computer aided operation of complex systems

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1985-09-01

    Advanced technology is having the effect that industrial systems are becoming more highly automated and do not rely on human intervention for the control of normally planned and/or predicted situations. Thus the importance of the operator has shifted from being a manual controller to becoming more of a systems manager and supervisory controller. At the same time, the use of advanced information technology in the control room and its potential impact on human-machine capabilities places additional demands on the designer. This report deals with work carried out to describe the plant-operator relationship in order to systematize the design and evaluation of suitable information systems in the control room. This design process starts with the control requirements from the plant and transforms them into corresponding sets of decision-making tasks with appropriate allocation of responsibilities between computer and operator. To further effectivize this cooperation, appropriate information display and accession are identified. The conceptual work has been supported by experimental studies on a small-scale simulator. (author)

  9. A computing system for LBB considerations

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, K.; Miettinen, J.; Raiko, H.; Keskinen, R.

    1997-04-01

    A computing system has been developed at VTT Energy for making efficient leak-before-break (LBB) evaluations of piping components. The system consists of fracture mechanics and leak rate analysis modules which are linked via an interactive user interface LBBCAL. The system enables quick tentative analysis of standard geometric and loading situations by means of fracture mechanics estimation schemes such as the R6, FAD, EPRI J, Battelle, plastic limit load and moments methods. Complex situations are handled with a separate in-house made finite-element code EPFM3D which uses 20-noded isoparametric solid elements, automatic mesh generators and advanced color graphics. Analytical formulas and numerical procedures are available for leak area evaluation. A novel contribution for leak rate analysis is the CRAFLO code which is based on a nonequilibrium two-phase flow model with phase slip. Its predictions are essentially comparable with those of the well known SQUIRT2 code; additionally it provides outputs for temperature, pressure and velocity distributions in the crack depth direction. An illustrative application to a circumferentially cracked elbow indicates expectedly that a small margin relative to the saturation temperature of the coolant reduces the leak rate and is likely to influence the LBB implementation to intermediate diameter (300 mm) primary circuit piping of BWR plants.

  10. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  11. AGIS: Evolution of Distributed Computing Information system for ATLAS

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria; Karavakis, Edward

    2015-01-01

    The variety of the ATLAS Computing Infrastructure requires a central information system to define the topology of computing resources and to store the different parameters and configuration data which are needed by the various ATLAS software components. The ATLAS Grid Information System is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services.

  12. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  13. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  14. Computing anticipatory systems with incursion and hyperincursion

    Science.gov (United States)

    Dubois, Daniel M.

    1998-07-01

    An anticipatory system is a system which contains a model of itself and/or of its environment in view of computing its present state as a function of the prediction of the model. With the concepts of incursion and hyperincursion, anticipatory discrete systems can be modelled, simulated and controlled. By definition an incursion, an inclusive or implicit recursion, can be written as: x(t+1)=F[…,x(t-1),x(t),x(t+1),…] where the value of a variable x(t+1) at time t+1 is a function of this variable at past, present and future times. This is an extension of recursion. Hyperincursion is an incursion with multiple solutions. For example, chaos in the Pearl-Verhulst map model: x(t+1)=a.x(t).[1-x(t)] is controlled by the following anticipatory incursive model: x(t+1)=a.x(t).[1-x(t+1)] which corresponds to the differential anticipatory equation: dx(t)/dt=a.x(t).[1-x(t+1)]-x(t). The main part of this paper deals with the discretisation of differential equation systems of linear and non-linear oscillators. The non-linear oscillator is based on the Lotka-Volterra equations model. The discretisation is made by incursion. The incursive discrete equation system gives the same stability condition than the original differential equations without numerical instabilities. The linearisation of the incursive discrete non-linear Lotka-Volterra equation system gives rise to the classical harmonic oscillator. The incursive discretisation of the linear oscillator is similar to define backward and forward discrete derivatives. A generalized complex derivative is then considered and applied to the harmonic oscillator. Non-locality seems to be a property of anticipatory systems. With some mathematical assumption, the Schrödinger quantum equation is derived for a particle in a uniform potential. Finally an hyperincursive system is given in the case of a neural stack memory.

  15. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  16. Computer systems: What the future holds

    Science.gov (United States)

    Stone, H. S.

    1976-01-01

    Developement of computer architecture is discussed in terms of the proliferation of the microprocessor, the utility of the medium-scale computer, and the sheer computational power of the large-scale machine. Changes in new applications brought about because of ever lowering costs, smaller sizes, and faster switching times are included.

  17. Computer-Aided Facilities Management Systems (CAFM).

    Science.gov (United States)

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  18. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  19. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  20. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  1. Integration of process computer systems to Cofrentes NPP

    International Nuclear Information System (INIS)

    Saettone Justo, A.; Pindado Andres, R.; Buedo Jimenez, J.L.; Jimenez Fernandez-Sesma, A.; Delgado Muelas, J.A.

    1997-01-01

    The existence of three different process computer systems in Cofrentes NPP and the ageing of two of them have led to the need for their integration into a single real time computer system, known as Integrated ERIS-Computer System (SIEC), which covers the functionality of the three systems: Process Computer (PC), Emergency Response Information System (ERIS) and Nuclear Calculation Computer (OCN). The paper describes the integration project developed, which has essentially consisted in the integration of PC, ERIS and OCN databases into a single database, the migration of programs from the old process computer into the new SIEC hardware-software platform and the installation of a communications programme to transmit all necessary data for OCN programs from the SIEC computer, which in the new configuration is responsible for managing the databases of the whole system. (Author)

  2. Semi-Automatic Measurement of the Airway Dimension by Computed Tomography Using the Full-With-Half- Maximum Method: a Study of the Measurement Accuracy according to the Orientation of an Artificial Airway

    International Nuclear Information System (INIS)

    Kim, Nam Kug; Seo, Joon Beom; Song, Koun Sik; Chae, Eun Jin; Kang, Suk Ho

    2008-01-01

    To develop an algorithm to measure the dimensions of an airway oriented obliquely on a volumetric CT, as well as assess the effect of the imaging parameters on the correct measurement of the airway dimension. An airway phantom with 11 poly-acryl tubes of various lumen diameters and wall thicknesses was scanned using a 16-MDCT (multidetector CT) at various tilt angles (0, 30, 45, and 60 ). The CT images were reconstructed at various reconstruction kernels and thicknesses. The axis of each airway was determined using the 3D thinning algorithm, with images perpendicular to the axis being reconstructed. The luminal radius and wall thickness was measured by the full-width-half-maximum method. The influence of the CT parameters (the size of the airways, obliquity on the radius and wall thickness) was assessed by comparing the actual dimension of each tube with the estimated values. The 3D thinning algorithm correctly determined the axis of the oblique airway in all tubes (mean error: 0.91 ± 0.82 .deg. ). A sharper reconstruction kernel, thicker image thickness and larger tilt angle of the airway axis resulted in a significant decrease of the measured wall thickness and an increase of the measured luminal radius. Use of a standard kernel and a 0.75-mm slice thickness resulted in the most accurate measurement of airway dimension, which was independent of obliquity. The airway obliquity and imaging parameters have a strong influence on the accuracy of the airway wall measurement. For the accurate measurement of airway thickness, the CT images should be reconstructed with a standard kernel and a 0.75 mm slice thickness

  3. The computer aided education and training system for accident management

    International Nuclear Information System (INIS)

    Yoneyama, Mitsuru; Kubota, Ryuji; Fujiwara, Tadashi; Sakuma, Hitoshi

    1999-01-01

    The education and training system for Accident Management was developed by the Japanese BWR group and Hitachi Ltd. The education and training system is composed of two systems. One is computer aided instruction (CAI) education system and the education and training system with computer simulations. Both systems are designed to be executed on personal computers. The outlines of the CAI education system and the education and training system with simulator are reported below. These systems provides plant operators and technical support center staff with the effective education and training for accident management. (author)

  4. Embedded computer systems for control applications in EBR-II

    International Nuclear Information System (INIS)

    Carlson, R.B.; Start, S.E.

    1993-01-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II

  5. Design of Computer Fault Diagnosis and Troubleshooting System ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... 2Department of Computer Science. Cross River University ... owners in dealing with their computer problems especially when the time is limited and human expert is not ..... questions with the system responding to each of the ...

  6. Applications of membrane computing in systems and synthetic biology

    CERN Document Server

    Gheorghe, Marian; Pérez-Jiménez, Mario

    2014-01-01

    Membrane Computing was introduced as a computational paradigm in Natural Computing. The models introduced, called Membrane (or P) Systems, provide a coherent platform to describe and study living cells as computational systems. Membrane Systems have been investigated for their computational aspects and employed to model problems in other fields, like: Computer Science, Linguistics, Biology, Economy, Computer Graphics, Robotics, etc. Their inherent parallelism, heterogeneity and intrinsic versatility allow them to model a broad range of processes and phenomena, being also an efficient means to solve and analyze problems in a novel way. Membrane Computing has been used to model biological systems, becoming with time a thorough modeling paradigm comparable, in its modeling and predicting capabilities, to more established models in this area. This book is the result of the need to collect, in an organic way, different facets of this paradigm. The chapters of this book, together with the web pages accompanying th...

  7. Computer system for nuclear power plant parameter display

    International Nuclear Information System (INIS)

    Stritar, A.; Klobuchar, M.

    1990-01-01

    The computer system for efficient, cheap and simple presentation of data on the screen of the personal computer is described. The display is in alphanumerical or graphical form. The system can be used for the man-machine interface in the process monitoring system of the nuclear power plant. It represents the third level of the new process computer system of the Nuclear Power Plant Krsko. (author)

  8. Computers as Components Principles of Embedded Computing System Design

    CERN Document Server

    Wolf, Wayne

    2008-01-01

    This book was the first to bring essential knowledge on embedded systems technology and techniques under a single cover. This second edition has been updated to the state-of-the-art by reworking and expanding performance analysis with more examples and exercises, and coverage of electronic systems now focuses on the latest applications. Researchers, students, and savvy professionals schooled in hardware or software design, will value Wayne Wolf's integrated engineering design approach.The second edition gives a more comprehensive view of multiprocessors including VLIW and superscalar archite

  9. An operating system for future aerospace vehicle computer systems

    Science.gov (United States)

    Foudriat, E. C.; Berman, W. J.; Will, R. W.; Bynum, W. L.

    1984-01-01

    The requirements for future aerospace vehicle computer operating systems are examined in this paper. The computer architecture is assumed to be distributed with a local area network connecting the nodes. Each node is assumed to provide a specific functionality. The network provides for communication so that the overall tasks of the vehicle are accomplished. The O/S structure is based upon the concept of objects. The mechanisms for integrating node unique objects with node common objects in order to implement both the autonomy and the cooperation between nodes is developed. The requirements for time critical performance and reliability and recovery are discussed. Time critical performance impacts all parts of the distributed operating system; e.g., its structure, the functional design of its objects, the language structure, etc. Throughout the paper the tradeoffs - concurrency, language structure, object recovery, binding, file structure, communication protocol, programmer freedom, etc. - are considered to arrive at a feasible, maximum performance design. Reliability of the network system is considered. A parallel multipath bus structure is proposed for the control of delivery time for time critical messages. The architecture also supports immediate recovery for the time critical message system after a communication failure.

  10. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Database Capture of Natural Language Echocardiographic Reports: A Unified Medical Language System Approach

    OpenAIRE

    Canfield, K.; Bray, B.; Huff, S.; Warner, H.

    1989-01-01

    We describe a prototype system for semi-automatic database capture of free-text echocardiography reports. The system is very simple and uses a Unified Medical Language System compatible architecture. We use this system and a large body of texts to create a patient database and develop a comprehensive hierarchical dictionary for echocardiography.

  12. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Science.gov (United States)

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  13. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  14. Quantum Computing in Fock Space Systems

    Science.gov (United States)

    Berezin, Alexander A.

    1997-04-01

    Fock space system (FSS) has unfixed number (N) of particles and/or degrees of freedom. In quantum computing (QC) main requirement is sustainability of coherent Q-superpositions. This normally favoured by low noise environment. High excitation/high temperature (T) limit is hence discarded as unfeasible for QC. Conversely, if N is itself a quantized variable, the dimensionality of Hilbert basis for qubits may increase faster (say, N-exponentially) than thermal noise (likely, in powers of N and T). Hence coherency may win over T-randomization. For this type of QC speed (S) of factorization of long integers (with D digits) may increase with D (for 'ordinary' QC speed polynomially decreases with D). This (apparent) paradox rests on non-monotonic bijectivity (cf. Georg Cantor's diagonal counting of rational numbers). This brings entire aleph-null structurality ("Babylonian Library" of infinite informational content of integer field) to superposition determining state of quantum analogue of Turing machine head. Structure of integer infinititude (e.g. distribution of primes) results in direct "Platonic pressure" resembling semi-virtual Casimir efect (presure of cut-off vibrational modes). This "effect", the embodiment of Pythagorean "Number is everything", renders Godelian barrier arbitrary thin and hence FSS-based QC can in principle be unlimitedly efficient (e.g. D/S may tend to zero when D tends to infinity).

  15. Computer aided information system for a PWR

    International Nuclear Information System (INIS)

    Vaidian, T.A.; Karmakar, G.; Rajagopal, R.; Shankar, V.; Patil, R.K.

    1994-01-01

    The computer aided information system (CAIS) is designed with a view to improve the performance of the operator. CAIS assists the plant operator in an advisory and support role, thereby reducing the workload level and potential human errors. The CAIS as explained here has been designed for a PWR type KLT- 40 used in Floating Nuclear Power Stations (FNPS). However the underlying philosophy evolved in designing the CAIS can be suitably adopted for other type of nuclear power plants too (BWR, PHWR). Operator information is divided into three broad categories: a) continuously available information b) automatically available information and c) on demand information. Two in number touch screens are provided on the main control panel. One is earmarked for continuously available information and the other is dedicated for automatically available information. Both the screens can be used at the operator's discretion for on-demand information. Automatically available information screen overrides the on-demand information screens. In addition to the above, CAIS has the features of event sequence recording, disturbance recording and information documentation. CAIS design ensures that the operator is not overburdened with excess and unnecessary information, but at the same time adequate and well formatted information is available. (author). 5 refs., 4 figs

  16. Time computations in anuran auditory systems

    Directory of Open Access Journals (Sweden)

    Gary J Rose

    2014-05-01

    Full Text Available Temporal computations are important in the acoustic communication of anurans. In many cases, calls between closely related species are nearly identical spectrally but differ markedly in temporal structure. Depending on the species, calls can differ in pulse duration, shape and/or rate (i.e., amplitude modulation, direction and rate of frequency modulation, and overall call duration. Also, behavioral studies have shown that anurans are able to discriminate between calls that differ in temporal structure. In the peripheral auditory system, temporal information is coded primarily in the spatiotemporal patterns of activity of auditory-nerve fibers. However, major transformations in the representation of temporal information occur in the central auditory system. In this review I summarize recent advances in understanding how temporal information is represented in the anuran midbrain, with particular emphasis on mechanisms that underlie selectivity for pulse duration and pulse rate (i.e., intervals between onsets of successive pulses. Two types of neurons have been identified that show selectivity for pulse rate: long-interval cells respond well to slow pulse rates but fail to spike or respond phasically to fast pulse rates; conversely, interval-counting neurons respond to intermediate or fast pulse rates, but only after a threshold number of pulses, presented at optimal intervals, have occurred. Duration-selectivity is manifest as short-pass, band-pass or long-pass tuning. Whole-cell patch recordings, in vivo, suggest that excitation and inhibition are integrated in diverse ways to generate temporal selectivity. In many cases, activity-related enhancement or depression of excitatory or inhibitory processes appear to contribute to selective responses.

  17. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  18. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus

    2009-01-01

    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  19. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  1. Designing an Assistant System Encouraging Ergonomic Computer Usage

    Directory of Open Access Journals (Sweden)

    Hüseyin GÜRÜLER

    2017-12-01

    Full Text Available Today, people of almost every age group are users of computers and computer aided systems. Technology makes our life easier, but it can also threaten our health. In recent years, one of the main causes of the proliferation of diseases such as lower back pain, neck pain or hernia, Arthritis, visual disturbances and obesity is wrong computer usage. The widespread use of computers also increases these findings. The purpose of this study is to direct computer users to use computers more carefully in terms of ergonomics. The user-interactive system developed for this purpose controls distance of the user to the screen and calculates the look angle and the time spent looking at the screen and provides audio or text format warning when necessary. It is thought that this system will reduce the health problems caused by the frequency of computer usage by encouraging individuals to use computers ergonomically.

  2. Computer systems for nuclear installation data control

    International Nuclear Information System (INIS)

    1987-09-01

    The computer programs developed by Divisao de Instalacoes Nucleares (DIN) from Brazilian CNEN for data control on nuclear installations in Brazil are presented. The following computer programs are described: control of registered companies, control of industrial sources, irradiators and monitors; control of liable person; control of industry irregularities; for elaborating credence tests; for shielding analysis; control of waste refuge [pt

  3. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  4. EBR-II Cover Gas Cleanup System upgrade distributed control and front end computer systems

    International Nuclear Information System (INIS)

    Carlson, R.B.

    1992-01-01

    The Experimental Breeder Reactor II (EBR-II) Cover Gas Cleanup System (CGCS) control system was upgraded in 1991 to improve control and provide a graphical operator interface. The upgrade consisted of a main control computer, a distributed control computer, a front end input/output computer, a main graphics interface terminal, and a remote graphics interface terminal. This paper briefly describes the Cover Gas Cleanup System and the overall control system; gives reasons behind the computer system structure; and then gives a detailed description of the distributed control computer, the front end computer, and how these computers interact with the main control computer. The descriptions cover both hardware and software

  5. The hack attack - Increasing computer system awareness of vulnerability threats

    Science.gov (United States)

    Quann, John; Belford, Peter

    1987-01-01

    The paper discusses the issue of electronic vulnerability of computer based systems supporting NASA Goddard Space Flight Center (GSFC) by unauthorized users. To test the security of the system and increase security awareness, NYMA, Inc. employed computer 'hackers' to attempt to infiltrate the system(s) under controlled conditions. Penetration procedures, methods, and descriptions are detailed in the paper. The procedure increased the security consciousness of GSFC management to the electronic vulnerability of the system(s).

  6. Soft computing trends in nuclear energy system

    International Nuclear Information System (INIS)

    Paramasivan, B.

    2012-01-01

    In spite of so many advancements in the power and energy sector over the last two decades, its survival to cater quality power with due consideration for planning, coordination, marketing, safety, stability, optimality and reliability is still believed to remain critical. Though it appears simple from the outside, yet the internal structure of large scale power systems is so complex that event management and decision making requires a formidable preliminary preparation, which gets still worsened in the presence of uncertainties and contingencies. These aspects have attracted several researchers to carryout continued research in this field and their valued contributions have been significantly helping the newcomers in understanding the evolutionary growth in this sector, starting from phenomena, tools, methodologies to strategies so as to ensure smooth, stable, safe, reliable and economic operation. The usage of soft computing would accelerate interaction between the energy and technology research community with an aim to foster unified development in the next generation. Monitoring the mechanical impact of a loose (detached or drifting) part in the reactor coolant system of a nuclear power plant is one of the essential functions for operation and maintenance of the plant. Large data tables are generated during this monitoring process. This data can be 'mined' to reveal latent patterns of interest to operation and maintenance. Rough set theory has been applied successfully to data mining. It can be used in the nuclear power industry and elsewhere to identify classes in datasets, finding dependencies in relations and discovering rules which are hidden in databases. An important role may be played by nuclear energy, provided that major safety, waste and proliferation issues affecting current nuclear reactors are satisfactorily addressed. In this respect, a large effort is under way since a few years towards the development of advanced nuclear systems that would use

  7. Development of a computer design system for HVAC

    International Nuclear Information System (INIS)

    Miyazaki, Y.; Yotsuya, M.; Hasegawa, M.

    1993-01-01

    The development of a computer design system for HVAC (Heating, Ventilating and Air Conditioning) system is presented in this paper. It supports the air conditioning design for a nuclear power plant and a reprocessing plant. This system integrates various computer design systems which were developed separately for the various design phases of HVAC. the purposes include centralizing the HVAC data, optimizing design, and reducing the designing time. The centralized HVAC data are managed by a DBMS (Data Base Management System). The DBMS separates the computer design system into a calculation module and the data. The design system can thus be expanded easily in the future. 2 figs

  8. Application of computational intelligence in emerging power systems

    African Journals Online (AJOL)

    ... in the electrical engineering applications. This paper highlights the application of computational intelligence methods in power system problems. Various types of CI methods, which are widely used in power system, are also discussed in the brief. Keywords: Power systems, computational intelligence, artificial intelligence.

  9. Operating System Concepts for Reconfigurable Computing: Review and Survey

    OpenAIRE

    Marcel Eckert; Dominik Meyer; Jan Haase; Bernd Klauer

    2016-01-01

    One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The arti...

  10. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  11. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  12. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  13. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  14. Operators manual for a computer controlled impedance measurement system

    Science.gov (United States)

    Gordon, J.

    1987-02-01

    Operating instructions of a computer controlled impedance measurement system based in Hewlett Packard instrumentation are given. Hardware details, program listings, flowcharts and a practical application are included.

  15. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  16. Top 10 Threats to Computer Systems Include Professors and Students

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  17. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  18. Fusion energy division computer systems network

    International Nuclear Information System (INIS)

    Hammons, C.E.

    1980-12-01

    The Fusion Energy Division of the Oak Ridge National Laboratory (ORNL) operated by Union Carbide Corporation Nuclear Division (UCC-ND) is primarily involved in the investigation of problems related to the use of controlled thermonuclear fusion as an energy source. The Fusion Energy Division supports investigations of experimental fusion devices and related fusion theory. This memo provides a brief overview of the computing environment in the Fusion Energy Division and the computing support provided to the experimental effort and theory research

  19. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  20. Functional requirements for gas characterization system computer software

    International Nuclear Information System (INIS)

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  1. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  2. Cyber Security on Nuclear Power Plant's Computer Systems

    International Nuclear Information System (INIS)

    Shin, Ick Hyun

    2010-01-01

    Computer systems are used in many different fields of industry. Most of us are taking great advantages from the computer systems. Because of the effectiveness and great performance of computer system, we are getting so dependable on the computer. But the more we are dependable on the computer system, the more the risk we will face when the computer system is unavailable or inaccessible or uncontrollable. There are SCADA, Supervisory Control And Data Acquisition, system which are broadly used for critical infrastructure such as transportation, electricity, water management. And if the SCADA system is vulnerable to the cyber attack, it is going to be nation's big disaster. Especially if nuclear power plant's main control systems are attacked by cyber terrorists, the results may be huge. Leaking of radioactive material will be the terrorist's main purpose without using physical forces. In this paper, different types of cyber attacks are described, and a possible structure of NPP's computer network system is presented. And the paper also provides possible ways of destruction of the NPP's computer system along with some suggestions for the protection against cyber attacks

  3. Realization of the computation process in the M-6000 computer for physical process automatization systems basing on CAMAC system

    International Nuclear Information System (INIS)

    Antonichev, G.M.; Vesenev, V.A.; Volkov, A.S.; Maslov, V.V.; Shilkin, I.P.; Bespalova, T.V.; Golutvin, I.A.; Nevskaya, N.A.

    1977-01-01

    Software for physical experiments using the CAMAC devices and the M-6000 computer are further developed. The construction principles and operation of the data acquisition system and the system generator are described. Using the generator for the data acquisition system the experimenter realizes the logic for data exchange between the CAMAC devices and the computer

  4. Total reduction of distorted echelle spectrograms - An automatic procedure. [for computer controlled microdensitometer

    Science.gov (United States)

    Peterson, R. C.; Title, A. M.

    1975-01-01

    A total reduction procedure, notable for its use of a computer-controlled microdensitometer for semi-automatically tracing curved spectra, is applied to distorted high-dispersion echelle spectra recorded by an image tube. Microdensitometer specifications are presented and the FORTRAN, TRACEN and SPOTS programs are outlined. The intensity spectrum of the photographic or electrographic plate is plotted on a graphic display. The time requirements are discussed in detail.

  5. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  6. Computer Generated Hologram System for Wavefront Measurement System Calibration

    Science.gov (United States)

    Olczak, Gene

    2011-01-01

    Computer Generated Holograms (CGHs) have been used for some time to calibrate interferometers that require nulling optics. A typical scenario is the testing of aspheric surfaces with an interferometer placed near the paraxial center of curvature. Existing CGH technology suffers from a reduced capacity to calibrate middle and high spatial frequencies. The root cause of this shortcoming is as follows: the CGH is not placed at an image conjugate of the asphere due to limitations imposed by the geometry of the test and the allowable size of the CGH. This innovation provides a calibration system where the imaging properties in calibration can be made comparable to the test configuration. Thus, if the test is designed to have good imaging properties, then middle and high spatial frequency errors in the test system can be well calibrated. The improved imaging properties are provided by a rudimentary auxiliary optic as part of the calibration system. The auxiliary optic is simple to characterize and align to the CGH. Use of the auxiliary optic also reduces the size of the CGH required for calibration and the density of the lines required for the CGH. The resulting CGH is less expensive than the existing technology and has reduced write error and alignment error sensitivities. This CGH system is suitable for any kind of calibration using an interferometer when high spatial resolution is required. It is especially well suited for tests that include segmented optical components or large apertures.

  7. Computer security of NPP instrumentation and control systems: categorization

    International Nuclear Information System (INIS)

    Klevtsov, A.L.; Simonov, A.A.; Trubchaninov, S.A.

    2016-01-01

    The paper is devoted to studying categorization of NPP instrumentation and control (I&C) systems from the point of view of computer security and to consideration of the computer security levels and zones used by the International Atomic Energy Agency (IAEA). The paper also describes the computer security degrees and zones regulated by the International Electrotechnical Commission (IEC) standard. The computer security categorization of the systems used by the U.S. Nuclear Regulatory Commission (NRC) is presented. The experts analyzed the main differences in I&C systems computer security categorization accepted by the IAEA, IEC and U.S. NRC. The approaches to categorization that should be advisably used in Ukraine during the development of regulation on NPP I&C systems computer security are proposed in the paper

  8. National electronic medical records integration on cloud computing system.

    Science.gov (United States)

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  9. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  10. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  11. The Cc1 Project – System For Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    J Chwastowski

    2012-01-01

    Full Text Available The main features of the Cloud Computing system developed at IFJ PAN are described. The project is financed from the structural resources provided by the European Commission and the Polish Ministry of Science and Higher Education (Innovative Economy, National Cohesion Strategy. The system delivers a solution for carrying out computer calculations on a Private Cloud computing infrastructure. It consists of an intuitive Web based user interface, a module for the users and resources administration and the standard EC2 interface implementation. Thanks to the distributed character of the system it allows for the integration of a geographically distant federation of computer clusters within a uniform user environment.

  12. Mechanisms of protection of information in computer networks and systems

    Directory of Open Access Journals (Sweden)

    Sergey Petrovich Evseev

    2011-10-01

    Full Text Available Protocols of information protection in computer networks and systems are investigated. The basic types of threats of infringement of the protection arising from the use of computer networks are classified. The basic mechanisms, services and variants of realization of cryptosystems for maintaining authentication, integrity and confidentiality of transmitted information are examined. Their advantages and drawbacks are described. Perspective directions of development of cryptographic transformations for the maintenance of information protection in computer networks and systems are defined and analyzed.

  13. Software Applications on the Peregrine System | High-Performance Computing

    Science.gov (United States)

    Algebraic Modeling System (GAMS) Statistics and analysis High-level modeling system for mathematical reactivity. Gurobi Optimizer Statistics and analysis Solver for mathematical programming LAMMPS Chemistry and , reactivities, and vibrational, electronic and NMR spectra. R Statistical Computing Environment Statistics and

  14. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  15. 10 CFR 35.457 - Therapy-related computer systems.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Therapy-related computer systems. 35.457 Section 35.457 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Manual Brachytherapy § 35.457 Therapy-related computer systems. The licensee shall perform acceptance testing on the treatment planning...

  16. Automatic design of optical systems by digital computer

    Science.gov (United States)

    Casad, T. A.; Schmidt, L. F.

    1967-01-01

    Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.

  17. Entrepreneurial Health Informatics for Computer Science and Information Systems Students

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Narula, Stuti

    2014-01-01

    Corporate entrepreneurship is a critical area of curricula for computer science and information systems students. Few institutions of computer science and information systems have entrepreneurship in the curricula however. This paper presents entrepreneurial health informatics as a course in a concentration of Technology Entrepreneurship at a…

  18. Software For Computer-Aided Design Of Control Systems

    Science.gov (United States)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  19. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  20. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…