WorldWideScience

Sample records for imaging-versus computed tomography-based

  1. Comparison of Combined X-Ray Radiography and Magnetic Resonance (XMR) Imaging-Versus Computed Tomography-Based Dosimetry for the Evaluation of Permanent Prostate Brachytherapy Implants

    International Nuclear Information System (INIS)

    Acher, Peter; Rhode, Kawal; Morris, Stephen; Gaya, Andrew; Miquel, Marc; Popert, Rick; Tham, Ivan; Nichol, Janette; McLeish, Kate; Deehan, Charles; Dasgupta, Prokar; Beaney, Ronald; Keevil, Stephen F.

    2008-01-01

    Purpose: To present a method for the dosimetric analysis of permanent prostate brachytherapy implants using a combination of stereoscopic X-ray radiography and magnetic resonance (MR) imaging (XMR) in an XMR facility, and to compare the clinical results between XMR- and computed tomography (CT)-based dosimetry. Methods and Materials: Patients who had received nonstranded iodine-125 permanent prostate brachytherapy implants underwent XMR and CT imaging 4 weeks later. Four observers outlined the prostate gland on both sets of images. Dose-volume histograms (DVHs) were derived, and agreement was compared among the observers and between the modalities. Results: A total of 30 patients were evaluated. Inherent XMR registration based on prior calibration and optical tracking required a further automatic seed registration step that revealed a median root mean square registration error of 4.2 mm (range, 1.6-11.4). The observers agreed significantly more closely on prostate base and apex positions as well as outlining contours on the MR images than on those from CT. Coefficients of variation were significantly higher for observed prostate volumes, D90, and V100 parameters on CT-based dosimetry as opposed to XMR. The XMR-based dosimetry showed little agreement with that from CT for all observers, with D90 95% limits of agreement ranges of 65, 118, 79, and 73 Gy for Observers 1, 2, 3, and 4, respectively. Conclusions: The study results showed that XMR-based dosimetry offers an alternative to other imaging modalities and registration methods with the advantages of MR-based prostate delineation and confident three-dimensional reconstruction of the implant. The XMR-derived dose-volume histograms differ from the CT-derived values and demonstrate less interobserver variability

  2. Computed tomography-based subclassification of chronic obstructive pulmonary disease

    DEFF Research Database (Denmark)

    Dirksen, Asger; Wille, Mathilde M W

    2016-01-01

    Computed tomography (CT) is an obvious modality for subclassification of COPD. Traditionally, the pulmonary involvement of chronic obstructive pulmonary disease (COPD) in smokers is understood as a combination of deleterious effects of smoking on small airways (chronic bronchitis and small airways...... observed in COPD are subtle. Furthermore, recent results indicate that emphysema may also be the essential pathophysiologic mechanism behind the airflow limitation of COPD. The definition of COPD excludes bronchiectasis as a symptomatic subtype of COPD, and CT findings in chronic bronchitis...... and exacerbations of COPD are rather unspecific. This leaves emphysema as the most obvious candidate for subclassification of COPD. Both chest radiologists and pulmonary physicians are quite familiar with the appearance of various patterns of emphysema on HRCT, such as centrilobular, panlobular, and paraseptal...

  3. Diagnostic accuracy of magnetic resonance imaging versus computed tomography in stress fractures of the lumbar spine

    International Nuclear Information System (INIS)

    Ganiyusufoglu, A.K.; Onat, L.; Karatoprak, O.; Enercan, M.; Hamzaoglu, A.

    2010-01-01

    Aim: To compare the diagnostic accuracy of magnetic resonance imaging (MRI) with computed tomography (CT) in stress fractures of the lumbar spine. Materials and methods: Radiological and clinical data from 57 adolescents and young adults with a diagnosis of stress injury of the lumbar spine were retrospectively reviewed. All cases had undergone both 1.5 T MRI and 16-section CT examinations. All MRI and CT images were retrospectively reviewed and evaluated in separate sessions. The fracture morphology (complete/incomplete, localization) and vertebral levels were noted at both the CT and MRI examinations. Bone marrow/peri-osseous soft-tissue oedema was also determined at MRI. Results: In total, 73 complete and 32 incomplete stress fractures were detected with CT. Sixty-seven complete, 24 incomplete fractures and eight stress reactions were detected using MRI in the same study group. Marrow oedema was also seen in eight of the complete and 20 of the incomplete fractures. The specificity, sensitivity, and accuracy of MRI in detecting fracture lines were 99.6, 86.7, and 97.2%, respectively. MRI was more accurate at the lower lumbar levels in comparison to upper lumbar levels. Conclusion: MRI has a similar diagnostic accuracy to CT in determining complete fractures with or without accompanying marrow oedema and incomplete fractures with accompanying marrow oedema, especially at the lower lumbar levels, which constitutes 94% of all fractures. At upper lumbar levels and in the incomplete fractures of the pars interarticularis with marked surrounding sclerosis, MRI has apparent limitations compared to CT imaging.

  4. Diagnostic accuracy of magnetic resonance imaging versus computed tomography in stress fractures of the lumbar spine

    Energy Technology Data Exchange (ETDEWEB)

    Ganiyusufoglu, A.K., E-mail: kursady33@yahoo.co [Department of Radiology, Florence Nightingale Hospital, Istanbul (Turkey); Onat, L. [Department of Radiology, Florence Nightingale Hospital, Istanbul (Turkey); Karatoprak, O.; Enercan, M.; Hamzaoglu, A. [Department of Orthopedics and Traumatology, Florence Nightingale Hospital, Istanbul (Turkey)

    2010-11-15

    Aim: To compare the diagnostic accuracy of magnetic resonance imaging (MRI) with computed tomography (CT) in stress fractures of the lumbar spine. Materials and methods: Radiological and clinical data from 57 adolescents and young adults with a diagnosis of stress injury of the lumbar spine were retrospectively reviewed. All cases had undergone both 1.5 T MRI and 16-section CT examinations. All MRI and CT images were retrospectively reviewed and evaluated in separate sessions. The fracture morphology (complete/incomplete, localization) and vertebral levels were noted at both the CT and MRI examinations. Bone marrow/peri-osseous soft-tissue oedema was also determined at MRI. Results: In total, 73 complete and 32 incomplete stress fractures were detected with CT. Sixty-seven complete, 24 incomplete fractures and eight stress reactions were detected using MRI in the same study group. Marrow oedema was also seen in eight of the complete and 20 of the incomplete fractures. The specificity, sensitivity, and accuracy of MRI in detecting fracture lines were 99.6, 86.7, and 97.2%, respectively. MRI was more accurate at the lower lumbar levels in comparison to upper lumbar levels. Conclusion: MRI has a similar diagnostic accuracy to CT in determining complete fractures with or without accompanying marrow oedema and incomplete fractures with accompanying marrow oedema, especially at the lower lumbar levels, which constitutes 94% of all fractures. At upper lumbar levels and in the incomplete fractures of the pars interarticularis with marked surrounding sclerosis, MRI has apparent limitations compared to CT imaging.

  5. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  6. Noninvasive Computed Tomography-based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial.

    Science.gov (United States)

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias

    2015-09-15

    Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.

  7. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M. [St. Antonius Hospital Nieuwegein, Department of Radiology, Nieuwegein (Netherlands); Jong, P.A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Zanen, P.; Grutters, J.C. [University Medical Center Utrecht, Division Heart and Lungs, Utrecht (Netherlands); St. Antonius Hospital Nieuwegein, Center of Interstitial Lung Diseases, Department of Pulmonology, Nieuwegein (Netherlands)

    2015-09-15

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  8. Feasibility of computed tomography based thermometry during interstitial laser heating in bovine liver

    International Nuclear Information System (INIS)

    Pandeya, G.D.; Klaessens, J.H.G.M.; Greuter, M.J.W.; Oudkerk, M.; Schmidt, B.; Flohr, T.; Hillegersberg, R. van

    2011-01-01

    To assess the feasibility of computed tomography (CT) based thermometry during interstitial laser heating in the bovine liver. Four freshly exercised cylindrical blocks of bovine tissue were heated using a continuous laser of Nd:YAG (wavelength: 1064 nm, active length: 30 mm, power: 10-30 W). All tissues were imaged at least once before and 7 times during laser heating using CT and temperatures were simultaneously measured with 5 calibrated thermal sensors. The dependency of the average CT numbers as a function of temperature was analysed with regression analysis and a CT thermal sensitivity was derived. During laser heating, the growing hypodense area was observed around the laser source and that area showed an increase as a function of time. The formation of hypodense area was caused by declining in CT numbers at increasing temperatures. The regression analysis showed an inverse linear dependency between temperature and average CT number with -0.65 ± 0.048 HU/ C (R 2 = 0.75) for the range of 18-85 C in bovine liver. The non-invasive CT based thermometry during interstitial laser heating is feasible in the bovine liver. CT based thermometry could be further developed and may be of potential use during clinical LITT of the liver. (orig.)

  9. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  10. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  11. Cone-beam x-ray luminescence computed tomography based on x-ray absorption dosage

    Science.gov (United States)

    Liu, Tianshuai; Rong, Junyan; Gao, Peng; Zhang, Wenli; Liu, Wenlei; Zhang, Yuanke; Lu, Hongbing

    2018-02-01

    With the advances of x-ray excitable nanophosphors, x-ray luminescence computed tomography (XLCT) has become a promising hybrid imaging modality. In particular, a cone-beam XLCT (CB-XLCT) system has demonstrated its potential in in vivo imaging with the advantage of fast imaging speed over other XLCT systems. Currently, the imaging models of most XLCT systems assume that nanophosphors emit light based on the intensity distribution of x-ray within the object, not completely reflecting the nature of the x-ray excitation process. To improve the imaging quality of CB-XLCT, an imaging model that adopts an excitation model of nanophosphors based on x-ray absorption dosage is proposed in this study. To solve the ill-posed inverse problem, a reconstruction algorithm that combines the adaptive Tikhonov regularization method with the imaging model is implemented for CB-XLCT reconstruction. Numerical simulations and phantom experiments indicate that compared with the traditional forward model based on x-ray intensity, the proposed dose-based model could improve the image quality of CB-XLCT significantly in terms of target shape, localization accuracy, and image contrast. In addition, the proposed model behaves better in distinguishing closer targets, demonstrating its advantage in improving spatial resolution.

  12. Cone Beam X-ray Luminescence Computed Tomography Based on Bayesian Method.

    Science.gov (United States)

    Zhang, Guanglei; Liu, Fei; Liu, Jie; Luo, Jianwen; Xie, Yaoqin; Bai, Jing; Xing, Lei

    2017-01-01

    X-ray luminescence computed tomography (XLCT), which aims to achieve molecular and functional imaging by X-rays, has recently been proposed as a new imaging modality. Combining the principles of X-ray excitation of luminescence-based probes and optical signal detection, XLCT naturally fuses functional and anatomical images and provides complementary information for a wide range of applications in biomedical research. In order to improve the data acquisition efficiency of previously developed narrow-beam XLCT, a cone beam XLCT (CB-XLCT) mode is adopted here to take advantage of the useful geometric features of cone beam excitation. Practically, a major hurdle in using cone beam X-ray for XLCT is that the inverse problem here is seriously ill-conditioned, hindering us to achieve good image quality. In this paper, we propose a novel Bayesian method to tackle the bottleneck in CB-XLCT reconstruction. The method utilizes a local regularization strategy based on Gaussian Markov random field to mitigate the ill-conditioness of CB-XLCT. An alternating optimization scheme is then used to automatically calculate all the unknown hyperparameters while an iterative coordinate descent algorithm is adopted to reconstruct the image with a voxel-based closed-form solution. Results of numerical simulations and mouse experiments show that the self-adaptive Bayesian method significantly improves the CB-XLCT image quality as compared with conventional methods.

  13. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    International Nuclear Information System (INIS)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M.; Jong, P.A. de; Zanen, P.; Grutters, J.C.

    2015-01-01

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  14. Computed tomography-based anatomic characterization of proximal aortic dissection with consideration for endovascular candidacy.

    Science.gov (United States)

    Moon, Michael C; Greenberg, Roy K; Morales, Jose P; Martin, Zenia; Lu, Qingsheng; Dowdall, Joseph F; Hernandez, Adrian V

    2011-04-01

    Proximal aortic dissections are life-threatening conditions that require immediate surgical intervention to avert an untreated mortality rate that approaches 50% at 48 hours. Advances in computed tomography (CT) imaging techniques have permitted increased characterization of aortic dissection that are necessary to assess the design and applicability of new treatment paradigms. All patients presenting during a 2-year period with acute proximal aortic dissections who underwent CT scanning were reviewed in an effort to establish a detailed assessment of their aortic anatomy. Imaging studies were assessed in an effort to document the location of the primary proximal fenestration, the proximal and distal extent of the dissection, and numerous morphologic measurements pertaining to the aortic valve, root, and ascending aorta to determine the potential for an endovascular exclusion of the ascending aorta. During the study period, 162 patients presented with proximal aortic dissections. Digital high-resolution preoperative CT imaging was performed on 76 patients, and 59 scans (77%) were of adequate quality to allow assessment of anatomic suitability for treatment with an endograft. In all cases, the dissection plane was detectable, yet the primary intimal fenestration was identified in only 41% of the studies. Scans showed 24 patients (32%) appeared to be anatomically amenable to such a repair (absence of valvular involvement, appropriate length and diameter of proximal sealing regions, lack of need to occlude coronary vasculature). Of the 42 scans that were determined not to be favorable for endovascular repair, the most common exclusion finding was the absence of a proximal landing zone (n = 15; 36%). Appropriately protocoled CT imaging provides detailed anatomic information about the aortic root and ascending aorta, allowing the assessment of which dissections have proximal fenestrations that may be amenable to an endovascular repair. Copyright © 2011 Society for

  15. Small field dose delivery evaluations using cone beam optical computed tomography-based polymer gel dosimetry

    Directory of Open Access Journals (Sweden)

    Timothy Olding

    2011-01-01

    Full Text Available This paper explores the combination of cone beam optical computed tomography with an N-isopropylacrylamide (NIPAM-based polymer gel dosimeter for three-dimensional dose imaging of small field deliveries. Initial investigations indicate that cone beam optical imaging of polymer gels is complicated by scattered stray light perturbation. This can lead to significant dosimetry failures in comparison to dose readout by magnetic resonance imaging (MRI. For example, only 60% of the voxels from an optical CT dose readout of a 1 l dosimeter passed a two-dimensional Low′s gamma test (at a 3%, 3 mm criteria, relative to a treatment plan for a well-characterized pencil beam delivery. When the same dosimeter was probed by MRI, a 93% pass rate was observed. The optical dose measurement was improved after modifications to the dosimeter preparation, matching its performance with the imaging capabilities of the scanner. With the new dosimeter preparation, 99.7% of the optical CT voxels passed a Low′s gamma test at the 3%, 3 mm criteria and 92.7% at a 2%, 2 mm criteria. The fitted interjar dose responses of a small sample set of modified dosimeters prepared (a from the same gel batch and (b from different gel batches prepared on the same day were found to be in agreement to within 3.6% and 3.8%, respectively, over the full dose range. Without drawing any statistical conclusions, this experiment gives a preliminary indication that intrabatch or interbatch NIPAM dosimeters prepared on the same day should be suitable for dose sensitivity calibration.

  16. Timing of computed tomography-based postimplant assessment following permanent transperineal prostate brachytherapy

    International Nuclear Information System (INIS)

    Prestidge, Bradley R.; Bice, William S.; Kiefer, Eric J.; Prete, James J.

    1998-01-01

    Purpose: To establish the rate of resolution of prostatic edema following transperineal interstitial permanent prostate brachytherapy, and to determine the results and impact of timing of the postimplant assessment on the dose-volume relationship. Methods and Materials: A series of 19 consecutive patients with early-stage adenocarcinoma of the prostate receiving transperineal interstitial permanent prostate brachytherapy, were enrolled in this study. Twelve received 125 I and seven received 103 Pd. Postoperative assessment included a computed tomographic (CT) scan on postoperative days 1, 8, 30, 90, and 180. On each occasion, CT scans were performed on a GE helical unit at 3-mm abutting slices, 15-cm field of view. Prostate volumes were outlined on CT scans by a single clinician. Following digitization of the volumes and radioactive sources, volumes and dose-volume histograms were calculated. The prostate volume encompassed by the 80% and 100% reference isodose volumes was calculated. Results: Preimplant transrectal ultrasound determined volumes varied from 17.5 to 38.6 cc (median 27.9 cc). Prostate volumes previously defined on 40 randomly selected postimplant CT scans were compared in a blinded fashion to a second CT-derived volume and ranged from -32% to +24%. The Pearson correlation coefficient for prostate CT volume reproducibility was 0.77 (p < 0.03). CT scan-determined volume performed on postoperative day 1 was an average of 41.4% greater than the volume determined by preimplant ultrasound. Significant decreases in average volume were seen during the first month postoperatively. Average volume decreased 14% from day 1 to day 8, 10% from day 8 to day 30, 3% from day 30 to day 90, and 2% thereafter. Coverage of the prostate volume by the 80% isodose volume increased from 85.6% on postoperative day 1 to 92.2% on postoperative day 180. The corresponding increase in the 100% reference dose coverage of the prostate volume ranged from 73.1% to 83.3% between

  17. Computed tomography versus magnetic resonance imaging versus bone scintigraphy for clinically suspected scaphoid fractures in patients with negative plain radiographs

    NARCIS (Netherlands)

    Mallee, Wouter H.; Wang, Junfeng; Poolman, Rudolf W.; Kloen, Peter; Maas, Mario; de Vet, Henrica C. W.; Doornberg, Job N.

    2015-01-01

    In clinically suspected scaphoid fractures, early diagnosis reduces the risk of non-union and minimises loss in productivity resulting from unnecessary cast immobilisation. Since initial radiographs do not exclude the possibility of a fracture, additional imaging is needed. Computed tomography (CT),

  18. Magnetic Resonance Imaging versus Computed Tomography and Different Imaging Modalities in Evaluation of Sinonasal Neoplasms Diagnosed by Histopathology

    Directory of Open Access Journals (Sweden)

    Mohammed A. Gomaa

    2013-01-01

    Full Text Available Objective The study purpose was to detect the value of magnetic resonance imaging (MRI compared to computed tomography (CT and different imaging modalities as conventional radiology in evaluation of sinonasal neoplasms diagnosed by Histopathology. Methods Thirty patients (16 males and 14 females were complaining of symptoms related to sinonasal tract. After thorough clinical and local examination, the patients were subjected to the following: conventional radiography, CT, MRI, and histopathological examination. Results The nasal cavity was the most commonly involved site with sinonasal malignancies followed by the maxillary sinuses. The least commonly affected site was the frontal sinuses. Benign sinonasal tumors were present in 14 cases. The most common benign lesion was juvenile nasopharyngeal angiofibroma (6 cases, followed by inverted papilloma (3 cases. While malignant sinonasal tumors were present in 16 cases, squamous cell carcinoma was present in 5 cases, and undifferentiated carcinoma, in 3 cases. Lymphoepithelioma and non-Hodgkin lymphomas were present in 2 cases each, while adenocarcinoma, chondrosarcoma, adenoid cystic carcinoma, and rhabdomyosarcoma were present in 1 case each. Conclusion MRI with its superior soft tissue contrast and multiplanar capability is superior to CT in pretreatment evaluation of primary malignant tumors of sinonasal cavity.

  19. Potentials of high resolution magnetic resonance imaging versus computed tomography for preoperative local staging of colon cancer

    International Nuclear Information System (INIS)

    Rollven, Erik; Blomqvist, Lennart; Holm, Torbjorn; Glimelius, Bengt; Loerinc, Esther

    2013-01-01

    Background: Preoperative identification of locally advanced colon cancer is of importance in order to properly plan treatment. Purpose: To study high resolution T2-weighted magnetic resonance imaging (MRI) versus computed tomography (CT) for preoperative staging of colon cancer with surgery and histopathology as reference standard. Material and Methods: Twenty-eight patients with a total of 29 tumors were included. Patients were examined on a 1.5 T MR unit using a phased array body coil. T2 turbo spin-echo high resolution sequences were obtained in a coronal, transverse, and perpendicular plane to the long axis of the colon at the site of the tumor. Contrast-enhanced CT was performed using a protocol for metastasis staging. The examinations were independently evaluated by two gastrointestinal radiologists using criteria adapted to imaging for prediction of T-stage, N-stage, and extramural venous invasion. Based on the T-stage, tumors were divided in to locally advanced (T3cd-T4) and not locally advanced (T1-T3ab). Surgical and histopathological findings served as reference standard. Results: Using MRI, T-stage, N-stage, and extramural venous invasion were correctly predicted for each observer in 90% and 93%, 72% and 69%, and 82% and 78% of cases, respectively. With CT the corresponding results were 79% and 76%, 72% and 72%, 78% and 67%. For MRI inter-observer agreements (Kappa statistics) were 0.79, 0.10, and 0.76. For CT the corresponding results were 0.64, 0.66, and 0.22. Conclusion: Patients with locally advanced colon cancer, defined as tumor stage T3cd-T4, can be identified by both high resolution MRI and CT, even when CT is performed with a metastasis staging protocol. MRI may have an advantage, due to its high soft tissue discrimination, to identify certain prognostic factors such as T-stage and extramural venous invasion

  20. Comparison of onboard low-field magnetic resonance imaging versus onboard computed tomography for anatomy visualization in radiotherapy.

    Science.gov (United States)

    Noel, Camille E; Parikh, Parag J; Spencer, Christopher R; Green, Olga L; Hu, Yanle; Mutic, Sasa; Olsen, Jeffrey R

    2015-01-01

    Onboard magnetic resonance imaging (OB-MRI) for daily localization and adaptive radiotherapy has been under development by several groups. However, no clinical studies have evaluated whether OB-MRI improves visualization of the target and organs at risk (OARs) compared to standard onboard computed tomography (OB-CT). This study compared visualization of patient anatomy on images acquired on the MRI-(60)Co ViewRay system to those acquired with OB-CT. Fourteen patients enrolled on a protocol approved by the Institutional Review Board (IRB) and undergoing image-guided radiotherapy for cancer in the thorax (n = 2), pelvis (n = 6), abdomen (n = 3) or head and neck (n = 3) were imaged with OB-MRI and OB-CT. For each of the 14 patients, the OB-MRI and OB-CT datasets were displayed side-by-side and independently reviewed by three radiation oncologists. Each physician was asked to evaluate which dataset offered better visualization of the target and OARs. A quantitative contouring study was performed on two abdominal patients to assess if OB-MRI could offer improved inter-observer segmentation agreement for adaptive planning. In total 221 OARs and 10 targets were compared for visualization on OB-MRI and OB-CT by each of the three physicians. The majority of physicians (two or more) evaluated visualization on MRI as better for 71% of structures, worse for 10% of structures, and equivalent for 14% of structures. 5% of structures were not visible on either. Physicians agreed unanimously for 74% and in majority for > 99% of structures. Targets were better visualized on MRI in 4/10 cases, and never on OB-CT. Low-field MR provides better anatomic visualization of many radiotherapy targets and most OARs as compared to OB-CT. Further studies with OB-MRI should be pursued.

  1. Potentials of high resolution magnetic resonance imaging versus computed tomography for preoperative local staging of colon cancer

    Energy Technology Data Exchange (ETDEWEB)

    Rollven, Erik; Blomqvist, Lennart [Dept. of Diagnostic Radiology, Karolinska Univ. Hospital Solna, Stockholm (Sweden); Dept. of Molecular Medicine and Surgery, Karolinska Inst., Stockholm (Sweden)], e-mail: erik.rollven@ki.se; Holm, Torbjorn [Dept. of Molecular Medicine and Surgery, Karolinska Inst., Stockholm (Sweden); Dept. of Surgery, Karolinska Univ. Hospital Solna, Stockholm (Sweden); Glimelius, Bengt [Dept. of Radiology, Oncology and Radiation Science, Uppsala Univ., Uppsala (Sweden); Dept. of Oncology and Pathology, Karolinska Inst., Stockholm (Sweden); Loerinc, Esther [Dept. of Oncology and Pathology, Karolinska Inst., Stockholm (Sweden); Dept. of Pathology, Karolinska Univ. Hospital, Solna, Sweden (Sweden)

    2013-09-15

    Background: Preoperative identification of locally advanced colon cancer is of importance in order to properly plan treatment. Purpose: To study high resolution T2-weighted magnetic resonance imaging (MRI) versus computed tomography (CT) for preoperative staging of colon cancer with surgery and histopathology as reference standard. Material and Methods: Twenty-eight patients with a total of 29 tumors were included. Patients were examined on a 1.5 T MR unit using a phased array body coil. T2 turbo spin-echo high resolution sequences were obtained in a coronal, transverse, and perpendicular plane to the long axis of the colon at the site of the tumor. Contrast-enhanced CT was performed using a protocol for metastasis staging. The examinations were independently evaluated by two gastrointestinal radiologists using criteria adapted to imaging for prediction of T-stage, N-stage, and extramural venous invasion. Based on the T-stage, tumors were divided in to locally advanced (T3cd-T4) and not locally advanced (T1-T3ab). Surgical and histopathological findings served as reference standard. Results: Using MRI, T-stage, N-stage, and extramural venous invasion were correctly predicted for each observer in 90% and 93%, 72% and 69%, and 82% and 78% of cases, respectively. With CT the corresponding results were 79% and 76%, 72% and 72%, 78% and 67%. For MRI inter-observer agreements (Kappa statistics) were 0.79, 0.10, and 0.76. For CT the corresponding results were 0.64, 0.66, and 0.22. Conclusion: Patients with locally advanced colon cancer, defined as tumor stage T3cd-T4, can be identified by both high resolution MRI and CT, even when CT is performed with a metastasis staging protocol. MRI may have an advantage, due to its high soft tissue discrimination, to identify certain prognostic factors such as T-stage and extramural venous invasion.

  2. Clinical application of calculated split renal volume using computed tomography-based renal volumetry after partial nephrectomy: Correlation with technetium-99m dimercaptosuccinic acid renal scan data.

    Science.gov (United States)

    Lee, Chan Ho; Park, Young Joo; Ku, Ja Yoon; Ha, Hong Koo

    2017-06-01

    To evaluate the clinical application of computed tomography-based measurement of renal cortical volume and split renal volume as a single tool to assess the anatomy and renal function in patients with renal tumors before and after partial nephrectomy, and to compare the findings with technetium-99m dimercaptosuccinic acid renal scan. The data of 51 patients with a unilateral renal tumor managed by partial nephrectomy were retrospectively analyzed. The renal cortical volume of tumor-bearing and contralateral kidneys was measured using ImageJ software. Split estimated glomerular filtration rate and split renal volume calculated using this renal cortical volume were compared with the split renal function measured with technetium-99m dimercaptosuccinic acid renal scan. A strong correlation between split renal function and split renal volume of the tumor-bearing kidney was observed before and after surgery (r = 0.89, P volumetry had a strong correlation with the split renal function measured using technetium-99m dimercaptosuccinic acid renal scan. Computed tomography-based split renal volume measurement before and after partial nephrectomy can be used as a single modality for anatomical and functional assessment of the tumor-bearing kidney. © 2017 The Japanese Urological Association.

  3. Intermuscular pterygoid-temporal abscess following inferior alveolar nerve block anesthesia-A computer tomography based navigated surgical intervention: Case report and review.

    Science.gov (United States)

    Wallner, Jürgen; Reinbacher, Knut Ernst; Pau, Mauro; Feichtinger, Matthias

    2014-01-01

    Inferior alveolar nerve block (IANB) anesthesia is a common local anesthetic procedure. Although IANB anesthesia is known for its safety, complications can still occur. Today immediately or delayed occurring disorders following IANB anesthesia and their treatment are well-recognized. We present a case of a patient who developed a symptomatic abscess in the pterygoid region as a result of several inferior alveolar nerve injections. Clinical symptoms included diffuse pain, reduced mouth opening and jaw's hypomobility and were persistent under a first step conservative treatment. Since image-based navigated interventions have gained in importance and are used for various procedures a navigated surgical intervention was initiated as a second step therapy. Thus precise, atraumatic surgical intervention was performed by an optical tracking system in a difficult anatomical region. A symptomatic abscess was treated by a computed tomography-based navigated surgical intervention at our department. Advantages and disadvantages of this treatment strategy are evaluated.

  4. Intermuscular pterygoid-temporal abscess following inferior alveolar nerve block anesthesia–A computer tomography based navigated surgical intervention: Case report and review

    Science.gov (United States)

    Wallner, Jürgen; Reinbacher, Knut Ernst; Pau, Mauro; Feichtinger, Matthias

    2014-01-01

    Inferior alveolar nerve block (IANB) anesthesia is a common local anesthetic procedure. Although IANB anesthesia is known for its safety, complications can still occur. Today immediately or delayed occurring disorders following IANB anesthesia and their treatment are well-recognized. We present a case of a patient who developed a symptomatic abscess in the pterygoid region as a result of several inferior alveolar nerve injections. Clinical symptoms included diffuse pain, reduced mouth opening and jaw's hypomobility and were persistent under a first step conservative treatment. Since image-based navigated interventions have gained in importance and are used for various procedures a navigated surgical intervention was initiated as a second step therapy. Thus precise, atraumatic surgical intervention was performed by an optical tracking system in a difficult anatomical region. A symptomatic abscess was treated by a computed tomography-based navigated surgical intervention at our department. Advantages and disadvantages of this treatment strategy are evaluated. PMID:24987612

  5. Influence of Sinogram-Affirmed Iterative Reconstruction on Computed Tomography-Based Lung Volumetry and Quantification of Pulmonary Emphysema.

    Science.gov (United States)

    Baumueller, Stephan; Hilty, Regina; Nguyen, Thi Dan Linh; Weder, Walter; Alkadhi, Hatem; Frauenfelder, Thomas

    2016-01-01

    The purpose of this study was to evaluate the influence of sinogram-affirmed iterative reconstruction (SAFIRE) on quantification of lung volume and pulmonary emphysema in low-dose chest computed tomography compared with filtered back projection (FBP). Enhanced or nonenhanced low-dose chest computed tomography was performed in 20 patients with chronic obstructive pulmonary disease (group A) and in 20 patients without lung disease (group B). Data sets were reconstructed with FBP and SAFIRE strength levels 3 to 5. Two readers semiautomatically evaluated lung volumes and automatically quantified pulmonary emphysema, and another assessed image quality. Radiation dose parameters were recorded. Lung volume between FBP and SAFIRE 3 to 5 was not significantly different among both groups (all P > 0.05). When compared with those of FBP, total emphysema volume was significantly lower among reconstructions with SAFIRE 4 and 5 (mean difference, 0.56 and 0.79 L; all P emphysema is affected at higher strength levels.

  6. Breakeven analysis of computed tomography (Based on utilization of whole body C.T. scanner of SNU hospital)

    International Nuclear Information System (INIS)

    Cheung, Hwan; Choi, Myung Jun; Yoo, Chang Ho

    1986-01-01

    The C.T. scanner, an important tool for image-based diagnostics, is one of the costliest types of medical equipment. At present, there are in Korea a total of 66 units installed, and more units will be added in the future. For the fact the price of the C.T. scanner as well as scanning charge for using the equipment is very high as compared to those of the other kinds of medical equipment. The break-even analysis of computed tomography is considered fundamental as well as essential both to rational hospital management and keeping the charge for its use at an optimum level in consideration of the patient's medical expense burden. Even if pursuit of profits is not the role objective of a hospital, it cannot be denied that a break-even analysis provides an important factor for the decision making process in hospital management. The present study has the purpose of finding the ways and means to help rationalize hospital operation and improve its earning power through break-even analysis of C.T. scanner operation. For this purpose the total cost of the GE 8800 Whole Body C. T. Scanner installed at the Seoul National University Hospital was computed, and the records of its operation were analyzed. The expenses for its operation were divided into direct and indirect expenses depending on whether generation of the cost was recognized in the C. T. room or not, and the actual cost was computed for each of these accounting units

  7. Use of 4-Dimensional Computed Tomography-Based Ventilation Imaging to Correlate Lung Dose and Function With Clinical Outcomes

    International Nuclear Information System (INIS)

    Vinogradskiy, Yevgeniy; Castillo, Richard; Castillo, Edward; Tucker, Susan L.; Liao, Zhongxing; Guerrero, Thomas; Martel, Mary K.

    2013-01-01

    Purpose: Four-dimensional computed tomography (4DCT)-based ventilation is an emerging imaging modality that can be used in the thoracic treatment planning process. The clinical benefit of using ventilation images in radiation treatment plans remains to be tested. The purpose of the current work was to test the potential benefit of using ventilation in treatment planning by evaluating whether dose to highly ventilated regions of the lung resulted in increased incidence of clinical toxicity. Methods and Materials: Pretreatment 4DCT data were used to compute pretreatment ventilation images for 96 lung cancer patients. Ventilation images were calculated using 4DCT data, deformable image registration, and a density-change based algorithm. Dose–volume and ventilation-based dose function metrics were computed for each patient. The ability of the dose–volume and ventilation-based dose–function metrics to predict for severe (grade 3+) radiation pneumonitis was assessed using logistic regression analysis, area under the curve (AUC) metrics, and bootstrap methods. Results: A specific patient example is presented that demonstrates how incorporating ventilation-based functional information can help separate patients with and without toxicity. The logistic regression significance values were all lower for the dose–function metrics (range P=.093-.250) than for their dose–volume equivalents (range, P=.331-.580). The AUC values were all greater for the dose–function metrics (range, 0.569-0.620) than for their dose–volume equivalents (range, 0.500-0.544). Bootstrap results revealed an improvement in model fit using dose–function metrics compared to dose–volume metrics that approached significance (range, P=.118-.155). Conclusions: To our knowledge, this is the first study that attempts to correlate lung dose and 4DCT ventilation-based function to thoracic toxicity after radiation therapy. Although the results were not significant at the .05 level, our data suggests

  8. Increased incidence of adrenal gland injury in blunt abdominal trauma: a computed tomography-based study from Pakistan

    Directory of Open Access Journals (Sweden)

    Aziz Muhammad Usman

    2014-02-01

    Full Text Available 【Abstract】Objective: To determine the frequency of adrenal injuries in patients presenting with blunt abdominal trauma by computed tomography (CT. Methods: During a 6 month period from January 1, 2011 to June 30, 2011, 82 emergency CT examinations were performed in the setting of major abdominal trauma and retrospectively reviewed for adrenal gland injuries. Results: A total of 7 patients were identified as having adrenal gland injuries (6 males and 1 female. Two patients had isolated adrenal gland injuries. In the other 5 patients with nonisolated injuries, injuries to the liver (1 case, spleen (1 case, retroperitoneum (2 cases and mesentery (4 cases were identified. Overall 24 cases with liver injuries (29 %, 11 cases with splenic injuries (13%, 54 cases with mesenteric injuries (65%, 14 cases (17% with retroperitoneal injuries and 9 cases with renal injuries were identified. Conclusion: Adrenal gland injury is identified in 7 patients (11.7% out of a total of 82 patients who underwent CT after major abdominal trauma. Most of these cases were nonisolated injuries. Our experience indicates that adrenal injury resulting from trauma is more common than suggested by other reports. The rise in incidence of adrenal injuries could be attributed to the mode of injury.

  9. The effects of gas diffusion layers structure on water transportation using X-ray computed tomography based Lattice Boltzmann method

    Science.gov (United States)

    Jinuntuya, Fontip; Whiteley, Michael; Chen, Rui; Fly, Ashley

    2018-02-01

    The Gas Diffusion Layer (GDL) of a Polymer Electrolyte Membrane Fuel Cell (PEMFC) plays a crucial role in overall cell performance. It is responsible for the dissemination of reactant gasses from the gas supply channels to the reactant sites at the Catalyst Layer (CL), and the adequate removal of product water from reactant sites back to the gas channels. Existing research into water transport in GDLs has been simplified to 2D estimations of GDL structures or use virtual stochastic models. This work uses X-ray computed tomography (XCT) to reconstruct three types of GDL in a model. These models are then analysed via Lattice Boltzmann methods to understand the water transport behaviours under differing contact angles and pressure differences. In this study, the three GDL samples were tested over the contact angles of 60°, 80°, 90°, 100°, 120° and 140° under applied pressure differences of 5 kPa, 10 kPa and 15 kPa. By varying the contact angle and pressure difference, it was found that the transition between stable displacement and capillary fingering is not a gradual process. Hydrophilic contact angles in the region of 60°<θ < 90° showed stable displacement properties, whereas contact angles in the region of 100°<θ < 140° displayed capillary fingering characteristics.

  10. Parotid gland sparing effect by computed tomography-based modified lower field margin in whole brain radiotherapy

    International Nuclear Information System (INIS)

    Cho, Oyeon; Chun, Mi Son; Oh, Young Taek; Kim, Mi Hwa; Park, Hae Jin; Nam, Sang Soo; Heo, Jae Sung; Noh, O Kyu; Park, Sung Ho

    2013-01-01

    Parotid gland can be considered as a risk organ in whole brain radiotherapy (WBRT). The purpose of this study is to evaluate the parotid gland sparing effect of computed tomography (CT)-based WBRT compared to 2-dimensional plan with conventional field margin. From January 2008 to April 2011, 53 patients underwent WBRT using CT-based simulation. Bilateral two-field arrangement was used and the prescribed dose was 30 Gy in 10 fractions. We compared the parotid dose between 2 radiotherapy plans using different lower field margins: conventional field to the lower level of the atlas (CF) and modified field fitted to the brain tissue (MF). Averages of mean parotid dose of the 2 protocols with CF and MF were 17.4 Gy and 8.7 Gy, respectively (p 98% of prescribed dose were 99.7% for CF and 99.5% for MF. Compared to WBRT with CF, CT-based lower field margin modification is a simple and effective technique for sparing the parotid gland, while providing similar dose coverage of the whole brain.

  11. A dental implant-based registration method for measuring mandibular kinematics using cone beam computed tomography-based fluoroscopy.

    Science.gov (United States)

    Lin, Cheng-Chung; Chen, Chien-Chih; Chen, Yunn-Jy; Lu, Tung-Wu; Hong, Shih-Wun

    2014-01-01

    This study aimed to develop and evaluate experimentally an implant-based registration method for measuring three-dimensional (3D) kinematics of the mandible and dental implants in the mandible based on dental cone beam computed tomography (CBCT), modified to include fluoroscopic function. The proposed implant-based registration method was based on the registration of CBCT data of implants/bones with single-plane fluoroscopy images. Seven registration conditions that included one to three implants were evaluated experimentally for their performance in a cadaveric porcine headmodel. The implant-based registration method was shown to have measurement errors (SD) of less than -0.2 (0.3) mm, 1.1 (2.2) mm, and 0.7 degrees (1.3 degrees) for the in-plane translation, out-of-plane translation, and all angular components, respectively, regardless of the number of implants used. The corresponding errors were reduced to less than -0.1 (0.1) mm, -0.3 (1.7) mm, and 0.5 degree (0.4 degree) when three implants were used. An implant-based registration method was developed to measure the 3D kinematics of the mandible/implants. With its high accuracy and reliability, the new method will be useful for measuring the 3D motion of the bones/implants for relevant applications.

  12. Preclinical validation of automated dual-energy X-ray absorptiometry and computed tomography-based body composition measurements

    International Nuclear Information System (INIS)

    DEVRIESE, Joke; Pottel, Hans; BEELS, Laurence; VAN DE WIELE, Christophe; MAES, Alex; GHEYSENS, Olivier

    2016-01-01

    The aim of this study was to determine and validate a set of Hounsfield unit (HU) ranges to segment computed tomography (CT) images into tissue types and to test the validity of dual-energy X-ray absorptiometry (DXA) tissue segmentation on pure, unmixed porcine tissues. This preclinical prospective study was approved by the local ethical committee. Different quantities of porcine bone tissue (BT), lean tissue (LT) and adipose tissue (AT) were scanned using DXA and CT. Tissue type segmentation in DXA was performed via the standard clinical protocol and in CT through different sets of HU ranges. Percent coefficients of variation (%CV) were used to assess precision while % differences of observed masses were tested against zero using the Wilcoxon signed-rank Test. Total mass DXA measurements differ little but significantly (P=0.016) from true mass, while total mass CT measurements based on literature values show non-significant (P=0.69) differences of 1.7% and 2.0%. BT mass estimates with DXA differed more from true mass (median -78.2 to -75.8%) than other tissue types (median -11.3 to -8.1%). Tissue mass estimates with CT and literature HU ranges showed small differences from true mass for every tissue type (median -10.4 to 8.8%). The most suited method for automated tissue segmentation is CT and can become a valuable tool in quantitative nuclear medicine.

  13. Spiral Computed Tomography Based Maxillary Sinus Imaging in Relation to Tooth Loss, Implant Placement and Potential Grafting Procedure

    Directory of Open Access Journals (Sweden)

    Reinhilde Jacobs

    2010-01-01

    Full Text Available Objectives: The purpose of the present study was to explore the maxillary sinus anatomy, its variations and volume in patients with a need for maxillary implant placement.Materials and Methods: Maxillary sinus data of 101 consecutive patients who underwent spiral computed tomography (CT scans for preoperative implant planning in the maxilla at the Department of Periodontology, University Hospital, Catholic University of Leuven, Leuven, Belgium were retrospectively evaluated. The alveolar bone height was measured on serial cross-sectional images between alveolar crest and sinus floor, parallel to the tooth axis. In order to describe the size of the maxillary sinus anteroposterior (AP and mediolateral (ML diameters of the sinus were measured.Results: The results indicated that the alveolar bone height was significantly higher in the premolar regions in comparison to the molar region (n = 46, P 4 mm mucosal thickening mostly at the level of the sinus floor. The present sample did not allow revealing any significant difference (P > 0.05 in maxillary sinus dimensions for partially dentate and edentulous subjects.Conclusions: Cross-sectional imaging can be used in order to obtain more accurate information on the morphology, variation, and the amount of maxillary bone adjacent to the maxillary sinus.

  14. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve

    Science.gov (United States)

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-01-01

    Abstract The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD). We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC. FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = −0.584 and r = −0.568, respectively, both P system can predict FFR at an optimal cut-off of <0.80, and we propose a novel application of CT-AC to MPI-IQ-SPECT for predicting clinically significant and insignificant FFR even in nonobese patients. PMID:29390486

  15. Single-energy computed tomography-based pulmonary perfusion imaging: Proof-of-principle in a canine model

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Tokihiro, E-mail: toyamamoto@ucdavis.edu [Department of Radiation Oncology, University of California Davis School of Medicine, Sacramento, California 95817 (United States); Kent, Michael S.; Wisner, Erik R. [Department of Surgical and Radiological Sciences, University of California Davis School of Veterinary Medicine, Davis, California 95616 (United States); Johnson, Lynelle R.; Stern, Joshua A. [Department of Medicine and Epidemiology, University of California Davis School of Veterinary Medicine, Davis, California 95616 (United States); Qi, Lihong [Department of Public Health Sciences, University of California Davis, Davis, California 95616 (United States); Fujita, Yukio [Department of Radiation Oncology, Tokai University, Isehara, Kanagawa 259-1193 (Japan); Boone, John M. [Department of Radiology, University of California Davis School of Medicine, Sacramento, California 95817 (United States)

    2016-07-15

    Purpose: Radiotherapy (RT) that selectively avoids irradiating highly functional lung regions may reduce pulmonary toxicity, which is substantial in lung cancer RT. Single-energy computed tomography (CT) pulmonary perfusion imaging has several advantages (e.g., higher resolution) over other modalities and has great potential for widespread clinical implementation, particularly in RT. The purpose of this study was to establish proof-of-principle for single-energy CT perfusion imaging. Methods: Single-energy CT perfusion imaging is based on the following: (1) acquisition of end-inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast agents, (2) deformable image registration (DIR) for spatial mapping of those two CT image data sets, and (3) subtraction of the precontrast image data set from the postcontrast image data set, yielding a map of regional Hounsfield unit (HU) enhancement, a surrogate for regional perfusion. In a protocol approved by the institutional animal care and use committee, the authors acquired CT scans in the prone position for a total of 14 anesthetized canines (seven canines with normal lungs and seven canines with diseased lungs). The elastix algorithm was used for DIR. The accuracy of DIR was evaluated based on the target registration error (TRE) of 50 anatomic pulmonary landmarks per subject for 10 randomly selected subjects as well as on singularities (i.e., regions where the displacement vector field is not bijective). Prior to perfusion computation, HUs of the precontrast end-inspiratory image were corrected for variation in the lung inflation level between the precontrast and postcontrast end-inspiratory CT scans, using a model built from two additional precontrast CT scans at end-expiration and midinspiration. The authors also assessed spatial heterogeneity and gravitationally directed gradients of regional perfusion for normal lung subjects and diseased lung subjects using a two-sample two-tailed t

  16. Single-energy computed tomography-based pulmonary perfusion imaging: Proof-of-principle in a canine model.

    Science.gov (United States)

    Yamamoto, Tokihiro; Kent, Michael S; Wisner, Erik R; Johnson, Lynelle R; Stern, Joshua A; Qi, Lihong; Fujita, Yukio; Boone, John M

    2016-07-01

    Radiotherapy (RT) that selectively avoids irradiating highly functional lung regions may reduce pulmonary toxicity, which is substantial in lung cancer RT. Single-energy computed tomography (CT) pulmonary perfusion imaging has several advantages (e.g., higher resolution) over other modalities and has great potential for widespread clinical implementation, particularly in RT. The purpose of this study was to establish proof-of-principle for single-energy CT perfusion imaging. Single-energy CT perfusion imaging is based on the following: (1) acquisition of end-inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast agents, (2) deformable image registration (DIR) for spatial mapping of those two CT image data sets, and (3) subtraction of the precontrast image data set from the postcontrast image data set, yielding a map of regional Hounsfield unit (HU) enhancement, a surrogate for regional perfusion. In a protocol approved by the institutional animal care and use committee, the authors acquired CT scans in the prone position for a total of 14 anesthetized canines (seven canines with normal lungs and seven canines with diseased lungs). The elastix algorithm was used for DIR. The accuracy of DIR was evaluated based on the target registration error (TRE) of 50 anatomic pulmonary landmarks per subject for 10 randomly selected subjects as well as on singularities (i.e., regions where the displacement vector field is not bijective). Prior to perfusion computation, HUs of the precontrast end-inspiratory image were corrected for variation in the lung inflation level between the precontrast and postcontrast end-inspiratory CT scans, using a model built from two additional precontrast CT scans at end-expiration and midinspiration. The authors also assessed spatial heterogeneity and gravitationally directed gradients of regional perfusion for normal lung subjects and diseased lung subjects using a two-sample two-tailed t-test. The mean TRE

  17. Single-energy computed tomography-based pulmonary perfusion imaging: Proof-of-principle in a canine model

    International Nuclear Information System (INIS)

    Yamamoto, Tokihiro; Kent, Michael S.; Wisner, Erik R.; Johnson, Lynelle R.; Stern, Joshua A.; Qi, Lihong; Fujita, Yukio; Boone, John M.

    2016-01-01

    Purpose: Radiotherapy (RT) that selectively avoids irradiating highly functional lung regions may reduce pulmonary toxicity, which is substantial in lung cancer RT. Single-energy computed tomography (CT) pulmonary perfusion imaging has several advantages (e.g., higher resolution) over other modalities and has great potential for widespread clinical implementation, particularly in RT. The purpose of this study was to establish proof-of-principle for single-energy CT perfusion imaging. Methods: Single-energy CT perfusion imaging is based on the following: (1) acquisition of end-inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast agents, (2) deformable image registration (DIR) for spatial mapping of those two CT image data sets, and (3) subtraction of the precontrast image data set from the postcontrast image data set, yielding a map of regional Hounsfield unit (HU) enhancement, a surrogate for regional perfusion. In a protocol approved by the institutional animal care and use committee, the authors acquired CT scans in the prone position for a total of 14 anesthetized canines (seven canines with normal lungs and seven canines with diseased lungs). The elastix algorithm was used for DIR. The accuracy of DIR was evaluated based on the target registration error (TRE) of 50 anatomic pulmonary landmarks per subject for 10 randomly selected subjects as well as on singularities (i.e., regions where the displacement vector field is not bijective). Prior to perfusion computation, HUs of the precontrast end-inspiratory image were corrected for variation in the lung inflation level between the precontrast and postcontrast end-inspiratory CT scans, using a model built from two additional precontrast CT scans at end-expiration and midinspiration. The authors also assessed spatial heterogeneity and gravitationally directed gradients of regional perfusion for normal lung subjects and diseased lung subjects using a two-sample two-tailed t

  18. Efficient approach for determining four-dimensional computed tomography-based internal target volume in stereotactic radiotherapy of lung cancer

    International Nuclear Information System (INIS)

    Yeo, Seung Gu; Kim, Eun Seog

    2013-01-01

    This study aimed to investigate efficient approaches for determining internal target volume (ITV) from four-dimensional computed tomography (4D CT) images used in stereotactic body radiotherapy (SBRT) for patients with early-stage non-small cell lung cancer (NSCLC). 4D CT images were analyzed for 15 patients who received SBRT for stage I NSCLC. Three different ITVs were determined as follows: combining clinical target volume (CTV) from all 10 respiratory phases (ITV 10Phases ); combining CTV from four respiratory phases, including two extreme phases (0% and 50%) plus two intermediate phases (20% and 70%) (ITV 4Phases ); and combining CTV from two extreme phases (ITV 2Phases ). The matching index (MI) of ITV 4Phases and ITV 2Phases was defined as the ratio of ITV 4Phases and ITV 2Phases , respectively, to the ITV 10Phases . The tumor motion index (TMI) was defined as the ratio of ITV 10Phases to CTV mean , which was the mean of 10 CTVs delineated on 10 respiratory phases. The ITVs were significantly different in the order of ITV 10Phases , ITV 4Phases , and ITV 2Phases (all p 4Phases was significantly higher than that of ITV 2Phases (p 4Phases was inversely related to TMI (r = -0.569, p = 0.034). In a subgroup with low TMI (n = 7), ITV 4Phases was not statistically different from ITV 10Phases (p = 0.192) and its MI was significantly higher than that of ITV 2Phases (p = 0.016). The ITV 4Phases may be an efficient approach alternative to optimal ITV 10Phases in SBRT for early-stage NSCLC with less tumor motion.

  19. Total body height estimation using sacrum height in Anatolian Caucasians: multidetector computed tomography-based virtual anthropometry

    Energy Technology Data Exchange (ETDEWEB)

    Karakas, Hakki Muammer [Inonu University Medical Faculty, Turgut Ozal Medical Center, Department of Radiology, Malatya (Turkey); Celbis, Osman [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Forensic Medicine, Malatya (Turkey); Harma, Ahmet [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Orthopaedics and Traumatology, Malatya (Turkey); Alicioglu, Banu [Trakya University Medical Faculty, Department of Radiology, Edirne (Turkey); Trakya University Health Sciences Institute, Department of Anatomy, Edirne (Turkey)

    2011-05-15

    Estimation of total body height is a major step when a subject has to be identified from his/her skeletal structures. In the presence of decomposed skeletons and missing bones, estimation is usually based on regression equation for intact long bones. If these bones are fragmented or missing, alternative structures must be used. In this study, the value of sacrum height (SH) in total body height (TBH) estimation was investigated in a contemporary population of adult Anatolian Caucasians. Sixty-six men (41.6 {+-} 14.9 years) and 43 women (41.1 {+-} 14.2 years) were scanned with 64-row multidetector computed tomography (MDCT) to obtain high-resolution anthropometric data. SH of midsagittal sections was electronically measured. The technique and methodology were validated on a standard skeletal model. Sacrum height was 111.2 {+-} 12.6 mm (77-138 mm) in men and 104.7 {+-} 8.2 (89-125 mm) in women. The difference between the two sexes regarding SH was significant (p < 0.0001). SH did not significantly correlate with age in men, whereas the correlation was significant in women (p < 0.03). The correlation between SH and the stature was significant in men (r = 0.427, p < 0.0001) and was insignificant in women. For men the regression equation was [Stature = (0.306 x SH)+137.9] (r = 0.54, SEE = 56.9, p < 0.0001). Sacrum height is not susceptible to sex, or to age in men. In the presence of incomplete male skeletons, SH helps to determine the stature. This study is also one of the initial applications of MDCT in virtual anthropometric research. (orig.)

  20. Morphological characteristics of the posterior malleolar fragment according to ankle fracture patterns: a computed tomography-based study.

    Science.gov (United States)

    Yi, Young; Chun, Dong-Il; Won, Sung Hun; Park, Suyeon; Lee, Sanghyeon; Cho, Jaeho

    2018-02-13

    The posterior malleolar fragment (PMF) of an ankle fracture can have various shapes depending on the injury mechanism. The purpose of this study was to evaluate the morphological characteristics of the PMF according to the ankle fracture pattern described in the Lauge-Hansen classification by using computed tomography (CT) images. We retrospectively analyzed CT data of 107 patients (107 ankles) who underwent surgery for trimalleolar fracture from January 2012 to December 2014. The patients were divided into two groups: 76 ankles in the supination-external rotation (SER) stage IV group and 31 ankles in the pronation-external rotation (PER) stage IV group. The PMF type of the two groups was assessed using the Haraguchi and Jan Bartonicek classification. The cross angle (α), fragment length ratio (FLR), fragment area ratio (FAR), sagittal angle (θ), and fragment height (FH) were measured to assess the morphological characteristics of the PMF. The PMF in the SER group mainly had a posterolateral shape, whereas that in the PER group mainly had a posteromedial two-part shape or a large posterolateral triangular shape (P = 0.02). The average cross angle was not significantly different between the two groups (SER group = 19.4°, PER group = 17.6°). The mean FLR and FH were significantly larger in the PER group than in the SER group (P = 0.024, P = 0.006). The mean fragment sagittal angle in the PER group was significantly smaller than that in the SER group (P = 0.017). With regard to the articular involvement, volume, and vertical nature, the SER-type fracture tends to have a smaller fragment due to the rotational force, whereas the PER-type fracture tends to have a larger fragment due to the combination of rotational and axial forces.

  1. Total body height estimation using sacrum height in Anatolian Caucasians: multidetector computed tomography-based virtual anthropometry

    International Nuclear Information System (INIS)

    Karakas, Hakki Muammer; Celbis, Osman; Harma, Ahmet; Alicioglu, Banu

    2011-01-01

    Estimation of total body height is a major step when a subject has to be identified from his/her skeletal structures. In the presence of decomposed skeletons and missing bones, estimation is usually based on regression equation for intact long bones. If these bones are fragmented or missing, alternative structures must be used. In this study, the value of sacrum height (SH) in total body height (TBH) estimation was investigated in a contemporary population of adult Anatolian Caucasians. Sixty-six men (41.6 ± 14.9 years) and 43 women (41.1 ± 14.2 years) were scanned with 64-row multidetector computed tomography (MDCT) to obtain high-resolution anthropometric data. SH of midsagittal sections was electronically measured. The technique and methodology were validated on a standard skeletal model. Sacrum height was 111.2 ± 12.6 mm (77-138 mm) in men and 104.7 ± 8.2 (89-125 mm) in women. The difference between the two sexes regarding SH was significant (p < 0.0001). SH did not significantly correlate with age in men, whereas the correlation was significant in women (p < 0.03). The correlation between SH and the stature was significant in men (r = 0.427, p < 0.0001) and was insignificant in women. For men the regression equation was [Stature = (0.306 x SH)+137.9] (r = 0.54, SEE = 56.9, p < 0.0001). Sacrum height is not susceptible to sex, or to age in men. In the presence of incomplete male skeletons, SH helps to determine the stature. This study is also one of the initial applications of MDCT in virtual anthropometric research. (orig.)

  2. Cone Beam Computed Tomography-based Evaluation of the Anterior Teeth Position Changes obtained by Passive Self-ligating Brackets.

    Science.gov (United States)

    Rhoden, Fernando K; Maltagliati, Liliana Á; de Castro Ferreira Conti, Ana C; Almeida-Pedrin, Renata R; Filho, Leopoldino C; de Almeida Cardoso, Maurício

    2016-08-01

    The objective of this study was to evaluate the anterior teeth position changes obtained by passive self-ligating brackets using cone beam computed tomography (CBCT). Twenty patients with a mean age of 16.5 years, class I malocclusion, constricted maxillary arch, and teeth crowding above 5 mm were enrolled in this study, and treated by passive orthodontic self-ligating brackets. A sequence of stainless steel thermoset wire was implemented with ending wire of 0.019" × 0.025". The CBCT and dental casts were obtained prior to the installation of orthodontic appliances (T1), and 30 days after rectangular steel wire 0.019" × 0.025" installation (T2). The measurements in CBCT were performed with the Anatomage software, and the dental casts were evaluated with a digital caliper rule with an accuracy of 0.01 mm. The CBCT data demonstrated mean buccal inclination of the upper and lower central incisors ranging from 6.55° to 7.24° respectively. The upper and lower lateral incisors ranged from 4.90° to 8.72° respectively. The lower canines showed an average increase of 3.88° in the buccal inclination and 1.96 mm in the transverse intercuspal distance. The upper canines showed a negative inclination with mean average of -0.36°, and an average increase of 0.82 mm in the transverse distance, with negative correlation with the initial crowding. Treatment with passive self-ligating brackets without obtaining spaces increases buccal inclination of the upper and lower incisors with no correlation with the amount of initial teeth crowding. The intercanine distance tends to a small increase showing different inclinations between the arches. When taking into account the self-ligating brackets, the amount of initial dental crowding is not a limitation factor that could increase the buccal inclination of the anterior teeth.

  3. Energy-angle correlation correction algorithm for monochromatic computed tomography based on Thomson scattering X-ray source

    Science.gov (United States)

    Chi, Zhijun; Du, Yingchao; Huang, Wenhui; Tang, Chuanxiang

    2017-12-01

    The necessity for compact and relatively low cost x-ray sources with monochromaticity, continuous tunability of x-ray energy, high spatial coherence, straightforward polarization control, and high brightness has led to the rapid development of Thomson scattering x-ray sources. To meet the requirement of in-situ monochromatic computed tomography (CT) for large-scale and/or high-attenuation materials based on this type of x-ray source, there is an increasing demand for effective algorithms to correct the energy-angle correlation. In this paper, we take advantage of the parametrization of the x-ray attenuation coefficient to resolve this problem. The linear attenuation coefficient of a material can be decomposed into a linear combination of the energy-dependent photoelectric and Compton cross-sections in the keV energy regime without K-edge discontinuities, and the line integrals of the decomposition coefficients of the above two parts can be determined by performing two spectrally different measurements. After that, the line integral of the linear attenuation coefficient of an imaging object at a certain interested energy can be derived through the above parametrization formula, and monochromatic CT can be reconstructed at this energy using traditional reconstruction methods, e.g., filtered back projection or algebraic reconstruction technique. Not only can monochromatic CT be realized, but also the distributions of the effective atomic number and electron density of the imaging object can be retrieved at the expense of dual-energy CT scan. Simulation results validate our proposal and will be shown in this paper. Our results will further expand the scope of application for Thomson scattering x-ray sources.

  4. A computed tomography-based spatial normalization for the analysis of [18F] fluorodeoxyglucose positron emission tomography of the brain.

    Science.gov (United States)

    Cho, Hanna; Kim, Jin Su; Choi, Jae Yong; Ryu, Young Hoon; Lyoo, Chul Hyoung

    2014-01-01

    We developed a new computed tomography (CT)-based spatial normalization method and CT template to demonstrate its usefulness in spatial normalization of positron emission tomography (PET) images with [(18)F] fluorodeoxyglucose (FDG) PET studies in healthy controls. Seventy healthy controls underwent brain CT scan (120 KeV, 180 mAs, and 3 mm of thickness) and [(18)F] FDG PET scans using a PET/CT scanner. T1-weighted magnetic resonance (MR) images were acquired for all subjects. By averaging skull-stripped and spatially-normalized MR and CT images, we created skull-stripped MR and CT templates for spatial normalization. The skull-stripped MR and CT images were spatially normalized to each structural template. PET images were spatially normalized by applying spatial transformation parameters to normalize skull-stripped MR and CT images. A conventional perfusion PET template was used for PET-based spatial normalization. Regional standardized uptake values (SUV) measured by overlaying the template volume of interest (VOI) were compared to those measured with FreeSurfer-generated VOI (FSVOI). All three spatial normalization methods underestimated regional SUV values by 0.3-20% compared to those measured with FSVOI. The CT-based method showed slightly greater underestimation bias. Regional SUV values derived from all three spatial normalization methods were correlated significantly (p normalization may be an alternative method for structure-based spatial normalization of [(18)F] FDG PET when MR imaging is unavailable. Therefore, it is useful for PET/CT studies with various radiotracers whose uptake is expected to be limited to specific brain regions or highly variable within study population.

  5. Investigation of four-dimensional computed tomography-based pulmonary ventilation imaging in patients with emphysematous lung regions

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Tokihiro; Loo, Billy W Jr; Keall, Paul J [Department of Radiation Oncology, Stanford University School of Medicine, 875 Blake Wilbur Dr, Stanford, CA 94305-5847 (United States); Kabus, Sven; Lorenz, Cristian; Von Berg, Jens; Blaffert, Thomas [Department of Digital Imaging, Philips Research Europe, Roentgenstrasse 24-26, D-22335 Hamburg (Germany); Klinder, Tobias, E-mail: Tokihiro@stanford.edu [Clinical Informatics, Interventional, and Translational Solutions, Philips Research North America, Briarcliff Manor, NY 10510 (United States)

    2011-04-07

    A pulmonary ventilation imaging technique based on four-dimensional (4D) computed tomography (CT) has advantages over existing techniques. However, physiologically accurate 4D-CT ventilation imaging has not been achieved in patients. The purpose of this study was to evaluate 4D-CT ventilation imaging by correlating ventilation with emphysema. Emphysematous lung regions are less ventilated and can be used as surrogates for low ventilation. We tested the hypothesis: 4D-CT ventilation in emphysematous lung regions is significantly lower than in non-emphysematous regions. Four-dimensional CT ventilation images were created for 12 patients with emphysematous lung regions as observed on CT, using a total of four combinations of two deformable image registration (DIR) algorithms: surface-based (DIR{sup sur}) and volumetric (DIR{sup vol}), and two metrics: Hounsfield unit (HU) change (V{sub HU}) and Jacobian determinant of deformation (V{sub Jac}), yielding four ventilation image sets per patient. Emphysematous lung regions were detected by density masking. We tested our hypothesis using the one-tailed t-test. Visually, different DIR algorithms and metrics yielded spatially variant 4D-CT ventilation images. The mean ventilation values in emphysematous lung regions were consistently lower than in non-emphysematous regions for all the combinations of DIR algorithms and metrics. V{sub HU} resulted in statistically significant differences for both DIR{sup sur} (0.14 {+-} 0.14 versus 0.29 {+-} 0.16, p = 0.01) and DIR{sup vol} (0.13 {+-} 0.13 versus 0.27 {+-} 0.15, p < 0.01). However, V{sub Jac} resulted in non-significant differences for both DIR{sup sur} (0.15 {+-} 0.07 versus 0.17 {+-} 0.08, p = 0.20) and DIR{sup vol} (0.17 {+-} 0.08 versus 0.19 {+-} 0.09, p = 0.30). This study demonstrated the strong correlation between the HU-based 4D-CT ventilation and emphysema, which indicates the potential for HU-based 4D-CT ventilation imaging to achieve high physiologic accuracy. A

  6. Micro-computed tomography-based phenotypic approaches in embryology: procedural artifacts on assessments of embryonic craniofacial growth and development

    Directory of Open Access Journals (Sweden)

    Logan C Cairine

    2010-02-01

    Full Text Available Abstract Background Growing demand for three dimensional (3D digital images of embryos for purposes of phenotypic assessment drives implementation of new histological and imaging techniques. Among these micro-computed tomography (μCT has recently been utilized as an effective and practical method for generating images at resolutions permitting 3D quantitative analysis of gross morphological attributes of developing tissues and organs in embryonic mice. However, histological processing in preparation for μCT scanning induces changes in organ size and shape. Establishing normative expectations for experimentally induced changes in size and shape will be an important feature of 3D μCT-based phenotypic assessments, especially if quantifying differences in the values of those parameters between comparison sets of developing embryos is a primary aim. Toward that end, we assessed the nature and degree of morphological artifacts attending μCT scanning following use of common fixatives, using a two dimensional (2D landmark geometric morphometric approach to track the accumulation of distortions affecting the embryonic head from the native, uterine state through to fixation and subsequent scanning. Results Bouin's fixation reduced average centroid sizes of embryonic mouse crania by approximately 30% and substantially altered the morphometric shape, as measured by the shift in Procrustes distance, from the unfixed state, after the data were normalized for naturally occurring shape variation. Subsequent μCT scanning produced negligible changes in size but did appear to reduce or even reverse fixation-induced random shape changes. Mixtures of paraformaldehyde + glutaraldehyde reduced average centroid sizes by 2-3%. Changes in craniofacial shape progressively increased post-fixation. Conclusions The degree to which artifacts are introduced in the generation of random craniofacial shape variation relates to the degree of specimen dehydration during the

  7. Comparison of Real-Time Intraoperative Ultrasound-Based Dosimetry With Postoperative Computed Tomography-Based Dosimetry for Prostate Brachytherapy

    International Nuclear Information System (INIS)

    Nag, Subir; Shi Peipei; Liu Bingren; Gupta, Nilendu; Bahnson, Robert R.; Wang, Jian Z.

    2008-01-01

    Purpose: To evaluate whether real-time intraoperative ultrasound (US)-based dosimetry can replace conventional postoperative computed tomography (CT)-based dosimetry in prostate brachytherapy. Methods and Materials: Between December 2001 and November 2002, 82 patients underwent 103 Pd prostate brachytherapy. An interplant treatment planning system was used for real-time intraoperative transrectal US-guided treatment planning. The dose distribution was updated according to the estimated seed position to obtain the dose-volume histograms. Postoperative CT-based dosimetry was performed a few hours later using the Theraplan-Plus treatment planning system. The dosimetric parameters obtained from the two imaging modalities were compared. Results: The results of this study revealed correlations between the US- and CT-based dosimetry. However, large variations were found in the implant-quality parameters of the two modalities, including the doses covering 100%, 90%, and 80% of the prostate volume and prostate volumes covered by 100%, 150%, and 200% of the prescription dose. The mean relative difference was 38% and 16% for doses covering 100% and 90% of the prostate volume and 10% and 21% for prostate volumes covered by 100% and 150% of the prescription dose, respectively. The CT-based volume covered by 200% of the prescription dose was about 30% greater than the US-based one. Compared with CT-based dosimetry, US-based dosimetry significantly underestimated the dose to normal organs, especially for the rectum. The average US-based maximal dose and volume covered by 100% of the prescription dose for the rectum was 72 Gy and 0.01 cm 3 , respectively, much lower than the 159 Gy and 0.65 cm 3 obtained using CT-based dosimetry. Conclusion: Although dosimetry using intraoperative US-based planning provides preliminary real-time information, it does not accurately reflect the postoperative CT-based dosimetry. Until studies have determined whether US-based dosimetry or

  8. Micro-computed tomography-based phenotypic approaches in embryology: procedural artifacts on assessments of embryonic craniofacial growth and development.

    Science.gov (United States)

    Schmidt, Eric J; Parsons, Trish E; Jamniczky, Heather A; Gitelman, Julian; Trpkov, Cvett; Boughner, Julia C; Logan, C Cairine; Sensen, Christoph W; Hallgrímsson, Benedikt

    2010-02-17

    Growing demand for three dimensional (3D) digital images of embryos for purposes of phenotypic assessment drives implementation of new histological and imaging techniques. Among these micro-computed tomography (microCT) has recently been utilized as an effective and practical method for generating images at resolutions permitting 3D quantitative analysis of gross morphological attributes of developing tissues and organs in embryonic mice. However, histological processing in preparation for microCT scanning induces changes in organ size and shape. Establishing normative expectations for experimentally induced changes in size and shape will be an important feature of 3D microCT-based phenotypic assessments, especially if quantifying differences in the values of those parameters between comparison sets of developing embryos is a primary aim. Toward that end, we assessed the nature and degree of morphological artifacts attending microCT scanning following use of common fixatives, using a two dimensional (2D) landmark geometric morphometric approach to track the accumulation of distortions affecting the embryonic head from the native, uterine state through to fixation and subsequent scanning. Bouin's fixation reduced average centroid sizes of embryonic mouse crania by approximately 30% and substantially altered the morphometric shape, as measured by the shift in Procrustes distance, from the unfixed state, after the data were normalized for naturally occurring shape variation. Subsequent microCT scanning produced negligible changes in size but did appear to reduce or even reverse fixation-induced random shape changes. Mixtures of paraformaldehyde + glutaraldehyde reduced average centroid sizes by 2-3%. Changes in craniofacial shape progressively increased post-fixation. The degree to which artifacts are introduced in the generation of random craniofacial shape variation relates to the degree of specimen dehydration during the initial fixation. Fixation methods that

  9. Variabilities of Magnetic Resonance Imaging-, Computed Tomography-, and Positron Emission Tomography-Computed Tomography-Based Tumor and Lymph Node Delineations for Lung Cancer Radiation Therapy Planning.

    Science.gov (United States)

    Karki, Kishor; Saraiya, Siddharth; Hugo, Geoffrey D; Mukhopadhyay, Nitai; Jan, Nuzhat; Schuster, Jessica; Schutzer, Matthew; Fahrner, Lester; Groves, Robert; Olsen, Kathryn M; Ford, John C; Weiss, Elisabeth

    2017-09-01

    To investigate interobserver delineation variability for gross tumor volumes of primary lung tumors and associated pathologic lymph nodes using magnetic resonance imaging (MRI), and to compare the results with computed tomography (CT) alone- and positron emission tomography (PET)-CT-based delineations. Seven physicians delineated the tumor volumes of 10 patients for the following scenarios: (1) CT only, (2) PET-CT fusion images registered to CT ("clinical standard"), and (3) postcontrast T1-weighted MRI registered with diffusion-weighted MRI. To compute interobserver variability, the median surface was generated from all observers' contours and used as the reference surface. A physician labeled the interface types (tumor to lung, atelectasis (collapsed lung), hilum, mediastinum, or chest wall) on the median surface. Contoured volumes and bidirectional local distances between individual observers' contours and the reference contour were analyzed. Computed tomography- and MRI-based tumor volumes normalized relative to PET-CT-based volumes were 1.62 ± 0.76 (mean ± standard deviation) and 1.38 ± 0.44, respectively. Volume differences between the imaging modalities were not significant. Between observers, the mean normalized volumes per patient averaged over all patients varied significantly by a factor of 1.6 (MRI) and 2.0 (CT and PET-CT) (P=4.10 × 10 -5 to 3.82 × 10 -9 ). The tumor-atelectasis interface had a significantly higher variability than other interfaces for all modalities combined (P=.0006). The interfaces with the smallest uncertainties were tumor-lung (on CT) and tumor-mediastinum (on PET-CT and MRI). Although MRI-based contouring showed overall larger variability than PET-CT, contouring variability depended on the interface type and was not significantly different between modalities, despite the limited observer experience with MRI. Multimodality imaging and combining different imaging characteristics might be the best approach to define

  10. Prospective, blinded trial of whole-body magnetic resonance imaging versus computed tomography positron emission tomography in staging primary and recurrent cancer of the head and neck.

    LENUS (Irish Health Repository)

    O'Neill, J P

    2012-02-01

    OBJECTIVES: To compare the use of computed tomography - positron emission tomography and whole-body magnetic resonance imaging for the staging of head and neck cancer. PATIENTS AND METHODS: From January to July 2009, 15 consecutive head and neck cancer patients (11 men and four women; mean age 59 years; age range 19 to 81 years) underwent computed tomography - positron emission tomography and whole-body magnetic resonance imaging for pre-therapeutic evaluation. All scans were staged, as per the American Joint Committee on Cancer tumour-node-metastasis classification, by two blinded consultant radiologists, in two sittings. Diagnoses were confirmed by histopathological examination of endoscopic biopsies, and in some cases whole surgical specimens. RESULTS: Tumour staging showed a 74 per cent concordance, node staging an 80 per cent concordance and metastasis staging a 100 per cent concordance, comparing the two imaging modalities. CONCLUSION: This study found radiological staging discordance between the two imaging modalities. Whole-body magnetic resonance imaging is an emerging staging modality with superior visualisation of metastatic disease, which does not require exposure to ionising radiation.

  11. Diagnostic accuracy of a volume-rendered computed tomography movie and other computed tomography-based imaging methods in assessment of renal vascular anatomy for laparoscopic donor nephrectomy.

    Science.gov (United States)

    Yamamoto, Shingo; Tanooka, Masao; Ando, Kumiko; Yamano, Toshiko; Ishikura, Reiichi; Nojima, Michio; Hirota, Shozo; Shima, Hiroki

    2009-12-01

    To evaluate the diagnostic accuracy of computed tomography (CT)-based imaging methods for assessing renal vascular anatomy, imaging studies, including standard axial CT, three-dimensional volume-rendered CT (3DVR-CT), and a 3DVR-CT movie, were performed on 30 patients who underwent laparoscopic donor nephrectomy (10 right side, 20 left side) for predicting the location of the renal arteries and renal, adrenal, gonadal, and lumbar veins. These findings were compared with videos obtained during the operation. Two of 37 renal arteries observed intraoperatively were missed by standard axial CT and 3DVR-CT, whereas all arteries were identified by the 3DVR-CT movie. Two of 36 renal veins were missed by standard axial CT and 3DVR-CT, whereas 1 was missed by the 3DVR-CT movie. In 20 left renal hilar anatomical structures, 20 adrenal, 20 gonadal, and 22 lumbar veins were observed during the operation. Preoperatively, the standard axial CT, 3DVR-CT, and 3DVR-CT movie detected 11, 19, and 20 adrenal veins; 13, 14, and 19 gonadal veins; and 6, 11, and 15 lumbar veins, respectively. Overall, of 135 renal vascular structures, the standard axial CT, 3DVR-CT, and 3DVR-CT movie accurately detected 99 (73.3%), 113 (83.7%), and 126 (93.3%) vessels, respectively, which indicated that the 3DVR-CT movie demonstrated a significantly higher detection rate than other CT-based imaging methods (P renal vascular anatomy before laparoscopic donor nephrectomy.

  12. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve: A single-center prospective study.

    Science.gov (United States)

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-12-01

    The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD).We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC.FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = -0.584 and r = -0.568, respectively, both P system can predict FFR at an optimal cut-off of reserved.

  13. The influence of secondary reconstruction slice thickness on NewTom 3G cone beam computed tomography-based radiological interpretation of sheep mandibular condyle fractures.

    Science.gov (United States)

    Sirin, Yigit; Guven, Koray; Horasan, Sinan; Sencan, Sabri; Bakir, Baris; Barut, Oya; Tanyel, Cem; Aral, Ali; Firat, Deniz

    2010-11-01

    The objective of this study was to examine the diagnostic accuracy of the different secondary reconstruction slice thicknesses of cone beam computed tomography (CBCT) on artificially created mandibular condyle fractures. A total of 63 sheep heads with or without condylar fractures were scanned with a NewTom 3G CBCT scanner. Multiplanar reformatted (MPR) views in 0.2-mm, 1-mm, 2-mm, and 3-mm secondary reconstruction slice thicknesses were evaluated by 7 observers. Inter- and intraobserver agreements were calculated with weighted kappa statistics. The receiver operating characteristic (ROC) curve analysis was used to statistically compare the area under the curve (AUC) of each slice thickness. The kappa coefficients varied from fair and to excellent. The AUCs of 0.2-mm and 1-mm slice thicknesses were found to be significantly higher than those of 2 mm and 3 mm for some type of fractures. CBCT was found to be accurate in detecting all variants of fractures at 0.2 mm and 1 mm. However, 2-mm and 3-mm slices were not suitable to detect fissure, complete, and comminuted types of mandibular condyle fractures. Copyright © 2010 Mosby, Inc. All rights reserved.

  14. Short-term Reproducibility of Computed Tomography-based Lung Density Measurements in Alpha-1 Antitrypsin Deficiency and Smokers with Emphysema

    International Nuclear Information System (INIS)

    Shaker, S.B.; Dirksen, A.; Laursen, L.C.; Maltbaek, N.; Christensen, L.; Sander, U.; Seersholm, N.; Skovgaard, L.T.; Nielsen, L.; Kok-Jensen, A.

    2004-01-01

    Purpose: To study the short-term reproducibility of lung density measurements by multi-slice computed tomography (CT) using three different radiation doses and three reconstruction algorithms. Material and Methods: Twenty-five patients with smoker's emphysema and 25 patients with 1-antitrypsin deficiency underwent 3 scans at 2-week intervals. Low-dose protocol was applied, and images were reconstructed with bone, detail, and soft algorithms. Total lung volume (TLV), 15th percentile density (PD-15), and relative area at -910 Hounsfield units (RA-910) were obtained from the images using Pulmo-CMS software. Reproducibility of PD-15 and RA-910 and the influence of radiation dose, reconstruction algorithm, and type of emphysema were then analysed. Results: The overall coefficient of variation of volume adjusted PD-15 for all combinations of radiation dose and reconstruction algorithm was 3.7%. The overall standard deviation of volume-adjusted RA-910 was 1.7% (corresponding to a coefficient of variation of 6.8%). Radiation dose, reconstruction algorithm, and type of emphysema had no significant influence on the reproducibility of PD-15 and RA-910. However, bone algorithm and very low radiation dose result in overestimation of the extent of emphysema. Conclusion: Lung density measurement by CT is a sensitive marker for quantitating both subtypes of emphysema. A CT-protocol with radiation dose down to 16 mAs and soft or detail reconstruction algorithm is recommended

  15. Cone-beam computed tomography-based diagnosis and treatment simulation for a patient with a protrusive profile and a gummy smile

    Science.gov (United States)

    Imamura, Toshihiro; Kokai, Satoshi; Ono, Takashi

    2018-01-01

    For patients with bimaxillary protrusion, significant retraction and intrusion of the anterior teeth are sometimes essential to improve the facial profile. However, severe root resorption of the maxillary incisors occasionally occurs after treatment because of various factors. For instance, it has been reported that approximation or invasion of the incisive canal by the anterior tooth roots during retraction may cause apical root damage. Thus, determination of the position of the maxillary incisors is key for orthodontic diagnosis and treatment planning in such cases. Cone-beam computed tomography (CBCT) may be useful for simulating the post-treatment position of the maxillary incisors and surrounding structures in order to ensure safe teeth movement. Here, we present a case of Class II malocclusion with bimaxillary protrusion, wherein apical root damage due to treatment was minimized by pretreatment evaluation of the anatomical structures and simulation of the maxillary central incisor movement using CBCT. Considerable retraction and intrusion of the maxillary incisors, which resulted in a significant improvement in the facial profile and smile, were achieved without severe root resorption. Our findings suggest that CBCT-based diagnosis and treatment simulation may facilitate safe and dynamic orthodontic tooth movement, particularly in patients requiring maximum anterior tooth retraction. PMID:29732305

  16. Separation of hepatic iron and fat by dual-source dual-energy computed tomography based on material decomposition: an animal study.

    Science.gov (United States)

    Ma, Jing; Song, Zhi-Qiang; Yan, Fu-Hua

    2014-01-01

    To explore the feasibility of dual-source dual-energy computed tomography (DSDECT) for hepatic iron and fat separation in vivo. All of the procedures in this study were approved by the Research Animal Resource Center of Shanghai Ruijin Hospital. Sixty rats that underwent DECT scanning were divided into the normal group, fatty liver group, liver iron group, and coexisting liver iron and fat group, according to Prussian blue and HE staining. The data for each group were reconstructed and post-processed by an iron-specific, three-material decomposition algorithm. The iron enhancement value and the virtual non-iron contrast value, which indicated overloaded liver iron and residual liver tissue, respectively, were measured. Spearman's correlation and one-way analysis of variance (ANOVA) were performed, respectively, to analyze statistically the correlations with the histopathological results and differences among groups. The iron enhancement values were positively correlated with the iron pathology grading (r = 0.729, pVNC) values were negatively correlated with the fat pathology grading (r = -0.642,pVNC values (F = 25.308,pVNC values were only observed between the fat-present and fat-absent groups. Separation of hepatic iron and fat by dual energy material decomposition in vivo was feasible, even when they coexisted.

  17. Comparison of computed tomography based parametric and patient-specific finite element models of the healthy and metastatic spine using a mesh-morphing algorithm.

    Science.gov (United States)

    O'Reilly, Meaghan Anne; Whyne, Cari Marisa

    2008-08-01

    A comparative analysis of parametric and patient-specific finite element (FE) modeling of spinal motion segments. To develop patient-specific FE models of spinal motion segments using mesh-morphing methods applied to a parametric FE model. To compare strain and displacement patterns in parametric and morphed models for both healthy and metastatically involved vertebrae. Parametric FE models may be limited in their ability to fully represent patient-specific geometries and material property distributions. Generation of multiple patient-specific FE models has been limited because of computational expense. Morphing methods have been successfully used to generate multiple specimen-specific FE models of caudal rat vertebrae. FE models of a healthy and a metastatic T6-T8 spinal motion segment were analyzed with and without patient-specific material properties. Parametric and morphed models were compared using a landmark-based morphing algorithm. Morphing of the parametric FE model and including patient-specific material properties both had a strong impact on magnitudes and patterns of vertebral strain and displacement. Small but important geometric differences can be represented through morphing of parametric FE models. The mesh-morphing algorithm developed provides a rapid method for generating patient-specific FE models of spinal motion segments.

  18. A study on optimal scan conditions of big bore multi-slice computed tomography based on radiation dose and image noise

    International Nuclear Information System (INIS)

    Lee, J. S.; Ye, S. J.; Kim, E. H.

    2011-01-01

    The newly introduced Big Bore computed tomography (CT) has a possibility to increase the tube current product scan time (mA s) for compensation of image degradation due to larger gentry opening without sound guideline. The objective of this paper is to derive optimal scan conditions for Big Bore CT scanner, mainly relating to the dose of diagnostic CT. The weighted CT dose index (CTDI w ) was estimated at five typical protocols, such as head and neck, brain, paediatric, chest and abdomen. Noises were analysed in a circle of 1 or 2 cm of diameter in CT image slice. The results showed that measured CTDI w values generally follow the theoretical rule at all scanning conditions of every protocol. Although image noises decrease with increment of mA s, analysed image noises do follow the theoretical rule, but only in specific protocols. This phenomenon is presumed to result from the photon energy spectra arriving at the detection system of the Big Bore scanner. (authors)

  19. Computed Tomography-Based Occipital Condyle Morphometry in an Indian Population to Assess the Feasibility of Condylar Screws for Occipitocervical Fusion.

    Science.gov (United States)

    Srivastava, Abhishek; Nanda, Geetanjali; Mahajan, Rajat; Nanda, Ankur; Mishra, Nirajana; Karmaran, Srinivasa; Batra, Sahil; Chhabra, Harvinder Singh

    2017-12-01

    A retrospective computed tomography (CT)-based morphometric study of 82 occipital condyles in the Indian population, focusing on critical morphometric dimensions with relation to placing condylar screws. This study focused on determining the feasibility of placing occipital condylar screws in an Indian population using CT anatomical morphometric data. The occipital condylar screw is a novel technique being explored as one of the options in occipitocervical stabilization. Sex and ethnic variations in anatomical structures may restrict the feasibility of this technique in some populations. To the best of our knowledge, there are no CT-based data on an Indian population that assess the feasibility of occipital condylar screws. We measured the dimensions of 82 occipital condyles in 41 adults on coronal, sagittal, and axial reconstructed CT images. The differences were noted between the right and left sides and also between males and females. Statistical analysis was performed using the t -test, with a p -value of occipital condyle shows that condylar screws are anatomically feasible in a large portion of the Indian population. However, because a small number of population may not be suitable for this technique, meticulous study of preoperative anatomy using detailed CT data is advised.

  20. Customized Computed Tomography-Based Boost Volumes in Breast-Conserving Therapy: Use of Three-Dimensional Histologic Information for Clinical Target Volume Margins

    International Nuclear Information System (INIS)

    Hanbeukers, Bianca; Borger, Jacques; Ende, Piet van den; Ent, Fred van der; Houben, Ruud; Jager, Jos; Keymeulen, Kristien; Murrer, Lars; Sastrowijoto, Suprapto; Vijver, Koen van de; Boersma, Liesbeth

    2009-01-01

    Purpose: To determine the difference in size between computed tomography (CT)-based irradiated boost volumes and simulator-based irradiated volumes in patients treated with breast-conserving therapy and to analyze whether the use of anisotropic three-dimensional clinical target volume (CTV) margins using the histologically determined free resection margins allows for a significant reduction of the CT-based boost volumes. Patients and Methods: The CT data from 49 patients were used to delineate a planning target volume (PTV) with isotropic CTV margins and to delineate a PTV sim that mimicked the PTV as delineated in the era of conventional simulation. For 17 patients, a PTV with anisotropic CTV margins was defined by applying customized three-dimensional CTV margins, according to the free excision margins in six directions. Boost treatment plans consisted of conformal portals for the CT-based PTVs and rectangular fields for the PTV sim . Results: The irradiated volume (volume receiving ≥95% of the prescribed dose [V 95 ]) for the PTV with isotropic CTV margins was 1.6 times greater than that for the PTV sim : 228 cm 3 vs. 147 cm 3 (p 95 was similar to the V 95 for the PTV sim (190 cm 3 vs. 162 cm 3 ; p = NS). The main determinant for the irradiated volume was the size of the excision cavity (p < .001), which was mainly related to the interval between surgery and the planning CT scan (p = .029). Conclusion: CT-based PTVs with isotropic margins for the CTV yield much greater irradiated volumes than fluoroscopically based PTVs. Applying individualized anisotropic CTV margins allowed for a significant reduction of the irradiated boost volume.

  1. Survival outcomes after radiation therapy for stage III non-small-cell lung cancer after adoption of computed tomography-based simulation.

    Science.gov (United States)

    Chen, Aileen B; Neville, Bridget A; Sher, David J; Chen, Kun; Schrag, Deborah

    2011-06-10

    Technical studies suggest that computed tomography (CT) -based simulation improves the therapeutic ratio for thoracic radiation therapy (TRT), although few studies have evaluated its use or impact on outcomes. We used the Surveillance, Epidemiology and End Results (SEER) -Medicare linked data to identify CT-based simulation for TRT among Medicare beneficiaries diagnosed with stage III non-small-cell lung cancer (NSCLC) between 2000 and 2005. Demographic and clinical factors associated with use of CT simulation were identified, and the impact of CT simulation on survival was analyzed by using Cox models and propensity score analysis. The proportion of patients treated with TRT who had CT simulation increased from 2.4% in 1994 to 34.0% in 2000 to 77.6% in 2005. Of the 5,540 patients treated with TRT from 2000 to 2005, 60.1% had CT simulation. Geographic variation was seen in rates of CT simulation, with lower rates in rural areas and in the South and West compared with those in the Northeast and Midwest. Patients treated with chemotherapy were more likely to have CT simulation (65.2% v 51.2%; adjusted odds ratio, 1.67; 95% CI, 1.48 to 1.88; P simulation. Controlling for demographic and clinical characteristics, CT simulation was associated with lower risk of death (adjusted hazard ratio, 0.77; 95% CI, 0.73 to 0.82; P simulation. CT-based simulation has been widely, although not uniformly, adopted for the treatment of stage III NSCLC and is associated with higher survival among patients receiving TRT.

  2. Computed Tomography Based Three-dimensional Measurements of Spine Shortening Distance After Posterior Three-column Osteotomies for the Treatment of Severe and Stiff Scoliosis.

    Science.gov (United States)

    Li, Xue-Shi; Huang, Zi-Fang; Deng, Yao-Long; Fan, Heng-Wei; Sui, Wen-Yuan; Wang, Chong-Wen; Yang, Jun-Lin

    2017-07-15

    Retrospective study. This study is to measure and analyze the changes of three-dimensional (3D) distances of spinal column and spinal canal at the three-column osteotomy sites and address their clinical and neurologic significance. Three-column osteotomies were developed to treat severe and stiff spine deformities with insufficient understanding on the safe limit of spine shortening and the relationship between the shortening distance of the spinal column and that of the spinal canal. Records of 52 continuous patients with severe and stiff scoliosis treated with three-column spine osteotomies at our institution from July 2013 to June 2015 were reviewed. The preoperative spinal cord function classification were type A in 31 cases, type B in 10 cases, and type C in 11 cases. The types of osteotomies carried out were extended pedicle subtraction osteotomy in nine patients and posterior vertebral column resection in 43 patients. Multimodality neuromonitoring strategies were adopted intraoperatively. 3D pre- and postoperative spine models were reconstructed from the computed tomography (CT) scans. The distances of convex and concave spinal column and the spinal canal shortening were measured and analyzed. The spinal column shortening distance (SCSD) measured on the 3D models (27.8 mm) were statistically shorter than those measured intraoperatively (32.8 mm) (P column strut graft than in those with bone-on-bone fusion (P column cannot represent that of the central spinal canal in patients with severe scoliosis. The spinal column shortening procedure in appropriately selected patient groups with bone-on-bone fusion is a viable option with the CCSD being significantly shorter than the convex SCSD. 4.

  3. Accuracy of Cup Positioning With the Computed Tomography-Based Two-dimensional to Three-Dimensional Matched Navigation System: A Prospective, Randomized Controlled Study.

    Science.gov (United States)

    Yamada, Kazuki; Endo, Hirosuke; Tetsunaga, Tomonori; Miyake, Takamasa; Sanki, Tomoaki; Ozaki, Toshifumi

    2018-01-01

    The accuracy of various navigation systems used for total hip arthroplasty has been described, but no publications reported the accuracy of cup orientation in computed tomography (CT)-based 2D-3D (two-dimensional to three-dimensional) matched navigation. In a prospective, randomized controlled study, 80 hips including 44 with developmental dysplasia of the hips were divided into a CT-based 2D-3D matched navigation group (2D-3D group) and a paired-point matched navigation group (PPM group). The accuracy of cup orientation (absolute difference between the intraoperative record and the postoperative measurement) was compared between groups. Additionally, multiple logistic regression analysis was performed to evaluate patient factors affecting the accuracy of cup orientation in each navigation. The accuracy of cup inclination was 2.5° ± 2.2° in the 2D-3D group and 4.6° ± 3.3° in the PPM group (P = .0016). The accuracy of cup anteversion was 2.3° ± 1.7° in the 2D-3D group and 4.4° ± 3.3° in the PPM group (P = .0009). In the PPM group, the presence of roof osteophytes decreased the accuracy of cup inclination (odds ratio 8.27, P = .0140) and the absolute value of pelvic tilt had a negative influence on the accuracy of cup anteversion (odds ratio 1.27, P = .0222). In the 2D-3D group, patient factors had no effect on the accuracy of cup orientation. The accuracy of cup positioning in CT-based 2D-3D matched navigation was better than in paired-point matched navigation, and was not affected by patient factors. It is a useful system for even severely deformed pelvises such as developmental dysplasia of the hips. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Separation of hepatic iron and fat by dual-source dual-energy computed tomography based on material decomposition: an animal study.

    Directory of Open Access Journals (Sweden)

    Jing Ma

    Full Text Available OBJECTIVE: To explore the feasibility of dual-source dual-energy computed tomography (DSDECT for hepatic iron and fat separation in vivo. MATERIALS AND METHODS: All of the procedures in this study were approved by the Research Animal Resource Center of Shanghai Ruijin Hospital. Sixty rats that underwent DECT scanning were divided into the normal group, fatty liver group, liver iron group, and coexisting liver iron and fat group, according to Prussian blue and HE staining. The data for each group were reconstructed and post-processed by an iron-specific, three-material decomposition algorithm. The iron enhancement value and the virtual non-iron contrast value, which indicated overloaded liver iron and residual liver tissue, respectively, were measured. Spearman's correlation and one-way analysis of variance (ANOVA were performed, respectively, to analyze statistically the correlations with the histopathological results and differences among groups. RESULTS: The iron enhancement values were positively correlated with the iron pathology grading (r = 0.729, p<0.001. Virtual non-iron contrast (VNC values were negatively correlated with the fat pathology grading (r = -0.642,p<0.0001. Different groups showed significantly different iron enhancement values and VNC values (F = 25.308,p<0.001; F = 10.911, p<0.001, respectively. Among the groups, significant differences in iron enhancement values were only observed between the iron-present and iron-absent groups, and differences in VNC values were only observed between the fat-present and fat-absent groups. CONCLUSION: Separation of hepatic iron and fat by dual energy material decomposition in vivo was feasible, even when they coexisted.

  5. Prostate positioning using cone-beam computer tomography based on manual soft-tissue registration. Interobserver agreement between radiation oncologists and therapists

    Energy Technology Data Exchange (ETDEWEB)

    Jereczek-Fossa, B.A.; Pobbiati, C.; Fanti, P. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); University of Milan, Milan (Italy); Santoro, L. [European Institute of Oncology, Department of Epidemiology and Biostatistics, Milan (Italy); Fodor, C.; Zerini, D. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); Vigorito, S. [European Institute of Oncology, Department of Medical Physics, Milan (Italy); Baroni, G. [Politecnico di Milano, Department of Electronics Information and Bioengineering, Milan (Italy); De Cobelli, O. [European Institute of Oncology, Department of Urology, Milan (Italy); University of Milan, Milan (Italy); Orecchia, R. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); National Center for Oncological Hadrontherapy (CNAO) Foundation, Pavia (Italy); University of Milan, Milan (Italy)

    2014-01-15

    To check the interobserver agreement between radiation oncologists and therapists (RTT) using an on- and off-line cone-beam computer tomography (CBCT) protocol for setup verification in the radiotherapy of prostate cancer. The CBCT data from six prostate cancer patients treated with hypofractionated intensity-modulated radiotherapy (IMRT) were independently reviewed off-line by four observers (one radiation oncologist, one junior and two senior RTTs) and benchmarked with on-line CBCT positioning performed by a radiation oncologist immediately prior to treatment. CBCT positioning was based on manual soft-tissue registration. Agreement between observers was evaluated using weighted Cohen's kappa statistics. In total, 152 CBCT-based prostate positioning procedures were reviewed by each observer. The mean (± standard deviation) of the differences between off- and on-line CBCT-simCT registration translations along the three directions (antero-posterior, latero-lateral and cranio-caudal) and rotation around the antero-posterior axis were - 0.7 (3.6) mm, 1.9 (2.7) mm, 0.9 (3.6) mm and - 1.8 (5.0) degrees, respectively. Satisfactory interobserver agreement was found, being substantial (weighted kappa > 0.6) in 10 of 16 comparisons and moderate (0.41-0.60) in the remaining six comparisons. CBCT interpretation performed by RTTs is comparable to that of radiation oncologists. Our study might be helpful in the quality assurance of radiotherapy and the optimization of competencies. Further investigation should include larger sample sizes, a greater number of observers and validated methodology in order to assess interobserver variability and its impact on high-precision prostate cancer IGRT. In the future, it should enable the wider implementation of complex and evolving radiotherapy technologies. (orig.)

  6. On the relationship between calcified neurocysticercosis and epilepsy in an endemic village: A large-scale, computed tomography-based population study in rural Ecuador.

    Science.gov (United States)

    Del Brutto, Oscar H; Arroyo, Gianfranco; Del Brutto, Victor J; Zambrano, Mauricio; García, Héctor H

    2017-11-01

    Using a large-scale population-based study, we aimed to assess prevalence and patterns of presentation of neurocysticercosis (NCC) and its relationship with epilepsy in community-dwellers aged ≥20 years living in Atahualpa (rural Ecuador). In a three-phase epidemiological study, individuals with suspected seizures were identified during a door-to-door survey and an interview (phase I). Then, neurologists evaluated suspected cases and randomly selected negative persons to estimate epilepsy prevalence (phase II). In phase III, all participants were offered noncontrast computed tomography (CT) for identifying NCC cases. The independent association between NCC (exposure) and epilepsy (outcome) was assessed by the use of multivariate logistic regression models adjusted for age, sex, level of education, and alcohol intake. CT findings were subsequently compared to archived brain magnetic resonance imaging in a sizable subgroup of participants. Of 1,604 villagers aged ≥20 years, 1,462 (91%) were enrolled. Forty-one persons with epilepsy (PWE) were identified, for a crude prevalence of epilepsy of 28 per 1,000 population (95% confidence interval [CI] = 20.7-38.2). A head CT was performed in 1,228 (84%) of 1,462 participants, including 39 of 41 PWE. CT showed lesions consistent with calcified parenchymal brain cysticerci in 118 (9.6%) cases (95% CI = 8.1-11.4%). No patient had other forms of NCC. Nine of 39 PWE, as opposed to 109 of 1,189 participants without epilepsy, had NCC (23.1% vs. 9.2%, p = 0.004). This difference persisted in the adjusted logistic regression model (odds ratio = 3.04, 95% CI = 1.35-6.81, p = 0.007). This large CT-based study demonstrates that PWE had three times the odds of having NCC than those without epilepsy, providing robust epidemiological evidence favoring the relationship between NCC and epilepsy. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  7. Computed Tomography-Based Imaging of Voxel-Wise Lesion Water Uptake in Ischemic Brain: Relationship Between Density and Direct Volumetry.

    Science.gov (United States)

    Broocks, Gabriel; Flottmann, Fabian; Ernst, Marielle; Faizy, Tobias Djamsched; Minnerup, Jens; Siemonsen, Susanne; Fiehler, Jens; Kemmling, Andre

    2018-04-01

    Net water uptake per volume of brain tissue may be calculated by computed tomography (CT) density, and this imaging biomarker has recently been investigated as a predictor of lesion age in acute stroke. However, the hypothesis that measurements of CT density may be used to quantify net water uptake per volume of infarct lesion has not been validated by direct volumetric measurements so far. The purpose of this study was to (1) develop a theoretical relationship between CT density reduction and net water uptake per volume of ischemic lesions and (2) confirm this relationship by quantitative in vitro and in vivo CT image analysis using direct volumetric measurements. We developed a theoretical rationale for a linear relationship between net water uptake per volume of ischemic lesions and CT attenuation. The derived relationship between water uptake and CT density was tested in vitro in a set of increasingly diluted iodine solutions with successive CT measurements. Furthermore, the consistency of this relationship was evaluated using human in vivo CT images in a retrospective multicentric cohort. In 50 edematous infarct lesions, net water uptake was determined by direct measurement of the volumetric difference between the ischemic and normal hemisphere and was correlated with net water uptake calculated by ischemic density measurements. With regard to in vitro data, water uptake by density measurement was equivalent to direct volumetric measurement (r = 0.99, P volumetry was 44.7 ± 26.8 mL and the mean percent water uptake per lesion volume was 22.7% ± 7.4%. This was equivalent to percent water uptake obtained from density measurements: 21.4% ± 6.4%. The mean difference between percent water uptake by direct volumetry and percent water uptake by CT density was -1.79% ± 3.40%, which was not significantly different from 0 (P < 0.0001). Volume of water uptake in infarct lesions can be calculated quantitatively by relative CT density measurements. Voxel-wise imaging

  8. Computed tomography-based treatment planning for high-dose-rate brachytherapy using the tandem and ring applicator: influence of applicator choice on organ dose and inter-fraction adaptive planning

    Directory of Open Access Journals (Sweden)

    Vishruta A. Dumane

    2017-06-01

    Full Text Available Three dimensional planning for high-dose-rate (HDR brachytherapy in cervical cancer has been highly recommended by consensus guidelines such as the American Brachytherapy Society (ABS and the Groupe Européen de Curiethérapie – European Society for Radiotherapy and Oncology (GEC-ESTRO. In this document, we describe our experience with computed tomography (CT-based planning using the tandem/ring applicator. We discuss the influence of applicator geometry on doses to organs at risk (OARs, namely the bladder, rectum, and sigmoid. Through example cases with dose prescribed to point A, we demonstrate how adaptive planning can help achieve constraints to the OARs as per guidelines.

  9. A high-resolution computed tomography-based scoring system to differentiate the most infectious active pulmonary tuberculosis from community-acquired pneumonia in elderly and non-elderly patients

    International Nuclear Information System (INIS)

    Yeh, Jun-Jun; Chen, Solomon Chih-Cheng; Chen, Cheng-Ren; Yeh, Ting-Chun; Lin, Hsin-Kai; Hong, Jia-Bin; Wu, Bing-Tsang; Wu, Ming-Ting

    2014-01-01

    The objective of this study was to use high-resolution computed tomography (HRCT) imaging to predict the presence of smear-positive active pulmonary tuberculosis (PTB) in elderly (at least 65 years of age) and non-elderly patients (18-65 years of age). Patients with active pulmonary infections seen from November 2010 through December 2011 received HRCT chest imaging, sputum smears for acid-fast bacilli and sputum cultures for Mycobacterium tuberculosis. Smear-positive PTB was defined as at least one positive sputum smear and a positive culture for M. tuberculosis. Multivariate logistic regression analyses were performed to determine the HRCT predictors of smear-positive active PTB, and a prediction score was developed on the basis of receiver operating characteristic curve analysis. Of 1,255 patients included, 139 were diagnosed with smear-positive active PTB. According to ROC curve analysis, the sensitivity, specificity, positive predictive value, negative predictive value, false positive rates and false negative rates were 98.6 %, 95.8 %, 78.5 %, 99.8 %, 4.2 % and 1.4 %, respectively, for diagnosing smear-positive active PTB in elderly patients, and 100.0 %, 96.9 %, 76.5 %, 100.0 %, 3.1 % and 0.0 %, respectively, for non-elderly patients. HRCT can assist in the early diagnosis of the most infectious active PTB, thereby preventing transmission and minimizing unnecessary immediate respiratory isolation. (orig.)

  10. A high-resolution computed tomography-based scoring system to differentiate the most infectious active pulmonary tuberculosis from community-acquired pneumonia in elderly and non-elderly patients

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Jun-Jun [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Section of Thoracic Imaging, Department of Chest Medicine and Family Medicine, Chiayi City (China); Chia Nan University of Pharmacy and Science, Tainan (China); Meiho University, Pingtung (China); Pingtung Christian Hospital, Pingtung (China); Chen, Solomon Chih-Cheng; Chen, Cheng-Ren [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Department of Medical Research, Chiayi City (China); Yeh, Ting-Chun; Lin, Hsin-Kai; Hong, Jia-Bin; Wu, Bing-Tsang [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Department of Family Medicine, Chiayi City (China); Wu, Ming-Ting [Department of Radiology, Kaohsiung Veterans General Hospital, Section of Thoracic and Circulation Imaging, Kaohsiung (China); School of Medicine, National Yang Ming University, Faculty of Medicine, Taipei (China)

    2014-10-15

    The objective of this study was to use high-resolution computed tomography (HRCT) imaging to predict the presence of smear-positive active pulmonary tuberculosis (PTB) in elderly (at least 65 years of age) and non-elderly patients (18-65 years of age). Patients with active pulmonary infections seen from November 2010 through December 2011 received HRCT chest imaging, sputum smears for acid-fast bacilli and sputum cultures for Mycobacterium tuberculosis. Smear-positive PTB was defined as at least one positive sputum smear and a positive culture for M. tuberculosis. Multivariate logistic regression analyses were performed to determine the HRCT predictors of smear-positive active PTB, and a prediction score was developed on the basis of receiver operating characteristic curve analysis. Of 1,255 patients included, 139 were diagnosed with smear-positive active PTB. According to ROC curve analysis, the sensitivity, specificity, positive predictive value, negative predictive value, false positive rates and false negative rates were 98.6 %, 95.8 %, 78.5 %, 99.8 %, 4.2 % and 1.4 %, respectively, for diagnosing smear-positive active PTB in elderly patients, and 100.0 %, 96.9 %, 76.5 %, 100.0 %, 3.1 % and 0.0 %, respectively, for non-elderly patients. HRCT can assist in the early diagnosis of the most infectious active PTB, thereby preventing transmission and minimizing unnecessary immediate respiratory isolation. (orig.)

  11. The Different Volume Effects of Small-Bowel Toxicity During Pelvic Irradiation Between Gynecologic Patients With and Without Abdominal Surgery: A Prospective Study With Computed Tomography-Based Dosimetry

    International Nuclear Information System (INIS)

    Huang, E.-Y.; Sung, C.-C.; Ko, S.-F.; Wang, C.-J.; Yang, Kuender D.

    2007-01-01

    Purpose: To evaluate the effect of abdominal surgery on the volume effects of small-bowel toxicity during whole-pelvic irradiation in patients with gynecologic malignancies. Methods and Materials: From May 2003 through November 2006, 80 gynecologic patients without (Group I) or with (Group II) prior abdominal surgery were analyzed. We used a computed tomography (CT) planning system to measure the small-bowel volume and dosimetry. We acquired the range of small-bowel volume in 10% (V10) to 100% (V100) of dose, at 10% intervals. The onset and grade of diarrhea during whole-pelvic irradiation were recorded as small-bowel toxicity up to 39.6 Gy in 22 fractions. Results: The volume effect of Grade 2-3 diarrhea existed from V10 to V100 in Group I patients and from V60 to V100 in Group II patients on univariate analyses. The V40 of Group I and the V100 of Group II achieved most statistical significance. The mean V40 was 281 ± 27 cm 3 and 489 ± 34 cm 3 (p 3 and 132 ± 19 cm 3 (p = 0.003). Multivariate analyses revealed that V40 (p = 0.001) and V100 (p = 0.027) were independent factors for the development of Grade 2-3 diarrhea in Groups I and II, respectively. Conclusions: Gynecologic patients without and with abdominal surgery have different volume effects on small-bowel toxicity during whole-pelvic irradiation. Low-dose volume can be used as a predictive index of Grade 2 or greater diarrhea in patients without abdominal surgery. Full-dose volume is more important than low-dose volume for Grade 2 or greater diarrhea in patients with abdominal surgery

  12. 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography-Based Radiotherapy Target Volume Definition in Non-Small-Cell Lung Cancer: Delineation by Radiation Oncologists vs. Joint Outlining With a PET Radiologist?

    International Nuclear Information System (INIS)

    Hanna, Gerard G.; Carson, Kathryn J.; Lynch, Tom; McAleese, Jonathan; Cosgrove, Vivian P.; Eakin, Ruth L.; Stewart, David P.; Zatari, Ashraf; O'Sullivan, Joe M.; Hounsell, Alan R.

    2010-01-01

    Purpose: 18 F-Fluorodeoxyglucose positron emission tomography/computed tomography (PET/CT) has benefits in target volume (TV) definition in radiotherapy treatment planning (RTP) for non-small-cell lung cancer (NSCLC); however, an optimal protocol for TV delineation has not been determined. We investigate volumetric and positional variation in gross tumor volume (GTV) delineation using a planning PET/CT among three radiation oncologists and a PET radiologist. Methods and Materials: RTP PET/CT scans were performed on 28 NSCLC patients (Stage IA-IIIB) of which 14 patients received prior induction chemotherapy. Three radiation oncologists and one PET radiologist working with a fourth radiation oncologist independently delineated the GTV on CT alone (GTV CT ) and on fused PET/CT images (GTV PETCT ). The mean percentage volume change (PVC) between GTV CT and GTV PETCT for the radiation oncologists and the PVC between GTV CT and GTV PETCT for the PET radiologist were compared using the Wilcoxon signed-rank test. Concordance index (CI) was used to assess both positional and volume change between GTV CT and GTV PETCT in a single measurement. Results: For all patients, a significant difference in PVC from GTV CT to GTV PETCT exists between the radiation oncologist (median, 5.9%), and the PET radiologist (median, -0.4%, p = 0.001). However, no significant difference in median concordance index (comparing GTV CT and GTV FUSED for individual cases) was observed (PET radiologist = 0.73; radiation oncologists = 0.66; p = 0.088). Conclusions: Percentage volume changes from GTV CT to GTV PETCT were lower for the PET radiologist than for the radiation oncologists, suggesting a lower impact of PET/CT in TV delineation for the PET radiologist than for the oncologists. Guidelines are needed to standardize the use of PET/CT for TV delineation in RTP.

  13. 18F-fluorodeoxyglucose positron emission tomography/computed tomography-based radiotherapy target volume definition in non-small-cell lung cancer: delineation by radiation oncologists vs. joint outlining with a PET radiologist?

    Science.gov (United States)

    Hanna, Gerard G; Carson, Kathryn J; Lynch, Tom; McAleese, Jonathan; Cosgrove, Vivian P; Eakin, Ruth L; Stewart, David P; Zatari, Ashraf; O'Sullivan, Joe M; Hounsell, Alan R

    2010-11-15

    (18)F-Fluorodeoxyglucose positron emission tomography/computed tomography (PET/CT) has benefits in target volume (TV) definition in radiotherapy treatment planning (RTP) for non-small-cell lung cancer (NSCLC); however, an optimal protocol for TV delineation has not been determined. We investigate volumetric and positional variation in gross tumor volume (GTV) delineation using a planning PET/CT among three radiation oncologists and a PET radiologist. RTP PET/CT scans were performed on 28 NSCLC patients (Stage IA-IIIB) of which 14 patients received prior induction chemotherapy. Three radiation oncologists and one PET radiologist working with a fourth radiation oncologist independently delineated the GTV on CT alone (GTV(CT)) and on fused PET/CT images (GTV(PETCT)). The mean percentage volume change (PVC) between GTV(CT) and GTV(PETCT) for the radiation oncologists and the PVC between GTV(CT) and GTV(PETCT) for the PET radiologist were compared using the Wilcoxon signed-rank test. Concordance index (CI) was used to assess both positional and volume change between GTV(CT) and GTV(PETCT) in a single measurement. For all patients, a significant difference in PVC from GTV(CT) to GTV(PETCT) exists between the radiation oncologist (median, 5.9%), and the PET radiologist (median, -0.4%, p = 0.001). However, no significant difference in median concordance index (comparing GTV(CT) and GTV(FUSED) for individual cases) was observed (PET radiologist = 0.73; radiation oncologists = 0.66; p = 0.088). Percentage volume changes from GTV(CT) to GTV(PETCT) were lower for the PET radiologist than for the radiation oncologists, suggesting a lower impact of PET/CT in TV delineation for the PET radiologist than for the oncologists. Guidelines are needed to standardize the use of PET/CT for TV delineation in RTP. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Comparison of Positron Emission Tomography Quantification Using Magnetic Resonance- and Computed Tomography-Based Attenuation Correction in Physiological Tissues and Lesions: A Whole-Body Positron Emission Tomography/Magnetic Resonance Study in 66 Patients.

    Science.gov (United States)

    Seith, Ferdinand; Gatidis, Sergios; Schmidt, Holger; Bezrukov, Ilja; la Fougère, Christian; Nikolaou, Konstantin; Pfannenberg, Christina; Schwenzer, Nina

    2016-01-01

    Attenuation correction (AC) in fully integrated positron emission tomography (PET)/magnetic resonance (MR) systems plays a key role for the quantification of tracer uptake. The aim of this prospective study was to assess the accuracy of standardized uptake value (SUV) quantification using MR-based AC in direct comparison with computed tomography (CT)-based AC of the same PET data set on a large patient population. Sixty-six patients (22 female; mean [SD], 61 [11] years) were examined by means of combined PET/CT and PET/MR (11C-choline, 18F-FDG, or 68Ga-DOTATATE) subsequently. Positron emission tomography images from PET/MR examinations were corrected with MR-derived AC based on tissue segmentation (PET(MR)). The same PET data were corrected using CT-based attenuation maps (μ-maps) derived from PET/CT after nonrigid registration of the CT to the MR-based μ-map (PET(MRCT)). Positron emission tomography SUVs were quantified placing regions of interest or volumes of interest in 6 different body regions as well as PET-avid lesions, respectively. The relative differences of quantitative PET values when using MR-based AC versus CT-based AC were varying depending on the organs and body regions assessed. In detail, the mean (SD) relative differences of PET SUVs were as follows: -7.8% (11.5%), blood pool; -3.6% (5.8%), spleen; -4.4% (5.6%)/-4.1% (6.2%), liver; -0.6% (5.0%), muscle; -1.3% (6.3%), fat; -40.0% (18.7%), bone; 1.6% (4.4%), liver lesions; -6.2% (6.8%), bone lesions; and -1.9% (6.2%), soft tissue lesions. In 10 liver lesions, distinct overestimations greater than 5% were found (up to 10%). In addition, overestimations were found in 2 bone lesions and 1 soft tissue lesion adjacent to the lung (up to 28.0%). Results obtained using different PET tracers show that MR-based AC is accurate in most tissue types, with SUV deviations generally of less than 10%. In bone, however, underestimations can be pronounced, potentially leading to inaccurate SUV quantifications. In

  15. The Relationships between Metabolic Disorders (Hypertension, Dyslipidemia, and Impaired Glucose Tolerance) and Computed Tomography-Based Indices of Hepatic Steatosis or Visceral Fat Accumulation in Middle-Aged Japanese Men.

    Science.gov (United States)

    Fujibayashi, Kazutoshi; Gunji, Toshiaki; Yokokawa, Hirohide; Naito, Toshio; Sasabe, Noriko; Okumura, Mitsue; Iijima, Kimiko; Shibuya, Katsuhiko; Hisaoka, Teruhiko; Fukuda, Hiroshi

    2016-01-01

    Most studies on the relationships between metabolic disorders (hypertension, dyslipidemia, and impaired glucose tolerance) and hepatic steatosis (HS) or visceral fat accumulation (VFA) have been cross-sectional, and thus, these relationships remain unclear. We conducted a retrospective cohort study to clarify the relationships between components of metabolic disorders and HS/VFA. The participants were 615 middle-aged men who were free from serious liver disorders, diabetes, and HS/VFA and underwent multiple general health check-ups at our institution between 2009 and 2013. The data from the initial and final check-ups were used. HS and VFA were assessed by computed tomography. HS was defined as a liver to spleen attenuation ratio of ≤1.0. VFA was defined as a visceral fat cross-sectional area of ≥100 cm2 at the level of the navel. Metabolic disorders were defined using Japan's metabolic syndrome diagnostic criteria. The participants were divided into four groups based on the presence (+) or absence (-) of HS/VFA. The onset rates of each metabolic disorder were compared among the four groups. Among the participants, 521, 55, 24, and 15 were classified as HS(-)/VFA(-), HS(-)/VFA(+), HS(+)/VFA(-), and HS(+)/VFA(+), respectively, at the end of the study. Impaired glucose tolerance was more common among the participants that exhibited HS or VFA (p = 0.05). On the other hand, dyslipidemia was more common among the participants that displayed VFA (p = 0.01). It is likely that VFA is associated with impaired glucose tolerance and dyslipidemia, while HS might be associated with impaired glucose tolerance. Unfortunately, our study failed to detect associations between HS/VFA and metabolic disorders due to the low number of subjects that exhibited fat accumulation. Although our observational study had major limitations, we consider that it obtained some interesting results. HS and VFA might affect different metabolic disorders. Further large-scale longitudinal studies are

  16. The Relationships between Metabolic Disorders (Hypertension, Dyslipidemia, and Impaired Glucose Tolerance and Computed Tomography-Based Indices of Hepatic Steatosis or Visceral Fat Accumulation in Middle-Aged Japanese Men.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Fujibayashi

    Full Text Available Most studies on the relationships between metabolic disorders (hypertension, dyslipidemia, and impaired glucose tolerance and hepatic steatosis (HS or visceral fat accumulation (VFA have been cross-sectional, and thus, these relationships remain unclear. We conducted a retrospective cohort study to clarify the relationships between components of metabolic disorders and HS/VFA.The participants were 615 middle-aged men who were free from serious liver disorders, diabetes, and HS/VFA and underwent multiple general health check-ups at our institution between 2009 and 2013. The data from the initial and final check-ups were used. HS and VFA were assessed by computed tomography. HS was defined as a liver to spleen attenuation ratio of ≤1.0. VFA was defined as a visceral fat cross-sectional area of ≥100 cm2 at the level of the navel. Metabolic disorders were defined using Japan's metabolic syndrome diagnostic criteria. The participants were divided into four groups based on the presence (+ or absence (- of HS/VFA. The onset rates of each metabolic disorder were compared among the four groups.Among the participants, 521, 55, 24, and 15 were classified as HS(-/VFA(-, HS(-/VFA(+, HS(+/VFA(-, and HS(+/VFA(+, respectively, at the end of the study. Impaired glucose tolerance was more common among the participants that exhibited HS or VFA (p = 0.05. On the other hand, dyslipidemia was more common among the participants that displayed VFA (p = 0.01.It is likely that VFA is associated with impaired glucose tolerance and dyslipidemia, while HS might be associated with impaired glucose tolerance. Unfortunately, our study failed to detect associations between HS/VFA and metabolic disorders due to the low number of subjects that exhibited fat accumulation. Although our observational study had major limitations, we consider that it obtained some interesting results. HS and VFA might affect different metabolic disorders. Further large-scale longitudinal studies

  17. Impacts of Digital Imaging versus Drawing on Student Learning in Undergraduate Biodiversity Labs

    Science.gov (United States)

    Basey, John M.; Maines, Anastasia P.; Francis, Clinton D.; Melbourne, Brett

    2014-01-01

    We examined the effects of documenting observations with digital imaging versus hand drawing in inquiry-based college biodiversity labs. Plant biodiversity labs were divided into two treatments, digital imaging (N = 221) and hand drawing (N = 238). Graduate-student teaching assistants (N = 24) taught one class in each treatment. Assessments…

  18. Scattered Neutron Tomography Based on A Neutron Transport Inverse Problem

    International Nuclear Information System (INIS)

    William Charlton

    2007-01-01

    Neutron radiography and computed tomography are commonly used techniques to non-destructively examine materials. Tomography refers to the cross-sectional imaging of an object from either transmission or reflection data collected by illuminating the object from many different directions

  19. Impact of electrocardiogram-gated multi-slice computed tomography-based aortic annular measurement in the evaluation of paravalvular leakage following transcatheter aortic valve replacement: the efficacy of the OverSized AortiC Annular ratio (OSACA ratio) in TAVR.

    Science.gov (United States)

    Maeda, Koichi; Kuratani, Toru; Torikai, Kei; Shimamura, Kazuo; Mizote, Isamu; Ichibori, Yasuhiro; Takeda, Yasuharu; Daimon, Takashi; Nakatani, Satoshi; Nanto, Shinsuke; Sawa, Yoshiki

    2013-07-01

    Even mild paravalvular leakage (PVL) after transcatheter aortic valve replacement (TAVR) is associated with increased late mortality. Electrocardiogram-gated multi-slice computed tomography (MSCT) enables detailed aortic annulus assessment. We describe the impact of MSCT for PVL following TAVR. Congruence between the prosthesis and annulus diameters affects PVL; therefore, we calculated the OverSized AortiC Annular ratio (OSACA ratio) and OSACA (transesophageal echocardiography, TEE) ratio as prosthesis diameter/annulus diameter on MSCT or TEE, respectively, and compared their relationship with PVL ≤ trace following TAVR. Of 36 consecutive patients undergoing TAVR (Group A), the occurrence of PVL ≤ trace (33.3%) was significantly related to the OSACA ratio (p = 0.00020). In receiver-operating characteristics analysis, the cutoff value of 1.03 for the OSACA ratio had the highest sum of sensitivity (75.0%) and specificity (91.7%; AUC = 0.87) with significantly higher discriminatory performance for PVL as compared to the OSACA (TEE) ratio (AUC = 0.69, p = 0.028). In nine consecutive patients (Group B) undergoing TAVR based on guidelines formulated from our experience with Group A, PVL ≤ trace was significantly more frequent (88.9%) than that in Group A (p = 0.0060). The OSACA ratio has a significantly higher discriminatory performance for PVL ≤ trace than the OSACA (TEE) ratio, and aortic annular measurement from MSCT is more accurate than that from TEE. © 2013 Wiley Periodicals, Inc.

  20. Cerenkov luminescence tomography based on preconditioning orthogonal matching pursuit

    Science.gov (United States)

    Liu, Haixiao; Hu, Zhenhua; Wang, Kun; Tian, Jie; Yang, Xin

    2015-03-01

    Cerenkov luminescence imaging (CLI) is a novel optical imaging method and has been proved to be a potential substitute of the traditional radionuclide imaging such as positron emission tomography (PET) and single-photon emission computed tomography (SPECT). This imaging method inherits the high sensitivity of nuclear medicine and low cost of optical molecular imaging. To obtain the depth information of the radioactive isotope, Cerenkov luminescence tomography (CLT) is established and the 3D distribution of the isotope is reconstructed. However, because of the strong absorption and scatter, the reconstruction of the CLT sources is always converted to an ill-posed linear system which is hard to be solved. In this work, the sparse nature of the light source was taken into account and the preconditioning orthogonal matching pursuit (POMP) method was established to effectively reduce the ill-posedness and obtain better reconstruction accuracy. To prove the accuracy and speed of this algorithm, a heterogeneous numerical phantom experiment and an in vivo mouse experiment were conducted. Both the simulation result and the mouse experiment showed that our reconstruction method can provide more accurate reconstruction result compared with the traditional Tikhonov regularization method and the ordinary orthogonal matching pursuit (OMP) method. Our reconstruction method will provide technical support for the biological application for Cerenkov luminescence.

  1. Principle of diffraction enhanced imaging (DEI) and computed tomography based on DEI method

    International Nuclear Information System (INIS)

    Zhu Peiping; Huang Wanxia; Yuan Qingxi; Wang Junyue; Zheng Xin; Shu Hang; Chen Bo; Liu Yijin; Li Enrong; Wu Ziyu; Yu Jian

    2006-01-01

    In the first part of this article a more general DEI equation was derived using simple concepts. Not only does the new DEI equation explain all the problems that can be done by the DEI equation proposed by Chapman, but also explains the problem that can not be explained with the old DEI equation, such as the noise background caused by the small angle scattering reflected by the analyzer. In the second part, a DEI-PI-CT formula has been proposed and the contour contrast caused by the extinction of refraction beam has been qualitatively explained, and then based on the work of Ando's group two formulae of refraction CT with DEI method has been proposed. Combining one refraction CT formula proposed by Dilmanian with the two refraction CT formulae proposed by us, the whole framework of CT algorithm can be made to reconstruct three components of the gradient of refractive index. (authors)

  2. Dynamic computed tomography based on spatio-temporal analysis in acute stroke: Preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ha Young; Pyeon, Do Yeong; Kim, Da Hye; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of)

    2016-12-15

    Acute stroke is a one of common disease that require fast diagnosis and treatment to save patients life. however, the acute stroke may cause lifelong disability due to brain damage with no prompt surgical procedure. In order to diagnose the Stroke, brain perfusion CT examination and possible rapid implementation of 3D angiography has been widely used. However, a low-dose technique should be applied for the examination since a lot of radiation exposure to the patient may cause secondary damage for the patients. Therefore, the degradation of the measured CT images may interferes with a clinical check in that blood vessel shapes o n the CT image are significantly affected by gaussian noise. In this study, we employed the spatio-temporal technique to analyze dynamic (brain perfusion) CT data to improve an image quality for successful clinical diagnosis. As a results, proposed technique could remove gaussian noise successfully, demonstrated a possibility of new image segmentation technique for CT angiography. Qualitative evaluation was conducted by skilled radiological technologists, indicated significant quality improvement of dynamic CT images. the proposed technique will be useful tools as a clinical application for brain perfusion CT examination.

  3. Evaluation of thallium-201 myocardial emission computed tomography based on a comparison with postmortem findings

    International Nuclear Information System (INIS)

    Nagashima, Jun-ichi; Yamada, Hideo; Ohkawa, Shin-ichiro; Yonamine, Shigemichi; Nishino, Hideo; Yamagata, Atsushi; Suzuki, Yasuko; Tanno, Munehiko; Chiba, Kazuo

    1986-01-01

    The correlative study of myocardial perfusion assessed by 201 Tl myocardial ECT with the pathological finding of the heart was performed in 10 autopsied cases with mean age of 77 years old (range: 60 - 90 y). In 6 cases with myocardial infarction (MI) 7 perfusion defects were observed, that was, 3 in anteroseptal wall, 1 in anterolateral wall and 3 in posterior wall on the images of SPECT. Seven MIs were also found in postmortem examination. Six MIs were observed at autopsy corresponding to perfusion defect on SPECT images. In one myocardial perfusion defect at inferoposterior portion on SPECT, a non-transmural MI was found at anteroseptum. In one case with valvular disease a false positive result was obtained at posterior wall where neither myocardial necrosis nor fibrosis was observed at autopsy. This case had aortic stenosis due to bicuspid aortic valve by autopsy. The ventricle was devided into 16 segments in each of 4 short axial images to evaluate extent of MI. SPECT for extent of MI showed sensitivity of 81.9 %, specificity of 96.0 % and diagnostic accuracy of 92.5 %. False negative segment was apt to be observed at the surrounding of non-transmural MI or basal half of left ventricle (LV) with transmural MI, while false positive segment was at posterior portion of basal half of LV. It was concluded that myocardial ECT was useful for evaluation of the site and extent of MI. (author)

  4. Optimizing a micro-computed tomography-based surrogate measurement of bone-implant contact.

    Science.gov (United States)

    Meagher, Matthew J; Parwani, Rachna N; Virdi, Amarjit S; Sumner, Dale R

    2018-03-01

    Histology and backscatter scanning electron microscopy (bSEM) are the current gold standard methods for quantifying bone-implant contact (BIC), but are inherently destructive. Microcomputed tomography (μCT) is a non-destructive alternative, but attempts to validate μCT-based assessment of BIC in animal models have produced conflicting results. We previously showed in a rat model using a 1.5 mm diameter titanium implant that the extent of the metal-induced artefact precluded accurate measurement of bone sufficiently close to the interface to assess BIC. Recently introduced commercial laboratory μCT scanners have smaller voxels and improved imaging capabilities, possibly overcoming this limitation. The goals of the present study were to establish an approach for optimizing μCT imaging parameters and to validate μCT-based assessment of BIC. In an empirical parametric study using a 1.5 mm diameter titanium implant, we determined 90 kVp, 88 µA, 1.5 μm isotropic voxel size, 1600 projections/180°, and 750 ms integration time to be optimal. Using specimens from an in vivo rat experiment, we found significant correlations between bSEM and μCT for BIC with the manufacturer's automated analysis routine (r = 0.716, p = 0.003) or a line-intercept method (r = 0.797, p = 0.010). Thus, this newer generation scanner's improved imaging capability reduced the extent of the metal-induced artefact zone enough to permit assessment of BIC. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:979-986, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  5. Tomography based determination of permeability, Dupuit-Forchheimer coefficient, and interfacial heat transfer coefficient in reticulate porous ceramics

    International Nuclear Information System (INIS)

    Petrasch, Joerg; Meier, Fabian; Friess, Hansmartin; Steinfeld, Aldo

    2008-01-01

    A computer tomography based methodology is applied to determine the transport properties of fluid flow across porous media. A 3D digital representation of a 10-ppi reticulate porous ceramic (RPC) sample was generated by X-ray tomographic scans. Structural properties such as the porosity, specific interfacial surface area, pore-size distribution, mean survival time, two-point correlation function s 2 , and local geometry distribution of the RPC sample are directly extracted from the tomographic data. Reference solutions of the fluid flow governing equations are obtained for Re = 0.2-200 by applying finite volume direct pore-level numerical simulation (DPLS) using unstructured, body-fitted, tetrahedral mesh discretization. The permeability and the Dupuit-Forchheimer coefficient are determined from the reference solutions by DPLS, and compared to the values predicted by selected porous media flow models, namely: conduit-flow, hydraulic radius theory, drag models, mean survival time bound, s 2 -bound, fibrous bed correlations, and local porosity theory-based models. DPLS is further employed to determine the interfacial heat transfer coefficient and to derive a corresponding Nu-correlation, which is compared to empirical correlations

  6. Overview and outlook on muon survey tomography based on micromegas detectors for unreachable sites technology

    Directory of Open Access Journals (Sweden)

    Roche I. Lázaro

    2016-01-01

    Full Text Available The present document describes the functioning principles of the Muon Survey Tomography based on Micromegas detectors for Unreachable Sites Technology and its distinguishing features from other Micromegas-like detectors. Additionally, it addresses the challenges found while operating the first generation and the resulting improvements. Currently, the project Temporal Tomography of the Densitometry by the Measurement of Muons is focused on obtaining a reliable pulse from the micromesh, associated to the passing of a muon, in order to trigger the acquisition and operate in standalone mode. An outlook of the future steps of the project is provided as well.

  7. Computed tomographic findings in dogs with head trauma and development of a novel prognostic computed tomography-based scoring system.

    Science.gov (United States)

    Chai, Orit; Peery, Dana; Bdolah-Abram, Tali; Moscovich, Efrat; Kelmer, Efrat; Klainbart, Sigal; Milgram, Joshua; Shamir, Merav H

    2017-09-01

    OBJECTIVE To characterize CT findings and outcomes in dogs with head trauma and design a prognostic scale. ANIMALS 27 dogs admitted to the Koret School Veterinary Teaching Hospital within 72 hours after traumatic head injury that underwent CT imaging of the head. PROCEDURES Data were extracted from medical records regarding dog signalment, history, physical and neurologic examination findings, and modified Glasgow coma scale scores. All CT images were retrospectively evaluated by a radiologist unaware of dog status. Short-term (10 days after trauma) and long-term (≥ 6 months after trauma) outcomes were determined, and CT findings and other variables were analyzed for associations with outcome. A prognostic CT-based scale was developed on the basis of the results. RESULTS Cranial vault fractures, parenchymal abnormalities, or both were identified via CT in 24 of 27 (89%) dogs. Three (11%) dogs had only facial bone fractures. Intracranial hemorrhage was identified in 16 (59%) dogs, cranial vault fractures in 15 (56%), midline shift in 14 (52%), lateral ventricle asymmetry in 12 (44%), and hydrocephalus in 7 (26%). Hemorrhage and ventricular asymmetry were significantly and negatively associated with short- and long-term survival, respectively. The developed 7-point prognostic scale included points for hemorrhage, midline shift or lateral ventricle asymmetry, cranial vault fracture, and depressed fracture (1 point each) and infratentorial lesion (3 points). CONCLUSIONS AND CLINICAL RELEVANCE The findings reported here may assist in determining prognoses for other dogs with head trauma. The developed scale may be useful for outcome assessment of dogs with head trauma; however, it must be validated before clinical application.

  8. Pathological fracture prediction in patients with metastatic lesions can be improved with quantitative computed tomography based computer models

    NARCIS (Netherlands)

    Tanck, Esther; van Aken, Jantien B.; van der Linden, Yvette M.; Schreuder, H.W. Bart; Binkowski, Marcin; Huizenga, Henk; Verdonschot, Nico

    2009-01-01

    Purpose: In clinical practice, there is an urgent need to improve the prediction of fracture risk for cancer patients with bone metastases. The methods that are currently used to estimate fracture risk are dissatisfying, hence affecting the quality of life of patients with a limited life expectancy.

  9. Comparative assessment of liver tumor motion using cine-magnetic resonance imaging versus 4-dimensional computed tomography.

    Science.gov (United States)

    Fernandes, Annemarie T; Apisarnthanarax, Smith; Yin, Lingshu; Zou, Wei; Rosen, Mark; Plastaras, John P; Ben-Josef, Edgar; Metz, James M; Teo, Boon-Keng

    2015-04-01

    To compare the extent of tumor motion between 4-dimensional CT (4DCT) and cine-MRI in patients with hepatic tumors treated with radiation therapy. Patients with liver tumors who underwent 4DCT and 2-dimensional biplanar cine-MRI scans during simulation were retrospectively reviewed to determine the extent of target motion in the superior-inferior, anterior-posterior, and lateral directions. Cine-MRI was performed over 5 minutes. Tumor motion from MRI was determined by tracking the centroid of the gross tumor volume using deformable image registration. Motion estimates from 4DCT were performed by evaluation of the fiducial, residual contrast (or liver contour) positions in each CT phase. Sixteen patients with hepatocellular carcinoma (n=11), cholangiocarcinoma (n=3), and liver metastasis (n=2) were reviewed. Cine-MRI motion was larger than 4DCT for the superior-inferior direction in 50% of patients by a median of 3.0 mm (range, 1.5-7 mm), the anterior-posterior direction in 44% of patients by a median of 2.5 mm (range, 1-5.5 mm), and laterally in 63% of patients by a median of 1.1 mm (range, 0.2-4.5 mm). Cine-MRI frequently detects larger differences in hepatic intrafraction tumor motion when compared with 4DCT most notably in the superior-inferior direction, and may be useful when assessing the need for or treating without respiratory management, particularly in patients with unreliable 4DCT imaging. Margins wider than the internal target volume as defined by 4DCT were required to encompass nearly all the motion detected by cine-MRI for some of the patients in this study. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. MR imaging versus PET/CT for evaluation of pancreatic lesions

    Energy Technology Data Exchange (ETDEWEB)

    Belião, Sara, E-mail: sara.beliao@clix.pt [Department of Radiology Hospital S. Francisco Xavier, Estrada do Forte do Alto do Duque, 1495-005 Lisbon (Portugal); Ferreira, Alexandra, E-mail: alexandratavaresferreira@gmail.com [Department of Radiology, Hospital D. Estefânia, Rua Jacinta Marto, 1169-045 Lisbon (Portugal); Vierasu, Irina, E-mail: Ortansa-Irina.Vierasu@ulb.ac.be [Service de Médecine Nucléaire, Route de Lennik 808, 1070 Brussels (Belgium); Blocklet, Didier, E-mail: dblockle@ulb.ac.be [Service de Médecine Nucléaire, Route de Lennik 808, 1070 Brussels (Belgium); Goldman, Serge, E-mail: petscan@ulb.ac.be [Service de Médecine Nucléaire, Route de Lennik 808, 1070 Brussels (Belgium); Metens, Thierry, E-mail: tmetens@ulb.ac.be [Service de Radiologie – Imagerie par Resonance Magnétique, Route de Lennik 808, 1070 Brussels (Belgium); Matos, Celso, E-mail: cmatos@ulb.ac.be [Service de Radiologie – Imagerie par Resonance Magnétique, Route de Lennik 808, 1070 Brussels (Belgium)

    2012-10-15

    Purpose: To retrospectively determine the diagnostic accuracy of magnetic resonance imaging (MRI) and combined positron emission tomography/computed tomography (PET/CT) in the differential diagnosis of benign and malignant pancreatic lesions. Materials and methods: Twenty-seven patients (15 women/12 men, mean age 56.5 years) with MR imaging and PET/CT studies performed to differentiate benign and malignant pancreatic lesions were identified between October 2008 and October 2010. Both MR and PET/CT data sets were retrospectively and blindly evaluated by two independent readers (4 readers total) with different degrees of experience, using a visual five-point score system. The results were correlated with final diagnosis obtained by histopathology. Results: 17 patients had malignant diseases and 10 patients had benign diseases. Depending on the observer, the sensitivity, specificity, positive predictive value and negative predictive value of MRI varied between 88–94%, 50–80%, 75–89% and 71–89% respectively. Sensitivities, specificities, positive predictive values and negative predictive values of PET/CT were 73%, 56%, 73% and 56% respectively. The diagnostic accuracy of MR for the differential diagnosis of pancreatic lesions was 74–89%, compared with 67% for PET/CT. The weighted Cohen's kappa coefficient was 0.47 at MR and 0.53 at PET/CT. Conclusion: MRI achieved higher sensitivity and specificity in the differential diagnosis of pancreatic lesions.

  11. Prospective Comparison of the Diagnostic Accuracy of MR Imaging versus CT for Acute Appendicitis.

    Science.gov (United States)

    Repplinger, Michael D; Pickhardt, Perry J; Robbins, Jessica B; Kitchin, Douglas R; Ziemlewicz, Tim J; Hetzel, Scott J; Golden, Sean K; Harringa, John B; Reeder, Scott B

    2018-04-24

    Purpose To compare the accuracy of magnetic resonance (MR) imaging with that of computed tomography (CT) for the diagnosis of acute appendicitis in emergency department (ED) patients. Materials and Methods This was an institutional review board-approved, prospective, observational study of ED patients at an academic medical center (February 2012 to August 2014). Eligible patients were nonpregnant and 12- year-old or older patients in whom a CT study had been ordered for evaluation for appendicitis. After informed consent was obtained, CT and MR imaging (with non-contrast material-enhanced, diffusion-weighted, and intravenous contrast-enhanced sequences) were performed in tandem, and the images were subsequently retrospectively interpreted in random order by three abdominal radiologists who were blinded to the patients' clinical outcomes. Likelihood of appendicitis was rated on a five-point scale for both CT and MR imaging. A composite reference standard of surgical and histopathologic results and clinical follow-up was used, arbitrated by an expert panel of three investigators. Test characteristics were calculated and reported as point estimates with 95% confidence intervals (CIs). Results Analysis included images of 198 patients (114 women [58%]; mean age, 31.6 years ± 14.2 [range, 12-81 years]; prevalence of appendicitis, 32.3%). The sensitivity and specificity were 96.9% (95% CI: 88.2%, 99.5%) and 81.3% (95% CI: 73.5%, 87.3%) for MR imaging and 98.4% (95% CI: 90.5%, 99.9%) and 89.6% (95% CI: 82.8%, 94.0%) for CT, respectively, when a cutoff point of 3 or higher was used. The positive and negative likelihood ratios were 5.2 (95% CI: 3.7, 7.7) and 0.04 (95% CI: 0, 0.11) for MR imaging and 9.4 (95% CI: 5.9, 16.4) and 0.02 (95% CI: 0.00, 0.06) for CT, respectively. Receiver operating characteristic curve analysis demonstrated that the optimal cutoff point to maximize accuracy was 4 or higher, at which point there was no difference between MR imaging and CT

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Phantom-less bone mineral density (BMD) measurement using dual energy computed tomography-based 3-material decomposition

    Science.gov (United States)

    Hofmann, Philipp; Sedlmair, Martin; Krauss, Bernhard; Wichmann, Julian L.; Bauer, Ralf W.; Flohr, Thomas G.; Mahnken, Andreas H.

    2016-03-01

    Osteoporosis is a degenerative bone disease usually diagnosed at the manifestation of fragility fractures, which severely endanger the health of especially the elderly. To ensure timely therapeutic countermeasures, noninvasive and widely applicable diagnostic methods are required. Currently the primary quantifiable indicator for bone stability, bone mineral density (BMD), is obtained either by DEXA (Dual-energy X-ray absorptiometry) or qCT (quantitative CT). Both have respective advantages and disadvantages, with DEXA being considered as gold standard. For timely diagnosis of osteoporosis, another CT-based method is presented. A Dual Energy CT reconstruction workflow is being developed to evaluate BMD by evaluating lumbar spine (L1-L4) DE-CT images. The workflow is ROI-based and automated for practical use. A dual energy 3-material decomposition algorithm is used to differentiate bone from soft tissue and fat attenuation. The algorithm uses material attenuation coefficients on different beam energy levels. The bone fraction of the three different tissues is used to calculate the amount of hydroxylapatite in the trabecular bone of the corpus vertebrae inside a predefined ROI. Calibrations have been performed to obtain volumetric bone mineral density (vBMD) without having to add a calibration phantom or to use special scan protocols or hardware. Accuracy and precision are dependent on image noise and comparable to qCT images. Clinical indications are in accordance with the DEXA gold standard. The decomposition-based workflow shows bone degradation effects normally not visible on standard CT images which would induce errors in normal qCT results.

  18. SU-C-206-03: Metal Artifact Reduction in X-Ray Computed Tomography Based On Local Anatomical Similarity

    International Nuclear Information System (INIS)

    Dong, X; Yang, X; Rosenfield, J; Elder, E; Dhabaan, A

    2016-01-01

    Purpose: Metal implants such as orthopedic hardware and dental fillings cause severe bright and dark streaking in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. Additionally, such artifacts negatively impact patient set-up in image guided radiation therapy (IGRT). In this work, we propose a novel method for metal artifact reduction which utilizes the anatomical similarity between neighboring CT slices. Methods: Neighboring CT slices show similar anatomy. Based on this anatomical similarity, the proposed method replaces corrupted CT pixels with pixels from adjacent, artifact-free slices. A gamma map, which is the weighted summation of relative HU error and distance error, is calculated for each pixel in the artifact-corrupted CT image. The minimum value in each pixel’s gamma map is used to identify a pixel from the adjacent CT slice to replace the corresponding artifact-corrupted pixel. This replacement only occurs if the minimum value in a particular pixel’s gamma map is larger than a threshold. The proposed method was evaluated with clinical images. Results: Highly attenuating dental fillings and hip implants cause severe streaking artifacts on CT images. The proposed method eliminates the dark and bright streaking and improves the implant delineation and visibility. In particular, the image non-uniformity in the central region of interest was reduced from 1.88 and 1.01 to 0.28 and 0.35, respectively. Further, the mean CT HU error was reduced from 328 HU and 460 HU to 60 HU and 36 HU, respectively. Conclusions: The proposed metal artifact reduction method replaces corrupted image pixels with pixels from neighboring slices that are free of metal artifacts. This method proved capable of suppressing streaking artifacts, improving HU accuracy and image detectability.

  19. Influence of 18F-fluorodeoxyglucose-positron emission tomography on computed tomography-based radiation treatment planning for oesophageal cancer

    International Nuclear Information System (INIS)

    Everitt, C.; Leong, T.

    2006-01-01

    The addition of positron emission tomography (PET) information to CT-based radiotherapy treatment planning has the potential to improve target volume definition through more accurate localization of the primary tumour and involved regional lymph nodes. This case report describes the first patient enrolled to a prospective study evaluating the effects of coregistered positron emission tomography/CT images on radiotherapy treatment planning for oesophageal cancer. The results show that if combined positron emission tomography/CT is used for radiotherapy treatment planning, there may be alterations to the delineation of tumour volumes when compared to CT alone. For this patient, a geographic miss of tumour would have occurred if CT data alone were used for radiotherapy planning Copyright (2006) Blackwell Publishing Asia Pty Ltd

  20. A virtual sinogram method to reduce dental metallic implant artefacts in computed tomography-based attenuation correction for PET

    NARCIS (Netherlands)

    Abdoli, Mehrsima; Ay, Mohammad Reza; Ahmadian, Alireza; Zaidi, Habib

    Objective Attenuation correction of PET data requires accurate determination of the attenuation map (mu map), which represents the spatial distribution of linear attenuation coefficients of different tissues at 511 keV. The presence of high-density metallic dental filling material in head and neck

  1. SU-F-T-398: Improving Radiotherapy Treatment Planning Using Dual Energy Computed Tomography Based Tissue Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Tomic, N; Bekerat, H; Seuntjens, J; Forghani, R; DeBlois, F; Devic, S [McGill University, Montreal, QC (Canada)

    2016-06-15

    Purpose: Both kVp settings and geometric distribution of various materials lead to significant change of the HU values, showing the largest discrepancy for high-Z materials and for the lowest CT scanning kVp setting. On the other hand, the dose distributions around low-energy brachytherapy sources are highly dependent on the architecture and composition of tissue heterogeneities in and around the implant. Both measurements and Monte Carlo calculations show that improper tissue characterization may lead to calculated dose errors of 90% for low energy and around 10% for higher energy photons. We investigated the ability of dual-energy CT (DECT) to characterize more accurately tissue equivalent materials. Methods: We used the RMI-467 heterogeneity phantom scanned in DECT mode with 3 different set-ups: first, we placed high electron density (ED) plugs within the outer ring of the phantom; then we arranged high ED plugs within the inner ring; and finally ED plugs were randomly distributed. All three setups were scanned with the same DECT technique using a single-source DECT scanner with fast kVp switching (Discovery CT750HD; GE Healthcare). Images were transferred to a GE Advantage workstation for DECT analysis. Spectral Hounsfield unit curves (SHUACs) were then generated from 50 to 140-keV, in 10-keV increments, for each plug. Results: The dynamic range of Hounsfield units shrinks with increased photon energy as the attenuation coefficients decrease. Our results show that the spread of HUs for the three different geometrical setups is the smallest at 80 keV. Furthermore, among all the energies and all materials presented, the largest difference appears at high Z tissue equivalent plugs. Conclusion: Our results suggest that dose calculations at both megavoltage and low photon energies could benefit in the vicinity of bony structures if the 80 keV reconstructed monochromatic CT image is used with the DECT protocol utilized in this work.

  2. Three-dimensional in vivo imaging of the murine liver: a micro-computed tomography-based anatomical study.

    Directory of Open Access Journals (Sweden)

    Teresa Fiebig

    Full Text Available Various murine models are currently used to study acute and chronic pathological processes of the liver, and the efficacy of novel therapeutic regimens. The increasing availability of high-resolution small animal imaging modalities presents researchers with the opportunity to precisely identify and describe pathological processes of the liver. To meet the demands, the objective of this study was to provide a three-dimensional illustration of the macroscopic anatomical location of the murine liver lobes and hepatic vessels using small animal imaging modalities. We analysed micro-CT images of the murine liver by integrating additional information from the published literature to develop comprehensive illustrations of the macroscopic anatomical features of the murine liver and hepatic vasculature. As a result, we provide updated three-dimensional illustrations of the macroscopic anatomy of the murine liver and hepatic vessels using micro-CT. The information presented here provides researchers working in the field of experimental liver disease with a comprehensive, easily accessable overview of the macroscopic anatomy of the murine liver.

  3. SU-C-206-03: Metal Artifact Reduction in X-Ray Computed Tomography Based On Local Anatomical Similarity

    Energy Technology Data Exchange (ETDEWEB)

    Dong, X; Yang, X; Rosenfield, J; Elder, E; Dhabaan, A [Emory University, Winship Cancer Institute, Atlanta, GA (United States)

    2016-06-15

    Purpose: Metal implants such as orthopedic hardware and dental fillings cause severe bright and dark streaking in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. Additionally, such artifacts negatively impact patient set-up in image guided radiation therapy (IGRT). In this work, we propose a novel method for metal artifact reduction which utilizes the anatomical similarity between neighboring CT slices. Methods: Neighboring CT slices show similar anatomy. Based on this anatomical similarity, the proposed method replaces corrupted CT pixels with pixels from adjacent, artifact-free slices. A gamma map, which is the weighted summation of relative HU error and distance error, is calculated for each pixel in the artifact-corrupted CT image. The minimum value in each pixel’s gamma map is used to identify a pixel from the adjacent CT slice to replace the corresponding artifact-corrupted pixel. This replacement only occurs if the minimum value in a particular pixel’s gamma map is larger than a threshold. The proposed method was evaluated with clinical images. Results: Highly attenuating dental fillings and hip implants cause severe streaking artifacts on CT images. The proposed method eliminates the dark and bright streaking and improves the implant delineation and visibility. In particular, the image non-uniformity in the central region of interest was reduced from 1.88 and 1.01 to 0.28 and 0.35, respectively. Further, the mean CT HU error was reduced from 328 HU and 460 HU to 60 HU and 36 HU, respectively. Conclusions: The proposed metal artifact reduction method replaces corrupted image pixels with pixels from neighboring slices that are free of metal artifacts. This method proved capable of suppressing streaking artifacts, improving HU accuracy and image detectability.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  13. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    Science.gov (United States)

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  14. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  18. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  3. Mixed Total Variation and L1 Regularization Method for Optical Tomography Based on Radiative Transfer Equation

    Directory of Open Access Journals (Sweden)

    Jinping Tang

    2017-01-01

    Full Text Available Optical tomography is an emerging and important molecular imaging modality. The aim of optical tomography is to reconstruct optical properties of human tissues. In this paper, we focus on reconstructing the absorption coefficient based on the radiative transfer equation (RTE. It is an ill-posed parameter identification problem. Regularization methods have been broadly applied to reconstruct the optical coefficients, such as the total variation (TV regularization and the L1 regularization. In order to better reconstruct the piecewise constant and sparse coefficient distributions, TV and L1 norms are combined as the regularization. The forward problem is discretized with the discontinuous Galerkin method on the spatial space and the finite element method on the angular space. The minimization problem is solved by a Jacobian-based Levenberg-Marquardt type method which is equipped with a split Bregman algorithms for the L1 regularization. We use the adjoint method to compute the Jacobian matrix which dramatically improves the computation efficiency. By comparing with the other imaging reconstruction methods based on TV and L1 regularizations, the simulation results show the validity and efficiency of the proposed method.

  4. Image reconstruction of fluorescent molecular tomography based on the tree structured Schur complement decomposition

    Directory of Open Access Journals (Sweden)

    Wang Jiajun

    2010-05-01

    Full Text Available Abstract Background The inverse problem of fluorescent molecular tomography (FMT often involves complex large-scale matrix operations, which may lead to unacceptable computational errors and complexity. In this research, a tree structured Schur complement decomposition strategy is proposed to accelerate the reconstruction process and reduce the computational complexity. Additionally, an adaptive regularization scheme is developed to improve the ill-posedness of the inverse problem. Methods The global system is decomposed level by level with the Schur complement system along two paths in the tree structure. The resultant subsystems are solved in combination with the biconjugate gradient method. The mesh for the inverse problem is generated incorporating the prior information. During the reconstruction, the regularization parameters are adaptive not only to the spatial variations but also to the variations of the objective function to tackle the ill-posed nature of the inverse problem. Results Simulation results demonstrate that the strategy of the tree structured Schur complement decomposition obviously outperforms the previous methods, such as the conventional Conjugate-Gradient (CG and the Schur CG methods, in both reconstruction accuracy and speed. As compared with the Tikhonov regularization method, the adaptive regularization scheme can significantly improve ill-posedness of the inverse problem. Conclusions The methods proposed in this paper can significantly improve the reconstructed image quality of FMT and accelerate the reconstruction process.

  5. 3D and 4D magnetic susceptibility tomography based on complex MR images

    Science.gov (United States)

    Chen, Zikuan; Calhoun, Vince D

    2014-11-11

    Magnetic susceptibility is the physical property for T2*-weighted magnetic resonance imaging (T2*MRI). The invention relates to methods for reconstructing an internal distribution (3D map) of magnetic susceptibility values, .chi. (x,y,z), of an object, from 3D T2*MRI phase images, by using Computed Inverse Magnetic Resonance Imaging (CIMRI) tomography. The CIMRI technique solves the inverse problem of the 3D convolution by executing a 3D Total Variation (TV) regularized iterative convolution scheme, using a split Bregman iteration algorithm. The reconstruction of .chi. (x,y,z) can be designed for low-pass, band-pass, and high-pass features by using a convolution kernel that is modified from the standard dipole kernel. Multiple reconstructions can be implemented in parallel, and averaging the reconstructions can suppress noise. 4D dynamic magnetic susceptibility tomography can be implemented by reconstructing a 3D susceptibility volume from a 3D phase volume by performing 3D CIMRI magnetic susceptibility tomography at each snapshot time.

  6. Electrical impedance tomography-based sensing skin for quantitative imaging of damage in concrete

    International Nuclear Information System (INIS)

    Hallaji, Milad; Pour-Ghaz, Mohammad; Seppänen, Aku

    2014-01-01

    This paper outlines the development of a large-area sensing skin for damage detection in concrete structures. The developed sensing skin consists of a thin layer of electrically conductive copper paint that is applied to the surface of the concrete. Cracking of the concrete substrate results in the rupture of the sensing skin, decreasing its electrical conductivity locally. The decrease in conductivity is detected with electrical impedance tomography (EIT) imaging. In previous works, electrically based sensing skins have provided only qualitative information on the damage on the substrate surface. In this paper, we study whether quantitative imaging of the damage is possible. We utilize application-specific models and computational methods in the image reconstruction, including a total variation (TV) prior model for the damage and an approximate correction of the modeling errors caused by the inhomogeneity of the painted sensing skin. The developed damage detection method is tested experimentally by applying the sensing skin to polymeric substrates and a reinforced concrete beam under four-point bending. In all test cases, the EIT-based sensing skin provides quantitative information on cracks and/or other damages on the substrate surface: featuring a very low conductivity in the damage locations, and a reliable indication of the lengths and shapes of the cracks. The results strongly support the applicability of the painted EIT-based sensing skin for damage detection in reinforced concrete elements and other substrates. (paper)

  7. Optimization of recommendations for abdomen computerized tomography based on reconstruction filters, voltage and tube current

    International Nuclear Information System (INIS)

    Silveira, Vinicius da Costa

    2015-01-01

    The use of computed tomography has increased significantly over the past decades. In Brazil the use increased more than twofold from 2008 to 2014, in the meantime the abdomen procedures have tripled. The high frequency of this procedure combined by the increasing collective radiation dose in medical exposures, has resulted development tools to maximize the benefit in CT images. This work aimed to establish protocols optimized in abdominal CT through acquisitions parameters and reconstructions techniques based on filters kernels. A sample of patients undergoing abdominal CT in a diagnostic center of Rio de Janeiro was assessed. Had been collected patients information and acquisitions parameters. The phantoms CT image acquisitions were performed by using different voltage values by adjusting the tube current (mAs) to obtain the same value from CTDI vol patients with normal BMI. Afterwards, the CTDIvol values were reduced by 30%, 50% and 60%. All images were reconstructed with low-contrast filters (A) and standard filters (B). The CTDIvol values for patients with normal BMI were 7% higher than in patients with underweight BMI and 30%, 50% and 60% lower than the overweight, obese I and III patients, respectively. The evaluations of image quality showed that variation of the current (mA) and the reconstruction filters did not affect the Hounsfield values. When the contrast-to-noise ratio (CNR) was normalized to CTDIvol, the protocols acquired with 60% reduction of CTDIvol with 140 kV and 80 kV showed CNR 6% lower than the routine. Modifications of the acquisition parameters did not affect spatial resolution, but the post-processing with B filters reduced the spatial frequency by 16%. With reduced the dose of 30%, lesions in the spleen had the CNR higher than 10% routine protocols with 140 kV acquired and post-processed to filter A. The image post-processing with a filter A with a 80kV voltage provided CNR values equal to the routine for the liver lesions with a 30

  8. Measurement of diabetic wounds with optical coherence tomography-based air-jet indentation system and a material testing system.

    Science.gov (United States)

    Choi, M-C; Cheung, K-K; Ng, G Y-F; Zheng, Y-P; Cheing, G L-Y

    2015-11-01

    Material testing system is a conventional but destructive method for measuring the biomechanical properties of wound tissues in basic research. The recently developed optical coherence tomography-based air-jet indentation system is a non-destructive method for measuring these properties of soft tissues in a non-contact manner. The aim of the study was to examine the correlation between the biomechanical properties of wound tissues measured by the two systems. Young male Sprague-Dawley rats with streptozotocin-induced diabetic were wounded by a 6 mm biopsy punch on their hind limbs. The biomechanical properties of wound tissues were assessed with the two systems on post-wounding days 3, 7, 10, 14, and 21. Wound sections were stained with picro-sirius red for analysis on the collagen fibres. Data obtained on the different days were charted to obtain the change in biomechanical properties across the time points, and then pooled to examine the correlation between measurements made by the two devices. Qualitative analysis to determine any correlation between indentation stiffness measured by the air-jet indentation system and the orientation of collagen fibres. The indentation stiffness is significantly negatively correlated to the maximum load, maximum tensile stress, and Young's modulus by the material testing system (all pair-jet indentation system to evaluate the biomechanical properties of wounds in a non-contact manner. It is a potential clinical device to examine the biomechanical properties of chronic wounds in vivo in a repeatable manner.

  9. Repeatability of Computerized Tomography-Based Anthropomorphic Measurements of Frailty in Patients With Pulmonary Fibrosis Undergoing Lung Transplantation.

    Science.gov (United States)

    McClellan, Taylor; Allen, Brian C; Kappus, Matthew; Bhatti, Lubna; Dafalla, Randa A; Snyder, Laurie D; Bashir, Mustafa R

    To determine interreader and intrareader repeatability and correlations among measurements of computerized tomography-based anthropomorphic measurements in patients with pulmonary fibrosis undergoing lung transplantation. This was an institutional review board-approved, Health Insurance Portability and Accountability Act-compliant retrospective study of 23 randomly selected subjects (19 male and 4 female; median age = 69 years; range: 66-77 years) with idiopathic pulmonary fibrosis undergoing pulmonary transplantation, who had also undergone preoperative thoracoabdominal computerized tomography. Five readers of varying imaging experience independently performed the following cross-sectional area measurements at the inferior endplate of the L3 vertebral body: right and left psoas muscles, right and left paraspinal muscles, total abdominal musculature, and visceral and subcutaneous fat. The following measurements were obtained at the inferior endplate of T6: right and left paraspinal muscles with and without including the trapezius muscles and subcutaneous fat. Three readers repeated all measurements to assess intrareader repeatability. Intrareader repeatability was nearly perfect (interclass correlation coefficients = 0.99, P < 0.001). Interreader agreement was excellent across all 5 readers (interclass correlation coefficients: 0.71-0.99, P < 0.001). Coefficients of variance between measures ranged from 3.2%-6.8% for abdominal measurements, but were higher for thoracic measurements, up to 23.9%. Correlation between total paraspinal and total psoas muscle area was strong (r 2 = 0.67, P < 0.001). Thoracic and abdominal musculature had a weaker correlation (r 2 = 0.35-0.38, P < 0.001). Measures of thoracic and abdominal muscle and fat area are highly repeatable in patients with pulmonary fibrosis undergoing lung transplantation. Measures of muscle area are strongly correlated among abdominal locations, but inversely correlated between abdominal and thoracic locations

  10. Technetium-99m sestamibi myocardial tomography based on dipyridamole echocardiography testing in hypertensive patients with chest pain

    Energy Technology Data Exchange (ETDEWEB)

    Schillaci, O. [Section of Nuclear Medicine, Department of Experimental Medicine and Pathology, University ``La Sapienza``, Rome (Italy); Moroni, C. [Department of Internal Medicine, University ``La Sapienza``, Rome (Italy); Scopinaro, F. [Section of Nuclear Medicine, Department of Experimental Medicine and Pathology, University ``La Sapienza``, Rome (Italy); Tavolaro, R. [Section of Nuclear Medicine, Department of Experimental Medicine and Pathology, University ``La Sapienza``, Rome (Italy); Danieli, R. [Section of Nuclear Medicine, Department of Experimental Medicine and Pathology, University ``La Sapienza``, Rome (Italy); Bossini, A. [Department of Internal Medicine, University ``La Sapienza``, Rome (Italy); Cassone, R. [Department of Internal Medicine, University ``La Sapienza``, Rome (Italy); Colella, A.C. [Section of Nuclear Medicine, Department of Experimental Medicine and Pathology, University ``La Sapienza``, Rome (Italy)

    1997-07-01

    The aim of this study was to evaluate the diagnostic capability of technetium-99m sestamibi tomography based on dipyridamole echocardiography testing in hypertensives with chest pain, and to compare the scintigraphic results with those of coronary angiography, exercise electrocardiography and dipyridamole echocardiography. Forty subjects with mild to moderate hypertension, chest pain and no previous myocardial infarction were submitted to {sup 99m}Tc-sestamibi tomography (at rest and after high-dose dipyridamole echocardiography) and to exercise electrocardiography testing. At coronary angiography 22 patients (group A) had significant epicardial coronary artery disease ({>=}70% stenosis of at least one major vessel) and 18 normal main coronary vessels (group B). Dipyridamole {sup 99m}Tc-sestamibi imaging was positive in 21/22 patients of group A and in 5/18 of group B. Dipyridamole echocardiography was positive in 18/22 patients of group A and in 5/18 of group B. Exercise electrocardiography was positive in 15/22 patients of group A and in 11/18 of group B. Four out of five subjects in group B with positive results in all the tests showed a slow run-off of angiographic contrast medium, probably due to small-vessel disease. Significant epicardial coronary artery disease in hypertensives with chest pain is unlikely when dipyridamole {sup 99m}Tc-sestamibi tomography is negative. When scintigraphy is positive, either epicardial coronary artery disease or a small-vessel disease condition is possible. The association of scintigraphy with dipyridamole echocardiography testing allows the assessment of contractile function and myocardial perfusion by a single pharmacological stress. (orig./AJ). With 3 figs., 2 tabs.

  11. A Comparison of Accuracy of Image- versus Hardware-based Tracking Technologies in 3D Fusion in Aortic Endografting.

    Science.gov (United States)

    Rolls, A E; Maurel, B; Davis, M; Constantinou, J; Hamilton, G; Mastracci, T M

    2016-09-01

    Fusion of three-dimensional (3D) computed tomography and intraoperative two-dimensional imaging in endovascular surgery relies on manual rigid co-registration of bony landmarks and tracking of hardware to provide a 3D overlay (hardware-based tracking, HWT). An alternative technique (image-based tracking, IMT) uses image recognition to register and place the fusion mask. We present preliminary experience with an agnostic fusion technology that uses IMT, with the aim of comparing the accuracy of overlay for this technology with HWT. Data were collected prospectively for 12 patients. All devices were deployed using both IMT and HWT fusion assistance concurrently. Postoperative analysis of both systems was performed by three blinded expert observers, from selected time-points during the procedures, using the displacement of fusion rings, the overlay of vascular markings and the true ostia of renal arteries. The Mean overlay error and the deviation from mean error was derived using image analysis software. Comparison of the mean overlay error was made between IMT and HWT. The validity of the point-picking technique was assessed. IMT was successful in all of the first 12 cases, whereas technical learning curve challenges thwarted HWT in four cases. When independent operators assessed the degree of accuracy of the overlay, the median error for IMT was 3.9 mm (IQR 2.89-6.24, max 9.5) versus 8.64 mm (IQR 6.1-16.8, max 24.5) for HWT (p = .001). Variance per observer was 0.69 mm(2) and 95% limit of agreement ±1.63. In this preliminary study, the error of magnitude of displacement from the "true anatomy" during image overlay in IMT was less than for HWT. This confirms that ongoing manual re-registration, as recommended by the manufacturer, should be performed for HWT systems to maintain accuracy. The error in position of the fusion markers for IMT was consistent, thus may be considered predictable. Copyright © 2016 European Society for Vascular Surgery. Published by

  12. TU-CD-BRA-08: Single-Energy Computed Tomography-Based Pulmonary Perfusion Imaging: Proof-Of-Principle in a Canine Model

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, T; Boone, J [University of California Davis School of Medicine, Sacramento, CA (United States); Kent, M; Wisner, E [University of California Davis School of Veterinary Medicine, Davis, CA (United States); Fujita, Y [Tokai University, Isehara (Japan)

    2015-06-15

    Purpose: Pulmonary perfusion imaging has provided significant insights into pulmonary diseases, and can be useful in radiotherapy. The purpose of this study was to prospectively establish proof-of-principle in a canine model for single-energy CT-based perfusion imaging, which has the potential for widespread clinical implementation. Methods: Single-energy CT perfusion imaging is based on: (1) acquisition of inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast medium, (2) deformable image registration (DIR) of the two CT image data sets, and (3) subtraction of the pre-contrast image from post-contrast image, yielding a map of Hounsfield unit (HU) enhancement. These subtraction image data sets hypothetically represent perfused blood volume, a surrogate for perfusion. In an IACUC-approved clinical trial, we acquired pre- and post-contrast CT scans in the prone posture for six anesthetized, mechanically-ventilated dogs. The elastix algorithm was used for DIR. The registration accuracy was quantified using the target registration errors (TREs) for 50 pulmonary landmarks in each dog. The gradient of HU enhancement between gravity-dependent (ventral) and non-dependent (dorsal) regions was evaluated to quantify the known effect of gravity, i.e., greater perfusion in ventral regions. Results: The lung volume difference between the two scans was 4.3±3.5% on average (range 0.3%–10.1%). DIR demonstrated an average TRE of 0.7±1.0 mm. HU enhancement in lung parenchyma was 34±10 HU on average and varied considerably between individual dogs, indicating the need for improvement of the contrast injection protocol. HU enhancement in ventral (gravity-dependent) regions was found to be greater than in dorsal regions. A population average ventral-to-dorsal gradient of HU enhancement was strong (R{sup 2}=0.94) and statistically significant (p<0.01). Conclusion: This canine study demonstrated relatively accurate DIR and a strong ventral-to-dorsal gradient of HU enhancement, providing proof-of-principle for single-energy CT pulmonary perfusion imaging. This ongoing study will enroll more dogs and investigate the physiological significance. This study was supported by a Philips Healthcare/Radiological Society of North America (RSNA) Research Seed Grant (RSD1458)

  13. Optimal C-arm angulation during transcatheter aortic valve replacement: Accuracy of a rotational C-arm computed tomography based three dimensional heart model.

    Science.gov (United States)

    Veulemans, Verena; Mollus, Sabine; Saalbach, Axel; Pietsch, Max; Hellhammer, Katharina; Zeus, Tobias; Westenfeld, Ralf; Weese, Jürgen; Kelm, Malte; Balzer, Jan

    2016-10-26

    To investigate the accuracy of a rotational C-arm CT-based 3D heart model to predict an optimal C-arm configuration during transcatheter aortic valve replacement (TAVR). Rotational C-arm CT (RCT) under rapid ventricular pacing was performed in 57 consecutive patients with severe aortic stenosis as part of the pre-procedural cardiac catheterization. With prototype software each RCT data set was segmented using a 3D heart model. From that the line of perpendicularity curve was obtained that generates a perpendicular view of the aortic annulus according to the right-cusp rule. To evaluate the accuracy of a model-based overlay we compared model- and expert-derived aortic root diameters. For all 57 patients in the RCT cohort diameter measurements were obtained from two independent operators and were compared to the model-based measurements. The inter-observer variability was measured to be in the range of 0°-12.96° of angular C-arm displacement for two independent operators. The model-to-operator agreement was 0°-13.82°. The model-based and expert measurements of aortic root diameters evaluated at the aortic annulus ( r = 0.79, P optimal C-arm configuration, potentially simplifying current clinical workflows before and during TAVR.

  14. Novel high-resolution computed tomography-based radiomic classifier for screen-identified pulmonary nodules in the National Lung Screening Trial.

    Science.gov (United States)

    Peikert, Tobias; Duan, Fenghai; Rajagopalan, Srinivasan; Karwoski, Ronald A; Clay, Ryan; Robb, Richard A; Qin, Ziling; Sicks, JoRean; Bartholmai, Brian J; Maldonado, Fabien

    2018-01-01

    Optimization of the clinical management of screen-detected lung nodules is needed to avoid unnecessary diagnostic interventions. Herein we demonstrate the potential value of a novel radiomics-based approach for the classification of screen-detected indeterminate nodules. Independent quantitative variables assessing various radiologic nodule features such as sphericity, flatness, elongation, spiculation, lobulation and curvature were developed from the NLST dataset using 726 indeterminate nodules (all ≥ 7 mm, benign, n = 318 and malignant, n = 408). Multivariate analysis was performed using least absolute shrinkage and selection operator (LASSO) method for variable selection and regularization in order to enhance the prediction accuracy and interpretability of the multivariate model. The bootstrapping method was then applied for the internal validation and the optimism-corrected AUC was reported for the final model. Eight of the originally considered 57 quantitative radiologic features were selected by LASSO multivariate modeling. These 8 features include variables capturing Location: vertical location (Offset carina centroid z), Size: volume estimate (Minimum enclosing brick), Shape: flatness, Density: texture analysis (Score Indicative of Lesion/Lung Aggression/Abnormality (SILA) texture), and surface characteristics: surface complexity (Maximum shape index and Average shape index), and estimates of surface curvature (Average positive mean curvature and Minimum mean curvature), all with Pscreen-detected nodule characterization appears extremely promising however independent external validation is needed.

  15. Feasibility of differential quantification of 3D temporomandibular kinematics during various oral activities using a cone-beam computed tomography-based 3D fluoroscopic method

    Directory of Open Access Journals (Sweden)

    Chien-Chih Chen

    2013-06-01

    Conclusion: A new CBCT-based 3D fluoroscopic method was proposed and shown to be capable of quantitatively differentiating TMJ movement patterns among complicated functional activities. It also enabled a complete description of the rigid-body mandibular motion and descriptions of as many reference points as needed for future clinical applications. It will be helpful for dental practice and for a better understanding of the functions of the TMJ.

  16. Computed tomography-based lung nodule volumetry - do optimized reconstructions of routine protocols achieve similar accuracy, reproducibility and interobserver variability to that of special volumetry protocols?

    International Nuclear Information System (INIS)

    Bolte, H.; Riedel, C.; Knoess, N.; Hoffmann, B.; Heller, M.; Biederer, J.; Freitag, S.

    2007-01-01

    Purpose: The aim of this in vitro and ex vivo CT study was to investigate whether the use of a routine thorax protocol (RTP) with optimized reconstruction parameters can provide comparable accuracy, reproducibility and interobserver variability of volumetric analyses to that of a special volumetry protocol (SVP). Materials and Methods: To assess accuracy, 3 polyurethane (PU) spheres (35 HU; diameters: 4, 6 and 10 mm) were examined with a recommended SVP using a multislice CT (collimation 16 x 0.75 mm, pitch 1.25, 20 mAs, slice thickness 1 mm, increment 0.7 mm, medium kernel) and an optimized RTP (collimation 16 x 1.5 mm, pitch 1.25, 100 mAs, reconstructed slice thickness 2 mm, increment 0.4 mm, sharp kernel). For the assessment of intrascan and interscan reproducibility and interobserver variability, 20 artificial small pulmonary nodules were placed in a dedicated ex vivo chest phantom and examined with identical scan protocols. The artificial lesions consisted of a fat-wax-Lipiodol registered mixture. Phantoms and ex vivo lesions were examined afterwards using commercial volumetry software. To describe accuracy the relative deviations from the true volumes of the PU phantoms were calculated. For intrascan and interscan reproducibility and interobserver variability, the 95 % normal range (95 % NR) of relative deviations between two measurements was calculated. Results: For the SVP the achieved relative deviations for the 4, 6 and 10 mm PU phantoms were - 14.3 %, - 12.7 % and - 6.8 % and were 4.5 %, - 0.6 % and - 2.6 %, respectively, for the optimized RTP. SVP showed a 95 % NR of 0 - 1.5 % for intrascan and a 95 % NR of - 10.8 - 2.9 % for interscan reproducibility. The 95 % NR for interobserver variability was - 4.3 - 3.3 %. The optimized RTP achieved a 95 % NR of - 3.1 - 4.3 % for intrascan reproducibility and a 95 % NR of - 7.0 - 3.5 % for interscan reproducibility. The 95 % NR for interobserver variability was - 0.4 - 6.8 %. (orig.)

  17. TU-CD-BRA-08: Single-Energy Computed Tomography-Based Pulmonary Perfusion Imaging: Proof-Of-Principle in a Canine Model

    International Nuclear Information System (INIS)

    Yamamoto, T; Boone, J; Kent, M; Wisner, E; Fujita, Y

    2015-01-01

    Purpose: Pulmonary perfusion imaging has provided significant insights into pulmonary diseases, and can be useful in radiotherapy. The purpose of this study was to prospectively establish proof-of-principle in a canine model for single-energy CT-based perfusion imaging, which has the potential for widespread clinical implementation. Methods: Single-energy CT perfusion imaging is based on: (1) acquisition of inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast medium, (2) deformable image registration (DIR) of the two CT image data sets, and (3) subtraction of the pre-contrast image from post-contrast image, yielding a map of Hounsfield unit (HU) enhancement. These subtraction image data sets hypothetically represent perfused blood volume, a surrogate for perfusion. In an IACUC-approved clinical trial, we acquired pre- and post-contrast CT scans in the prone posture for six anesthetized, mechanically-ventilated dogs. The elastix algorithm was used for DIR. The registration accuracy was quantified using the target registration errors (TREs) for 50 pulmonary landmarks in each dog. The gradient of HU enhancement between gravity-dependent (ventral) and non-dependent (dorsal) regions was evaluated to quantify the known effect of gravity, i.e., greater perfusion in ventral regions. Results: The lung volume difference between the two scans was 4.3±3.5% on average (range 0.3%–10.1%). DIR demonstrated an average TRE of 0.7±1.0 mm. HU enhancement in lung parenchyma was 34±10 HU on average and varied considerably between individual dogs, indicating the need for improvement of the contrast injection protocol. HU enhancement in ventral (gravity-dependent) regions was found to be greater than in dorsal regions. A population average ventral-to-dorsal gradient of HU enhancement was strong (R"2=0.94) and statistically significant (p<0.01). Conclusion: This canine study demonstrated relatively accurate DIR and a strong ventral-to-dorsal gradient of HU enhancement, providing proof-of-principle for single-energy CT pulmonary perfusion imaging. This ongoing study will enroll more dogs and investigate the physiological significance. This study was supported by a Philips Healthcare/Radiological Society of North America (RSNA) Research Seed Grant (RSD1458)

  18. Diagnosis of drowning using post-mortem computed tomography based on the volume and density of fluid accumulation in the maxillary and sphenoid sinuses.

    Science.gov (United States)

    Kawasumi, Yusuke; Kawabata, Tomoyoshi; Sugai, Yusuke; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-10-01

    Recent studies have reported that drowning victims frequently have fluid accumulation in the paranasal sinuses, most notably the maxillary and sphenoid sinuses. However, in our previous study, many non-drowning victims also had fluid accumulation in the sinuses. Therefore, we evaluated the qualitative difference in fluid accumulation between drowning and non-drowning cases in the present study. Thirty-eight drowning and 73 non-drowning cases were investigated retrospectively. The fluid volume and density of each case were calculated using a DICOM workstation. The drowning cases were compared with the non-drowning cases using the Mann-Whitney U-test because the data showed non-normal distribution. The median fluid volume was 1.82 (range 0.02-11.7) ml in the drowning cases and 0.49 (0.03-8.7) ml in the non-drowning cases, and the median fluid density was 22 (-14 to 66) and 39 (-65 to 77) HU, respectively. Both volume and density differed significantly between the drowning and non-drowning cases (p=0.001, p=0.0007). Regarding cut-off levels in the ROC analysis, the points on the ROC curve closest (0, 1) were 1.03ml (sensitivity 68%, specificity 68%, PPV 53%, NPV 81%) and 27.5 HU (61%, 70%, 51%, 77%). The Youden indices were 1.03ml and 37.8 HU (84%, 51%, 47%, 86%). When the cut-off level was set at 1.03ml and 27.5HU, the sensitivity was 42%, specificity 45%, PPV 29% and NPV 60%. When the cut-off level was set at 1.03ml and 37.8HU, sensitivity was 58%, specificity 32%, PPV 31% and NPV 59%. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. A PDE-constrained SQP algorithm for optical tomography based on the frequency-domain equation of radiative transfer

    International Nuclear Information System (INIS)

    Kim, Hyun Keol; Hielscher, Andreas H

    2009-01-01

    It is well acknowledged that transport-theory-based reconstruction algorithm can provide the most accurate reconstruction results especially when small tissue volumes or high absorbing media are considered. However, these codes have a high computational burden and are often only slowly converging. Therefore, methods that accelerate the computation are highly desirable. To this end, we introduce in this work a partial-differential-equation (PDE) constrained approach to optical tomography that makes use of an all-at-once reduced Hessian sequential quadratic programming (rSQP) scheme. The proposed scheme treats the forward and inverse variables independently, which makes it possible to update the radiation intensities and the optical coefficients simultaneously by solving the forward and inverse problems, all at once. We evaluate the performance of the proposed scheme with numerical and experimental data, and find that the rSQP scheme can reduce the computation time by a factor of 10–25, as compared to the commonly employed limited memory BFGS method. At the same time accuracy and robustness even in the presence of noise are not compromised

  20. A new approach to global seismic tomography based on regularization by sparsity in a novel 3D spherical wavelet basis

    Science.gov (United States)

    Loris, Ignace; Simons, Frederik J.; Daubechies, Ingrid; Nolet, Guust; Fornasier, Massimo; Vetter, Philip; Judd, Stephen; Voronin, Sergey; Vonesch, Cédric; Charléty, Jean

    2010-05-01

    Global seismic wavespeed models are routinely parameterized in terms of spherical harmonics, networks of tetrahedral nodes, rectangular voxels, or spherical splines. Up to now, Earth model parametrizations by wavelets on the three-dimensional ball remain uncommon. Here we propose such a procedure with the following three goals in mind: (1) The multiresolution character of a wavelet basis allows for the models to be represented with an effective spatial resolution that varies as a function of position within the Earth. (2) This property can be used to great advantage in the regularization of seismic inversion schemes by seeking the most sparse solution vector, in wavelet space, through iterative minimization of a combination of the ℓ2 (to fit the data) and ℓ1 norms (to promote sparsity in wavelet space). (3) With the continuing increase in high-quality seismic data, our focus is also on numerical efficiency and the ability to use parallel computing in reconstructing the model. In this presentation we propose a new wavelet basis to take advantage of these three properties. To form the numerical grid we begin with a surface tesselation known as the 'cubed sphere', a construction popular in fluid dynamics and computational seismology, coupled with an semi-regular radial subdivison that honors the major seismic discontinuities between the core-mantle boundary and the surface. This mapping first divides the volume of the mantle into six portions. In each 'chunk' two angular and one radial variable are used for parametrization. In the new variables standard 'cartesian' algorithms can more easily be used to perform the wavelet transform (or other common transforms). Edges between chunks are handled by special boundary filters. We highlight the benefits of this construction and use it to analyze the information present in several published seismic compressional-wavespeed models of the mantle, paying special attention to the statistics of wavelet and scaling coefficients

  1. Iterative image reconstruction for positron emission tomography based on a detector response function estimated from point source measurements

    International Nuclear Information System (INIS)

    Tohme, Michel S; Qi Jinyi

    2009-01-01

    The accuracy of the system model in an iterative reconstruction algorithm greatly affects the quality of reconstructed positron emission tomography (PET) images. For efficient computation in reconstruction, the system model in PET can be factored into a product of a geometric projection matrix and sinogram blurring matrix, where the former is often computed based on analytical calculation, and the latter is estimated using Monte Carlo simulations. Direct measurement of a sinogram blurring matrix is difficult in practice because of the requirement of a collimated source. In this work, we propose a method to estimate the 2D blurring kernels from uncollimated point source measurements. Since the resulting sinogram blurring matrix stems from actual measurements, it can take into account the physical effects in the photon detection process that are difficult or impossible to model in a Monte Carlo (MC) simulation, and hence provide a more accurate system model. Another advantage of the proposed method over MC simulation is that it can easily be applied to data that have undergone a transformation to reduce the data size (e.g., Fourier rebinning). Point source measurements were acquired with high count statistics in a relatively fine grid inside the microPET II scanner using a high-precision 2D motion stage. A monotonically convergent iterative algorithm has been derived to estimate the detector blurring matrix from the point source measurements. The algorithm takes advantage of the rotational symmetry of the PET scanner and explicitly models the detector block structure. The resulting sinogram blurring matrix is incorporated into a maximum a posteriori (MAP) image reconstruction algorithm. The proposed method has been validated using a 3 x 3 line phantom, an ultra-micro resolution phantom and a 22 Na point source superimposed on a warm background. The results of the proposed method show improvements in both resolution and contrast ratio when compared with the MAP

  2. A novel post-processing scheme for two-dimensional electrical impedance tomography based on artificial neural networks.

    Directory of Open Access Journals (Sweden)

    Sébastien Martin

    Full Text Available Electrical Impedance Tomography (EIT is a powerful non-invasive technique for imaging applications. The goal is to estimate the electrical properties of living tissues by measuring the potential at the boundary of the domain. Being safe with respect to patient health, non-invasive, and having no known hazards, EIT is an attractive and promising technology. However, it suffers from a particular technical difficulty, which consists of solving a nonlinear inverse problem in real time. Several nonlinear approaches have been proposed as a replacement for the linear solver, but in practice very few are capable of stable, high-quality, and real-time EIT imaging because of their very low robustness to errors and inaccurate modeling, or because they require considerable computational effort.In this paper, a post-processing technique based on an artificial neural network (ANN is proposed to obtain a nonlinear solution to the inverse problem, starting from a linear solution. While common reconstruction methods based on ANNs estimate the solution directly from the measured data, the method proposed here enhances the solution obtained from a linear solver.Applying a linear reconstruction algorithm before applying an ANN reduces the effects of noise and modeling errors. Hence, this approach significantly reduces the error associated with solving 2D inverse problems using machine-learning-based algorithms.This work presents radical enhancements in the stability of nonlinear methods for biomedical EIT applications.

  3. Long ranging swept-source optical coherence tomography-based angiography outperforms its spectral-domain counterpart in imaging human skin microcirculations

    Science.gov (United States)

    Xu, Jingjiang; Song, Shaozhen; Men, Shaojie; Wang, Ruikang K.

    2017-11-01

    There is an increasing demand for imaging tools in clinical dermatology that can perform in vivo wide-field morphological and functional examination from surface to deep tissue regions at various skin sites of the human body. The conventional spectral-domain optical coherence tomography-based angiography (SD-OCTA) system is difficult to meet these requirements due to its fundamental limitations of the sensitivity roll-off, imaging range as well as imaging speed. To mitigate these issues, we demonstrate a swept-source OCTA (SS-OCTA) system by employing a swept source based on a vertical cavity surface-emitting laser. A series of comparisons between SS-OCTA and SD-OCTA are conducted. Benefiting from the high system sensitivity, long imaging range, and superior roll-off performance, the SS-OCTA system is demonstrated with better performance in imaging human skin than the SD-OCTA system. We show that the SS-OCTA permits remarkable deep visualization of both structure and vasculature (up to ˜2 mm penetration) with wide field of view capability (up to 18×18 mm2), enabling a more comprehensive assessment of the morphological features as well as functional blood vessel networks from the superficial epidermal to deep dermal layers. It is expected that the advantages of the SS-OCTA system will provide a ground for clinical translation, benefiting the existing dermatological practice.

  4. Prognostic value of combined CT angiography and myocardial perfusion imaging versus invasive coronary angiography and nuclear stress perfusion imaging in the prediction of major adverse cardiovascular events

    DEFF Research Database (Denmark)

    Chen, Marcus Y.; Rochitte, Carlos E.; Arbab-Zadeh, Armin

    2017-01-01

    Purpose: To compare the prognostic importance (time to major adverse cardiovascular event [MACE]) of combined computed tomography (CT) angiography and CT myocardial stress perfusion imaging with that of combined invasive coronary angiography (ICA) and stress single photon emission CT myocardial p...

  5. Computed tomography of cryogenic cells

    International Nuclear Information System (INIS)

    Schneider, Gerd; Anderson, E.; Vogt, S.; Knochel, C.; Weiss, D.; LeGros, M.; Larabell, C.

    2001-01-01

    Due to the short wavelengths of X-rays and low numerical aperture of the Fresnel zone plates used as X-ray objectives, the depth of field is several microns. Within the focal depth, imaging a thick specimen is to a good approximation equivalent to projecting the specimen absorption. Therefore, computed tomography based on a tilt series of X-ray microscopic images can be used to reconstruct the local linear absorption coefficient and image the three-dimensional specimen structure. To preserve the structural integrity of biological objects during image acquisition, microscopy is performed at cryogenic temperatures. Tomography based on X-ray microscopic images was applied to study the distribution of male specific lethal 1 (MSL-1), a nuclear protein involved in dosage compensation in Drosophila melanogaster, which ensures that males with single X chromosome have the same amount of most X-linked gene products as females with two X chromosomes. Tomographic reconstructions of X-ray microscopic images were used to compute the local three-dimensional linear absorption coefficient revealing the arrangement of internal structures of Drosophila melanogaster cells. Combined with labelling techniques, nanotomography is a new technique to study the 3D distribution of selected proteins inside whole cells. We want to improve this technique with respect to resolution and specimen preparation. The resolution in the reconstruction can be significantly improved by reducing the angular step size to collect more viewing angles, which requires an automated data acquisition. In addition, fast-freezing with liquid ethane instead of cryogenic He gas will be applied to improve the vitrification of the hydrated samples. We also plan to apply cryo X-ray nanotomography in order to study different types of cells and their nuclear protein distributions

  6. Comparison of five segmentation tools for 18F-fluoro-deoxy-glucose-positron emission tomography-based target volume definition in head and neck cancer.

    NARCIS (Netherlands)

    Schinagl, D.A.X.; Vogel, W.V.; Hoffmann, A.L.; Dalen, J.A. van; Oyen, W.J.G.; Kaanders, J.H.A.M.

    2007-01-01

    PURPOSE: Target-volume delineation for radiation treatment to the head and neck area traditionally is based on physical examination, computed tomography (CT), and magnetic resonance imaging. Additional molecular imaging with (18)F-fluoro-deoxy-glucose (FDG)-positron emission tomography (PET) may

  7. Influence of Cone-Beam Computed Tomography filters on diagnosis of simulated endodontic complications.

    Science.gov (United States)

    Verner, F S; D'Addazio, P S; Campos, C N; Devito, K L; Almeida, S M; Junqueira, R B

    2017-11-01

    To evaluate the influence of cone-beam computed tomography (CBCT) filters on diagnosis of simulated endodontic complications. Sixteen human teeth, in three mandibles, were submitted to the following simulated endodontic complications: (G1) fractured file, (G2) perforations in the canal walls, (G3) deviated cast post, and (G4) external root resorption. The mandibles were submitted to CBCT examination (I-Cat ® Next Generation). Five oral radiologists evaluated the images independently with and without XoranCat ® software filters. Accuracy, sensitivity and specificity were determined. ROC curves were calculated for each group with the filters, and the areas under the curves were compared using anova (one-way) test. McNemar test was applied for pair-wise agreement between all images versus the gold standard and original images versus images with filters (P originals images (P = 0.00 for all filters) only in G1 group. There were no differences in the other groups. The filters did not improve the diagnosis of the simulated endodontic complications evaluated. Their diagnosis remains a major challenge in clinical practice. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  8. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  9. Computed Tomography-Based Anatomic Assessment Overestimates Local Tumor Recurrence in Patients With Mass-like Consolidation After Stereotactic Body Radiotherapy for Early-Stage Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Dunlap, Neal E. [Department of Radiation Oncology, University of Louisville, Louisville, KY (United States); Yang Wensha [Department of Radiation Oncology, Cedars Sinai Medical Center, Los Angeles, CA (United States); McIntosh, Alyson [Department of Radiation Oncology, John and Dorothy Morgan Cancer Center, Lehigh Valley Hospital, Allentown, PA (United States); Sheng, Ke [Department of Radiation Oncology, David Geffen School of Medicine at University of California Los Angeles, Los Angeles, CA (United States); Benedict, Stanley H.; Read, Paul W. [Department of Radiation Oncology, University of Virginia, Charlottesville, VA (United States); Larner, James M., E-mail: jml2p@virginia.edu [Department of Radiation Oncology, University of Virginia, Charlottesville, VA (United States)

    2012-12-01

    Purpose: To investigate pulmonary radiologic changes after lung stereotactic body radiotherapy (SBRT), to distinguish between mass-like fibrosis and tumor recurrence. Methods and Materials: Eighty consecutive patients treated with 3- to 5-fraction SBRT for early-stage peripheral non-small cell lung cancer with a minimum follow-up of 12 months were reviewed. The mean biologic equivalent dose received was 150 Gy (range, 78-180 Gy). Patients were followed with serial CT imaging every 3 months. The CT appearance of consolidation was defined as diffuse or mass-like. Progressive disease on CT was defined according to Response Evaluation Criteria in Solid Tumors 1.1. Positron emission tomography (PET) CT was used as an adjunct test. Tumor recurrence was defined as a standardized uptake value equal to or greater than the pretreatment value. Biopsy was used to further assess consolidation in select patients. Results: Median follow-up was 24 months (range, 12.0-36.0 months). Abnormal mass-like consolidation was identified in 44 patients (55%), whereas diffuse consolidation was identified in 12 patients (15%), at a median time from end of treatment of 10.3 months and 11.5 months, respectively. Tumor recurrence was found in 35 of 44 patients with mass-like consolidation using CT alone. Combined with PET, 10 of the 44 patients had tumor recurrence. Tumor size (hazard ratio 1.12, P=.05) and time to consolidation (hazard ratio 0.622, P=.03) were predictors for tumor recurrence. Three consecutive increases in volume and increasing volume at 12 months after treatment in mass-like consolidation were highly specific for tumor recurrence (100% and 80%, respectively). Patients with diffuse consolidation were more likely to develop grade {>=}2 pneumonitis (odds ratio 26.5, P=.02) than those with mass-like consolidation (odds ratio 0.42, P=.07). Conclusion: Incorporating the kinetics of mass-like consolidation and PET to the current criteria for evaluating posttreatment response will increase the likelihood of correctly identifying patients with progressive disease after lung SBRT.

  10. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  11. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  12. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  13. Variance in predicted cup size by 2-dimensional vs 3-dimensional computerized tomography-based templating in primary total hip arthroplasty.

    Science.gov (United States)

    Osmani, Feroz A; Thakkar, Savyasachi; Ramme, Austin; Elbuluk, Ameer; Wojack, Paul; Vigdorchik, Jonathan M

    2017-12-01

    Preoperative total hip arthroplasty templating can be performed with radiographs using acetate prints, digital viewing software, or with computed tomography (CT) images. Our hypothesis is that 3D templating is more precise and accurate with cup size prediction as compared to 2D templating with acetate prints and digital templating software. Data collected from 45 patients undergoing robotic-assisted total hip arthroplasty compared cup sizes templated on acetate prints and OrthoView software to MAKOplasty software that uses CT scan. Kappa analysis determined strength of agreement between each templating modality and the final size used. t tests compared mean cup-size variance from the final size for each templating technique. Interclass correlation coefficient (ICC) determined reliability of digital and acetate planning by comparing predictions of the operating surgeon and a blinded adult reconstructive fellow. The Kappa values for CT-guided, digital, and acetate templating with the final size was 0.974, 0.233, and 0.262, respectively. Both digital and acetate templating significantly overpredicted cup size, compared to CT-guided methods ( P cup size when compared to the significant overpredictions of digital and acetate templating. CT-guided templating may also lead to better outcomes due to bone stock preservation from a smaller and more accurate cup size predicted than that of digital and acetate predictions.

  14. Comparison of five segmentation tools for 18F-fluoro-deoxy-glucose-positron emission tomography-based target volume definition in head and neck cancer.

    Science.gov (United States)

    Schinagl, Dominic A X; Vogel, Wouter V; Hoffmann, Aswin L; van Dalen, Jorn A; Oyen, Wim J; Kaanders, Johannes H A M

    2007-11-15

    Target-volume delineation for radiation treatment to the head and neck area traditionally is based on physical examination, computed tomography (CT), and magnetic resonance imaging. Additional molecular imaging with (18)F-fluoro-deoxy-glucose (FDG)-positron emission tomography (PET) may improve definition of the gross tumor volume (GTV). In this study, five methods for tumor delineation on FDG-PET are compared with CT-based delineation. Seventy-eight patients with Stages II-IV squamous cell carcinoma of the head and neck area underwent coregistered CT and FDG-PET. The primary tumor was delineated on CT, and five PET-based GTVs were obtained: visual interpretation, applying an isocontour of a standardized uptake value of 2.5, using a fixed threshold of 40% and 50% of the maximum signal intensity, and applying an adaptive threshold based on the signal-to-background ratio. Absolute GTV volumes were compared, and overlap analyses were performed. The GTV method of applying an isocontour of a standardized uptake value of 2.5 failed to provide successful delineation in 45% of cases. For the other PET delineation methods, volume and shape of the GTV were influenced heavily by the choice of segmentation tool. On average, all threshold-based PET-GTVs were smaller than on CT. Nevertheless, PET frequently detected significant tumor extension outside the GTV delineated on CT (15-34% of PET volume). The choice of segmentation tool for target-volume definition of head and neck cancer based on FDG-PET images is not trivial because it influences both volume and shape of the resulting GTV. With adequate delineation, PET may add significantly to CT- and physical examination-based GTV definition.

  15. Comparison of Five Segmentation Tools for 18F-Fluoro-Deoxy-Glucose-Positron Emission Tomography-Based Target Volume Definition in Head and Neck Cancer

    International Nuclear Information System (INIS)

    Schinagl, Dominic A.X.; Vogel, Wouter V.; Hoffmann, Aswin L.; Dalen, Jorn A. van; Oyen, Wim J.; Kaanders, Johannes H.A.M.

    2007-01-01

    Purpose: Target-volume delineation for radiation treatment to the head and neck area traditionally is based on physical examination, computed tomography (CT), and magnetic resonance imaging. Additional molecular imaging with 18 F-fluoro-deoxy-glucose (FDG)-positron emission tomography (PET) may improve definition of the gross tumor volume (GTV). In this study, five methods for tumor delineation on FDG-PET are compared with CT-based delineation. Methods and Materials: Seventy-eight patients with Stages II-IV squamous cell carcinoma of the head and neck area underwent coregistered CT and FDG-PET. The primary tumor was delineated on CT, and five PET-based GTVs were obtained: visual interpretation, applying an isocontour of a standardized uptake value of 2.5, using a fixed threshold of 40% and 50% of the maximum signal intensity, and applying an adaptive threshold based on the signal-to-background ratio. Absolute GTV volumes were compared, and overlap analyses were performed. Results: The GTV method of applying an isocontour of a standardized uptake value of 2.5 failed to provide successful delineation in 45% of cases. For the other PET delineation methods, volume and shape of the GTV were influenced heavily by the choice of segmentation tool. On average, all threshold-based PET-GTVs were smaller than on CT. Nevertheless, PET frequently detected significant tumor extension outside the GTV delineated on CT (15-34% of PET volume). Conclusions: The choice of segmentation tool for target-volume definition of head and neck cancer based on FDG-PET images is not trivial because it influences both volume and shape of the resulting GTV. With adequate delineation, PET may add significantly to CT- and physical examination-based GTV definition

  16. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  17. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  18. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  19. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  20. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  1. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  2. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  3. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  4. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  5. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  6. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  7. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  8. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  9. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  10. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  12. 3D-CT vascular setting protocol using computer graphics for the evaluation of maxillofacial lesions

    Directory of Open Access Journals (Sweden)

    CAVALCANTI Marcelo de Gusmão Paraiso

    2001-01-01

    Full Text Available In this paper we present the aspect of a mandibular giant cell granuloma in spiral computed tomography-based three-dimensional (3D-CT reconstructed images using computer graphics, and demonstrate the importance of the vascular protocol in permitting better diagnosis, visualization and determination of the dimensions of the lesion. We analyzed 21 patients with maxillofacial lesions of neoplastic and proliferative origins. Two oral and maxillofacial radiologists analyzed the images. The usefulness of interactive 3D images reconstructed by means of computer graphics, especially using a vascular setting protocol for qualitative and quantitative analyses for the diagnosis, determination of the extent of lesions, treatment planning and follow-up, was demonstrated. The technique is an important adjunct to the evaluation of lesions in relation to axial CT slices and 3D-CT bone images.

  13. 3D-CT vascular setting protocol using computer graphics for the evaluation of maxillofacial lesions.

    Science.gov (United States)

    Cavalcanti, M G; Ruprecht, A; Vannier, M W

    2001-01-01

    In this paper we present the aspect of a mandibular giant cell granuloma in spiral computed tomography-based three-dimensional (3D-CT) reconstructed images using computer graphics, and demonstrate the importance of the vascular protocol in permitting better diagnosis, visualization and determination of the dimensions of the lesion. We analyzed 21 patients with maxillofacial lesions of neoplastic and proliferative origins. Two oral and maxillofacial radiologists analyzed the images. The usefulness of interactive 3D images reconstructed by means of computer graphics, especially using a vascular setting protocol for qualitative and quantitative analyses for the diagnosis, determination of the extent of lesions, treatment planning and follow-up, was demonstrated. The technique is an important adjunct to the evaluation of lesions in relation to axial CT slices and 3D-CT bone images.

  14. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  15. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  16. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  17. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  18. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  19. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  1. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  2. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  3. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  4. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  5. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  6. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  7. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  8. Improving limited-projection-angle fluorescence molecular tomography using a co-registered x-ray computed tomography scan.

    Science.gov (United States)

    Radrich, Karin; Ale, Angelique; Ermolayev, Vladimir; Ntziachristos, Vasilis

    2012-12-01

    We examine the improvement in imaging performance, such as axial resolution and signal localization, when employing limited-projection-angle fluorescence molecular tomography (FMT) together with x-ray computed tomography (XCT) measurements versus stand-alone FMT. For this purpose, we employed living mice, bearing a spontaneous lung tumor model, and imaged them with FMT and XCT under identical geometrical conditions using fluorescent probes for cancer targeting. The XCT data was employed, herein, as structural prior information to guide the FMT reconstruction. Gold standard images were provided by fluorescence images of mouse cryoslices, providing the ground truth in fluorescence bio-distribution. Upon comparison of FMT images versus images reconstructed using hybrid FMT and XCT data, we demonstrate marked improvements in image accuracy. This work relates to currently disseminated FMT systems, using limited projection scans, and can be employed to enhance their performance.

  9. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  10. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  11. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  12. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  13. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  14. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  15. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  16. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  17. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  18. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  20. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  1. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  2. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  3. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  4. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  5. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  6. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  7. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  8. Computer Navigation-aided Resection of Sacral Chordomas

    Directory of Open Access Journals (Sweden)

    Yong-Kun Yang

    2016-01-01

    Full Text Available Background: Resection of sacral chordomas is challenging. The anatomy is complex, and there are often no bony landmarks to guide the resection. Achieving adequate surgical margins is, therefore, difficult, and the recurrence rate is high. Use of computer navigation may allow optimal preoperative planning and improve precision in tumor resection. The purpose of this study was to evaluate the safety and feasibility of computer navigation-aided resection of sacral chordomas. Methods: Between 2007 and 2013, a total of 26 patients with sacral chordoma underwent computer navigation-aided surgery were included and followed for a minimum of 18 months. There were 21 primary cases and 5 recurrent cases, with a mean age of 55.8 years old (range: 35-84 years old. Tumors were located above the level of the S3 neural foramen in 23 patients and below the level of the S3 neural foramen in 3 patients. Three-dimensional images were reconstructed with a computed tomography-based navigation system combined with the magnetic resonance images using the navigation software. Tumors were resected via a posterior approach assisted by the computer navigation. Mean follow-up was 38.6 months (range: 18-84 months. Results: Mean operative time was 307 min. Mean intraoperative blood loss was 3065 ml. For computer navigation, the mean registration deviation during surgery was 1.7 mm. There were 18 wide resections, 4 marginal resections, and 4 intralesional resections. All patients were alive at the final follow-up, with 2 (7.7% exhibiting tumor recurrence. The other 24 patients were tumor-free. The mean Musculoskeletal Tumor Society Score was 27.3 (range: 19-30. Conclusions: Computer-assisted navigation can be safely applied to the resection of the sacral chordomas, allowing execution of preoperative plans, and achieving good oncological outcomes. Nevertheless, this needs to be accomplished by surgeons with adequate experience and skill.

  9. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  10. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  11. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  12. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  13. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  14. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  15. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  16. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  17. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  18. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  19. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  20. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  1. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  2. 18F-FDG PET-CT imaging versus bone marrow biopsy in pediatric Hodgkin's lymphoma: a quantitative assessment of marrow uptake and novel insights into clinical implications of marrow involvement

    International Nuclear Information System (INIS)

    Hassan, Aamna; Siddique, Maimoona; Bashir, Humayun; Riaz, Saima; Nawaz, M.K.; Wali, Rabia; Mahreen, Asma

    2017-01-01

    To evaluate whether positron emission tomography/computed tomography using fluorine-18 fluoro-deoxyglucose ( 18 F-FDG PET-CT) predicts bone marrow involvement (BMI) in pediatric Hodgkin's lymphoma (pHL) with sufficient accuracy to supplant routine staging bone marrow biopsy (BMB), and to assess the clinical importance of marrow disease by comparing the prognosis of stage IV HL with BMI versus that without BMI. Data were retrospectively analyzed for all cases of pHL between July 2010 and June 2015 referred for staging 18 F-FDG PET-CT scan and BMB. The reference standard was BMB. Stage IV patients were divided into three groups to compare their progression-free and overall survival: PET+ BMB-, PET+ BMB+, and PET- BMB-. Of the 784 patients, 83.3% were male and 16.7% female, with age ranging from 2 to 18 years (mean 10.3 years). Among the total cases, 104 (13.3%) had BMI; of these, 100 were detected by PET imaging and 58 by BMB. BMB and 18 F-FDG PET/CT scans were concordant for BMI detection in 728 patients (93%): positive concordance in 54 and negative in 674. Of the 56 discordant cases, four had a false-negative PET scans and were upstaged by BMB, 46 with focal uptake were PET/CT-positive and BMB-negative (not obtained from active sites), and six with diffuse uptake were false-positive on PET due to paraneoplastic marrow activation. The sensitivity, specificity, PPV, and NPV of PET for identifying BMI was 93.6, 94, 53, and 99.4% respectively. On quantitative assessment, mean iBM-SUV max of bilateral iliac crests was significantly higher in those with BMI versus those without (p < 0.05). 18 F-FDG PET-CT imaging is more sensitive than BMB for BMI detection in pHL staging. BMB should be limited to those with normal marrow uptake in the presence of poor risk factors or those with diffusely increased uptake to exclude marrow involvement in the background of reactive marrow. (orig.)

  3. {sup 18}F-FDG PET-CT imaging versus bone marrow biopsy in pediatric Hodgkin's lymphoma: a quantitative assessment of marrow uptake and novel insights into clinical implications of marrow involvement

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Aamna; Siddique, Maimoona; Bashir, Humayun; Riaz, Saima; Nawaz, M.K. [Shaukat Khanum Memorial Cancer Hospital and Research Centre, Department of Nuclear Medicine, Lahore (Pakistan); Wali, Rabia; Mahreen, Asma [Shaukat Khanum Memorial Cancer Hospital and Research Centre, Paediatric Oncology, Lahore (Pakistan)

    2017-07-15

    To evaluate whether positron emission tomography/computed tomography using fluorine-18 fluoro-deoxyglucose ({sup 18}F-FDG PET-CT) predicts bone marrow involvement (BMI) in pediatric Hodgkin's lymphoma (pHL) with sufficient accuracy to supplant routine staging bone marrow biopsy (BMB), and to assess the clinical importance of marrow disease by comparing the prognosis of stage IV HL with BMI versus that without BMI. Data were retrospectively analyzed for all cases of pHL between July 2010 and June 2015 referred for staging {sup 18}F-FDG PET-CT scan and BMB. The reference standard was BMB. Stage IV patients were divided into three groups to compare their progression-free and overall survival: PET+ BMB-, PET+ BMB+, and PET- BMB-. Of the 784 patients, 83.3% were male and 16.7% female, with age ranging from 2 to 18 years (mean 10.3 years). Among the total cases, 104 (13.3%) had BMI; of these, 100 were detected by PET imaging and 58 by BMB. BMB and {sup 18}F-FDG PET/CT scans were concordant for BMI detection in 728 patients (93%): positive concordance in 54 and negative in 674. Of the 56 discordant cases, four had a false-negative PET scans and were upstaged by BMB, 46 with focal uptake were PET/CT-positive and BMB-negative (not obtained from active sites), and six with diffuse uptake were false-positive on PET due to paraneoplastic marrow activation. The sensitivity, specificity, PPV, and NPV of PET for identifying BMI was 93.6, 94, 53, and 99.4% respectively. On quantitative assessment, mean iBM-SUV{sub max} of bilateral iliac crests was significantly higher in those with BMI versus those without (p < 0.05). {sup 18}F-FDG PET-CT imaging is more sensitive than BMB for BMI detection in pHL staging. BMB should be limited to those with normal marrow uptake in the presence of poor risk factors or those with diffusely increased uptake to exclude marrow involvement in the background of reactive marrow. (orig.)

  4. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  5. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  6. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  7. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  8. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  10. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  11. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  12. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  13. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  14. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  15. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  16. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  17. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  18. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  19. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  20. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  1. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  2. Data on analysis of coronary atherosclerosis on computed tomography and 18F-sodium fluoride positron emission tomography

    Directory of Open Access Journals (Sweden)

    Toshiro Kitagawa

    2017-08-01

    Full Text Available This article contains the data showing illustrative examples of plaque classification on coronary computed tomography angiography (CCTA and measurement of 18F-sodium fluoride (18F-NaF uptake in coronary atherosclerotic lesions on positron emission tomography (PET. We divided the lesions into one of three plaque types on CCTA (calcified plaque, non-calcified plaque, partially calcified plaque. Focal 18F-NaF uptake of each lesion was quantified using maximum tissue-to-background ratio. This article also provides a representative case with a non-calcified coronary plaque detected on CCTA and identified on 18F-NaF PET/non-contrast computed tomography based on a location of a vessel branch as a landmark. These complement the data reported by Kitagawa et al. (2017 [1].

  3. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  4. Adrenocorticotrophin-dependent hypercortisolism: Imaging versus laboratory diagnosis

    Directory of Open Access Journals (Sweden)

    Tančić-Gajić Milina

    2012-01-01

    Full Text Available Introduction. Cushing’s syndrome results from inappropriate exposure to excessive glucocorticoids. Untreated, it has significant morbidity and mortality. Case Outline. A 38-year-old woman with a typical appearance of Cushing’s syndrome was admitted for further evaluation of hypercortisolism. The serum cortisol level was elevated without diurnal rhythm, without adequate suppression of cortisol after 1 mg dexamethasone suppression test. 24-hour urinaryfree cortisol level was elevated. Differential diagnostic testing indicated adrenocorticotrophin (ACTH- dependent lesion of the pituitary origin. Pituitary abnormalities were not observed during repeated MRI scanning. Inferior petrosal sinus sampling (IPSS was performed: 1 Baseline ratio ACTH inferior petrosal sinus/peripheral was <2; 2 Corticotropin-releasing hormone (CRH stimulated ratio ACTH inferior petrosal sinus/peripheral was <3; 3 Baseline intersinus ratio of ACTH was <1.4; 4 Increase in inferior petrosal sinus and peripheral ACTH of more than 50 percent above basal level after CRH; 5 Baseline ratio ACTH vena jugularis interna/peripheral was >1.7. Transsphenoidal exploration and removal of the pituitary tumor was performed inducing iatrogenic hypopituitarism. Postoperative morning serum cortisol level was less than 50 nmol/l on adequate replacement therapy with hydrocortisone, levothyroxine and estro-progestagen. Conclusion. No single test provides absolute distinction, but the combined results of several tests generally provide a correct diagnosis of Cushing’s syndrome.

  5. Evaluation of Marfan syndrome: MR imaging versus CT

    International Nuclear Information System (INIS)

    Soulen, R.L.; Fishman, E.K.; Pyeritz, R.E.; Gott, V.L.; Zerhouni, E.A.

    1986-01-01

    Twenty-five patients with Marfan, syndrome underwent both CT and MR imaging. MR imaging were interpreted in blinded fashion and then compared with CT scans MR imaging was found to be equivalent to CT in the detection of aortic, dural, and hip abnormalities in patients not operated on. MR imaging was superior to CT in the evaluation of postoperative patients because the artifact produced by Bjork-Shirley or St. Jude valves precludes adequate evaluation of the aortic root on CT while producing only a small inferior field distortion (a ''pseudo-ventricular septal defect'') on MR imaging. The absence of radiation exposure is another major advantage of MR imaging in this relatively young population requiring serial studies. The authors conclude that MR imaging is the modality of choice for the evaluation and follow-up of patients with Marfan syndrome and offers an appropriate means of screening their kindred

  6. Black Males and Television: New Images Versus Old Stereotypes.

    Science.gov (United States)

    Douglas, Robert L.

    1987-01-01

    This paper focuses on historic portrayal of black males in service and support roles in the media and their relation to social reality. Both television and films use glamorous sophisticated trappings seemingly to enhance the image of black males, but the personalities of the characters they play remain stereotypic. (VM)

  7. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  8. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  9. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  10. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  11. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  12. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  13. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  14. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  15. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  16. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  17. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  18. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  19. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  20. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  1. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  2. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  3. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  4. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  5. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  6. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  7. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  8. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  9. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  10. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  11. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  12. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  13. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  14. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  15. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  16. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  17. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  18. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  19. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  20. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  1. Optimization of recommendations for abdomen computerized tomography based on reconstruction filters, voltage and tube current; Otimizacao de protocolos de tomografia computadorizada de abdome com base nos filtros de reconstrucao, tensao e corrente do tubo

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Vinicius da Costa

    2015-07-01

    The use of computed tomography has increased significantly over the past decades. In Brazil the use increased more than twofold from 2008 to 2014, in the meantime the abdomen procedures have tripled. The high frequency of this procedure combined by the increasing collective radiation dose in medical exposures, has resulted development tools to maximize the benefit in CT images. This work aimed to establish protocols optimized in abdominal CT through acquisitions parameters and reconstructions techniques based on filters kernels. A sample of patients undergoing abdominal CT in a diagnostic center of Rio de Janeiro was assessed. Had been collected patients information and acquisitions parameters. The phantoms CT image acquisitions were performed by using different voltage values by adjusting the tube current (mAs) to obtain the same value from CTDI{sub vol} patients with normal BMI. Afterwards, the CTDIvol values were reduced by 30%, 50% and 60%. All images were reconstructed with low-contrast filters (A) and standard filters (B). The CTDIvol values for patients with normal BMI were 7% higher than in patients with underweight BMI and 30%, 50% and 60% lower than the overweight, obese I and III patients, respectively. The evaluations of image quality showed that variation of the current (mA) and the reconstruction filters did not affect the Hounsfield values. When the contrast-to-noise ratio (CNR) was normalized to CTDIvol, the protocols acquired with 60% reduction of CTDIvol with 140 kV and 80 kV showed CNR 6% lower than the routine. Modifications of the acquisition parameters did not affect spatial resolution, but the post-processing with B filters reduced the spatial frequency by 16%. With reduced the dose of 30%, lesions in the spleen had the CNR higher than 10% routine protocols with 140 kV acquired and post-processed to filter A. The image post-processing with a filter A with a 80kV voltage provided CNR values equal to the routine for the liver lesions with a 30

  2. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  3. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  4. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  5. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  6. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  7. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  8. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  9. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  10. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  11. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  12. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  13. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  14. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  15. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  16. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  17. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  18. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  19. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  2. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  3. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  5. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  6. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  7. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  8. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  9. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  10. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  11. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  12. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  13. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  14. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  15. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  16. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  17. Searching with Quantum Computers

    OpenAIRE

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.

  18. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  19. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  20. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  2. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  3. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  4. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  5. Quantum Computer Science

    Science.gov (United States)

    Mermin, N. David

    2007-08-01

    Preface; 1. Cbits and Qbits; 2. General features and some simple examples; 3. Breaking RSA encryption with a quantum computer; 4. Searching with a quantum computer; 5. Quantum error correction; 6. Protocols that use just a few Qbits; Appendices; Index.

  6. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  7. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  8. Computer Technology Directory.

    Science.gov (United States)

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  9. My Computer Is Learning.

    Science.gov (United States)

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  10. What is Computed Tomography?

    Science.gov (United States)

    ... Imaging Medical X-ray Imaging What is Computed Tomography? Share Tweet Linkedin Pin it More sharing options ... Chest X ray Image back to top Computed Tomography (CT) Although also based on the variable absorption ...

  11. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  12. Computing for Belle

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  13. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  14. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  15. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  16. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - ...

  19. Intimacy and Computer Communication.

    Science.gov (United States)

    Robson, Dave; Robson, Maggie

    1998-01-01

    Addresses the relationship between intimacy and communication that is based on computer technology. Discusses definitions of intimacy and the nature of intimate conversations that use computers as a communications medium. Explores implications for counseling. (MKA)

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  1. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  2. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  3. Nanoelectronics: Metrology and Computation

    International Nuclear Information System (INIS)

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  4. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  5. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  6. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  7. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  8. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  9. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  11. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  12. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  13. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  14. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  15. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  16. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  17. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  18. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  19. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  20. Visitor's Computer Guidelines | CTIO

    Science.gov (United States)

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  1. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  2. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  3. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  5. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  6. Beyond the Computer Literacy.

    Science.gov (United States)

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  7. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  8. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  9. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  10. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  11. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  12. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  13. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  14. Optical Doppler tomography based on a field programmable gate array

    DEFF Research Database (Denmark)

    Larsen, Henning Engelbrecht; Nilsson, Ronnie Thorup; Thrane, Lars

    2008-01-01

    We report the design of and results obtained by using a field programmable gate array (FPGA) to digitally process optical Doppler tomography signals. The processor fits into the analog signal path in an existing optical coherence tomography setup. We demonstrate both Doppler frequency and envelope...... extraction using the Hilbert transform, all in a single FPGA. An FPGA implementation has certain advantages over general purpose digital signal processor (DSP) due to the fact that the processing elements operate in parallel as opposed to the DSP. which is primarily a sequential processor....

  15. Microcomputed tomography-based assessment of retrieved dental implants

    NARCIS (Netherlands)

    Narra, N.; Antalainen, A.K.; Zipprich, H.; Sándor, G.K.; Wolff, J.

    2015-01-01

    Purpose: The aim of this study was to demonstrate the potential of microcomputed tomography (micro-CT) technology in the assessment of retrieved dental implants. Cases are presented to illustrate the value of micro-CT imaging techniques in determining possible mechanical causes for dental implant

  16. Projection model for flame chemiluminescence tomography based on lens imaging

    Science.gov (United States)

    Wan, Minggang; Zhuang, Jihui

    2018-04-01

    For flame chemiluminescence tomography (FCT) based on lens imaging, the projection model is essential because it formulates the mathematical relation between the flame projections captured by cameras and the chemiluminescence field, and, through this relation, the field is reconstructed. This work proposed the blurry-spot (BS) model, which takes more universal assumptions and has higher accuracy than the widely applied line-of-sight model. By combining the geometrical camera model and the thin-lens equation, the BS model takes into account perspective effect of the camera lens; by combining ray-tracing technique and Monte Carlo simulation, it also considers inhomogeneous distribution of captured radiance on the image plane. Performance of these two models in FCT was numerically compared, and results showed that using the BS model could lead to better reconstruction quality in wider application ranges.

  17. Hyperspectral tomography based on multi-mode absorption spectroscopy (MUMAS)

    Science.gov (United States)

    Dai, Jinghang; O'Hagan, Seamus; Liu, Hecong; Cai, Weiwei; Ewart, Paul

    2017-10-01

    This paper demonstrates a hyperspectral tomographic technique that can recover the temperature and concentration field of gas flows based on multi-mode absorption spectroscopy (MUMAS). This method relies on the recently proposed concept of nonlinear tomography, which can take full advantage of the nonlinear dependency of MUMAS signals on temperature and enables 2D spatial resolution of MUMAS which is naturally a line-of-sight technique. The principles of MUMAS and nonlinear tomography, as well as the mathematical formulation of the inversion problem, are introduced. Proof-of-concept numerical demonstrations are presented using representative flame phantoms and assuming typical laser parameters. The results show that faithful reconstruction of temperature distribution is achievable when a signal-to-noise ratio of 20 is assumed. This method can potentially be extended to simultaneously reconstructing distributions of temperature and the concentration of multiple flame species.

  18. Comparative micro computed tomography study of a vertebral body

    Science.gov (United States)

    Drews, Susanne; Beckmann, Felix; Herzen, Julia; Brunke, Oliver; Salmon, Phil; Friess, Sebastian; Laib, Andres; Koller, Bruno; Hemberger, Thomas; Müller-Gerbl, Magdalena; Müller, Bert

    2008-08-01

    Investigations of bony tissues are often performed using micro computed tomography based on X-rays, since the calcium distribution leads to superior contrast. Osteoporotic bone, for example, can be well compared with healthy one with respect to density and morphology. Degenerative and rheumatoid diseases usually start, however, at the bone-cartilage-interface, which is hardly accessible. The direct influence on the bone itself becomes only visible at later stage. For the development of suitable therapies against degenerative cartilage damages the exact three-dimensional description of the bone-cartilage interface is vital, as demonstrated for transplanted cartilage-cells or bone-cartilage-constructs in animal models. So far, the morphological characterization was restricted to magnetic resonance imaging (MRI) with poor spatial resolution or to time-consuming histological sectioning with appropriate spatial resolution only in two rather arbitrarily chosen directions. Therefore, one should develop μCT to extract the features of low absorbing cartilage. The morphology and the volume of the inter-vertebral cartilage disc of lumbar motion segments have been determined for one PMMA embedded specimen. Tomograms were recorded using nanotom® (Phoenix|x-ray, Wunstorf, Germany), μCT 35TM (Scanco Medical, Brütisellen, Switzerland), 1172TM and 1174TM (both Skyscan, Kontich, Belgium), as well as using the SRμCT at HASYLAB/DESY. Conventional and SRμCT can provide the morphology and the volume of cartilage between bones. Increasing the acquisition time, the signal-to-noise ratio becomes better and better but the prominent artifacts in conventional μCT as the result of inhomogeneously distributed bony tissue prevents the exact segmentation of cartilage. SRμCT allows segmenting the cartilage but requires long periods of expensive beam-time to obtain reasonable contrast.

  19. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  20. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  1. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  2. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  3. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  4. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  5. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  6. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  7. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  8. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  9. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  10. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  11. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  12. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  13. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  14. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  15. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  16. COMPUTER-ASSISTED ACCOUNTING

    Directory of Open Access Journals (Sweden)

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  17. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  18. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  19. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  20. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  1. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  2. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  3. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  4. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  5. Computing at Belle II

    International Nuclear Information System (INIS)

    Kuhr, Thomas

    2012-01-01

    Belle II, a next-generation B-factory experiment, will search for new physics effects in a data sample about 50 times larger than the one collected by its predecessor, the Belle experiment. To match the advances in accelerator and detector technology, the computing system and the software have to be upgraded as well. The Belle II computing model is presented and an overview of the distributed computing system and the offline software framework is given.

  6. Computing Conference at Bologna

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    From 9-12 September a Europhysics Conference on Computing in High Energy and Nuclear Physics, organized by the Computational Physics Group of the European Physical Society, was held in Bologna, attracting some 150 participants. Its purpose was contact and exchange of information between experimental physicists (from both fields of research) and computer experts (on whom the successful outcome of the research has become increasingly dependent)

  7. Review on Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  8. CAD on personal computers

    International Nuclear Information System (INIS)

    Lee, Seong U; Cho, Cheol Ho; Ko, Il Du

    1990-02-01

    This book contains four studies of CAD on personal computers. The first thing is computer graphics in computer-aided design by Seong U Lee. The second thing is graphics primer and programming with Fortran by Seong U Lee. The third thing is application of Auto cad by Il Do Ko. The last thing is application of CAD in building construction design by Cheol Ho Cho.

  9. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  10. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  11. Research in computer forensics

    OpenAIRE

    Wai, Hor Cheong

    2002-01-01

    Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

  12. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  13. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  14. Human Computer Music Performance

    OpenAIRE

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  15. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  16. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  17. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  18. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  19. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  20. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  1. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  2. Multidisciplinary Computational Research

    National Research Council Canada - National Science Library

    Visbal, Miguel R

    2006-01-01

    The purpose of this work is to develop advanced multidisciplinary numerical simulation capabilities for aerospace vehicles with emphasis on highly accurate, massively parallel computational methods...

  3. Frontiers in Computer Education

    CERN Document Server

    Zhu, Egui; 2011 International Conference on Frontiers in Computer Education (ICFCE 2011)

    2012-01-01

    This book is the proceedings of the 2011 International Conference on Frontiers in Computer Education (ICFCE 2011) in Sanya, China, December 1-2, 2011. The contributions can be useful for researchers, software engineers, and programmers, all interested in promoting the computer and education development. Topics covered are computing and communication technology, network management, wireless networks, telecommunication, Signal and Image Processing, Machine Learning, educational management, educational psychology, educational system, education engineering, education technology and training.  The emphasis is on methods and calculi for computer science and education technology development, verification and verification tools support, experiences from doing developments, and the associated theoretical problems.

  4. Computers appreciated by marketers

    International Nuclear Information System (INIS)

    Mantho, M.

    1993-01-01

    The computer has been worth its weight in gold to the fueloil man. In fact, with falling prices on both software and machines, the worth is greater than gold. Every so often, about every three years, we ask some questions about the utilization of computers. This time, we looked into the future, to find out the acceptance of other marvels such as the cellular phone and hand held computer. At the moment, there isn't much penetration. Contact by two-way radio as well as computing meters on trucks still reign supreme

  5. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  6. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  7. Computers in engineering. 1988

    International Nuclear Information System (INIS)

    Tipnis, V.A.; Patton, E.M.

    1988-01-01

    These proceedings discuss the following subjects: Knowledge base systems; Computers in designing; uses of artificial intelligence; engineering optimization and expert systems of accelerators; and parallel processing in designing

  8. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  9. Octopus: LLL's computing utility

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  10. Theory and Computation

    Data.gov (United States)

    Federal Laboratory Consortium — Flexible computational infrastructure, software tools and theoretical consultation are provided to support modeling and understanding of the structure and properties...

  11. Educational Computer Utilization and Computer Communications.

    Science.gov (United States)

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  12. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  13. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  14. Classroom Computer Network.

    Science.gov (United States)

    Lent, John

    1984-01-01

    This article describes a computer network system that connects several microcomputers to a single disk drive and one copy of software. Many schools are switching to networks as a cheaper and more efficient means of computer instruction. Teachers may be faced with copywriting problems when reproducing programs. (DF)

  15. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  16. Can Computers See?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 6. Can Computers See? - Can Computers Understand Visual Data? Neelima Shrikhande. General Article Volume 4 Issue 6 June 1999 pp 45-56. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computational genomics of hyperthermophiles

    NARCIS (Netherlands)

    Werken, van de H.J.G.

    2008-01-01

    With the ever increasing number of completely sequenced prokaryotic genomes and the subsequent use of functional genomics tools, e.g. DNA microarray and proteomics, computational data analysis and the integration of microbial and molecular data is inevitable. This thesis describes the computational

  18. Computer Technology for Industry

    Science.gov (United States)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  19. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  20. Computer Use Exposed

    NARCIS (Netherlands)

    J.M. Richter (Janneke)

    2009-01-01

    textabstractEver since the introduction of the personal computer, our daily lives are infl uenced more and more by computers. A day in the life of a PhD-student illustrates this: “At the breakfast table, I check my e-mail to see if the meeting later that day has been confi rmed, and I check the time

  1. Can Computers Create?

    Science.gov (United States)

    Hausman, Carl R.

    1985-01-01

    To be creative, an act must have as its outcome something new in the way it is intelligible and valuable. Computers have restricted contexts of information and have no ability to weigh bits of information. Computer optimists presuppose either determinism or indeterminism, either of which abandons creativity. (MT)

  2. Personalized Empathic Computing (PEC)

    NARCIS (Netherlands)

    van Beusekom, W.; van den Broek, Egon; van der Heijden, M.; Janssen, J.H.; Spaak, E.

    2006-01-01

    Until a decade ago, computers were only used by experts, for professional purposes solely. Nowadays, the personal computer (PC) is standard equipment in most western housekeepings and is used to gather information, play games, communicate, etc. In parallel, users' expectations increase and,

  3. Computers and Creativity.

    Science.gov (United States)

    Ten Dyke, Richard P.

    1982-01-01

    A traditional question is whether or not computers shall ever think like humans. This question is redirected to a discussion of whether computers shall ever be truly creative. Creativity is defined and a program is described that is designed to complete creatively a series problem in mathematics. (MP)

  4. Petascale Computational Systems

    OpenAIRE

    Bell, Gordon; Gray, Jim; Szalay, Alex

    2007-01-01

    Computational science is changing to be data intensive. Super-Computers must be balanced systems; not just CPU farms but also petascale IO and networking arrays. Anyone building CyberInfrastructure should allocate resources to support a balanced Tier-1 through Tier-3 design.

  5. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  6. Emission computed tomography

    International Nuclear Information System (INIS)

    Budinger, T.F.; Gullberg, G.T.; Huesman, R.H.

    1979-01-01

    This chapter is devoted to the methods of computer assisted tomography for determination of the three-dimensional distribution of gamma-emitting radionuclides in the human body. The major applications of emission computed tomography are in biological research and medical diagnostic procedures. The objectives of these procedures are to make quantitative measurements of in vivo biochemical and hemodynamic functions

  7. Computers in writing instruction

    NARCIS (Netherlands)

    Schwartz, Helen J.; van der Geest, Thea; Smit-Kreuzen, Marlies

    1992-01-01

    For computers to be useful in writing instruction, innovations should be valuable for students and feasible for teachers to implement. Research findings yield contradictory results in measuring the effects of different uses of computers in writing, in part because of the methodological complexity of

  8. Nature, computation and complexity

    International Nuclear Information System (INIS)

    Binder, P-M; Ellis, G F R

    2016-01-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us. (paper)

  9. Computational Sociolinguistics: A Survey

    NARCIS (Netherlands)

    Nguyen, Dong-Phuong; Doğruöz, A. Seza; Rosé, Carolyn P.; de Jong, Franciska M.G.

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  10. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  11. Theory and computational science

    International Nuclear Information System (INIS)

    Durham, P.

    1985-01-01

    The theoretical and computational science carried out at the Daresbury Laboratory in 1984/5 is detailed in the Appendix to the Daresbury Annual Report. The Theory, Computational Science and Applications Groups, provide support work for the experimental projects conducted at Daresbury. Use of the FPS-164 processor is also described. (U.K.)

  12. Selecting Personal Computers.

    Science.gov (United States)

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  13. Physicist or computer specialist?

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  14. Theory of computational complexity

    CERN Document Server

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  15. Computer vision for sports

    DEFF Research Database (Denmark)

    Thomas, Graham; Gade, Rikke; Moeslund, Thomas B.

    2017-01-01

    fixed to players or equipment is generally not possible. This provides a rich set of opportunities for the application of computer vision techniques to help the competitors, coaches and audience. This paper discusses a selection of current commercial applications that use computer vision for sports...

  16. Basic principles of computers

    International Nuclear Information System (INIS)

    Royal, H.D.; Parker, J.A.; Holmen, B.L.

    1988-01-01

    This chapter presents preliminary concepts of computer operations. It describes the hardware used in a nuclear medicine computer system. It discusses the software necessary for acquisition and analysis of nuclear medicine studies. The chapter outlines the integrated package of hardware and software that is necessary to perform specific functions in nuclear medicine

  17. Teaching Using Computer Games

    Science.gov (United States)

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  18. Text understanding for computers

    NARCIS (Netherlands)

    Kenter, T.M.

    2017-01-01

    A long-standing challenge for computers communicating with humans is to pass the Turing test, i.e., to communicate in such a way that it is impossible for humans to determine whether they are talking to a computer or another human being. The field of natural language understanding — which studies

  19. Advances in Computer Entertainment.

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  20. Computers and Classroom Culture.

    Science.gov (United States)

    Schofield, Janet Ward

    This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…

  1. Computer Literacy Education

    Science.gov (United States)

    1989-01-01

    Cognitive Aspect ," AEDS Journal, 18, 3 (Spring 1985) 150. "°Geoffrey Akst, "Computer Literacy: An Interview with Dr. Michael Hoban." Journal of Develop- m...1984. Cheng, Tina T.; Plake, Barbara; and Stevens, Dorothy Jo. "A Validation Study of the Computer Literacy Examination: Cognitive Aspect ." AEDS

  2. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  3. Learning with Ubiquitous Computing

    Science.gov (United States)

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  4. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  5. Computing in Research.

    Science.gov (United States)

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  6. Computational Cognitive Color Perception

    NARCIS (Netherlands)

    Ciftcioglu, O.; Bittermann, M.S.

    2016-01-01

    Comprehension of aesthetical color characteristics based on a computational model of visual perception and color cognition are presented. The computational comprehension is manifested by the machine’s capability of instantly assigning appropriate colors to the objects perceived. They form a scene

  7. Thinking about computational thinking

    NARCIS (Netherlands)

    Lu, J.J.; Fletcher, G.H.L.; Fitzgerald, S.; Guzdial, M.; Lewandowski, G.; Wolfman, S.A.

    2009-01-01

    Jeannette Wing's call for teaching Computational Thinking (CT) as a formative skill on par with reading, writing, and arithmetic places computer science in the category of basic knowledge. Just as proficiency in basic language arts helps us to effectively communicate and in basic math helps us to

  8. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  9. Computational Sociolinguistics: A Survey.

    NARCIS (Netherlands)

    de Jong, F.M.G.; Nguyen, Dong

    2016-01-01

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  10. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  11. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  12. Exercises in Computational Chemistry

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  13. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  14. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...

  15. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  16. Neuroscience, brains, and computers

    Directory of Open Access Journals (Sweden)

    Giorno Maria Innocenti

    2013-07-01

    Full Text Available This paper addresses the role of the neurosciences in establishing what the brain is and how states of the brain relate to states of the mind. The brain is viewed as a computational deviceperforming operations on symbols. However, the brain is a special purpose computational devicedesigned by evolution and development for survival and reproduction, in close interaction with theenvironment. The hardware of the brain (its structure is very different from that of man-made computers.The computational style of the brain is also very different from traditional computers: the computationalalgorithms, instead of being sets of external instructions, are embedded in brain structure. Concerningthe relationships between brain and mind a number of questions lie ahead. One of them is why andhow, only the human brain grasped the notion of God, probably only at the evolutionary stage attainedby Homo sapiens.

  17. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  19. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  20. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  1. Computer tomography in otolaryngology

    International Nuclear Information System (INIS)

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  2. Offline computing and networking

    International Nuclear Information System (INIS)

    Appel, J.A.; Avery, P.; Chartrand, G.

    1985-01-01

    This note summarizes the work of the Offline Computing and Networking Group. The report is divided into two sections; the first deals with the computing and networking requirements and the second with the proposed way to satisfy those requirements. In considering the requirements, we have considered two types of computing problems. The first is CPU-intensive activity such as production data analysis (reducing raw data to DST), production Monte Carlo, or engineering calculations. The second is physicist-intensive computing such as program development, hardware design, physics analysis, and detector studies. For both types of computing, we examine a variety of issues. These included a set of quantitative questions: how much CPU power (for turn-around and for through-put), how much memory, mass-storage, bandwidth, and so on. There are also very important qualitative issues: what features must be provided by the operating system, what tools are needed for program design, code management, database management, and for graphics

  3. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  4. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  5. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  6. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  7. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  8. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  9. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  10. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  11. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  12. Computing with synthetic protocells.

    Science.gov (United States)

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise.

  13. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  14. Forensic Computing (Dagstuhl Seminar 13482)

    OpenAIRE

    Freiling, Felix C.; Hornung, Gerrit; Polcák, Radim

    2014-01-01

    Forensic computing} (sometimes also called digital forensics, computer forensics or IT forensics) is a branch of forensic science pertaining to digital evidence, i.e., any legal evidence that is processed by digital computer systems or stored on digital storage media. Forensic computing is a new discipline evolving within the intersection of several established research areas such as computer science, computer engineering and law. Forensic computing is rapidly gaining importance since the...

  15. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Rhodes, M.L.; Jaffee, C.C.; Felix, R.

    1987-01-01

    The organization of the book follows the plan of the meeting, with chapters representing the general meeting sessions and articles representing the meeting presentations. These are grouped by modality or kindred application, where relevant. Some sessions are not similarly divided and individual papers are positioned, presumably, in order of presentation. Each section labeled workshop addresses a specific topic. The first session is on digital image generation and contains sections on magnetic resonance imaging, nuclear medicine, computed tomography, ultrasound, digital radiography, and digital subtraction and angiography. The remaining sections are on application programming, picture archiving and communications systems, computer graphics, and computer vision

  16. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  17. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  18. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  19. Archives and the computer

    CERN Document Server

    Cook, Michael Garnet

    1986-01-01

    Archives and the Computer deals with the use of the computer and its systems and programs in archiving data and other related materials. The book covers topics such as the scope of automated systems in archives; systems for records management, archival description, and retrieval; and machine-readable archives. The selection also features examples of archives from different institutions such as the University of Liverpool, Berkshire County Record Office, and the National Maritime Museum.The text is recommended for archivists who would like to know more about the use of computers in archiving of

  20. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)