WorldWideScience

Sample records for imaging-versus computed tomography-based

  1. Computed tomography-based subclassification of chronic obstructive pulmonary disease

    DEFF Research Database (Denmark)

    Dirksen, Asger; Wille, Mathilde M W

    2016-01-01

    Computed tomography (CT) is an obvious modality for subclassification of COPD. Traditionally, the pulmonary involvement of chronic obstructive pulmonary disease (COPD) in smokers is understood as a combination of deleterious effects of smoking on small airways (chronic bronchitis and small airways...... observed in COPD are subtle. Furthermore, recent results indicate that emphysema may also be the essential pathophysiologic mechanism behind the airflow limitation of COPD. The definition of COPD excludes bronchiectasis as a symptomatic subtype of COPD, and CT findings in chronic bronchitis...

  2. Diagnostic accuracy of magnetic resonance imaging versus computed tomography in stress fractures of the lumbar spine

    Energy Technology Data Exchange (ETDEWEB)

    Ganiyusufoglu, A.K., E-mail: kursady33@yahoo.co [Department of Radiology, Florence Nightingale Hospital, Istanbul (Turkey); Onat, L. [Department of Radiology, Florence Nightingale Hospital, Istanbul (Turkey); Karatoprak, O.; Enercan, M.; Hamzaoglu, A. [Department of Orthopedics and Traumatology, Florence Nightingale Hospital, Istanbul (Turkey)

    2010-11-15

    Aim: To compare the diagnostic accuracy of magnetic resonance imaging (MRI) with computed tomography (CT) in stress fractures of the lumbar spine. Materials and methods: Radiological and clinical data from 57 adolescents and young adults with a diagnosis of stress injury of the lumbar spine were retrospectively reviewed. All cases had undergone both 1.5 T MRI and 16-section CT examinations. All MRI and CT images were retrospectively reviewed and evaluated in separate sessions. The fracture morphology (complete/incomplete, localization) and vertebral levels were noted at both the CT and MRI examinations. Bone marrow/peri-osseous soft-tissue oedema was also determined at MRI. Results: In total, 73 complete and 32 incomplete stress fractures were detected with CT. Sixty-seven complete, 24 incomplete fractures and eight stress reactions were detected using MRI in the same study group. Marrow oedema was also seen in eight of the complete and 20 of the incomplete fractures. The specificity, sensitivity, and accuracy of MRI in detecting fracture lines were 99.6, 86.7, and 97.2%, respectively. MRI was more accurate at the lower lumbar levels in comparison to upper lumbar levels. Conclusion: MRI has a similar diagnostic accuracy to CT in determining complete fractures with or without accompanying marrow oedema and incomplete fractures with accompanying marrow oedema, especially at the lower lumbar levels, which constitutes 94% of all fractures. At upper lumbar levels and in the incomplete fractures of the pars interarticularis with marked surrounding sclerosis, MRI has apparent limitations compared to CT imaging.

  3. Feasibility of computed tomography based thermometry during interstitial laser heating in bovine liver

    NARCIS (Netherlands)

    Pandeya, G. D.; Klaessens, J. H. G. M.; Greuter, M. J. W.; Schmidt, B.; Flohr, T.; van Hillegersberg, R.; Oudkerk, M.

    2011-01-01

    To assess the feasibility of computed tomography (CT) based thermometry during interstitial laser heating in the bovine liver. Four freshly exercised cylindrical blocks of bovine tissue were heated using a continuous laser of Nd:YAG (wavelength: 1064 nm, active length: 30 mm, power: 10-30 W). All ti

  4. CdZnTe detector for computed tomography based on weighting potential

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Hyun Jong; Park, Chan Sun; Kim, Jung Su; Kim, Jung Min; Choi, Jong Hak; Kim, Ki Hyun [Korea University, Seoul (Korea, Republic of)

    2016-03-15

    Room-temperature operating CdZnTe(CZT) material is an innovative radiation detector which could reduce the patient dose to one-tenth level of conventional CT (Computed Tomography) and mammography system. The pixel and pixel pitch in the imaging device determine the conversion efficiency of incident Xor gamma-ray and the cross-talk of signal, that is, image quality of detector system. The weighting potential is the virtual potential determined by the position and geometry of electrode. The weighting potential obtained by computer-based simulation in solving Poisson equation with proper boundaries condition. The pixel was optimized by considering the CIE (charge induced efficiency) and the signal cross-talk in CT detector system. The pixel pitch was 1-mm and the detector thickness was 2-mm in the simulation. The optimized pixel size and inter-pixel distance for maximizing the CIE and minimizing the signal cross-talk is about 750 μm and 125 μm, respectively.

  5. Computed tomography-based biomarker provides unique signature for diagnosis of COPD phenotypes and disease progression.

    Science.gov (United States)

    Galbán, Craig J; Han, Meilan K; Boes, Jennifer L; Chughtai, Komal A; Meyer, Charles R; Johnson, Timothy D; Galbán, Stefanie; Rehemtulla, Alnawaz; Kazerooni, Ella A; Martinez, Fernando J; Ross, Brian D

    2012-11-01

    Chronic obstructive pulmonary disease (COPD) is increasingly being recognized as a highly heterogeneous disorder, composed of varying pathobiology. Accurate detection of COPD subtypes by image biomarkers is urgently needed to enable individualized treatment, thus improving patient outcome. We adapted the parametric response map (PRM), a voxel-wise image analysis technique, for assessing COPD phenotype. We analyzed whole-lung computed tomography (CT) scans acquired at inspiration and expiration of 194 individuals with COPD from the COPDGene study. PRM identified the extent of functional small airways disease (fSAD) and emphysema as well as provided CT-based evidence that supports the concept that fSAD precedes emphysema with increasing COPD severity. PRM is a versatile imaging biomarker capable of diagnosing disease extent and phenotype while providing detailed spatial information of disease distribution and location. PRM's ability to differentiate between specific COPD phenotypes will allow for more accurate diagnosis of individual patients, complementing standard clinical techniques.

  6. Prediction of Human Vertebral Compressive Strength Using Quantitative Computed Tomography Based Nonlinear Finite Element Method

    Directory of Open Access Journals (Sweden)

    Ahad Zeinali

    2007-12-01

    Full Text Available Introduction: Because of the importance of vertebral compressive fracture (VCF role in increasing the patients’ death rate and reducing their quality of life, many studies have been conducted for a noninvasive prediction of vertebral compressive strength based on bone mineral density (BMD determination and recently finite element analysis. In this study, QCT-voxel based nonlinear finite element method is used for predicting vertebral compressive strength. Material and Methods: Four thoracolumbar vertebrae were excised from 3 cadavers with an average age of 42 years. They were then put in a water phantom and were scanned using the QCT. Using a computer program prepared in MATLAB, detailed voxel based geometry and mechanical characteristics of the vertebra were extracted from the CT images. The three dimensional finite element models of the samples were created using ANSYS computer program. The compressive strength of each vertebra body was calculated based on a linearly elastic-linearly plastic model and large deformation analysis in ANSYS and was compared to the value measured experimentally for that sample. Results: Based on the obtained results the QCT-voxel based nonlinear finite element method (FEM can predict vertebral compressive strength more effectively and accurately than the common QCT-voxel based linear FEM. The difference between the predicted strength values using this method and the measured ones was less than 1 kN for all the samples. Discussion and Conclusion: It seems that the QCT-voxel based nonlinear FEM used in this study can predict more effectively and accurately the vertebral strengths based on every vertebrae specification by considering their detailed geometric and densitometric characteristics.

  7. Cone Beam X-ray Luminescence Computed Tomography Based on Bayesian Method.

    Science.gov (United States)

    Zhang, Guanglei; Liu, Fei; Liu, Jie; Luo, Jianwen; Xie, Yaoqin; Bai, Jing; Xing, Lei

    2017-01-01

    X-ray luminescence computed tomography (XLCT), which aims to achieve molecular and functional imaging by X-rays, has recently been proposed as a new imaging modality. Combining the principles of X-ray excitation of luminescence-based probes and optical signal detection, XLCT naturally fuses functional and anatomical images and provides complementary information for a wide range of applications in biomedical research. In order to improve the data acquisition efficiency of previously developed narrow-beam XLCT, a cone beam XLCT (CB-XLCT) mode is adopted here to take advantage of the useful geometric features of cone beam excitation. Practically, a major hurdle in using cone beam X-ray for XLCT is that the inverse problem here is seriously ill-conditioned, hindering us to achieve good image quality. In this paper, we propose a novel Bayesian method to tackle the bottleneck in CB-XLCT reconstruction. The method utilizes a local regularization strategy based on Gaussian Markov random field to mitigate the ill-conditioness of CB-XLCT. An alternating optimization scheme is then used to automatically calculate all the unknown hyperparameters while an iterative coordinate descent algorithm is adopted to reconstruct the image with a voxel-based closed-form solution. Results of numerical simulations and mouse experiments show that the self-adaptive Bayesian method significantly improves the CB-XLCT image quality as compared with conventional methods.

  8. Chest computed tomography-based scoring of thoracic sarcoidosis: Inter-rater reliability of CT abnormalities

    Energy Technology Data Exchange (ETDEWEB)

    Heuvel, D.A.V. den; Es, H.W. van; Heesewijk, J.P. van; Spee, M. [St. Antonius Hospital Nieuwegein, Department of Radiology, Nieuwegein (Netherlands); Jong, P.A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Zanen, P.; Grutters, J.C. [University Medical Center Utrecht, Division Heart and Lungs, Utrecht (Netherlands); St. Antonius Hospital Nieuwegein, Center of Interstitial Lung Diseases, Department of Pulmonology, Nieuwegein (Netherlands)

    2015-09-15

    To determine inter-rater reliability of sarcoidosis-related computed tomography (CT) findings that can be used for scoring of thoracic sarcoidosis. CT images of 51 patients with sarcoidosis were scored by five chest radiologists for various abnormal CT findings (22 in total) encountered in thoracic sarcoidosis. Using intra-class correlation coefficient (ICC) analysis, inter-rater reliability was analysed and reported according to the Guidelines for Reporting Reliability and Agreement Studies (GRRAS) criteria. A pre-specified sub-analysis was performed to investigate the effect of training. Scoring was trained in a distinct set of 15 scans in which all abnormal CT findings were represented. Median age of the 51 patients (36 men, 70 %) was 43 years (range 26 - 64 years). All radiographic stages were present in this group. ICC ranged from 0.91 for honeycombing to 0.11 for nodular margin (sharp versus ill-defined). The ICC was above 0.60 in 13 of the 22 abnormal findings. Sub-analysis for the best-trained observers demonstrated an ICC improvement for all abnormal findings and values above 0.60 for 16 of the 22 abnormalities. In our cohort, reliability between raters was acceptable for 16 thoracic sarcoidosis-related abnormal CT findings. (orig.)

  9. Use of a Computed Tomography Based Approach to Validate Noninvasive Devices to Measure Rotational Knee Laxity.

    Science.gov (United States)

    Neumann, Simon; Maas, Stefan; Waldmann, Danièle; Ricci, Pierre-Louis; Zürbes, Arno; Arnoux, Pierre-Jean; Walter, Frédéric; Kelm, Jens

    2015-01-01

    The purpose of this study is to validate a noninvasive rotational knee laxity measuring device called "Rotameter P2" with an approach based on Computed Tomography (CT). This CT-approach using X-rays is hence invasive and can be regarded as a precise reference method that may also be applied to similar devices. An error due to imperfect femur fixation was observed but can be neglected for small torques. The most significant estimation error is due to the unavoidable soft tissues rotation and hence flexibility in the measurement chain. The error increases with the applied torque. The assessment showed that the rotational knee angle measured with the Rotameter is still overestimated because of thigh and femur displacement, soft tissues deformation, and measurement artefacts adding up to a maximum of 285% error at +15 Nm for the Internal Rotation of female volunteers. This may be questioned if such noninvasive devices for measuring the Tibia-Femoral Rotation (TFR) can help diagnosing knee pathologies and investigate ligament reconstructive surgery.

  10. Application of a computed tomography based cystic fibrosis scoring system to chest tomosynthesis

    Science.gov (United States)

    Söderman, Christina; Johnsson, Åse; Vikgren, Jenny; Rystedt, Hans; Ivarsson, Jonas; Rossi Norrlund, Rauni; Nyberg Andersson, Lena; Bâth, Magnus

    2013-03-01

    In the monitoring of progression of lung disease in patients with cystic fibrosis (CF), recurrent computed tomography (CT) examinations are often used. The relatively new imaging technique chest tomosynthesis (CTS) may be an interesting alternative in the follow-up of these patients due to its visualization of the chest in slices at radiation doses and costs significantly lower than is the case with CT. A first step towards introducing CTS imaging in the diagnostics of CF patients is to establish a scoring system appropriate for evaluating the severity of CF pulmonary disease based on findings in CTS images. Previously, several such CF scoring systems based on CT imaging have been published. The purpose of the present study was to develop a CF scoring system for CTS, by starting from an existing scoring system dedicated for CT images and making modifications regarded necessary to make it appropriate for use with CTS images. In order to determine any necessary changes, three thoracic radiologists independently used a scoring system dedicated for CT on both CT and CTS images from CF patients. The results of the scoring were jointly evaluated by all the observers, which lead to suggestions for changes to the scoring system. Suggested modifications include excluding the scoring of air trapping and doing the scoring of the findings in quadrants of the image instead of in each lung lobe.

  11. A New Computed Tomography-Based Radiographic Method to Detect Early Loosening of Total Wrist Implants

    Energy Technology Data Exchange (ETDEWEB)

    Olivecrona, H.; Noz, M.E.; Maguire, G.Q. Jr; Zeleznik, M.P.; Sollerman, C.; Olivecrona, L. [Dept. of Hand Surgery, Soedersjukhuset, Stockholm (Sweden)

    2007-11-15

    Background: Diagnosis of loosening of total wrist implants is usually late using routine radiographs. Switching modality to computed tomography (CT) should aid in early diagnosis. Purpose: To propose and evaluate the accuracy of a new CT method for assessing loosening of the carpal component in total wrist arthroplasty. Material and Methods: A protocol encompassing volume registration of paired CT scans of patients with unexplained pain in a prosthetically replaced wrist (used in clinical routine) is presented. Scans are acquired as a dynamic examination under torsional load. Using volume registration, the carpal component of the prosthesis is brought into spatial alignment. After registration, prosthetic loosening is diagnosed by a shift in position of the bones relative to the prosthesis. This study is a preclinical validation of this method using a human cadaverous arm with a cemented total wrist implant and tantalum markers. Seven CT scans of the arm were acquired. The scans were combined into 21 pairs of CT volumes. The carpal component was registered in each scan pair, and the residual mismatch of the surrounding tantalum markers and bone was analyzed both visually and numerically. Results: The detection limit for prosthetic movement was less than 1 mm. Conclusion: The results of this study demonstrate that CT volume registration holds promise to improve detection of movement of the carpal component at an earlier stage than is obtainable with plain radiography.

  12. Ultrasonic computed tomography based on full-waveform inversion for bone quantitative imaging

    Science.gov (United States)

    Bernard, Simon; Monteiller, Vadim; Komatitsch, Dimitri; Lasaygues, Philippe

    2017-09-01

    We introduce an ultrasonic quantitative imaging method for long bones based on full-waveform inversion. The cost function is defined as the difference in the L 2-norm sense between observed data and synthetic results at a given iteration of the iterative inversion process. For simplicity, and in order to reduce the computational cost, we use a two-dimensional acoustic approximation. The inverse problem is solved iteratively based on a quasi-Newton technique called the Limited-memory Broyden-Fletcher-Goldfarb-Shanno method. We show how the technique can be made to work fine for benchmark models consisting of a single cylinder, and then five cylinders, the latter case including significant multiple diffraction effects. We then show pictures obtained for a tibia-fibula bone pair model. Convergence is fast, typically in 15 to 30 iterations in practice in each frequency band used. We discuss the so-called ‘cycle skipping’ effect that can occur in such full waveform inversion techniques and make them remain trapped in a local minimum of the cost function. We illustrate strategies that can be used in practice to avoid this. Future work should include viscoelastic materials rather than acoustic, and real data instead of synthetic data.

  13. Small field dose delivery evaluations using cone beam optical computed tomography-based polymer gel dosimetry

    Directory of Open Access Journals (Sweden)

    Timothy Olding

    2011-01-01

    Full Text Available This paper explores the combination of cone beam optical computed tomography with an N-isopropylacrylamide (NIPAM-based polymer gel dosimeter for three-dimensional dose imaging of small field deliveries. Initial investigations indicate that cone beam optical imaging of polymer gels is complicated by scattered stray light perturbation. This can lead to significant dosimetry failures in comparison to dose readout by magnetic resonance imaging (MRI. For example, only 60% of the voxels from an optical CT dose readout of a 1 l dosimeter passed a two-dimensional Low′s gamma test (at a 3%, 3 mm criteria, relative to a treatment plan for a well-characterized pencil beam delivery. When the same dosimeter was probed by MRI, a 93% pass rate was observed. The optical dose measurement was improved after modifications to the dosimeter preparation, matching its performance with the imaging capabilities of the scanner. With the new dosimeter preparation, 99.7% of the optical CT voxels passed a Low′s gamma test at the 3%, 3 mm criteria and 92.7% at a 2%, 2 mm criteria. The fitted interjar dose responses of a small sample set of modified dosimeters prepared (a from the same gel batch and (b from different gel batches prepared on the same day were found to be in agreement to within 3.6% and 3.8%, respectively, over the full dose range. Without drawing any statistical conclusions, this experiment gives a preliminary indication that intrabatch or interbatch NIPAM dosimeters prepared on the same day should be suitable for dose sensitivity calibration.

  14. Computed tomography-based distribution of involved lymph nodes in patients with upper esophageal cancer.

    Science.gov (United States)

    Li, M; Liu, Y; Xu, L; Huang, Y; Li, W; Yu, J; Kong, L

    2015-06-01

    Delineating the nodal clinical target volume (ctvn) remains a challenging task in patients with cervical or upper thoracic esophageal carcinoma (ec). In particular, the extent of the lymph area that should be included in the irradiation field remains controversial. In the present study, the extent of the ctvn was determined based on the incidence of lymph node involvement mapped by computed tomography (ct) imaging. Our study included 468 patients who were diagnosed with cervical and upper thoracic ec and who received staging information between June 2005 and April 2011. The anatomic distribution of metastatic regional lymph nodes was mapped using ct images and grouped using the levels established by the Radiation Therapy Oncology Group. The probability of the various groups being involved was examined. If a lymph node group had a probability of 10% or more of being involved, it was considered at high risk for metastasis, and elective treatment as part of the ctvn was recommended. Lymph node involvement was mapped by ct in 256 patients (54.7%). Not all lymph node groups should be included in the ctvn. For cervical lesions, the involved lymph nodes were located mainly between the hyoid bone and the arcus aortae; the recommended ctvn should consist of the neck lymph nodes at levels iii and iv (supraclavicular group) and thoracic groups 2 and 3P. In upper thoracic ec patients, most of the involved lymph nodes were distributed between the cricoid cartilage and the subcarinal area; the ctvn should cover the supraclavicular group and thoracic nodal groups 2, 3P, 4, 5, and 7. Our ct-based study indicates a specific distribution and incidence of metastatic lymph node groups in patients with cervical and upper thoracic ec. The results suggest that regional lymph node groups should be electively included in the ctvn for precise radiation administration.

  15. Studying primate carpal kinematics in three dimensions using a computed-tomography-based markerless registration method.

    Science.gov (United States)

    Orr, Caley M; Leventhal, Evan L; Chivers, Spencer F; Marzke, Mary W; Wolfe, Scott W; Crisco, Joseph J

    2010-04-01

    The functional morphology of the wrist pertains to a number of important questions in primate evolutionary biology, including that of hominins. Reconstructing locomotor and manipulative capabilities of the wrist in extinct species requires a detailed understanding of wrist biomechanics in extant primates and the relationship between carpal form and function. The kinematics of carpal movement, and the role individual joints play in providing mobility and stability of the wrist, is central to such efforts. However, there have been few detailed biomechanical studies of the nonhuman primate wrist. This is largely because of the complexity of wrist morphology and the considerable technical challenges involved in tracking the movements of the many small bones that compose the carpus. The purpose of this article is to introduce and outline a method adapted from human clinical studies of three-dimensional (3D) carpal kinematics for use in a comparative context. The method employs computed tomography of primate cadaver forelimbs in increments throughout the wrist's range of motion, coupled with markerless registration of 3D polygon models based on inertial properties of each bone. The 3D kinematic principles involved in extracting motion axis parameters that describe bone movement are reviewed. In addition, a set of anatomically based coordinate systems embedded in the radius, capitate, hamate, lunate, and scaphoid is presented for the benefit of other primate functional morphologists interested in studying carpal kinematics. Finally, a brief demonstration of how the application of these methods can elucidate the mechanics of the wrist in primates illustrates the closer-packing of carpals in chimpanzees than in orangutans, which may help to stabilize the midcarpus and produce a more rigid wrist beneficial for efficient hand posturing during knuckle-walking locomotion.

  16. Combining Computed Tomography-Based Bone Density Assessment with FRAX Screening in Men with Prostate Cancer.

    Science.gov (United States)

    McDonald, Andrew M; Jones, Joseph A; Cardan, Rex A; Saag, Kenneth S; Mayhew, David L; Fiveash, John B

    2016-10-01

    To investigate the addition of a computed tomography (CT)-based method of osteoporosis screening to FRAX without bone mineral density (BMD) fracture risk assessment in men undergoing radiotherapy for prostate cancer, we reviewed the records of all patients with localized prostate cancer treated with external beam radiotherapy at our institution between 2001 and 2012. The 10-yr probability of hip fracture was calculated using the FRAX algorithm without BMD. The CT attenuation of the L5 trabecular bone (L5CT) was assessed by contouring the trabecular bone on a single CT slice at the level of the midvertebral body and by averaging the Hounsfield units (HU) of all included voxels. L5CT values of 105 and 130 HU were used as screening thresholds. The clinical characteristics of additional patients identified by each L5CT screening threshold value were compared to patients whose estimated 10-yr risk of hip fracture was 3% or greater by FRAX without BMD. A total of 609 patients treated between 2001 and 2012 had CT scans available for review and complete clinical information allowing for FRAX without BMD risk calculation. Seventy-four (12.2%) patients had an estimated 10-yr risk of hip fracture of 3% or greater. An additional 22 (3.6%) and 71 (11.6%) patients were identified by CT screening when thresholds L5CT = 105 HU and L5CT = 130 HU were used, respectively. Compared to the group of patients identified by FRAX without BMD, the additional patients identified by CT screening at each L5CT threshold level tended to be younger and heavier, and were more likely to be African-American or treated without androgen deprivation therapy. These results suggest that the addition of CT-based screening to FRAX without BMD risk assessment identifies additional men with different underlying clinical characteristics who may be at risk for osteoporosis and may benefit from pharmacological therapy to increase BMD and reduce fracture risk.

  17. Proton radiography and proton computed tomography based on time-resolved dose measurements

    Science.gov (United States)

    Testa, Mauro; Verburg, Joost M.; Rose, Mark; Min, Chul Hee; Tang, Shikui; Hassane Bentefour, El; Paganetti, Harald; Lu, Hsiao-Ming

    2013-11-01

    We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time-dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (˜100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p

  18. Clinical impact of computed tomography-based image-guided brachytherapy for cervix cancer using the tandem-ring applicator - the Addenbrooke's experience.

    Science.gov (United States)

    Tan, L T; Coles, C E; Hart, C; Tait, E

    2009-04-01

    We report our initial 3-year experience of chemoradiotherapy for cervical cancer with computed tomography-based image-guided high dose rate (HDR) brachytherapy using the tandem-ring applicator. Twenty-eight patients were treated between February 2005 and December 2007. All patients received initial external beam radiotherapy (EBRT) followed by HDR brachytherapy (planned dose 21 Gy to point A in three fractions over 8 days). For each insertion, a computed tomography scan was obtained with the brachytherapy applicator in situ. The cervix, uterus and organs at risk (OAR) were contoured on the computed tomography images to create an individualised dosimetry plan. The D(90) (the dose delivered to 90% of the tumour target), V(100) (the percentage of tumour target volume receiving 100% of the prescribed dose) and the minimum dose in the most exposed 2 cm(3) volume (D(2 cc)) of rectum, bladder and bowel were recorded. The equivalent dose in 2 Gy fractions delivered by EBRT and brachytherapy was calculated. The 3-year cancer-specific survival was 81%, with a pelvic control rate of 96%. In 24 patients, a D(90)>or=74 Gy (alpha/beta10) was achieved. The only patient with local recurrence had a D(90) of 63.8 Gy(alpha/beta10). The overall actuarial risk of serious late morbidity was 14%. Seventeen patients had satisfactory OAR doses using the standard loading pattern. Seven patients had modifications to reduce the risk of toxicity, whereas two had modifications to improve the tumour dose. Comparison with a previous cohort of patients treated with chemoradiotherapy and a conventionally planned low dose rate triple source brachytherapy technique showed an improvement in local pelvic control of 20% (P=0.04). The implementation of a computed tomography-based tandem-ring HDR brachytherapy technique in conjunction with individual dose adaptation has resulted in a significant improvement in local control at Addenbrooke's without increasing the risk of serious toxicity, and with little

  19. Are we ready for positron emission tomography/computed tomography-based target volume definition in lymphoma radiation therapy?

    Science.gov (United States)

    Yeoh, Kheng-Wei; Mikhaeel, N George

    2013-01-01

    Fluorine-18 fluorodeoxyglucose (FDG)-positron emission tomography (PET)/computed tomography (CT) has become indispensable for the clinical management of lymphomas. With consistent evidence that it is more accurate than anatomic imaging in the staging and response assessment of many lymphoma subtypes, its utility continues to increase. There have therefore been efforts to incorporate PET/CT data into radiation therapy decision making and in the planning process. Further, there have also been studies investigating target volume definition for radiation therapy using PET/CT data. This article will critically review the literature and ongoing studies on the above topics, examining the value and methods of adding PET/CT data to the radiation therapy treatment algorithm. We will also discuss the various challenges and the areas where more evidence is required.

  20. Conventional four field radiotherapy versus computed tomography-based treatment planning in cancer cervix: A dosimetric study

    Directory of Open Access Journals (Sweden)

    Abhishek Gulia

    2013-01-01

    Full Text Available Background: With advancements in imaging, wide variations in pelvic anatomy have been observed, thus raising doubts about adequate target volume coverage by conventional external radiotherapy fields based on bony landmarks. The present study evaluates the need for integrating computed tomography (CT-based planning in the treatment of carcinoma cervix. Aims: To estimate inadequacies in target volume coverage when using conventional planning based on bony landmarks. Materials and Methods: The study consisted of 50 patients. Target volume delineation was done on planning CT scans, according to the guidelines given in literature. The volume of target receiving 95% of prescribed dose (V95 was calculated after superimposing a conventional four field box on digitally reconstructed radiograph. The geographic miss with conventional four field box technique was compared with the CT-based target volume delineation. Results: In 48 out of 50 patients, the conventional four field box failed to encompass the target volume. The areas of miss were at the superior and lateral borders of the anterior-posterior fields, and the anterior border of the lateral fields. The median V95 for conventional fields marked with bony landmarks was only 89.4% as compared to 93% for target delineation based on CT contouring. Conclusions: Our study shows inadequate target volume coverage with conventional four field box technique. We recommend routine use of CT-based planning for treatment with radiotherapy in carcinoma cervix.

  1. Increased incidence of adrenal gland injury in blunt abdominal trauma: a computed tomography-based study from Pakistan

    Directory of Open Access Journals (Sweden)

    Aziz Muhammad Usman

    2014-02-01

    Full Text Available 【Abstract】Objective: To determine the frequency of adrenal injuries in patients presenting with blunt abdominal trauma by computed tomography (CT. Methods: During a 6 month period from January 1, 2011 to June 30, 2011, 82 emergency CT examinations were performed in the setting of major abdominal trauma and retrospectively reviewed for adrenal gland injuries. Results: A total of 7 patients were identified as having adrenal gland injuries (6 males and 1 female. Two patients had isolated adrenal gland injuries. In the other 5 patients with nonisolated injuries, injuries to the liver (1 case, spleen (1 case, retroperitoneum (2 cases and mesentery (4 cases were identified. Overall 24 cases with liver injuries (29 %, 11 cases with splenic injuries (13%, 54 cases with mesenteric injuries (65%, 14 cases (17% with retroperitoneal injuries and 9 cases with renal injuries were identified. Conclusion: Adrenal gland injury is identified in 7 patients (11.7% out of a total of 82 patients who underwent CT after major abdominal trauma. Most of these cases were nonisolated injuries. Our experience indicates that adrenal injury resulting from trauma is more common than suggested by other reports. The rise in incidence of adrenal injuries could be attributed to the mode of injury.

  2. Spiral Computed Tomography Based Maxillary Sinus Imaging in Relation to Tooth Loss, Implant Placement and Potential Grafting Procedure

    Directory of Open Access Journals (Sweden)

    Reinhilde Jacobs

    2010-01-01

    Full Text Available Objectives: The purpose of the present study was to explore the maxillary sinus anatomy, its variations and volume in patients with a need for maxillary implant placement.Materials and Methods: Maxillary sinus data of 101 consecutive patients who underwent spiral computed tomography (CT scans for preoperative implant planning in the maxilla at the Department of Periodontology, University Hospital, Catholic University of Leuven, Leuven, Belgium were retrospectively evaluated. The alveolar bone height was measured on serial cross-sectional images between alveolar crest and sinus floor, parallel to the tooth axis. In order to describe the size of the maxillary sinus anteroposterior (AP and mediolateral (ML diameters of the sinus were measured.Results: The results indicated that the alveolar bone height was significantly higher in the premolar regions in comparison to the molar region (n = 46, P 4 mm mucosal thickening mostly at the level of the sinus floor. The present sample did not allow revealing any significant difference (P > 0.05 in maxillary sinus dimensions for partially dentate and edentulous subjects.Conclusions: Cross-sectional imaging can be used in order to obtain more accurate information on the morphology, variation, and the amount of maxillary bone adjacent to the maxillary sinus.

  3. Spiral computed tomography based maxillary sinus imaging in relation to tooth loss, implant placement and potential grafting procedure.

    Science.gov (United States)

    Shahbazian, Maryam; Xue, Dong; Hu, Yuqian; van Cleynenbreugel, Johan; Jacobs, Reinhilde

    2010-01-01

    The purpose of the present study was to explore the maxillary sinus anatomy, its variations and volume in patients with a need for maxillary implant placement. Maxillary sinus data of 101 consecutive patients who underwent spiral computed tomography (CT) scans for preoperative implant planning in the maxilla at the Department of Periodontology, University Hospital, Catholic University of Leuven, Leuven, Belgium were retrospectively evaluated. The alveolar bone height was measured on serial cross-sectional images between alveolar crest and sinus floor, parallel to the tooth axis. In order to describe the size of the maxillary sinus anteroposterior (AP) and mediolateral (ML) diameters of the sinus were measured. The results indicated that the alveolar bone height was significantly higher in the premolar regions in comparison to the molar region (n = 46, P maxillary sinuses were mostly located in the first premolar (49%) and second molar (84%) regions, respectively. Maxillary sinus septa were indentified in 47% of the maxillary antra. Almost 2/3 (66%) of the patients showed major (> 4 mm) mucosal thickening mostly at the level of the sinus floor. The present sample did not allow revealing any significant difference (P > 0.05) in maxillary sinus dimensions for partially dentate and edentulous subjects. Cross-sectional imaging can be used in order to obtain more accurate information on the morphology, variation, and the amount of maxillary bone adjacent to the maxillary sinus.

  4. Single-energy computed tomography-based pulmonary perfusion imaging: Proof-of-principle in a canine model.

    Science.gov (United States)

    Yamamoto, Tokihiro; Kent, Michael S; Wisner, Erik R; Johnson, Lynelle R; Stern, Joshua A; Qi, Lihong; Fujita, Yukio; Boone, John M

    2016-07-01

    Radiotherapy (RT) that selectively avoids irradiating highly functional lung regions may reduce pulmonary toxicity, which is substantial in lung cancer RT. Single-energy computed tomography (CT) pulmonary perfusion imaging has several advantages (e.g., higher resolution) over other modalities and has great potential for widespread clinical implementation, particularly in RT. The purpose of this study was to establish proof-of-principle for single-energy CT perfusion imaging. Single-energy CT perfusion imaging is based on the following: (1) acquisition of end-inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast agents, (2) deformable image registration (DIR) for spatial mapping of those two CT image data sets, and (3) subtraction of the precontrast image data set from the postcontrast image data set, yielding a map of regional Hounsfield unit (HU) enhancement, a surrogate for regional perfusion. In a protocol approved by the institutional animal care and use committee, the authors acquired CT scans in the prone position for a total of 14 anesthetized canines (seven canines with normal lungs and seven canines with diseased lungs). The elastix algorithm was used for DIR. The accuracy of DIR was evaluated based on the target registration error (TRE) of 50 anatomic pulmonary landmarks per subject for 10 randomly selected subjects as well as on singularities (i.e., regions where the displacement vector field is not bijective). Prior to perfusion computation, HUs of the precontrast end-inspiratory image were corrected for variation in the lung inflation level between the precontrast and postcontrast end-inspiratory CT scans, using a model built from two additional precontrast CT scans at end-expiration and midinspiration. The authors also assessed spatial heterogeneity and gravitationally directed gradients of regional perfusion for normal lung subjects and diseased lung subjects using a two-sample two-tailed t-test. The mean TRE

  5. Total body height estimation using sacrum height in Anatolian Caucasians: multidetector computed tomography-based virtual anthropometry

    Energy Technology Data Exchange (ETDEWEB)

    Karakas, Hakki Muammer [Inonu University Medical Faculty, Turgut Ozal Medical Center, Department of Radiology, Malatya (Turkey); Celbis, Osman [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Forensic Medicine, Malatya (Turkey); Harma, Ahmet [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Orthopaedics and Traumatology, Malatya (Turkey); Alicioglu, Banu [Trakya University Medical Faculty, Department of Radiology, Edirne (Turkey); Trakya University Health Sciences Institute, Department of Anatomy, Edirne (Turkey)

    2011-05-15

    Estimation of total body height is a major step when a subject has to be identified from his/her skeletal structures. In the presence of decomposed skeletons and missing bones, estimation is usually based on regression equation for intact long bones. If these bones are fragmented or missing, alternative structures must be used. In this study, the value of sacrum height (SH) in total body height (TBH) estimation was investigated in a contemporary population of adult Anatolian Caucasians. Sixty-six men (41.6 {+-} 14.9 years) and 43 women (41.1 {+-} 14.2 years) were scanned with 64-row multidetector computed tomography (MDCT) to obtain high-resolution anthropometric data. SH of midsagittal sections was electronically measured. The technique and methodology were validated on a standard skeletal model. Sacrum height was 111.2 {+-} 12.6 mm (77-138 mm) in men and 104.7 {+-} 8.2 (89-125 mm) in women. The difference between the two sexes regarding SH was significant (p < 0.0001). SH did not significantly correlate with age in men, whereas the correlation was significant in women (p < 0.03). The correlation between SH and the stature was significant in men (r = 0.427, p < 0.0001) and was insignificant in women. For men the regression equation was [Stature = (0.306 x SH)+137.9] (r = 0.54, SEE = 56.9, p < 0.0001). Sacrum height is not susceptible to sex, or to age in men. In the presence of incomplete male skeletons, SH helps to determine the stature. This study is also one of the initial applications of MDCT in virtual anthropometric research. (orig.)

  6. Investigation of four-dimensional computed tomography-based pulmonary ventilation imaging in patients with emphysematous lung regions.

    Science.gov (United States)

    Yamamoto, Tokihiro; Kabus, Sven; Klinder, Tobias; Lorenz, Cristian; von Berg, Jens; Blaffert, Thomas; Loo, Billy W; Keall, Paul J

    2011-04-07

    A pulmonary ventilation imaging technique based on four-dimensional (4D) computed tomography (CT) has advantages over existing techniques. However, physiologically accurate 4D-CT ventilation imaging has not been achieved in patients. The purpose of this study was to evaluate 4D-CT ventilation imaging by correlating ventilation with emphysema. Emphysematous lung regions are less ventilated and can be used as surrogates for low ventilation. We tested the hypothesis: 4D-CT ventilation in emphysematous lung regions is significantly lower than in non-emphysematous regions. Four-dimensional CT ventilation images were created for 12 patients with emphysematous lung regions as observed on CT, using a total of four combinations of two deformable image registration (DIR) algorithms: surface-based (DIR(sur)) and volumetric (DIR(vol)), and two metrics: Hounsfield unit (HU) change (V(HU)) and Jacobian determinant of deformation (V(Jac)), yielding four ventilation image sets per patient. Emphysematous lung regions were detected by density masking. We tested our hypothesis using the one-tailed t-test. Visually, different DIR algorithms and metrics yielded spatially variant 4D-CT ventilation images. The mean ventilation values in emphysematous lung regions were consistently lower than in non-emphysematous regions for all the combinations of DIR algorithms and metrics. V(HU) resulted in statistically significant differences for both DIR(sur) (0.14 ± 0.14 versus 0.29 ± 0.16, p = 0.01) and DIR(vol) (0.13 ± 0.13 versus 0.27 ± 0.15, p Jac) resulted in non-significant differences for both DIR(sur) (0.15 ± 0.07 versus 0.17 ± 0.08, p = 0.20) and DIR(vol) (0.17 ± 0.08 versus 0.19 ± 0.09, p = 0.30). This study demonstrated the strong correlation between the HU-based 4D-CT ventilation and emphysema, which indicates the potential for HU-based 4D-CT ventilation imaging to achieve high physiologic accuracy. A further study is needed to confirm these results.

  7. Healing Process of Osteonecrotic Lesions of the Femoral Head Following Transtrochanteric Rotational Osteotomy: A Computed Tomography-Based Study

    Science.gov (United States)

    Lakhotia, Devendra; Swaminathan, Siva; Oh, Jong Keon; Moon, Jun Gyu; Dwivedi, Chirayu; Hong, Suk Joo

    2017-01-01

    Background Transtrochanteric rotational osteotomy (TRO) is a controversial hip-preserving procedure with a variable success rate. The healing process of femoral head osteonecrosis after TRO has been poorly explained till now. This study aimed to evaluate the healing process of previously transposed necrotic lesion after a TRO for nontraumatic osteonecrosis of the femoral head using computed tomography (CT). Methods Among 52 patients (58 hips) who had preserved original femoral head after TRO, we retrospectively reviewed 27 patients (28 hips) who had undergone sequential CT scans and had no major complication following TRO. The average age was 34 years (range, 18 to 59 years). The mean follow-up period was 9.1 years. We evaluated the reparative process of the transposed osteonecrotic lesion with CT scans. Results Plain radiographs of the osteonecrotic lesion revealed sclerotic and lucent changes in 14 hips (50%) and normal bony architecture in the other 14 hips (50%) at the final follow-up. CT scans of the osteonecrotic lesions showed cystic changes with heterogeneous sclerosis in 13 hips (46%), normal trabecular bone with or without small cysts in 9 hips (32%), and fragmentation of the necrotic lesion in 6 hips (22%). Seventeen hips (60%) showed minimal (13 hips) to mild (4 hips) nonprogressive collapse of the transposed osteonecrotic area. The collapse of the transposed osteonecrotic area on the CT scan was significantly associated with the healing pattern (p = 0.009), as all 6 patients (6 hips) with fragmentation of the necrotic lesion had minimal (5 hips) to mild (1 hip) collapse. Furthermore, a significant association was found between the collapse of the transposed osteonecrotic area on the CT scan of 17 hips (60%) and postoperative Harris hip score (p = 0.021). We observed no differences among the healing patterns on CT scans with regard to age, gender, etiology, staging, preoperative lesion type, preoperative intact area, percentage of necrotic area

  8. Healing Process of Osteonecrotic Lesions of the Femoral Head Following Transtrochanteric Rotational Osteotomy: A Computed Tomography-Based Study.

    Science.gov (United States)

    Lakhotia, Devendra; Swaminathan, Siva; Shon, Won Yong; Oh, Jong Keon; Moon, Jun Gyu; Dwivedi, Chirayu; Hong, Suk Joo

    2017-03-01

    Transtrochanteric rotational osteotomy (TRO) is a controversial hip-preserving procedure with a variable success rate. The healing process of femoral head osteonecrosis after TRO has been poorly explained till now. This study aimed to evaluate the healing process of previously transposed necrotic lesion after a TRO for nontraumatic osteonecrosis of the femoral head using computed tomography (CT). Among 52 patients (58 hips) who had preserved original femoral head after TRO, we retrospectively reviewed 27 patients (28 hips) who had undergone sequential CT scans and had no major complication following TRO. The average age was 34 years (range, 18 to 59 years). The mean follow-up period was 9.1 years. We evaluated the reparative process of the transposed osteonecrotic lesion with CT scans. Plain radiographs of the osteonecrotic lesion revealed sclerotic and lucent changes in 14 hips (50%) and normal bony architecture in the other 14 hips (50%) at the final follow-up. CT scans of the osteonecrotic lesions showed cystic changes with heterogeneous sclerosis in 13 hips (46%), normal trabecular bone with or without small cysts in 9 hips (32%), and fragmentation of the necrotic lesion in 6 hips (22%). Seventeen hips (60%) showed minimal (13 hips) to mild (4 hips) nonprogressive collapse of the transposed osteonecrotic area. The collapse of the transposed osteonecrotic area on the CT scan was significantly associated with the healing pattern (p = 0.009), as all 6 patients (6 hips) with fragmentation of the necrotic lesion had minimal (5 hips) to mild (1 hip) collapse. Furthermore, a significant association was found between the collapse of the transposed osteonecrotic area on the CT scan of 17 hips (60%) and postoperative Harris hip score (p = 0.021). We observed no differences among the healing patterns on CT scans with regard to age, gender, etiology, staging, preoperative lesion type, preoperative intact area, percentage of necrotic area, direction of rotation and

  9. Myocardial Perfusion Imaging Versus Computed Tomography Angiography-Derived Fractional Flow Reserve Testing in Stable Patients With Intermediate-Range Coronary Lesions

    DEFF Research Database (Denmark)

    Nørgaard, Bjarne L; Gormsen, Lars C; Bøtker, Hans Erik

    2017-01-01

    BACKGROUND: Data on the clinical utility of coronary computed tomography angiography-derived fractional flow reserve (FFRCT) are sparse. In patients with intermediate (40-70%) coronary stenosis determined by coronary computed tomography angiography, we investigated the association of replacing st...

  10. Variabilities of Magnetic Resonance Imaging-, Computed Tomography-, and Positron Emission Tomography-Computed Tomography-Based Tumor and Lymph Node Delineations for Lung Cancer Radiation Therapy Planning.

    Science.gov (United States)

    Karki, Kishor; Saraiya, Siddharth; Hugo, Geoffrey D; Mukhopadhyay, Nitai; Jan, Nuzhat; Schuster, Jessica; Schutzer, Matthew; Fahrner, Lester; Groves, Robert; Olsen, Kathryn M; Ford, John C; Weiss, Elisabeth

    2017-09-01

    To investigate interobserver delineation variability for gross tumor volumes of primary lung tumors and associated pathologic lymph nodes using magnetic resonance imaging (MRI), and to compare the results with computed tomography (CT) alone- and positron emission tomography (PET)-CT-based delineations. Seven physicians delineated the tumor volumes of 10 patients for the following scenarios: (1) CT only, (2) PET-CT fusion images registered to CT ("clinical standard"), and (3) postcontrast T1-weighted MRI registered with diffusion-weighted MRI. To compute interobserver variability, the median surface was generated from all observers' contours and used as the reference surface. A physician labeled the interface types (tumor to lung, atelectasis (collapsed lung), hilum, mediastinum, or chest wall) on the median surface. Contoured volumes and bidirectional local distances between individual observers' contours and the reference contour were analyzed. Computed tomography- and MRI-based tumor volumes normalized relative to PET-CT-based volumes were 1.62 ± 0.76 (mean ± standard deviation) and 1.38 ± 0.44, respectively. Volume differences between the imaging modalities were not significant. Between observers, the mean normalized volumes per patient averaged over all patients varied significantly by a factor of 1.6 (MRI) and 2.0 (CT and PET-CT) (P=4.10 × 10(-5) to 3.82 × 10(-9)). The tumor-atelectasis interface had a significantly higher variability than other interfaces for all modalities combined (P=.0006). The interfaces with the smallest uncertainties were tumor-lung (on CT) and tumor-mediastinum (on PET-CT and MRI). Although MRI-based contouring showed overall larger variability than PET-CT, contouring variability depended on the interface type and was not significantly different between modalities, despite the limited observer experience with MRI. Multimodality imaging and combining different imaging characteristics might be the best approach to define

  11. Prospective, blinded trial of whole-body magnetic resonance imaging versus computed tomography positron emission tomography in staging primary and recurrent cancer of the head and neck.

    LENUS (Irish Health Repository)

    O'Neill, J P

    2012-02-01

    OBJECTIVES: To compare the use of computed tomography - positron emission tomography and whole-body magnetic resonance imaging for the staging of head and neck cancer. PATIENTS AND METHODS: From January to July 2009, 15 consecutive head and neck cancer patients (11 men and four women; mean age 59 years; age range 19 to 81 years) underwent computed tomography - positron emission tomography and whole-body magnetic resonance imaging for pre-therapeutic evaluation. All scans were staged, as per the American Joint Committee on Cancer tumour-node-metastasis classification, by two blinded consultant radiologists, in two sittings. Diagnoses were confirmed by histopathological examination of endoscopic biopsies, and in some cases whole surgical specimens. RESULTS: Tumour staging showed a 74 per cent concordance, node staging an 80 per cent concordance and metastasis staging a 100 per cent concordance, comparing the two imaging modalities. CONCLUSION: This study found radiological staging discordance between the two imaging modalities. Whole-body magnetic resonance imaging is an emerging staging modality with superior visualisation of metastatic disease, which does not require exposure to ionising radiation.

  12. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast

    Science.gov (United States)

    Chen, L.; Boone, J. M.; Abbey, C. K.; Hargreaves, J.; Bateni, C.; Lindfors, K. K.; Yang, K.; Nosratieh, A.; Hernandez, A.; Gazi, P.

    2015-04-01

    The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33, 0.71, 1.5 and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast. The percent correct of the human observer’s responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p physicist observer, however trends in performance were similar. Human observers demonstrate significantly better mass-lesion detection performance on thin-section CT images of the breast, compared to thick-section simulated projection images of the breast.

  13. Diagnostic accuracy of diffusion-weighted magnetic resonance imaging versus positron emission tomography/computed tomography for early response assessment of liver metastases to Y90-radioembolization.

    Science.gov (United States)

    Barabasch, Alexandra; Kraemer, Nils A; Ciritsis, Alexander; Hansen, Nienke L; Lierfeld, Marco; Heinzel, Alexander; Trautwein, Christian; Neumann, Ulf; Kuhl, Christiane K

    2015-06-01

    Patients with hepatic metastases who are candidates for Y90-radioembolization (Y90-RE) usually have advanced tumor stages with involvement of both liver lobes. Per current guidelines, these patients have usually undergone several cycles of potentially hepatotoxic systemic chemotherapy before Y90-RE is at all considered, requiring split (lobar) treatment sessions to reduce hepatic toxicity. Assessing response to Y90-RE early, that is, already after the first lobar session, would be helpful to avoid an ineffective and potentially hepatotoxic second lobar treatment. We investigated the accuracy with which diffusion- weighted magnetic resonance imaging (DWI-MRI) and positron emission tomography/computed tomography (PET/CT) can provide this information. An institutional review board-approved prospective intraindividual comparison trial on 35 patients who underwent fluorodeoxyglucose PET/CT and DWI-MRI within 6 weeks before and 6 weeks after Y90-RE to treat secondary-progressive liver metastases from solid cancers (20 colorectal, 13 breast, 2 other) was performed. An increase of minimal apparent diffusion coefficient (ADCmin) or decrease of maximum standard uptake value (SUVmax) by at least 30% was regarded as positive response. Long-term clinical and imaging follow-up was used to distinguish true- from false-response classifications. On the basis of long-term follow-up, 23 (66%) of 35 patients responded to the Y90 treatment. No significant changes of metastases size or contrast enhancement were observable on pretreatment versus posttreatment CT or magnetic resonance images.However, overall SUVmax decreased from 8.0 ± 3.9 to 5.5 ± 2.2 (P magnetic resonance imaging appears superior to PET/CT for early response assessment in patients with hepatic metastases of common solid tumors. It may be used in between lobar treatment sessions to guide further management of patients who undergo Y90-RE for hepatic metastases.

  14. Thermal-stress analysis of ceramic laminate veneer restorations with different incisal preparations using micro-computed tomography-based 3D finite element models.

    Science.gov (United States)

    Celebi, Alper Tunga; Icer, Esra; Eren, Meltem Mert; Baykasoglu, Cengiz; Mugan, Ata; Yildiz, Esra

    2017-11-01

    Main objective of this study is to investigate the thermal behavior of ceramic laminate veneer restorations of the maxillary central incisor with different incisal preparations such as butt joint and palatinal chamfer using finite element method. In addition, it is also aimed to understand the effect of different thermal loads which simulates hot and cold liquid imbibing in the mouth. Three-dimensional solid models of the sound tooth and prepared veneer restorations were obtained using micro-computed tomography images. Each ceramic veneer restoration was made up of ceramic, luting resin cement and adhesive layer which were generated based on the scanned images using computer-aided design software. Our solid model also included the remaining dental tissues such as periodontal ligament and surrounding cortical and spongy bones. Time-dependent linear thermal analyses were carried out to compare temperature changes and stress distributions of the sound and restored tooth models. The liquid is firstly in contact with the crown area where the maximum stresses were obtained. For the restorations, stresses on palatinal surfaces were found larger than buccal surfaces. Through interior tissues, the effect of thermal load diminished and smaller stress distributions were obtained near pulp and root-dentin regions. We found that the palatinal chamfer restoration presents comparatively larger stresses than the butt joint preparation. In addition, cold thermal loading showed larger temperature changes and stress distributions than those of hot thermal loading independent from the restoration technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The Plastered Skulls from the Pre-Pottery Neolithic B Site of Yiftahel (Israel) – A Computed Tomography-Based Analysis

    Science.gov (United States)

    Slon, Viviane; Sarig, Rachel; Hershkovitz, Israel; Khalaily, Hamoudi; Milevski, Ianir

    2014-01-01

    Three plastered skulls, dating to the Pre-Pottery Neolithic B, were found at the site of Yiftahel, in the Lower Galilee (Israel). The skulls underwent refitting and restoration processes, details of which are described herein. All three belong to adults, of which two appear to be males and one appears to be a female. Virtual cross-sections were studied and a density analysis of the plaster was performed using computed tomography scans. These were utilized to yield information regarding the modeling process. Similarities and differences between the Yiftahel and other plastered skulls from the Levant are examined. The possible role of skull plastering within a society undergoing a shift from a hunting-gathering way of life to a food producing strategy is discussed. PMID:24586625

  16. [Evaluation on Ability to Detect the Intracranial Hematoma with Different Density Using C-Arm Cone-beam Computed Tomography Based on Animal Model].

    Science.gov (United States)

    Zhou, Mi; Zeng, Yongming; Yu, Renqiang; Zhou, Yang; Xu, Rui; Sun, Jingkun; Gao, Zhimei

    2016-02-01

    This study aims to evaluate the ability of C-arm cone-beam CT to detect intracranial hematomas in canine models. Twenty one healthy canines were divided into seven groups and each group had three animals. Autologous blood and contrast agent (3 mL) were slowly injected into the left/right frontal lobes of each animal. Canines in the first group, the control group, were only injected with autologous blood without contrast agent. Each animal in all the 7 groups was scanned with C-arm cone-beam CT and multislice computed tomography (MSCT) after 5 minutes. The attenuation values and their standard deviations of the hematoma and uniformed brain tissues were measured to calculate the image noise, signal to noise ratio (SNR) and contrast to noise ratio (CNR). A scale with scores 1-3 was used to rate the quality of the reconstructed image of different hematoma as a subjective evaluation, and all the experimental data were processed with statistical treatment. The results revealed that when the density of hematoma was less than 65 HU, hematomata were not very clear on C-arm CT images, and when the density of hematoma was more than 65 HU, hematomata showed clearly on both C-arm CT and MSCT images and the scores of them were close. The coherence between the two physicians was very reliable. The same results were obtained with C-arm cone-beam CT and MSCT grades in measuring SD value, SNR, and CNR. The reasonable choice of density detection range of intracranial hematoma with C-arm cone-beam CT could be effectively applied to monitoring the intracranial hemorrhage during interventional diagnosis and treatment.

  17. A computer tomography-based spatial normalization for the analysis of [{sup 18}F]fluorodeoxyglucose position emission tomography of the brain

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Hanna; Choi, Jae Yong; Ryu, Young Hoon; Lyoo, Chul Hyoung [Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Kim, Jin Su [Molecular Imaging Research Center, Korea Institute Radiological and Medical Science, Seoul(Korea, Republic of)

    2014-12-15

    We developed a new computed tomography (CT)-based spatial normalization method and CT template to demonstrate its usefulness in spatial normalization of positron emission tomography (PET) images with [{sup 18}F] fluorodeoxyglucose (FDG) PET studies in healthy controls. Seventy healthy controls underwent brain CT scan (120 KeV, 180 mAs, and 3 mm of thickness) and [{sup 18}F] FDG PET scans using a PET/CT scanner. T1-weighted magnetic resonance (MR) images were acquired for all subjects. By averaging skull-stripped and spatially-normalized MR and CT images, we created skull-stripped MR and CT templates for spatial normalization. The skull-stripped MR and CT images were spatially normalized to each structural template. PET images were spatially normalized by applying spatial transformation parameters to normalize skull-stripped MR and CT images. A conventional perfusion PET template was used for PET-based spatial normalization. Regional standardized uptake values (SUV) measured by overlaying the template volume of interest (VOI) were compared to those measured with FreeSurfer-generated VOI (FSVOI). All three spatial normalization methods underestimated regional SUV values by 0.3-20% compared to those measured with FSVOI. The CT-based method showed slightly greater underestimation bias. Regional SUV values derived from all three spatial normalization methods were correlated significantly (p < 0.0001) with those measured with FSVOI. CT-based spatial normalization may be an alternative method for structure-based spatial normalization of [18F] FDG PET when MR imaging is unavailable. Therefore, it is useful for PET/CT studies with various radiotracers whose uptake is expected to be limited to specific brain regions or highly variable within study population.

  18. Computed tomography-based prediction of the straight antegrade humeral nail's entry point and exposure of "critical types": truth or fiction?

    Science.gov (United States)

    Euler, Simon A; Hengg, Clemens; Boos, Matthias; Dornan, Grant J; Turnbull, Travis Lee; Wambacher, Markus; Kralinger, Franz S; Millett, Peter J; Petri, Maximilian

    2017-05-01

    Straight antegrade intramedullary nailing of proximal humerus fractures has shown promising clinical results. However, up to 36% of all humeri seem to be "critical types" in terms of the potential violation of the supraspinatus (SSP) tendon footprint by the nail's insertion zone. The aims of this study were to evaluate if a computed tomography (CT) scan could reliably predict the nail's entry point on the humeral head and if it would be possible to preoperatively estimate the individual risk of iatrogenic violation of the SSP tendon footprint by evaluating the uninjured contralateral humerus. Twenty matched pairs of human cadaveric shoulders underwent CT scans, and the entry point for an antegrade nail as well as measurements regarding critical distances between the entry point and the rotator cuff were determined. Next, gross anatomic measurements of the same data were performed and compared. Furthermore, specimens were reviewed for critical types. Overall, 42.5% of all specimens were found to be critical types. The CT measurements exhibited excellent intra-rater and inter-rater reliability (intraclass correlation coefficients >0.90). Similarly, excellent agreement between the CT scan and gross anatomic measurements in contralateral shoulders (intraclass correlation coefficients >0.88) was found. Assessing the uninjured contralateral side, CT can reliably predict the entry point in antegrade humeral nailing and preoperatively identify critical types of humeral heads at risk of iatrogenic implantation damage to the SSP tendon footprint. This study may help surgeons in the decision-making processon which surgical technique should be used without putting the patient at risk for iatrogenic, implant-related damage to the rotator cuff. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  19. Four-Dimensional Computed Tomography Based Respiratory-Gated Radiotherapy with Respiratory Guidance System: Analysis of Respiratory Signals and Dosimetric Comparison

    Directory of Open Access Journals (Sweden)

    Jung Ae Lee

    2014-01-01

    Full Text Available Purpose. To investigate the effectiveness of respiratory guidance system in 4-dimensional computed tomography (4DCT based respiratory-gated radiation therapy (RGRT by comparing respiratory signals and dosimetric analysis of treatment plans. Methods. The respiratory amplitude and period of the free, the audio device-guided, and the complex system-guided breathing were evaluated in eleven patients with lung or liver cancers. The dosimetric parameters were assessed by comparing free breathing CT plan and 4DCT-based 30–70% maximal intensity projection (MIP plan. Results. The use of complex system-guided breathing showed significantly less variation in respiratory amplitude and period compared to the free or audio-guided breathing regarding the root mean square errors (RMSE of full inspiration (P=0.031, full expiration (P=0.007, and period (P=0.007. The dosimetric parameters including V5 Gy, V10 Gy, V20 Gy, V30 Gy, V40 Gy, and V50 Gy of normal liver or lung in 4DCT MIP plan were superior over free breathing CT plan. Conclusions. The reproducibility and regularity of respiratory amplitude and period were significantly improved with the complex system-guided breathing compared to the free or the audio-guided breathing. In addition, the treatment plan based on the 4D CT-based MIP images acquired with the complex system guided breathing showed better normal tissue sparing than that on the free breathing CT.

  20. Prostate positioning using cone-beam computer tomography based on manual soft-tissue registration. Interobserver agreement between radiation oncologists and therapists

    Energy Technology Data Exchange (ETDEWEB)

    Jereczek-Fossa, B.A.; Pobbiati, C.; Fanti, P. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); University of Milan, Milan (Italy); Santoro, L. [European Institute of Oncology, Department of Epidemiology and Biostatistics, Milan (Italy); Fodor, C.; Zerini, D. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); Vigorito, S. [European Institute of Oncology, Department of Medical Physics, Milan (Italy); Baroni, G. [Politecnico di Milano, Department of Electronics Information and Bioengineering, Milan (Italy); De Cobelli, O. [European Institute of Oncology, Department of Urology, Milan (Italy); University of Milan, Milan (Italy); Orecchia, R. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); National Center for Oncological Hadrontherapy (CNAO) Foundation, Pavia (Italy); University of Milan, Milan (Italy)

    2014-01-15

    To check the interobserver agreement between radiation oncologists and therapists (RTT) using an on- and off-line cone-beam computer tomography (CBCT) protocol for setup verification in the radiotherapy of prostate cancer. The CBCT data from six prostate cancer patients treated with hypofractionated intensity-modulated radiotherapy (IMRT) were independently reviewed off-line by four observers (one radiation oncologist, one junior and two senior RTTs) and benchmarked with on-line CBCT positioning performed by a radiation oncologist immediately prior to treatment. CBCT positioning was based on manual soft-tissue registration. Agreement between observers was evaluated using weighted Cohen's kappa statistics. In total, 152 CBCT-based prostate positioning procedures were reviewed by each observer. The mean (± standard deviation) of the differences between off- and on-line CBCT-simCT registration translations along the three directions (antero-posterior, latero-lateral and cranio-caudal) and rotation around the antero-posterior axis were - 0.7 (3.6) mm, 1.9 (2.7) mm, 0.9 (3.6) mm and - 1.8 (5.0) degrees, respectively. Satisfactory interobserver agreement was found, being substantial (weighted kappa > 0.6) in 10 of 16 comparisons and moderate (0.41-0.60) in the remaining six comparisons. CBCT interpretation performed by RTTs is comparable to that of radiation oncologists. Our study might be helpful in the quality assurance of radiotherapy and the optimization of competencies. Further investigation should include larger sample sizes, a greater number of observers and validated methodology in order to assess interobserver variability and its impact on high-precision prostate cancer IGRT. In the future, it should enable the wider implementation of complex and evolving radiotherapy technologies. (orig.)

  1. Are pedicle screw perforation rates influenced by distance from the reference frame in multilevel registration using a computed tomography-based navigation system in the setting of scoliosis?

    Science.gov (United States)

    Uehara, Masashi; Takahashi, Jun; Ikegami, Shota; Kuraishi, Shugo; Shimizu, Masayuki; Futatsugi, Toshimasa; Oba, Hiroki; Kato, Hiroyuki

    2017-04-01

    Pedicle screw fixation is commonly employed for the surgical correction of scoliosis but carries a risk of serious neurovascular or visceral structure events during screw insertion. To avoid these complications, we have been using a computed tomography (CT)-based navigation system during pedicle screw placement. As this could also prolong operation time, multilevel registration for pedicle screw insertion for posterior scoliosis surgery was developed to register three consecutive vertebrae in a single time with CT-based navigation. The reference frame was set either at the caudal end of three consecutive vertebrae or at one or two vertebrae inferior to the most caudal registered vertebra, and then pedicle screws were inserted into the three consecutive registered vertebrae and into the one or two adjacent vertebrae. This study investigated the perforation rates of vertebrae at zero, one, two, three, or four or more levels above or below the vertebra at which the reference frame was set. This is a retrospective, single-center, single-surgeon study. One hundred sixty-one scoliosis patients who had undergone pedicle screw fixation were reviewed. Screw perforation rates were evaluated by postoperative CT. We evaluated 161 scoliosis patients (34 boys and 127 girls; mean±standard deviation age: 14.6±2.8 years) who underwent pedicle screw fixation guided by a CT-based navigation system between March 2006 and December 2015. A total of 2,203 pedicle screws were inserted into T2-L5 using multilevel registration with CT-based navigation. The overall perforation rates for Grade 1, 2, or 3, Grade 2 or 3 (major perforations), and Grade 3 perforations (violations) were as follows: vertebrae at which the reference frame was set: 15.9%, 6.1%, and 2.5%; one vertebra above or below the reference frame vertebra: 16.5%, 4.0%, and 1.2%; two vertebrae above or below the reference frame vertebra: 20.7%, 8.7%, and 2.3%; three vertebrae above or below the reference frame vertebra: 23

  2. Computed tomography-based morphometric analysis of cervical pedicles in Indian population: A pilot study to assess feasibility of transpedicular screw fixation

    Directory of Open Access Journals (Sweden)

    A R Patwardhan

    2012-01-01

    Full Text Available Background: Cervical transpedicular screw fixation is safe and is probably going to be the gold standard for cervical spine fixation. However, cervical transpedicular screw use in the Asian population can be limited as the transverse diameter in this group of patients may not be adequate to accommodate the 3.5-mm pedicular screw thus injuring the vital structures located in the close proximity of the pedicles. Thus lateral mass fixation remains the mainstay of treatment. The present study evaluated the transverse cervical pedicle diameter of C2-C7 vertebrae in a pilot study in 27 Indian subjects using computed tomography (CT imaging and evaluated the feasibility of transpedicular screw fixation in them. Aims: To evaluate the feasibility of transpedicular screw fixation in the Indian population. Settings and Design: The cervical pedicle diameter size differs between the Asian and non-Asian population. The authors studied the transverse pedicle diameter of the C2-C7 of the cervical spine in the Indian population using CT measurements. This cross-sectional study was carried out at a tertiary care centre for a period of four months from October 2010 to December 2010. Material and Methods: Measurements of cervical pedicles in the subjects were performed on the CT workstation from the CT images taken at 2.5-mm interval. The transverse pedicle diameter was defined as the outermost diameter of the pedicle, taken perpendicular to the axis of the pedicle at the narrowest point and measured in millimeters±0.1 mm. Statistical Analysis: Descriptive statistics was used to represent percentage of transverse diameter of cervical pedicles less than 5 mm in male and female subjects at C2-C7 levels. Since there is no previous study done in India, we initiated the study with sample size of 27 as a pilot study. The statistical analysis was performed using SPSS software. Results: The mean transverse diameters of the cervical pedicles of C2, C3, C4, C5, C6 and C7 in

  3. Nuclear stress perfusion imaging versus computed tomography coronary angiography for identifying patients with obstructive coronary artery disease as defined by conventional angiography: insights from the CorE-64 multicenter study

    Directory of Open Access Journals (Sweden)

    Yutaka Tanami

    2014-08-01

    Full Text Available We investigated the diagnostic accuracy of computed tomography angiography (CTA versus myocardial perfusion imaging (MPI for detecting obstructive coronary artery disease (CAD as defined by conventional quantitative coronary angiography (QCA. Sixty-three patients who were enrolled in the CorE-64 multicenter study underwent CTA, MPI, and QCA imaging. All subjects were referred for cardiac catheterization with suspected or known coronary artery disease. The diagnostic accuracy of quantitative CTA and MPI for identifying patients with 50% or greater coronary arterial stenosis by QCA was evaluated using receiver operating characteristic (ROC analysis. Pre-defined subgroups were patients with known CAD and those with a calcium score of 400 or over. Diagnostic accuracy by ROC analysis revealed greater area under the curve (AUC for CTA than MPI for all 63 patients: 0.95 [95% confidence interval (CI: 0.89-0.100] vs 0.65 (95%CI: 0.53-0.77, respectively (P<0.01. Sensitivity, specificity, positive and negative predictive values were 0.93, 0.95, 0.97, 0.88, respectively, for CTA and 0.85, 0.45, 0.74, 0.63, respectively, for MPI. In 48 patients without known CAD, AUC was 0.96 for CTA and to 0.67 for SPECT (P<0.01. There was no significant difference in AUC for CTA in patients with calcium score below 400 versus over 400 (0.93 vs 0.95, but AUC was different for SPECT (0.61 vs 0.95; P<0.01. In a direct comparison, CTA is markedly superior to MPI for detecting obstructive coronary artery disease in patients. Even in subgroups traditionally more challenging for CTA, SPECT does not offer similarly good diagnostic accuracy. CTA may be considered the non-invasive test of choice if diagnosis of obstructive CAD is the purpose of imaging.

  4. Efficacy of Stent-Retriever Thrombectomy in Magnetic Resonance Imaging Versus Computed Tomographic Perfusion-Selected Patients in SWIFT PRIME Trial (Solitaire FR With the Intention for Thrombectomy as Primary Endovascular Treatment for Acute Ischemic Stroke).

    Science.gov (United States)

    Menjot de Champfleur, Nicolas; Saver, Jeffrey L; Goyal, Mayank; Jahan, Reza; Diener, Hans-Christoph; Bonafe, Alain; Levy, Elad I; Pereira, Vitor M; Cognard, Christophe; Yavagal, Dileep R; Albers, Gregory W

    2017-06-01

    The majority of patients enrolled in SWIFT PRIME trial (Solitaire FR With the Intention for Thrombectomy as Primary Endovascular Treatment for Acute Ischemic Stroke) had computed tomographic perfusion (CTP) imaging before randomization; 34 patients were randomized after magnetic resonance imaging (MRI). Patients with middle cerebral artery and distal carotid occlusions were randomized to treatment with tPA (tissue-type plasminogen activator) alone or tPA+stentriever thrombectomy. The primary outcome was the distribution of the modified Rankin Scale score at 90 days. Patients with the target mismatch profile for enrollment were identified on MRI and CTP. MRI selection was performed in 34 patients; CTP in 139 patients. Baseline National Institutes of Health Stroke Scale score was 17 in both groups. Target mismatch profile was present in 95% (MRI) versus 83% (CTP). A higher percentage of the MRI group was transferred from an outside hospital (P=0.02), and therefore, the time from stroke onset to randomization was longer in the MRI group (P=0.003). Time from emergency room arrival to randomization did not differ in CTP versus MRI-selected patients. Baseline ischemic core volumes were similar in both groups. Reperfusion rates (>90%/TICI [Thrombolysis in Cerebral Infarction] score 3) did not differ in the stentriever-treated patients in the MRI versus CTP groups. The primary efficacy analysis (90-day mRS score) demonstrated a statistically significant benefit in both subgroups (MRI, P=0.02; CTP, P=0.01). Infarct growth was reduced in the stentriever-treated group in both MRI and CTP groups. Time to randomization was significantly longer in MRI-selected patients; however, site arrival to randomization times were not prolonged, and the benefits of endovascular therapy were similar. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01657461. © 2017 American Heart Association, Inc.

  5. A high-resolution computed tomography-based scoring system to differentiate the most infectious active pulmonary tuberculosis from community-acquired pneumonia in elderly and non-elderly patients

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, Jun-Jun [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Section of Thoracic Imaging, Department of Chest Medicine and Family Medicine, Chiayi City (China); Chia Nan University of Pharmacy and Science, Tainan (China); Meiho University, Pingtung (China); Pingtung Christian Hospital, Pingtung (China); Chen, Solomon Chih-Cheng; Chen, Cheng-Ren [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Department of Medical Research, Chiayi City (China); Yeh, Ting-Chun; Lin, Hsin-Kai; Hong, Jia-Bin; Wu, Bing-Tsang [Ditmanson Medical Foundation Chia-Yi Christian Hospital, Department of Family Medicine, Chiayi City (China); Wu, Ming-Ting [Department of Radiology, Kaohsiung Veterans General Hospital, Section of Thoracic and Circulation Imaging, Kaohsiung (China); School of Medicine, National Yang Ming University, Faculty of Medicine, Taipei (China)

    2014-10-15

    The objective of this study was to use high-resolution computed tomography (HRCT) imaging to predict the presence of smear-positive active pulmonary tuberculosis (PTB) in elderly (at least 65 years of age) and non-elderly patients (18-65 years of age). Patients with active pulmonary infections seen from November 2010 through December 2011 received HRCT chest imaging, sputum smears for acid-fast bacilli and sputum cultures for Mycobacterium tuberculosis. Smear-positive PTB was defined as at least one positive sputum smear and a positive culture for M. tuberculosis. Multivariate logistic regression analyses were performed to determine the HRCT predictors of smear-positive active PTB, and a prediction score was developed on the basis of receiver operating characteristic curve analysis. Of 1,255 patients included, 139 were diagnosed with smear-positive active PTB. According to ROC curve analysis, the sensitivity, specificity, positive predictive value, negative predictive value, false positive rates and false negative rates were 98.6 %, 95.8 %, 78.5 %, 99.8 %, 4.2 % and 1.4 %, respectively, for diagnosing smear-positive active PTB in elderly patients, and 100.0 %, 96.9 %, 76.5 %, 100.0 %, 3.1 % and 0.0 %, respectively, for non-elderly patients. HRCT can assist in the early diagnosis of the most infectious active PTB, thereby preventing transmission and minimizing unnecessary immediate respiratory isolation. (orig.)

  6. Comparison of Positron Emission Tomography Quantification Using Magnetic Resonance- and Computed Tomography-Based Attenuation Correction in Physiological Tissues and Lesions: A Whole-Body Positron Emission Tomography/Magnetic Resonance Study in 66 Patients.

    Science.gov (United States)

    Seith, Ferdinand; Gatidis, Sergios; Schmidt, Holger; Bezrukov, Ilja; la Fougère, Christian; Nikolaou, Konstantin; Pfannenberg, Christina; Schwenzer, Nina

    2016-01-01

    Attenuation correction (AC) in fully integrated positron emission tomography (PET)/magnetic resonance (MR) systems plays a key role for the quantification of tracer uptake. The aim of this prospective study was to assess the accuracy of standardized uptake value (SUV) quantification using MR-based AC in direct comparison with computed tomography (CT)-based AC of the same PET data set on a large patient population. Sixty-six patients (22 female; mean [SD], 61 [11] years) were examined by means of combined PET/CT and PET/MR (11C-choline, 18F-FDG, or 68Ga-DOTATATE) subsequently. Positron emission tomography images from PET/MR examinations were corrected with MR-derived AC based on tissue segmentation (PET(MR)). The same PET data were corrected using CT-based attenuation maps (μ-maps) derived from PET/CT after nonrigid registration of the CT to the MR-based μ-map (PET(MRCT)). Positron emission tomography SUVs were quantified placing regions of interest or volumes of interest in 6 different body regions as well as PET-avid lesions, respectively. The relative differences of quantitative PET values when using MR-based AC versus CT-based AC were varying depending on the organs and body regions assessed. In detail, the mean (SD) relative differences of PET SUVs were as follows: -7.8% (11.5%), blood pool; -3.6% (5.8%), spleen; -4.4% (5.6%)/-4.1% (6.2%), liver; -0.6% (5.0%), muscle; -1.3% (6.3%), fat; -40.0% (18.7%), bone; 1.6% (4.4%), liver lesions; -6.2% (6.8%), bone lesions; and -1.9% (6.2%), soft tissue lesions. In 10 liver lesions, distinct overestimations greater than 5% were found (up to 10%). In addition, overestimations were found in 2 bone lesions and 1 soft tissue lesion adjacent to the lung (up to 28.0%). Results obtained using different PET tracers show that MR-based AC is accurate in most tissue types, with SUV deviations generally of less than 10%. In bone, however, underestimations can be pronounced, potentially leading to inaccurate SUV quantifications. In

  7. The Relationships between Metabolic Disorders (Hypertension, Dyslipidemia, and Impaired Glucose Tolerance and Computed Tomography-Based Indices of Hepatic Steatosis or Visceral Fat Accumulation in Middle-Aged Japanese Men.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Fujibayashi

    Full Text Available Most studies on the relationships between metabolic disorders (hypertension, dyslipidemia, and impaired glucose tolerance and hepatic steatosis (HS or visceral fat accumulation (VFA have been cross-sectional, and thus, these relationships remain unclear. We conducted a retrospective cohort study to clarify the relationships between components of metabolic disorders and HS/VFA.The participants were 615 middle-aged men who were free from serious liver disorders, diabetes, and HS/VFA and underwent multiple general health check-ups at our institution between 2009 and 2013. The data from the initial and final check-ups were used. HS and VFA were assessed by computed tomography. HS was defined as a liver to spleen attenuation ratio of ≤1.0. VFA was defined as a visceral fat cross-sectional area of ≥100 cm2 at the level of the navel. Metabolic disorders were defined using Japan's metabolic syndrome diagnostic criteria. The participants were divided into four groups based on the presence (+ or absence (- of HS/VFA. The onset rates of each metabolic disorder were compared among the four groups.Among the participants, 521, 55, 24, and 15 were classified as HS(-/VFA(-, HS(-/VFA(+, HS(+/VFA(-, and HS(+/VFA(+, respectively, at the end of the study. Impaired glucose tolerance was more common among the participants that exhibited HS or VFA (p = 0.05. On the other hand, dyslipidemia was more common among the participants that displayed VFA (p = 0.01.It is likely that VFA is associated with impaired glucose tolerance and dyslipidemia, while HS might be associated with impaired glucose tolerance. Unfortunately, our study failed to detect associations between HS/VFA and metabolic disorders due to the low number of subjects that exhibited fat accumulation. Although our observational study had major limitations, we consider that it obtained some interesting results. HS and VFA might affect different metabolic disorders. Further large-scale longitudinal studies

  8. Comparison of the clinical accuracy of cervical (C2-C7) pedicle screw insertion assisted by fluoroscopy, computed tomography-based navigation, and intraoperative three-dimensional C-arm navigation

    Institute of Scientific and Technical Information of China (English)

    LIU Ya-jun; TIAN Wei; LIU Bo; LI Qin; HU Lin; LI Zhi-yu; YUAN Qiang; L(U) Yan-wei; SUN Yu-zhen

    2010-01-01

    Background The complicated anatomy of the cervical spine and the variation among pedicles reduces the accuracy and increases the risk of neurovascular complications associated with screw implantation in this region. In this study, we compared the accuracy of cervical (C2-C7) pedicle screw fixation assisted by X-ray fluoroscopy, computed tomography (CT)-based navigation, or intraoperative three-dimensional (3D) C-arm navigation.Methods This prospective cohort study was performed in 82 consecutive patients who underwent cervical pedicle screw fixation. The accuracy of screw insertion was assessed by postoperative CT scan with 3D reconstruction. The accuracy of screw insertion was assessed as: excellent (screw completely within pedicle); acceptable (≤ 1 mm screw outside pedicle cortex); poor (>1 mm screw outside pedicle cortex).Results A total of 145 screws were inserted in 24 patients who underwent C-arm fluoroscopy. Of these, 96 screws (66.2%) were excellent, 37 (25.5%) were acceptable, and 12 (8.3%) were poor. One hundred and fifty-nine screws were inserted in 29 patients in the CT-based navigation group. Among these, 141 (88.7%) were excellent, 14 (8.8%) were acceptable, and 4 (2.5%) were poor. A total of 140 screws were inserted in 29 patients in the intraoperative 3D C-arm navigation group, of which 127 (90.7%) were excellent, and 13 (9.3%) were acceptable. No severe or permanent neurovascular complications associated with screw insertion were observed in any patient.Conclusione CT-based and intraoperative 3D C-arm navigation were similarly accurate, and were both significantly more accurate than C-arm fluoroscopy for guiding cervical pedicle screw fixation. They were able to accurately guide the angle and depth of screw placement using visual 3D images. These two techniques are therefore preferable for high-risk cervical pedicle screw fixation. The ease and convenience of intraoperative 3D C-arm navigation suggests that it may replace virtual

  9. Four-dimensional computed tomography based assessment and analysis of lung tumor motion during free-breathing respiration%基于四维CT影像肺内肿瘤运动度的测量与分析

    Institute of Scientific and Technical Information of China (English)

    王彦; 包勇; 张黎; 樊卫; 邓小武; 陈明

    2010-01-01

    目的 以四维CT(4DCT)影像量化肺内肿瘤因呼吸导致的运动,分析其影响因素,寻找运动度大的肿瘤特征.方法 在平静呼吸状态下接受4DCT扫描的肺内肿瘤患者43例,肺内可测量病灶44个.由同一位医生在4DCT各时相数据集卜勾画肿瘤GTV,分别测量GTV中心点在上下、左右、前后三维方向上的运动幅度.对与运动度可能相关的临床变量和解剖学因素进行统计学分析.以任意方向运动幅度>5 mm作为分界点,分析此类肿瘤特征.结果 肺内肿瘤运动度与T分期、GTV体积、肺内所处上下位置、与固定组织(如胸壁、纵隔、脊柱)粘连程度相关.10例患者肺内肿瘤的运动幅度>5 mm,均位于胸腔下部及后部,上下方向运动度最大,最大值为14.4 mm.95%肺内肿瘤的运动幅度在上下方向<11.8 mm,前后方向<4.6 mm,左右方向<2.7 mm.结论 呼吸导致的肺内肿瘤运动度受肿瘤位置、体积、T分期及粘连程度等因素影响.下叶肺内孤立肿瘤的运动度最大,主要发生在上下方向,上叶后段肿瘤的运动度次之.%Objective To quantify the amplitudes of lung tumor motion during free-breathing using four dimensional computed tomography (4DCT), and seek the characteristics of tumors with large motion. Methods Respiratory-induced tumor motion was analyzed for 44 tumors from 43 patients. All patients un-derwent 4DCT during free-breathing before treatment. Gross tumor volumes (GTV) on ten respiratory phases were contoured by the same doctor. The eentroids of GTVs were autoplaeed with treatment software (ADAC Pinnacle 7.4f), then the amplitudes of tumor motion were assessed. The various clinical and anatomic fac-tors associated with GTV motion were analyzed. The characteristics of tumors with motion greater than 5 mm in any direction were explored. Results The tumor motion was found to be associated with T stage, GTV size, the superior-inferior (SI) tumor location in the lung, and the attachment

  10. Impacts of Digital Imaging versus Drawing on Student Learning in Undergraduate Biodiversity Labs

    Science.gov (United States)

    Basey, John M.; Maines, Anastasia P.; Francis, Clinton D.; Melbourne, Brett

    2014-01-01

    We examined the effects of documenting observations with digital imaging versus hand drawing in inquiry-based college biodiversity labs. Plant biodiversity labs were divided into two treatments, digital imaging (N = 221) and hand drawing (N = 238). Graduate-student teaching assistants (N = 24) taught one class in each treatment. Assessments…

  11. Scattered Neutron Tomography Based on A Neutron Transport Inverse Problem

    Energy Technology Data Exchange (ETDEWEB)

    William Charlton

    2007-07-01

    Neutron radiography and computed tomography are commonly used techniques to non-destructively examine materials. Tomography refers to the cross-sectional imaging of an object from either transmission or reflection data collected by illuminating the object from many different directions.

  12. Diffuse optical tomography based on time-resolved compressive sensing

    Science.gov (United States)

    Farina, A.; Betcke, M.; Di Sieno, L.; Bassi, A.; Ducros, N.; Pifferi, A.; Valentini, G.; Arridge, S.; D'Andrea, C.

    2017-02-01

    Diffuse Optical Tomography (DOT) can be described as a highly multidimensional problem generating a huge data set with long acquisition/computational times. Biological tissue behaves as a low pass filter in the spatial frequency domain, hence compressive sensing approaches, based on both patterned illumination and detection, are useful to reduce the data set while preserving the information content. In this work, a multiple-view time-domain compressed sensing DOT system is presented and experimentally validated on non-planar tissue-mimicking phantoms containing absorbing inclusions.

  13. Guided Wave Tomography Based on Full-Waveform Inversion.

    Science.gov (United States)

    Rao, Jing; Ratassepp, Madis; Fan, Zheng

    2016-02-29

    In this paper, a guided wave tomography method based on Full Waveform Inversion (FWI) is developed for accurate and high resolu- tion reconstruction of the remaining wall thickness in isotropic plates. The forward model is computed in the frequency domain by solving a full-wave equation in a two-dimensional acoustic model, accounting for higher order eects such as diractions and multiple scattering. Both numerical simulations and experiments were carried out to obtain the signals of a dispersive guided mode propagating through defects. The inversion was based on local optimization of a waveform mist func- tion between modeled and measured data, and was applied iteratively to discrete frequency components from low to high frequencies. The resulting wave velocity maps were then converted to thickness maps by the dispersion characteristics of selected guided modes. The results suggest that the FWI method is capable to reconstruct the thickness map of a irregularly shaped defect accurately on a 10 mm thick plate with the thickness error within 0.5 mm.

  14. Analysis of sagittal anatomic structure of upper airway in patients with ankylosing spondylitis: computed tomography-based three-dimensional reconstruction%强直性脊柱炎患者上气道矢状位解剖结构分析:CT三维重建法

    Institute of Scientific and Technical Information of China (English)

    王幸双; 汪小海; 李文媛; 佟琪; 朱斌

    2013-01-01

    Objective To investigate the characteristics of sagittal anatomic structure of the upper airway in patients with ankylosing spondylitis using three-dimensional reconstruction based on computed tomography (CT).Methods Thirty-one male patients with ankylosing spondylitis,aged 20-60 yr (AS group),and 41 common patients (male) without difficult airways,aged 20-60 yr (control group),who underwent spiral CT scan of the head and neck using Helical CT from January 2007 to February 2011 in our hospital,were enrolled in the study.Reconstructed images of the upper airway were obtained using AW4.4 workstation and six distances (D1-D6) and four angles (α-δ) were recorded and analyzed:(1)D1,the arc distance between the upper central incisor and root of epiglottis; D2,the distance between the upper central incisor and root of epiglottis; D3 and D4,the lengths of maxilla and mandible ; D5,the distance between the root of epiglottis and midpoint of glottis; D6,the distance between the end of mandible and midpoint of glottis; (2) angle α,the angle of line D2 and D5; angle β,the angle of line D2 and the lower edge of the upper central incisor to the midpoint of glottis; angle γ,the angle of line D4 and D6; angle δ,the angle of the point of the lower edge of the upper central incisor to the trailing edge of the hard palate and then to the root of epiglottis.Results Compared with control group,no significant change was found in D1,D2,D3,D4 and D5 (P > 0.05),and D6,angle α and angle δ were significantly increased,whereas angle β and angle γ were decreased in AS group (P < 0.05).Conclusion The anatomic structure of the upper airway has the characteristics of specific changes and a laryngoscope blade with a large degree of curvature may be helpful for successful tracheal intubation in patients with ankylosing spondylitis.%目的 采用CT三维重建技术探讨强直性脊柱炎患者上气道解剖结构的特点.方法 选择2007年1月至2011年2月在本院行头颈部

  15. Dynamic computed tomography based on spatio-temporal analysis in acute stroke: Preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ha Young; Pyeon, Do Yeong; Kim, Da Hye; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of)

    2016-12-15

    Acute stroke is a one of common disease that require fast diagnosis and treatment to save patients life. however, the acute stroke may cause lifelong disability due to brain damage with no prompt surgical procedure. In order to diagnose the Stroke, brain perfusion CT examination and possible rapid implementation of 3D angiography has been widely used. However, a low-dose technique should be applied for the examination since a lot of radiation exposure to the patient may cause secondary damage for the patients. Therefore, the degradation of the measured CT images may interferes with a clinical check in that blood vessel shapes o n the CT image are significantly affected by gaussian noise. In this study, we employed the spatio-temporal technique to analyze dynamic (brain perfusion) CT data to improve an image quality for successful clinical diagnosis. As a results, proposed technique could remove gaussian noise successfully, demonstrated a possibility of new image segmentation technique for CT angiography. Qualitative evaluation was conducted by skilled radiological technologists, indicated significant quality improvement of dynamic CT images. the proposed technique will be useful tools as a clinical application for brain perfusion CT examination.

  16. Principle of diffraction enhanced imaging(DEI) and computed tomography based on DEI method

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In the first part of this article a more general DEI equation was derived using simple concepts. Not only does the new DEI equation explain all the problems that can be done by the DEI equation proposed by Chapman, but also explains the problem that can not be explained with the old DEI equation, such as the noise background caused by the small angle scattering reflected by the analyzer. In the second part, a DEI-PI-CT formula has been proposed and the contour contrast caused by the extinction of refraction beam has been qualitatively explained, and then based on the work of Ando's group two formulae of refraction CT with DEI method has been proposed. Combining one refraction CT formula proposed by Dilmanian with the two refraction CT formulae proposed by us, the whole framework of CT algorithm can be made to reconstruct three components of the gradient of refractive index.

  17. Refractive index tomography based on optical coherence tomography and tomographic reconstruction algorithm

    Science.gov (United States)

    Kitazawa, Takahiro; Nomura, Takanori

    2017-09-01

    Refractive index (RI) tomography based on not quantitative phase imaging (QPI) but optical coherence tomography (OCT) is proposed. In conventional RI tomography, the phase unwrapping process deteriorates measurement accuracy owing to the unwrapping error. To eliminate the unwrapping process, the introduction of OCT is proposed, because OCT directly provides optical thickness. The proposed method can improve measurement accuracy owing to the removal of the phase unwrapping error. The feasibility of the method is confirmed by numerical simulations and optical experiments. These results show that the proposed method can reduce measurement errors even when an object shows phase changes much larger than a wavelength.

  18. Pathological fracture prediction in patients with metastatic lesions can be improved with quantitative computed tomography based computer models.

    NARCIS (Netherlands)

    Tanck, E.J.M.; Aken, J.B. van; Linden, Y.M. van der; Schreuder, H.W.B.; Binkowski, M.; Huizenga, H.; Verdonschot, N.J.J.

    2009-01-01

    PURPOSE: In clinical practice, there is an urgent need to improve the prediction of fracture risk for cancer patients with bone metastases. The methods that are currently used to estimate fracture risk are dissatisfying, hence affecting the quality of life of patients with a limited life expectancy.

  19. Pathological fracture prediction in patients with metastatic lesions can be improved with quantitative computed tomography based computer models

    NARCIS (Netherlands)

    Tanck, Esther; Aken, van Jantien B.; Linden, van der Yvette M.; Schreuder, H.W. Bart; Binkowski, Marcin; Huizenga, Henk; Verdonschot, Nico

    2009-01-01

    Purpose: In clinical practice, there is an urgent need to improve the prediction of fracture risk for cancer patients with bone metastases. The methods that are currently used to estimate fracture risk are dissatisfying, hence affecting the quality of life of patients with a limited life expectancy.

  20. Comparative assessment of liver tumor motion using cine-magnetic resonance imaging versus 4-dimensional computed tomography.

    Science.gov (United States)

    Fernandes, Annemarie T; Apisarnthanarax, Smith; Yin, Lingshu; Zou, Wei; Rosen, Mark; Plastaras, John P; Ben-Josef, Edgar; Metz, James M; Teo, Boon-Keng

    2015-04-01

    To compare the extent of tumor motion between 4-dimensional CT (4DCT) and cine-MRI in patients with hepatic tumors treated with radiation therapy. Patients with liver tumors who underwent 4DCT and 2-dimensional biplanar cine-MRI scans during simulation were retrospectively reviewed to determine the extent of target motion in the superior-inferior, anterior-posterior, and lateral directions. Cine-MRI was performed over 5 minutes. Tumor motion from MRI was determined by tracking the centroid of the gross tumor volume using deformable image registration. Motion estimates from 4DCT were performed by evaluation of the fiducial, residual contrast (or liver contour) positions in each CT phase. Sixteen patients with hepatocellular carcinoma (n=11), cholangiocarcinoma (n=3), and liver metastasis (n=2) were reviewed. Cine-MRI motion was larger than 4DCT for the superior-inferior direction in 50% of patients by a median of 3.0 mm (range, 1.5-7 mm), the anterior-posterior direction in 44% of patients by a median of 2.5 mm (range, 1-5.5 mm), and laterally in 63% of patients by a median of 1.1 mm (range, 0.2-4.5 mm). Cine-MRI frequently detects larger differences in hepatic intrafraction tumor motion when compared with 4DCT most notably in the superior-inferior direction, and may be useful when assessing the need for or treating without respiratory management, particularly in patients with unreliable 4DCT imaging. Margins wider than the internal target volume as defined by 4DCT were required to encompass nearly all the motion detected by cine-MRI for some of the patients in this study. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Comparative Assessment of Liver Tumor Motion Using Cine–Magnetic Resonance Imaging Versus 4-Dimensional Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Annemarie T. [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Apisarnthanarax, Smith [Department of Radiation Oncology, University of Washington, Seattle, Washington (United States); Yin, Lingshu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Zou, Wei [Rutgers Cancer Institute of New Jersey, New Brunswick, New Jersey (United States); Rosen, Mark [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Plastaras, John P.; Ben-Josef, Edgar; Metz, James M. [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Teo, Boon-Keng, E-mail: kevin.teo@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States)

    2015-04-01

    Purpose: To compare the extent of tumor motion between 4-dimensional CT (4DCT) and cine-MRI in patients with hepatic tumors treated with radiation therapy. Methods and Materials: Patients with liver tumors who underwent 4DCT and 2-dimensional biplanar cine-MRI scans during simulation were retrospectively reviewed to determine the extent of target motion in the superior–inferior, anterior–posterior, and lateral directions. Cine-MRI was performed over 5 minutes. Tumor motion from MRI was determined by tracking the centroid of the gross tumor volume using deformable image registration. Motion estimates from 4DCT were performed by evaluation of the fiducial, residual contrast (or liver contour) positions in each CT phase. Results: Sixteen patients with hepatocellular carcinoma (n=11), cholangiocarcinoma (n=3), and liver metastasis (n=2) were reviewed. Cine-MRI motion was larger than 4DCT for the superior–inferior direction in 50% of patients by a median of 3.0 mm (range, 1.5-7 mm), the anterior–posterior direction in 44% of patients by a median of 2.5 mm (range, 1-5.5 mm), and laterally in 63% of patients by a median of 1.1 mm (range, 0.2-4.5 mm). Conclusions: Cine-MRI frequently detects larger differences in hepatic intrafraction tumor motion when compared with 4DCT most notably in the superior–inferior direction, and may be useful when assessing the need for or treating without respiratory management, particularly in patients with unreliable 4DCT imaging. Margins wider than the internal target volume as defined by 4DCT were required to encompass nearly all the motion detected by cine-MRI for some of the patients in this study.

  2. Experiments of Tomography-Based SAR Techniques with P-Band Polarimetric Data

    Science.gov (United States)

    Lombardini, F.; Pardini, M.

    2009-04-01

    New opportunities are arising in the synthetic aperture radar (SAR) observation of forest scenarios, especially with decimetric and metric radio wavelengths, which possess the capability of penetrating into volumes. Given its capabilities in the three-dimensional imaging of the scattering properties of the observed scene, SAR Tomography (Tomo-SAR) constitutes a good candidate for the analysis of the vertical structure of the forest. In this work, the results are presented of the application of tomography-based SAR techniques to P-band airborne data over a boreal forest from the ESA BioSAR-1 project. Results of an adaptive tomographic analysis are presented, also with a low resolution dataset, which emulates a satellite acquisition. In order to mitigate the geometric perspective effects due to the poor range resolution, the principle is introduced of the application of a common band pre-filtering to tomography. Then, a coherent layer canceller is derived to possibly apply interferometric techniques conceived for single layer scenarios to two layer scenarios. Finally, a stabilized adaptive polarimetric Tomo-SAR (PolTomo-SAR) method is proposed for estimating the 3D polarimetric scattering mechanism of the scene with low distorsions.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. SU-C-206-03: Metal Artifact Reduction in X-Ray Computed Tomography Based On Local Anatomical Similarity

    Energy Technology Data Exchange (ETDEWEB)

    Dong, X; Yang, X; Rosenfield, J; Elder, E; Dhabaan, A [Emory University, Winship Cancer Institute, Atlanta, GA (United States)

    2016-06-15

    Purpose: Metal implants such as orthopedic hardware and dental fillings cause severe bright and dark streaking in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. Additionally, such artifacts negatively impact patient set-up in image guided radiation therapy (IGRT). In this work, we propose a novel method for metal artifact reduction which utilizes the anatomical similarity between neighboring CT slices. Methods: Neighboring CT slices show similar anatomy. Based on this anatomical similarity, the proposed method replaces corrupted CT pixels with pixels from adjacent, artifact-free slices. A gamma map, which is the weighted summation of relative HU error and distance error, is calculated for each pixel in the artifact-corrupted CT image. The minimum value in each pixel’s gamma map is used to identify a pixel from the adjacent CT slice to replace the corresponding artifact-corrupted pixel. This replacement only occurs if the minimum value in a particular pixel’s gamma map is larger than a threshold. The proposed method was evaluated with clinical images. Results: Highly attenuating dental fillings and hip implants cause severe streaking artifacts on CT images. The proposed method eliminates the dark and bright streaking and improves the implant delineation and visibility. In particular, the image non-uniformity in the central region of interest was reduced from 1.88 and 1.01 to 0.28 and 0.35, respectively. Further, the mean CT HU error was reduced from 328 HU and 460 HU to 60 HU and 36 HU, respectively. Conclusions: The proposed metal artifact reduction method replaces corrupted image pixels with pixels from neighboring slices that are free of metal artifacts. This method proved capable of suppressing streaking artifacts, improving HU accuracy and image detectability.

  9. A virtual sinogram method to reduce dental metallic implant artefacts in computed tomography-based attenuation correction for PET

    NARCIS (Netherlands)

    Abdoli, Mehrsima; Ay, Mohammad Reza; Ahmadian, Alireza; Zaidi, Habib

    2010-01-01

    Objective Attenuation correction of PET data requires accurate determination of the attenuation map (mu map), which represents the spatial distribution of linear attenuation coefficients of different tissues at 511 keV. The presence of high-density metallic dental filling material in head and neck X

  10. Phantom-less bone mineral density (BMD) measurement using dual energy computed tomography-based 3-material decomposition

    Science.gov (United States)

    Hofmann, Philipp; Sedlmair, Martin; Krauss, Bernhard; Wichmann, Julian L.; Bauer, Ralf W.; Flohr, Thomas G.; Mahnken, Andreas H.

    2016-03-01

    Osteoporosis is a degenerative bone disease usually diagnosed at the manifestation of fragility fractures, which severely endanger the health of especially the elderly. To ensure timely therapeutic countermeasures, noninvasive and widely applicable diagnostic methods are required. Currently the primary quantifiable indicator for bone stability, bone mineral density (BMD), is obtained either by DEXA (Dual-energy X-ray absorptiometry) or qCT (quantitative CT). Both have respective advantages and disadvantages, with DEXA being considered as gold standard. For timely diagnosis of osteoporosis, another CT-based method is presented. A Dual Energy CT reconstruction workflow is being developed to evaluate BMD by evaluating lumbar spine (L1-L4) DE-CT images. The workflow is ROI-based and automated for practical use. A dual energy 3-material decomposition algorithm is used to differentiate bone from soft tissue and fat attenuation. The algorithm uses material attenuation coefficients on different beam energy levels. The bone fraction of the three different tissues is used to calculate the amount of hydroxylapatite in the trabecular bone of the corpus vertebrae inside a predefined ROI. Calibrations have been performed to obtain volumetric bone mineral density (vBMD) without having to add a calibration phantom or to use special scan protocols or hardware. Accuracy and precision are dependent on image noise and comparable to qCT images. Clinical indications are in accordance with the DEXA gold standard. The decomposition-based workflow shows bone degradation effects normally not visible on standard CT images which would induce errors in normal qCT results.

  11. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  1. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  6. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. An X-Ray Tomography Based Modeling Solution For Chemical Vapor Infiltration Of Ceramic Matrix Composites

    Science.gov (United States)

    Ros, William; Vignoles, Gérard L.; Germain, Christian

    2010-05-01

    A numerical tool for the simulation of Chemical Vapor Infiltration of carbon/carbon composites is introduced. The structure of the fibrous medium can be studied by high resolution X-Ray Computed Micro Tomography. Gas transport in various regimes is simulated by a random walk technique whilst the morphological evolution of the fluid/solid interface is handled by a Marching Cube technique. The program can be used to evaluate effective diffusivity and first order reaction rate. The numerical tool is validated by comparing computed effective properties of a straight slit pore with reactive walls to their analytical expression. Simulation of CVI processing of a real complex media is then presented.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Automation of Global Adjoint Tomography Based on ASDF and Workflow Management Tools

    Science.gov (United States)

    Lei, W.; Ruan, Y.; Bozdag, E.; Smith, J. A.; Modrak, R. T.; Krischer, L.; Chen, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Though a collaboration with the Oak Ridge National Laboratory computing group and an allocation on the `Titan' GPU-accelerated supercomputer, we have begun to assimilate waveform data from more than 4,000 earthquakes, from 1995 to 2015, in our inversions. However, since conventional file formats and signal processing tools were not designed for parallel processing of massive data volumes, use of such tools in high-resolution global inversions leads to major bottlenecks. To overcome such problems and allow for continued scientific progress, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of processing tools based on ASDF, covering from signal processing (pytomo3d), time window selection (pyflex) to adjoint source (pyadjoint). These new tools greatly enhance the reproducibility and accountability of our research while taking full advantage of parallel computing, showing superior scaling on modern computational platforms. The entire inversion workflow, intrinsically complex and sensitive to human errors, is carefully handled and automated by modern workflow management tools, preventing data contamination and saving a huge amount of time. Our starting model GLAD-M15 (Bozdag et al., 2016), an elastic model with transversely isotropic upper mantle, is based on 253 earthquakes and 15 nonlinear conjugate gradient iterations. We have now completed source inversions for more than 1,000 earthquakes and have started structural inversions using a quasi-Newton optimization algorithm. We will discuss the challenges of large-scale workflows on HPC systems, the solutions offered by our new adjoint tomography tools, and the initial tomographic results obtained using the new expanded dataset.

  12. 2D Magnetic resonance electrical property tomography based on B1(-) field mapping.

    Science.gov (United States)

    Yuqing Wan; Negishi, Michiro; Constable, R Todd

    2014-01-01

    Magnetic Resonance Electrical Property Tomography (MREPT) is a method to visualize electrical conductivity and permittivity distributions in the object. Traditional MREPT relies on either the radio frequency (RF) transmit field (B(+)1) mapping, or using a transmit/receive RF coil, to compute tissue's electrical conductivity and permittivity. This paper introduces an alternative approach based on the reconstructed receive field (B(-)1) By solving a system of homogeneous equations consisting of the signal ratios from multi-channel receive coils, the receive field distribution with both magnitude and phase can be computed. Similar to (B(+)1) based MREPT method, the conductivity and permittivity in the imaging object can be calculated from the (B(-)1) field. We demonstrated the feasibility to image electrical property contrasts through computer simulated studies and phantom experiments. Although this study focuses on the 2D reconstruction, the presented method can be extended to full 3D. This method can be applied to regular MR imaging collected with multi-channel receive coils, and therefore, tissue anomaly based on electrical properties can potentially be revealed with a higher imaging quality, providing useful information for clinical diagnosis.

  13. Eddy Current Tomography Based on a Finite Difference Forward Model with Additive Regularization

    Science.gov (United States)

    Trillon, A.; Girard, A.; Idier, J.; Goussard, Y.; Sirois, F.; Dubost, S.; Paul, N.

    2010-02-01

    Eddy current tomography is a nondestructive evaluation technique used for characterization of metal components. It is an inverse problem acknowledged as difficult to solve since it is both ill-posed and nonlinear. Our goal is to derive an inversion technique with improved tradeoff between quality of the results, computational requirements and ease of implementation. This is achieved by fully accounting for the nonlinear nature of the forward problem by means of a system of bilinear equations obtained through a finite difference modeling of the problem. The bilinear character of equations with respect to the electric field and the relative conductivity is taken advantage of through a simple contrast source inversion-like scheme. The ill-posedness is dealt with through the addition of regularization terms to the criterion, the form of which is determined according to computational constraints and the piecewise constant nature of the medium. Therefore an edge-preserving functional is selected. The performance of the resulting method is illustrated using 2D synthetic data examples.

  14. Mixed Total Variation and L1 Regularization Method for Optical Tomography Based on Radiative Transfer Equation

    Directory of Open Access Journals (Sweden)

    Jinping Tang

    2017-01-01

    Full Text Available Optical tomography is an emerging and important molecular imaging modality. The aim of optical tomography is to reconstruct optical properties of human tissues. In this paper, we focus on reconstructing the absorption coefficient based on the radiative transfer equation (RTE. It is an ill-posed parameter identification problem. Regularization methods have been broadly applied to reconstruct the optical coefficients, such as the total variation (TV regularization and the L1 regularization. In order to better reconstruct the piecewise constant and sparse coefficient distributions, TV and L1 norms are combined as the regularization. The forward problem is discretized with the discontinuous Galerkin method on the spatial space and the finite element method on the angular space. The minimization problem is solved by a Jacobian-based Levenberg-Marquardt type method which is equipped with a split Bregman algorithms for the L1 regularization. We use the adjoint method to compute the Jacobian matrix which dramatically improves the computation efficiency. By comparing with the other imaging reconstruction methods based on TV and L1 regularizations, the simulation results show the validity and efficiency of the proposed method.

  15. Image reconstruction of fluorescent molecular tomography based on the tree structured Schur complement decomposition

    Directory of Open Access Journals (Sweden)

    Wang Jiajun

    2010-05-01

    Full Text Available Abstract Background The inverse problem of fluorescent molecular tomography (FMT often involves complex large-scale matrix operations, which may lead to unacceptable computational errors and complexity. In this research, a tree structured Schur complement decomposition strategy is proposed to accelerate the reconstruction process and reduce the computational complexity. Additionally, an adaptive regularization scheme is developed to improve the ill-posedness of the inverse problem. Methods The global system is decomposed level by level with the Schur complement system along two paths in the tree structure. The resultant subsystems are solved in combination with the biconjugate gradient method. The mesh for the inverse problem is generated incorporating the prior information. During the reconstruction, the regularization parameters are adaptive not only to the spatial variations but also to the variations of the objective function to tackle the ill-posed nature of the inverse problem. Results Simulation results demonstrate that the strategy of the tree structured Schur complement decomposition obviously outperforms the previous methods, such as the conventional Conjugate-Gradient (CG and the Schur CG methods, in both reconstruction accuracy and speed. As compared with the Tikhonov regularization method, the adaptive regularization scheme can significantly improve ill-posedness of the inverse problem. Conclusions The methods proposed in this paper can significantly improve the reconstructed image quality of FMT and accelerate the reconstruction process.

  16. Sound field of thermoacoustic tomography based on a modified finite-difference time-domain method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Chi; WANG Yuanyuan

    2009-01-01

    A modified finite-difference time-domain (FDTD) method is proposed for the sound field simulation of the thermoacoustic tomography (TAT) in the acoustic speed inhomogeneous medium. First, the basic equations of the TAT are discretized to difference ones by the FDTD. Then the electromagnetic pulse, the excitation source of the TAT, is modified twice to eliminate the error introduced by high frequency electromagnetic waves. Computer simulations are carried out to validate this method. It is shown that the FDTD method has a better accuracy than the commonly used time-of-flight (TOF) method in the TAT with the inhomogeneous acoustic speed. The error of the FDTD is ten times smaller than that of the TOF in the simulation for the acoustic speed difference larger than 50%. So this FDTD method is an efficient one for the sound field simulation of the TAT and can provide the theoretical basis for the study of reconstruction algorithms of the TAT in the acoustic heterogeneous medium.

  17. Beam hardening correction for interior tomography based on exponential formed model and radon inversion transform

    Science.gov (United States)

    Chen, Siyu; Zhang, Hanming; Li, Lei; Xi, Xiaoqi; Han, Yu; Yan, Bin

    2016-10-01

    X-ray computed tomography (CT) has been extensively applied in industrial non-destructive testing (NDT). However, in practical applications, the X-ray beam polychromaticity often results in beam hardening problems for image reconstruction. The beam hardening artifacts, which manifested as cupping, streaks and flares, not only debase the image quality, but also disturb the subsequent analyses. Unfortunately, conventional CT scanning requires that the scanned object is completely covered by the field of view (FOV), the state-of-art beam hardening correction methods only consider the ideal scanning configuration, and often suffer problems for interior tomography due to the projection truncation. Aiming at this problem, this paper proposed a beam hardening correction method based on radon inversion transform for interior tomography. Experimental results show that, compared to the conventional correction algorithms, the proposed approach has achieved excellent performance in both beam hardening artifacts reduction and truncation artifacts suppression. Therefore, the presented method has vitally theoretic and practicable meaning in artifacts correction of industrial CT.

  18. Transducer combination for high-quality ultrasound tomography based on speed of sound imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Hun; Park, Kwan Kyu [Dept. of Mechanical Engineering, Hanyang University, Seoul (Korea, Republic of)

    2016-02-15

    The type of ultrasound transducer used influences the quality of a reconstructed ultrasound image. This study analyzed the effect of transducer type on ultrasound computed tomography (UCT) image quality. The UCT was modeled in an ultrasound simulator by using a 5 cm anatomy model and a ring-shape 5 MHz 128 transducer array, which considered attenuation, refraction, and reflection. Speed-of-sound images were reconstructed by the Radon transform as the UCT image modality. Acoustic impedance images were also reconstructed by the delayand-sum (DAS) method, which considered the speed of sound information. To determine the optimal combination of transducers in observation, point-source, flat, and focused transducers were tested in combination as trasmitters and receivers; UCT images were constructed from each combination. The combination of point-source/flat transducer as transmitting and receiving devices presented the best reconstructed image quality. In UCT implementation, the combination of a flat transducer for transmitting and a point transducer for receiving permitted acceptable image quality.

  19. Locating Impedance Change in Electrical Impedance Tomography Based on Multilevel BP Neural Network

    Institute of Scientific and Technical Information of China (English)

    彭源; 莫玉龙

    2003-01-01

    Electrical impedance tomography (EIT) is a new computer tomography technology, which reconstructs an impedance (resistivity, conductivity) distribution, or change of impedance, by making voltage and current measurements on the object's periphery.Image reconstruction in EIT is an ill-posed, non-linear inverse problem. A method for finding the place of impedance change in EIT is proposed in this paper, in which a multilevel BP neural network (MBPNN) is used to express the non-linear relation between theimpedance change inside the object and the voltage change measured on the surface of the object. Thus, the location of the impedance change can be decided by the measured voltage variation on the surface. The impedance change is then reconstructed using a linear approximate method. MBPNN can decide the impedance change location exactly without long training time. It alleviates some noise effects and can be expanded, ensuring high precision and space resolution of the reconstructed image that are not possible by using the back projection method.

  20. In vivo bioluminescence tomography based on multi-view projection and 3D surface reconstruction

    Science.gov (United States)

    Zhang, Shuang; Wang, Kun; Leng, Chengcai; Deng, Kexin; Hu, Yifang; Tian, Jie

    2015-03-01

    Bioluminescence tomography (BLT) is a powerful optical molecular imaging modality, which enables non-invasive realtime in vivo imaging as well as 3D quantitative analysis in preclinical studies. In order to solve the inverse problem and reconstruct inner light sources accurately, the prior structural information is commonly necessary and obtained from computed tomography or magnetic resonance imaging. This strategy requires expensive hybrid imaging system, complicated operation protocol and possible involvement of ionizing radiation. The overall robustness highly depends on the fusion accuracy between the optical and structural information. In this study we present a pure optical bioluminescence tomographic system (POBTS) and a novel BLT method based on multi-view projection acquisition and 3D surface reconstruction. The POBTS acquired a sparse set of white light surface images and bioluminescent images of a mouse. Then the white light images were applied to an approximate surface model to generate a high quality textured 3D surface reconstruction of the mouse. After that we integrated multi-view luminescent images based on the previous reconstruction, and applied an algorithm to calibrate and quantify the surface luminescent flux in 3D.Finally, the internal bioluminescence source reconstruction was achieved with this prior information. A BALB/C mouse with breast tumor of 4T1-fLuc cells mouse model were used to evaluate the performance of the new system and technique. Compared with the conventional hybrid optical-CT approach using the same inverse reconstruction method, the reconstruction accuracy of this technique was improved. The distance error between the actual and reconstructed internal source was decreased by 0.184 mm.

  1. Measurement of diabetic wounds with optical coherence tomography-based air-jet indentation system and a material testing system.

    Science.gov (United States)

    Choi, M-C; Cheung, K-K; Ng, G Y-F; Zheng, Y-P; Cheing, G L-Y

    2015-11-01

    Material testing system is a conventional but destructive method for measuring the biomechanical properties of wound tissues in basic research. The recently developed optical coherence tomography-based air-jet indentation system is a non-destructive method for measuring these properties of soft tissues in a non-contact manner. The aim of the study was to examine the correlation between the biomechanical properties of wound tissues measured by the two systems. Young male Sprague-Dawley rats with streptozotocin-induced diabetic were wounded by a 6 mm biopsy punch on their hind limbs. The biomechanical properties of wound tissues were assessed with the two systems on post-wounding days 3, 7, 10, 14, and 21. Wound sections were stained with picro-sirius red for analysis on the collagen fibres. Data obtained on the different days were charted to obtain the change in biomechanical properties across the time points, and then pooled to examine the correlation between measurements made by the two devices. Qualitative analysis to determine any correlation between indentation stiffness measured by the air-jet indentation system and the orientation of collagen fibres. The indentation stiffness is significantly negatively correlated to the maximum load, maximum tensile stress, and Young's modulus by the material testing system (all pbased air-jet indentation system to evaluate the biomechanical properties of wounds in a non-contact manner. It is a potential clinical device to examine the biomechanical properties of chronic wounds in vivo in a repeatable manner.

  2. Acoustic Emission tomography based on simultaneous algebraic reconstruction technique to visualize the damage source location in Q235B steel plate

    Science.gov (United States)

    Jiang, Yu; Xu, Feiyun; Xu, Bingsheng

    2015-12-01

    Acoustic Emission (AE) tomography based on Simultaneous Algebraic Reconstruction Technique (SART), which combines the traditional location algorithm with the SART algorithm by using AE events as its signal sources, is a new visualization method for inspecting and locating the internal damages in the structure. In this paper, the proposed method is applied to examine and visualize two man-made damage source locations in the Q235B steel plate to validate its effectiveness. Firstly, the Q235B steel plate with two holes specimen is fabricated and the pencil lead break (PLB) signal is taken as the exciting source for AE tomography.Secondly, A 6-step description of the SART algorithm is provided and the three dimensional(3D)image contained the damage source locations is visualized by using the proposed algorithm in terms of a locally varying wave velocity distribution. It is shown that the AE tomography based on SART has great potential in the application of structure damage detection. Finally, to further improve the quality of 3D imaging, the Median Filter and the Adaptive Median Filter are used to reduce the noises resulting from AE tomography. The experiment results indicate that Median Filter is the optimal method to remove Salt & Pepper noises.

  3. Is scoring system of computed tomography based metric parameters can accurately predicts shock wave lithotripsy stone-free rates and aid in the development of treatment strategies?

    Directory of Open Access Journals (Sweden)

    Yasser ALI Badran

    2016-01-01

    Conclusion: Stone size, stone density (HU, and SSD is simple to calculate and can be reported by radiologists to applying combined score help to augment predictive power of SWL, reduce cost, and improving of treatment strategies.

  4. Feasibility of differential quantification of 3D temporomandibular kinematics during various oral activities using a cone-beam computed tomography-based 3D fluoroscopic method

    Directory of Open Access Journals (Sweden)

    Chien-Chih Chen

    2013-06-01

    Conclusion: A new CBCT-based 3D fluoroscopic method was proposed and shown to be capable of quantitatively differentiating TMJ movement patterns among complicated functional activities. It also enabled a complete description of the rigid-body mandibular motion and descriptions of as many reference points as needed for future clinical applications. It will be helpful for dental practice and for a better understanding of the functions of the TMJ.

  5. Diagnosis of drowning using post-mortem computed tomography based on the volume and density of fluid accumulation in the maxillary and sphenoid sinuses

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Kawabata, Tomoyoshi; Sugai, Yusuke [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Sato, Miho, E-mail: meifan58@m.tains.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan)

    2013-10-01

    Recent studies have reported that drowning victims frequently have fluid accumulation in the paranasal sinuses, most notably the maxillary and sphenoid sinuses. However, in our previous study, many non-drowning victims also had fluid accumulation in the sinuses. Therefore, we evaluated the qualitative difference in fluid accumulation between drowning and non-drowning cases in the present study. Thirty-eight drowning and 73 non-drowning cases were investigated retrospectively. The fluid volume and density of each case were calculated using a DICOM workstation. The drowning cases were compared with the non-drowning cases using the Mann–Whitney U-test because the data showed non-normal distribution. The median fluid volume was 1.82 (range 0.02–11.7) ml in the drowning cases and 0.49 (0.03–8.7) ml in the non-drowning cases, and the median fluid density was 22 (−14 to 66) and 39 (−65 to 77) HU, respectively. Both volume and density differed significantly between the drowning and non-drowning cases (p = 0.001, p = 0.0007). Regarding cut-off levels in the ROC analysis, the points on the ROC curve closest (0, 1) were 1.03 ml (sensitivity 68%, specificity 68%, PPV 53%, NPV 81%) and 27.5 HU (61%, 70%, 51%, 77%). The Youden indices were 1.03 ml and 37.8 HU (84%, 51%, 47%, 86%). When the cut-off level was set at 1.03 ml and 27.5 HU, the sensitivity was 42%, specificity 45%, PPV 29% and NPV 60%. When the cut-off level was set at 1.03 ml and 37.8 HU, sensitivity was 58%, specificity 32%, PPV 31% and NPV 59%.

  6. Optimal C-arm angulation during transcatheter aortic valve replacement: Accuracy of a rotational C-arm computed tomography based three dimensional heart model.

    Science.gov (United States)

    Veulemans, Verena; Mollus, Sabine; Saalbach, Axel; Pietsch, Max; Hellhammer, Katharina; Zeus, Tobias; Westenfeld, Ralf; Weese, Jürgen; Kelm, Malte; Balzer, Jan

    2016-10-26

    To investigate the accuracy of a rotational C-arm CT-based 3D heart model to predict an optimal C-arm configuration during transcatheter aortic valve replacement (TAVR). Rotational C-arm CT (RCT) under rapid ventricular pacing was performed in 57 consecutive patients with severe aortic stenosis as part of the pre-procedural cardiac catheterization. With prototype software each RCT data set was segmented using a 3D heart model. From that the line of perpendicularity curve was obtained that generates a perpendicular view of the aortic annulus according to the right-cusp rule. To evaluate the accuracy of a model-based overlay we compared model- and expert-derived aortic root diameters. For all 57 patients in the RCT cohort diameter measurements were obtained from two independent operators and were compared to the model-based measurements. The inter-observer variability was measured to be in the range of 0°-12.96° of angular C-arm displacement for two independent operators. The model-to-operator agreement was 0°-13.82°. The model-based and expert measurements of aortic root diameters evaluated at the aortic annulus (r = 0.79, P r = 0.93, P r = 0.92, P < 0.01) correlated on a high level and the Bland-Altman analysis showed good agreement. The interobserver measurements did not show a significant bias. Automatic segmentation of the aortic root using an anatomical model can accurately predict an optimal C-arm configuration, potentially simplifying current clinical workflows before and during TAVR.

  7. TU-CD-BRA-08: Single-Energy Computed Tomography-Based Pulmonary Perfusion Imaging: Proof-Of-Principle in a Canine Model

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, T; Boone, J [University of California Davis School of Medicine, Sacramento, CA (United States); Kent, M; Wisner, E [University of California Davis School of Veterinary Medicine, Davis, CA (United States); Fujita, Y [Tokai University, Isehara (Japan)

    2015-06-15

    Purpose: Pulmonary perfusion imaging has provided significant insights into pulmonary diseases, and can be useful in radiotherapy. The purpose of this study was to prospectively establish proof-of-principle in a canine model for single-energy CT-based perfusion imaging, which has the potential for widespread clinical implementation. Methods: Single-energy CT perfusion imaging is based on: (1) acquisition of inspiratory breath-hold CT scans before and after intravenous injection of iodinated contrast medium, (2) deformable image registration (DIR) of the two CT image data sets, and (3) subtraction of the pre-contrast image from post-contrast image, yielding a map of Hounsfield unit (HU) enhancement. These subtraction image data sets hypothetically represent perfused blood volume, a surrogate for perfusion. In an IACUC-approved clinical trial, we acquired pre- and post-contrast CT scans in the prone posture for six anesthetized, mechanically-ventilated dogs. The elastix algorithm was used for DIR. The registration accuracy was quantified using the target registration errors (TREs) for 50 pulmonary landmarks in each dog. The gradient of HU enhancement between gravity-dependent (ventral) and non-dependent (dorsal) regions was evaluated to quantify the known effect of gravity, i.e., greater perfusion in ventral regions. Results: The lung volume difference between the two scans was 4.3±3.5% on average (range 0.3%–10.1%). DIR demonstrated an average TRE of 0.7±1.0 mm. HU enhancement in lung parenchyma was 34±10 HU on average and varied considerably between individual dogs, indicating the need for improvement of the contrast injection protocol. HU enhancement in ventral (gravity-dependent) regions was found to be greater than in dorsal regions. A population average ventral-to-dorsal gradient of HU enhancement was strong (R{sup 2}=0.94) and statistically significant (p<0.01). Conclusion: This canine study demonstrated relatively accurate DIR and a strong ventral-to-dorsal gradient of HU enhancement, providing proof-of-principle for single-energy CT pulmonary perfusion imaging. This ongoing study will enroll more dogs and investigate the physiological significance. This study was supported by a Philips Healthcare/Radiological Society of North America (RSNA) Research Seed Grant (RSD1458)

  8. SPECTRAL DOMAIN OPTICAL COHERENCE TOMOGRAPHY-BASED MICROSTRUCTURAL ANALYSIS OF RETINAL ARCHITECTURE POST INTERNAL LIMITING MEMBRANE PEELING FOR SURGERY OF IDIOPATHIC MACULAR HOLE REPAIR.

    Science.gov (United States)

    Modi, Aditya; Giridhar, Anantharaman; Gopalakrishnan, Mahesh

    2017-02-01

    Spectral domain optical coherence tomography-based analysis of retinal architecture after internal limiting membrane peeling for macular hole surgery. Prospective, interventional study. Fifty eyes underwent the surgical procedure with minimum internal limiting membrane peel of 3 mm diameter. Automatic segmentation software was used to assess individual layers preoperatively and postoperatively, 1.5 millimeters medial and lateral to fovea at 3 months postoperative visit. Main outcome measures were final central macular thickness and variation in individual retinal layer thickness. Mean central macular thickness postoperatively was 201 microns. Retinal thickening was observed, 1.5 mm medial to fovea (P Internal limiting membrane peel is associated with significant alteration in inner retinal architecture, especially in ganglion cell layer, which can adversely influence functional outcome of the surgery and makes it imperative to avoid peeling internal limiting membrane over a larger surface area.

  9. Interface and permittivity simultaneous reconstruction in electrical capacitance tomography based on boundary and finite-elements coupling method.

    Science.gov (United States)

    Ren, Shangjie; Dong, Feng

    2016-06-28

    Electrical capacitance tomography (ECT) is a non-destructive detection technique for imaging the permittivity distributions inside an observed domain from the capacitances measurements on its boundary. Owing to its advantages of non-contact, non-radiation, high speed and low cost, ECT is promising in the measurements of many industrial or biological processes. However, in the practical industrial or biological systems, a deposit is normally seen in the inner wall of its pipe or vessel. As the actual region of interest (ROI) of ECT is surrounded by the deposit layer, the capacitance measurements become weakly sensitive to the permittivity perturbation occurring at the ROI. When there is a major permittivity difference between the deposit and the ROI, this kind of shielding effect is significant, and the permittivity reconstruction becomes challenging. To deal with the issue, an interface and permittivity simultaneous reconstruction approach is proposed. Both the permittivity at the ROI and the geometry of the deposit layer are recovered using the block coordinate descent method. The boundary and finite-elements coupling method is employed to improve the computational efficiency. The performance of the proposed method is evaluated with the simulation tests. This article is part of the themed issue 'Supersensing through industrial process tomography'.

  10. Extracting a high-quality data space for stereo-tomography based on a 3D structure tensor algorithm and kinematic de-migration

    Science.gov (United States)

    Xiong, Kai; Yang, Kai; Wang, Yu-Xiang

    2017-08-01

    To extract a high-quality data space (the so-called kinematic invariants) is a key factor to a successful implementation of stereo-tomography. The structure tensor algorithm demonstrated itself a robust tool to pick the kinematic invariants for stereo-tomography. However, if there are lots of diffractions and other noises in the data, it could be risky to extract the data space from the data domain. Meanwhile, for any reflector, we try to pick all the relevant primary reflections as much as possible within a wide offset range. To achieve this, in this paper, we design a scheme to extract a high-quality data space for stereo-tomography based on 3D structure tensor and kinematic de-migration. Firstly, we apply an automatic, dense volumetric picking for residual move-out (RMO) and the structural dip in the depth-migrated domain with an advanced 3D structure tensor algorithm. Then, a set of key horizons are picked manually in a few selected depth-migrated common offset gathers. Finally, all the picked horizons are extrapolated along the offset axis based on the RMO information picked in advance. Thus, the initial high-density points picked in the depth-migrated volume are greatly refined. After this processing, a final and refined data space for stereo-tomography is extracted through a kinematic de-migration. We demonstrate the correctness and the robustness of the presented scheme with synthetic and real data examples.

  11. Computed tomography of cryogenic cells

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Gerd; Anderson, E.; Vogt, S.; Knochel, C.; Weiss, D.; LeGros, M.; Larabell, C.

    2001-08-30

    Due to the short wavelengths of X-rays and low numerical aperture of the Fresnel zone plates used as X-ray objectives, the depth of field is several microns. Within the focal depth, imaging a thick specimen is to a good approximation equivalent to projecting the specimen absorption. Therefore, computed tomography based on a tilt series of X-ray microscopic images can be used to reconstruct the local linear absorption coefficient and image the three-dimensional specimen structure. To preserve the structural integrity of biological objects during image acquisition, microscopy is performed at cryogenic temperatures. Tomography based on X-ray microscopic images was applied to study the distribution of male specific lethal 1 (MSL-1), a nuclear protein involved in dosage compensation in Drosophila melanogaster, which ensures that males with single X chromosome have the same amount of most X-linked gene products as females with two X chromosomes. Tomographic reconstructions of X-ray microscopic images were used to compute the local three-dimensional linear absorption coefficient revealing the arrangement of internal structures of Drosophila melanogaster cells. Combined with labelling techniques, nanotomography is a new technique to study the 3D distribution of selected proteins inside whole cells. We want to improve this technique with respect to resolution and specimen preparation. The resolution in the reconstruction can be significantly improved by reducing the angular step size to collect more viewing angles, which requires an automated data acquisition. In addition, fast-freezing with liquid ethane instead of cryogenic He gas will be applied to improve the vitrification of the hydrated samples. We also plan to apply cryo X-ray nanotomography in order to study different types of cells and their nuclear protein distributions.

  12. Comparison of five segmentation tools for 18F-fluoro-deoxy-glucose-positron emission tomography-based target volume definition in head and neck cancer.

    NARCIS (Netherlands)

    Schinagl, D.A.X.; Vogel, W.V.; Hoffmann, A.L.; Dalen, J.A. van; Oyen, W.J.G.; Kaanders, J.H.A.M.

    2007-01-01

    PURPOSE: Target-volume delineation for radiation treatment to the head and neck area traditionally is based on physical examination, computed tomography (CT), and magnetic resonance imaging. Additional molecular imaging with (18)F-fluoro-deoxy-glucose (FDG)-positron emission tomography (PET) may imp

  13. Three-dimensional noninvasive ultrasound Joule heat tomography based on the acousto-electric effect using unipolar pulses: a simulation study.

    Science.gov (United States)

    Yang, Renhuan; Li, Xu; Song, Aiguo; He, Bin; Yan, Ruqiang

    2012-11-21

    Electrical properties of biological tissues are highly sensitive to their physiological and pathological status. Thus it is of importance to image electrical properties of biological tissues. However, spatial resolution of conventional electrical impedance tomography (EIT) is generally poor. Recently, hybrid imaging modalities combining electric conductivity contrast and ultrasonic resolution based on the acousto-electric effect has attracted considerable attention. In this study, we propose a novel three-dimensional (3D) noninvasive ultrasound Joule heat tomography (UJHT) approach based on the acousto-electric effect using unipolar ultrasound pulses. As the Joule heat density distribution is highly dependent on the conductivity distribution, an accurate and high-resolution mapping of the Joule heat density distribution is expected to give important information that is closely related to the conductivity contrast. The advantages of the proposed ultrasound Joule heat tomography using unipolar pulses include its simple inverse solution, better performance than UJHT using common bipolar pulses and its independence of a priori knowledge of the conductivity distribution of the imaging object. Computer simulation results show that using the proposed method, it is feasible to perform a high spatial resolution Joule heat imaging in an inhomogeneous conductive media. Application of this technique on tumor scanning is also investigated by a series of computer simulations.

  14. Computed tomography-based anatomic assessment overestimates local tumor recurrence in patients with mass-like consolidation after stereotactic body radiotherapy for early-stage non-small cell lung cancer.

    Science.gov (United States)

    Dunlap, Neal E; Yang, Wensha; McIntosh, Alyson; Sheng, Ke; Benedict, Stanley H; Read, Paul W; Larner, James M

    2012-12-01

    To investigate pulmonary radiologic changes after lung stereotactic body radiotherapy (SBRT), to distinguish between mass-like fibrosis and tumor recurrence. Eighty consecutive patients treated with 3- to 5-fraction SBRT for early-stage peripheral non-small cell lung cancer with a minimum follow-up of 12 months were reviewed. The mean biologic equivalent dose received was 150 Gy (range, 78-180 Gy). Patients were followed with serial CT imaging every 3 months. The CT appearance of consolidation was defined as diffuse or mass-like. Progressive disease on CT was defined according to Response Evaluation Criteria in Solid Tumors 1.1. Positron emission tomography (PET) CT was used as an adjunct test. Tumor recurrence was defined as a standardized uptake value equal to or greater than the pretreatment value. Biopsy was used to further assess consolidation in select patients. Median follow-up was 24 months (range, 12.0-36.0 months). Abnormal mass-like consolidation was identified in 44 patients (55%), whereas diffuse consolidation was identified in 12 patients (15%), at a median time from end of treatment of 10.3 months and 11.5 months, respectively. Tumor recurrence was found in 35 of 44 patients with mass-like consolidation using CT alone. Combined with PET, 10 of the 44 patients had tumor recurrence. Tumor size (hazard ratio 1.12, P=.05) and time to consolidation (hazard ratio 0.622, P=.03) were predictors for tumor recurrence. Three consecutive increases in volume and increasing volume at 12 months after treatment in mass-like consolidation were highly specific for tumor recurrence (100% and 80%, respectively). Patients with diffuse consolidation were more likely to develop grade ≥ 2 pneumonitis (odds ratio 26.5, P=.02) than those with mass-like consolidation (odds ratio 0.42, P=.07). Incorporating the kinetics of mass-like consolidation and PET to the current criteria for evaluating posttreatment response will increase the likelihood of correctly identifying patients with progressive disease after lung SBRT. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Computed Tomography-Based Anatomic Assessment Overestimates Local Tumor Recurrence in Patients With Mass-like Consolidation After Stereotactic Body Radiotherapy for Early-Stage Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Dunlap, Neal E. [Department of Radiation Oncology, University of Louisville, Louisville, KY (United States); Yang Wensha [Department of Radiation Oncology, Cedars Sinai Medical Center, Los Angeles, CA (United States); McIntosh, Alyson [Department of Radiation Oncology, John and Dorothy Morgan Cancer Center, Lehigh Valley Hospital, Allentown, PA (United States); Sheng, Ke [Department of Radiation Oncology, David Geffen School of Medicine at University of California Los Angeles, Los Angeles, CA (United States); Benedict, Stanley H.; Read, Paul W. [Department of Radiation Oncology, University of Virginia, Charlottesville, VA (United States); Larner, James M., E-mail: jml2p@virginia.edu [Department of Radiation Oncology, University of Virginia, Charlottesville, VA (United States)

    2012-12-01

    Purpose: To investigate pulmonary radiologic changes after lung stereotactic body radiotherapy (SBRT), to distinguish between mass-like fibrosis and tumor recurrence. Methods and Materials: Eighty consecutive patients treated with 3- to 5-fraction SBRT for early-stage peripheral non-small cell lung cancer with a minimum follow-up of 12 months were reviewed. The mean biologic equivalent dose received was 150 Gy (range, 78-180 Gy). Patients were followed with serial CT imaging every 3 months. The CT appearance of consolidation was defined as diffuse or mass-like. Progressive disease on CT was defined according to Response Evaluation Criteria in Solid Tumors 1.1. Positron emission tomography (PET) CT was used as an adjunct test. Tumor recurrence was defined as a standardized uptake value equal to or greater than the pretreatment value. Biopsy was used to further assess consolidation in select patients. Results: Median follow-up was 24 months (range, 12.0-36.0 months). Abnormal mass-like consolidation was identified in 44 patients (55%), whereas diffuse consolidation was identified in 12 patients (15%), at a median time from end of treatment of 10.3 months and 11.5 months, respectively. Tumor recurrence was found in 35 of 44 patients with mass-like consolidation using CT alone. Combined with PET, 10 of the 44 patients had tumor recurrence. Tumor size (hazard ratio 1.12, P=.05) and time to consolidation (hazard ratio 0.622, P=.03) were predictors for tumor recurrence. Three consecutive increases in volume and increasing volume at 12 months after treatment in mass-like consolidation were highly specific for tumor recurrence (100% and 80%, respectively). Patients with diffuse consolidation were more likely to develop grade {>=}2 pneumonitis (odds ratio 26.5, P=.02) than those with mass-like consolidation (odds ratio 0.42, P=.07). Conclusion: Incorporating the kinetics of mass-like consolidation and PET to the current criteria for evaluating posttreatment response will increase the likelihood of correctly identifying patients with progressive disease after lung SBRT.

  16. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  17. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  18. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  19. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  20. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  1. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  2. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  3. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  4. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  5. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  6. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  7. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  8. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  9. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  10. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  11. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  12. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  13. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  14. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  15. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  16. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  17. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  18. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  19. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  20. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  1. 3D-CT vascular setting protocol using computer graphics for the evaluation of maxillofacial lesions

    Directory of Open Access Journals (Sweden)

    CAVALCANTI Marcelo de Gusmão Paraiso

    2001-01-01

    Full Text Available In this paper we present the aspect of a mandibular giant cell granuloma in spiral computed tomography-based three-dimensional (3D-CT reconstructed images using computer graphics, and demonstrate the importance of the vascular protocol in permitting better diagnosis, visualization and determination of the dimensions of the lesion. We analyzed 21 patients with maxillofacial lesions of neoplastic and proliferative origins. Two oral and maxillofacial radiologists analyzed the images. The usefulness of interactive 3D images reconstructed by means of computer graphics, especially using a vascular setting protocol for qualitative and quantitative analyses for the diagnosis, determination of the extent of lesions, treatment planning and follow-up, was demonstrated. The technique is an important adjunct to the evaluation of lesions in relation to axial CT slices and 3D-CT bone images.

  2. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  3. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  4. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  5. Network Tomography Based on Additive Metrics

    CERN Document Server

    Ni, Jian

    2008-01-01

    Inference of the network structure (e.g., routing topology) and dynamics (e.g., link performance) is an essential component in many network design and management tasks. In this paper we propose a new, general framework for analyzing and designing routing topology and link performance inference algorithms using ideas and tools from phylogenetic inference in evolutionary biology. The framework is applicable to a variety of measurement techniques. Based on the framework we introduce and develop several polynomial-time distance-based inference algorithms with provable performance. We provide sufficient conditions for the correctness of the algorithms. We show that the algorithms are consistent (return correct topology and link performance with an increasing sample size) and robust (can tolerate a certain level of measurement errors). In addition, we establish certain optimality properties of the algorithms (i.e., they achieve the optimal $l_\\infty$-radius) and demonstrate their effectiveness via model simulation.

  6. Computerized ionospheric tomography based on geosynchronous SAR

    Science.gov (United States)

    Hu, Cheng; Tian, Ye; Dong, Xichao; Wang, Rui; Long, Teng

    2017-02-01

    Computerized ionospheric tomography (CIT) based on spaceborne synthetic aperture radar (SAR) is an emerging technique to construct the three-dimensional (3-D) image of ionosphere. The current studies are all based on the Low Earth Orbit synthetic aperture radar (LEO SAR) which is limited by long repeat period and small coverage. In this paper, a novel ionospheric 3-D CIT technique based on geosynchronous SAR (GEO SAR) is put forward. First, several influences of complex atmospheric environment on GEO SAR focusing are detailedly analyzed, including background ionosphere and multiple scattering effects (induced by turbulent ionosphere), tropospheric effects, and random noises. Then the corresponding GEO SAR signal model is constructed with consideration of the temporal-variant background ionosphere within the GEO SAR long integration time (typically 100 s to 1000 s level). Concurrently, an accurate total electron content (TEC) retrieval method based on GEO SAR data is put forward through subband division in range and subaperture division in azimuth, obtaining variant TEC value with respect to the azimuth time. The processing steps of GEO SAR CIT are given and discussed. Owing to the short repeat period and large coverage area, GEO SAR CIT has potentials of covering the specific space continuously and completely and resultantly has excellent real-time performance. Finally, the TEC retrieval and GEO SAR CIT construction are performed by employing a numerical study based on the meteorological data. The feasibility and correctness of the proposed methods are verified.

  7. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  8. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  9. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  10. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  11. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  12. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  13. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  14. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  15. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  16. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  17. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  18. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  19. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  20. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  1. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  2. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  3. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  4. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    on Theory of Computing, pages 25-334, May 2000. [3]Tal Rabin and Michael Ben-Or. Verifiable secret sharing and multiparty protocols with honest majority (extended abstract). In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 73-85, Seattle, Washington, 15-17 May 1989.......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...

  5. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  6. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  7. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  8. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  9. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  10. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  11. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  12. Animation graphic interface for the space shuttle onboard computer

    Science.gov (United States)

    Wike, Jeffrey; Griffith, Paul

    1989-01-01

    Graphics interfaces designed to operate on space qualified hardware challenge software designers to display complex information under processing power and physical size constraints. Under contract to Johnson Space Center, MICROEXPERT Systems is currently constructing an intelligent interface for the LASER DOCKING SENSOR (LDS) flight experiment. Part of this interface is a graphic animation display for Rendezvous and Proximity Operations. The displays have been designed in consultation with Shuttle astronauts. The displays show multiple views of a satellite relative to the shuttle, coupled with numeric attitude information. The graphics are generated using position data received by the Shuttle Payload and General Support Computer (PGSC) from the Laser Docking Sensor. Some of the design considerations include crew member preferences in graphic data representation, single versus multiple window displays, mission tailoring of graphic displays, realistic 3D images versus generic icon representations of real objects, the physical relationship of the observers to the graphic display, how numeric or textual information should interface with graphic data, in what frame of reference objects should be portrayed, recognizing conditions of display information-overload, and screen format and placement consistency.

  13. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  14. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  15. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  16. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  17. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  18. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  19. Computer Navigation-aided Resection of Sacral Chordomas

    Directory of Open Access Journals (Sweden)

    Yong-Kun Yang

    2016-01-01

    Full Text Available Background: Resection of sacral chordomas is challenging. The anatomy is complex, and there are often no bony landmarks to guide the resection. Achieving adequate surgical margins is, therefore, difficult, and the recurrence rate is high. Use of computer navigation may allow optimal preoperative planning and improve precision in tumor resection. The purpose of this study was to evaluate the safety and feasibility of computer navigation-aided resection of sacral chordomas. Methods: Between 2007 and 2013, a total of 26 patients with sacral chordoma underwent computer navigation-aided surgery were included and followed for a minimum of 18 months. There were 21 primary cases and 5 recurrent cases, with a mean age of 55.8 years old (range: 35-84 years old. Tumors were located above the level of the S3 neural foramen in 23 patients and below the level of the S3 neural foramen in 3 patients. Three-dimensional images were reconstructed with a computed tomography-based navigation system combined with the magnetic resonance images using the navigation software. Tumors were resected via a posterior approach assisted by the computer navigation. Mean follow-up was 38.6 months (range: 18-84 months. Results: Mean operative time was 307 min. Mean intraoperative blood loss was 3065 ml. For computer navigation, the mean registration deviation during surgery was 1.7 mm. There were 18 wide resections, 4 marginal resections, and 4 intralesional resections. All patients were alive at the final follow-up, with 2 (7.7% exhibiting tumor recurrence. The other 24 patients were tumor-free. The mean Musculoskeletal Tumor Society Score was 27.3 (range: 19-30. Conclusions: Computer-assisted navigation can be safely applied to the resection of the sacral chordomas, allowing execution of preoperative plans, and achieving good oncological outcomes. Nevertheless, this needs to be accomplished by surgeons with adequate experience and skill.

  20. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  1. Improved detection of bone metastases from lung cancer in the thoracic cage using 5- and 1-mm axial images versus a new CT software generating rib unfolding images: comparison with standard ¹⁸F-FDG-PET/CT.

    Science.gov (United States)

    Homann, Georg; Mustafa, Deedar F; Ditt, Hendrik; Spengler, Werner; Kopp, Hans-Georg; Nikolaou, Konstantin; Horger, Marius

    2015-04-01

    To evaluate the performance of a dedicated computed tomography (CT) software called "bone reading" generating rib unfolded images for improved detection of rib metastases in patients with lung cancer in comparison to readings of 5- and 1-mm axial CT images and (18)F-Fluordeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT). Ninety consecutive patients who underwent (18)F-FDG-PET/CT and chest CT scanning between 2012 and 2014 at our institution were analyzed retrospectively. Chest CT scans with 5- and 1-mm slice thickness were interpreted blindly and separately focused on the detection of rib metastases (location, number, cortical vs. medullary, and osteoblastic vs. sclerotic). Subsequent image analysis of unfolded 1 mm-based CT rib images was performed. For all three data sets the reading time was registered. Finally, results were compared to those of FDG-PET. Validation was based on FDG-PET positivity for osteolytic and mixed osteolytic/osteoblastic focal rib lesions and follow-up for sclerotic PET-negative lesions. A total of 47 metastatic rib lesions were found on FDG-PET/CT plus another 30 detected by CT bone reading and confirmed by follow-up CT. Twenty-nine lesions were osteolytic, 14 were mixed osteolytic/osteoblastic, and 34 were sclerotic. On a patient-based analysis, CT (5 mm), CT (1 mm), and CT (1-mm bone reading) yielded a sensitivity, specificity, and accuracy of 76.5/97.3/93, 81.3/97.3/94, and 88.2/95.9/92, respectively. On segment-based (unfolded rib) analysis, the sensitivity, specificity, and accuracy of the three evaluations were 47.7/95.7/67, 59.5/95.8/77, and 94.8/88.2/92, respectively. Reading time for 5 mm/1 mm axial images and unfolded images was 40.5/50.7/21.56 seconds, respectively. The use of unfolded rib images in patients with lung cancer improves sensitivity and specificity of rib metastasis detection in comparison to 5- and 1-mm CT slice reading. Moreover, it may reduce the reading time. Copyright © 2015 AUR

  2. Cross-Hole Radar Travel-Time Tomography Based on Digital Image Segmentation%基于数字图像分割法的跨孔雷达走时层析成像

    Institute of Scientific and Technical Information of China (English)

    曲昕馨; 李桐林; 王飞

    2014-01-01

    The effectiveness of cross-hole radar tomography depends mainly on the quality of the extracted first arrival-time.Digital image segmentation method,based on the projection onto convex sets (POCS)technique,is used to extract the first arrival-time by segmenting the color image of the energy ratio,and had previously been applied in refracted seismic first arrival-time extraction.We first applied digital image segmentation method into cross-hole radar travel-time tomography to reconstruct the velocity field using an iteratively linearized inversion approach.During the inversion,LSQR algorithm was employed to solve the system of linear equations,Jacobian matrix was constructed by the curved ray tracing technique, and the travel-time was calculated using Multistencils Fast Marching Method (MSFM).We employed a synthetic data set and a field data set to test the effectiveness of the digital image segmentation method in travel-time tomography.For comparison,a traditional energy ratio method is also used for first arrival-time extraction.The result reflected that the tomography based on digital image segmentation method is more accurate with smaller residuals,and can provide more effective judgment for the underground velocity field.%跨孔雷达走时层析成像主要利用雷达波的走时进行反演,走时提取的正确与否将直接影响到层析成像的效果。数字图像分割法基于凸集投影(POCS)方法,使用能量比彩色图像分割技术准确提取走时。数字图像分割法提取走时首先应用在折射地震波的数据处理中。笔者首次将数字图像分割法提取走时的方法应用到跨孔雷达走时层析成像中,使用迭代线性反演算法重建了雷达波速度场。反演过程中,使用最小二乘 QR 分解法(LSQR)求解线性方程组,利用弯曲射线追踪技术构建雅可比矩阵,走时的计算值则由多模板快速推进算法(MSFM)得到。为了验证数字图像分割法

  3. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  4. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  5. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  6. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  7. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  8. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  9. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  10. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  11. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  12. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  13. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  14. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  15. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  16. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  17. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  18. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...... set of skills rather than one single skill. Skills acquisition at these layers can be tailored to the specific needs of students. The work presented here builds upon experience from courses for such students from the Humanities in which programming is taught as a tool for other purposes. Results...

  19. Induction of social behavior in zebrafish: live versus computer animated fish as stimuli.

    Science.gov (United States)

    Qin, Meiying; Wong, Albert; Seguin, Diane; Gerlai, Robert

    2014-06-01

    The zebrafish offers an excellent compromise between system complexity and practical simplicity and has been suggested as a translational research tool for the analysis of human brain disorders associated with abnormalities of social behavior. Unlike laboratory rodents zebrafish are diurnal, thus visual cues may be easily utilized in the analysis of their behavior and brain function. Visual cues, including the sight of conspecifics, have been employed to induce social behavior in zebrafish. However, the method of presentation of these cues and the question of whether computer animated images versus live stimulus fish have differential effects have not been systematically analyzed. Here, we compare the effects of five stimulus presentation types: live conspecifics in the experimental tank or outside the tank, playback of video-recorded live conspecifics, computer animated images of conspecifics presented by two software applications, the previously employed General Fish Animator, and a new application Zebrafish Presenter. We report that all stimuli were equally effective and induced a robust social response (shoaling) manifesting as reduced distance between stimulus and experimental fish. We conclude that presentation of live stimulus fish, or 3D images, is not required and 2D computer animated images are sufficient to induce robust and consistent social behavioral responses in zebrafish.

  20. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  1. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  2. Computational Physics.

    Science.gov (United States)

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  3. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  4. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  5. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  6. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  7. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  8. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  9. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  10. {sup 18}F-FDG PET-CT imaging versus bone marrow biopsy in pediatric Hodgkin's lymphoma: a quantitative assessment of marrow uptake and novel insights into clinical implications of marrow involvement

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Aamna; Siddique, Maimoona; Bashir, Humayun; Riaz, Saima; Nawaz, M.K. [Shaukat Khanum Memorial Cancer Hospital and Research Centre, Department of Nuclear Medicine, Lahore (Pakistan); Wali, Rabia; Mahreen, Asma [Shaukat Khanum Memorial Cancer Hospital and Research Centre, Paediatric Oncology, Lahore (Pakistan)

    2017-07-15

    To evaluate whether positron emission tomography/computed tomography using fluorine-18 fluoro-deoxyglucose ({sup 18}F-FDG PET-CT) predicts bone marrow involvement (BMI) in pediatric Hodgkin's lymphoma (pHL) with sufficient accuracy to supplant routine staging bone marrow biopsy (BMB), and to assess the clinical importance of marrow disease by comparing the prognosis of stage IV HL with BMI versus that without BMI. Data were retrospectively analyzed for all cases of pHL between July 2010 and June 2015 referred for staging {sup 18}F-FDG PET-CT scan and BMB. The reference standard was BMB. Stage IV patients were divided into three groups to compare their progression-free and overall survival: PET+ BMB-, PET+ BMB+, and PET- BMB-. Of the 784 patients, 83.3% were male and 16.7% female, with age ranging from 2 to 18 years (mean 10.3 years). Among the total cases, 104 (13.3%) had BMI; of these, 100 were detected by PET imaging and 58 by BMB. BMB and {sup 18}F-FDG PET/CT scans were concordant for BMI detection in 728 patients (93%): positive concordance in 54 and negative in 674. Of the 56 discordant cases, four had a false-negative PET scans and were upstaged by BMB, 46 with focal uptake were PET/CT-positive and BMB-negative (not obtained from active sites), and six with diffuse uptake were false-positive on PET due to paraneoplastic marrow activation. The sensitivity, specificity, PPV, and NPV of PET for identifying BMI was 93.6, 94, 53, and 99.4% respectively. On quantitative assessment, mean iBM-SUV{sub max} of bilateral iliac crests was significantly higher in those with BMI versus those without (p < 0.05). {sup 18}F-FDG PET-CT imaging is more sensitive than BMB for BMI detection in pHL staging. BMB should be limited to those with normal marrow uptake in the presence of poor risk factors or those with diffusely increased uptake to exclude marrow involvement in the background of reactive marrow. (orig.)

  11. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  12. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  13. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  14. Computational Physics

    Science.gov (United States)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  15. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  16. Computer files.

    Science.gov (United States)

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  18. Computer Spectrometers

    Science.gov (United States)

    Dattani, Nikesh S.

    2017-06-01

    Ideally, the cataloguing of spectroscopic linelists would not demand laborious and expensive experiments. Whatever an experiment might achieve, the same information would be attainable by running a calculation on a computer. Kolos and Wolniewicz were the first to demonstrate that calculations on a computer can outperform even the most sophisticated molecular spectroscopic experiments of the time, when their 1964 calculations of the dissociation energies of H_2 and D_{2} were found to be more than 1 cm^{-1} larger than the best experiments by Gerhard Herzberg, suggesting the experiment violated a strict variational principle. As explained in his Nobel Lecture, it took 5 more years for Herzberg to perform an experiment which caught up to the accuracy of the 1964 calculations. Today, numerical solutions to the Schrödinger equation, supplemented with relativistic and higher-order quantum electrodynamics (QED) corrections can provide ro-vibrational spectra for molecules that we strongly believe to be correct, even in the absence of experimental data. Why do we believe these calculated spectra are correct if we do not have experiments against which to test them? All evidence seen so far suggests that corrections due to gravity or other forces are not needed for a computer simulated QED spectrum of ro-vibrational energy transitions to be correct at the precision of typical spectrometers. Therefore a computer-generated spectrum can be considered to be as good as one coming from a more conventional spectrometer, and this has been shown to be true not just for the H_2 energies back in 1964, but now also for several other molecules. So are we at the stage where we can launch an array of calculations, each with just the atomic number changed in the input file, to reproduce the NIST energy level databases? Not quite. But I will show that for the 6e^- molecule Li_2, we have reproduced the vibrational spacings to within 0.001 cm^{-1} of the experimental spectrum, and I will

  19. Data on analysis of coronary atherosclerosis on computed tomography and 18F-sodium fluoride positron emission tomography

    Directory of Open Access Journals (Sweden)

    Toshiro Kitagawa

    2017-08-01

    Full Text Available This article contains the data showing illustrative examples of plaque classification on coronary computed tomography angiography (CCTA and measurement of 18F-sodium fluoride (18F-NaF uptake in coronary atherosclerotic lesions on positron emission tomography (PET. We divided the lesions into one of three plaque types on CCTA (calcified plaque, non-calcified plaque, partially calcified plaque. Focal 18F-NaF uptake of each lesion was quantified using maximum tissue-to-background ratio. This article also provides a representative case with a non-calcified coronary plaque detected on CCTA and identified on 18F-NaF PET/non-contrast computed tomography based on a location of a vessel branch as a landmark. These complement the data reported by Kitagawa et al. (2017 [1].

  20. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  1. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  2. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A

    2014-01-01

    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  3. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  4. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  5. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  6. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  7. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  8. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  9. Computational micromechanics

    Science.gov (United States)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  10. Differential diagnosis of ACTH-dependent hypercortisolism: imaging versus laboratory.

    Science.gov (United States)

    Andrioli, Massimiliano; Pecori Giraldi, Francesca; De Martin, Martina; Cattaneo, Agnese; Carzaniga, Chiara; Cavagnini, Francesco

    2009-01-01

    Differential diagnosis of ACTH-dependent Cushing's syndrome often presents major difficulties. Diagnostic troubles are increased by suboptimal specificity of endocrine tests, the rarity of ectopic ACTH secretion and the frequent incidental discovery of pituitary adenomas. A 43-year-old female reported with mild signs and symptoms of hypercortisolism, and initial hormonal tests and results of pituitary imaging (7-mm adenoma) were suggestive for Cushing's disease. However, inadequate response to corticotrophin-releasing hormone and failure to suppress after 8 mg dexamethasone pointed towards an ectopic source. Total body CT scan visualized only a small, non-specific nodule in the right posterior costophrenic excavation. Inferior petrosal sinus sampling revealed an absent center:periphery ACTH gradient but octreoscan and (18)F-FDG-PET-CT failed to detect abnormal tracer accumulation. We weighed results of the laboratory with those of imaging and decided to remove the lung nodule. Pathology identified a typical, ACTH-staining carcinoid and the diagnosis was confirmed by postsurgical hypoadrenalism. In conclusion, imaging may prove unsatisfactory or even misleading for the etiologial diagnosis of ACTH-dependent Cushing's syndrome and should therefore be interpreted only in context with results of hormonal dynamic testing.

  11. Experimental DNA computing

    NARCIS (Netherlands)

    Henkel, Christiaan

    2005-01-01

    Because of their information storing and processing capabilities, nucleic acids are interesting building blocks for molecular scale computers. Potential applications of such DNA computers range from massively parallel computation to computational gene therapy. In this thesis, several implementations

  12. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  13. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  14. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  15. Computing with functionals—computability theory or computer science?

    OpenAIRE

    Normann, Dag

    2006-01-01

    We review some of the history of the computability theory of functionals of higher types, and we will demonstrate how contributions from logic and theoretical computer science have shaped this still active subject.

  16. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  17. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  18. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  19. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  20. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  1. Computational thinking and thinking about computing.

    Science.gov (United States)

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  2. Locating Impedance Change in Electrical Impedance Tomography Based on Multilevel BP Neural Network%基于多级BP神经网络的EIT阻抗变化位置的确定

    Institute of Scientific and Technical Information of China (English)

    彭源; 莫玉龙

    2003-01-01

    Electrical impedance tomography (EIT) is a new computer tomography technology, which reconstructs an impedance (resistivity, conductivity) distribution, or change of impedance, by making voltage and current measurements on the object's periphery. Image reconstruction in EIT is an ill-posed, non-linear inverse problem. A method for finding the place of impedance change in EIT is proposed in this paper, in which a multilevel BP neural network (MBPNN) is used to express the non-linear relation between the impedance change inside the object and the voltage change measured on the surface of the object. Thus, the location of the impedance change can be decided by the measured voltage variation on the surface. The impedance change is then reconstructed using a linear approximate method. MBPNN can decide the impedance change location exactly without long training time. It alleviates some noise effects and can be expanded, ensuring high precision and space resolution of the reconstructed image that are not possible by using theback projection method.

  3. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  4. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  5. Cloud Computing (4)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ 8 Case Study Cloud computing is still a new phenomenon. Although many IT giants are developing their own cloud computing infrastructures,platforms, software, and services, few have really succeeded in becoming cloud computing providers.

  6. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  7. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    Science.gov (United States)

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  8. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  9. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  10. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  11. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special ... the Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a CT ...

  12. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  13. Introduction to computers

    OpenAIRE

    Rajaraman, A

    1995-01-01

    An article on computer application for knowledge processing intended to generate awareness among librarians on the possiblities offered by ICT to improve services. Compares computers and the human brain, provides a historical perspective of the development of computer technology, explains the components of the computer and the computer languages, identifes the areas where computers can be applied and its benefits. Explains available storage systems and database management process. Points out ...

  14. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  15. Cloud Computing (1)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series will discuss cloud computing technology in the following aspects: The first part provides a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  16. Cloud Computing (2)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series discusses cloud computing technology in the following aspects: The first part provided a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  17. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  18. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  19. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  20. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  1. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  2. Optimization of recommendations for abdomen computerized tomography based on reconstruction filters, voltage and tube current; Otimizacao de protocolos de tomografia computadorizada de abdome com base nos filtros de reconstrucao, tensao e corrente do tubo

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Vinicius da Costa

    2015-07-01

    The use of computed tomography has increased significantly over the past decades. In Brazil the use increased more than twofold from 2008 to 2014, in the meantime the abdomen procedures have tripled. The high frequency of this procedure combined by the increasing collective radiation dose in medical exposures, has resulted development tools to maximize the benefit in CT images. This work aimed to establish protocols optimized in abdominal CT through acquisitions parameters and reconstructions techniques based on filters kernels. A sample of patients undergoing abdominal CT in a diagnostic center of Rio de Janeiro was assessed. Had been collected patients information and acquisitions parameters. The phantoms CT image acquisitions were performed by using different voltage values by adjusting the tube current (mAs) to obtain the same value from CTDI{sub vol} patients with normal BMI. Afterwards, the CTDIvol values were reduced by 30%, 50% and 60%. All images were reconstructed with low-contrast filters (A) and standard filters (B). The CTDIvol values for patients with normal BMI were 7% higher than in patients with underweight BMI and 30%, 50% and 60% lower than the overweight, obese I and III patients, respectively. The evaluations of image quality showed that variation of the current (mA) and the reconstruction filters did not affect the Hounsfield values. When the contrast-to-noise ratio (CNR) was normalized to CTDIvol, the protocols acquired with 60% reduction of CTDIvol with 140 kV and 80 kV showed CNR 6% lower than the routine. Modifications of the acquisition parameters did not affect spatial resolution, but the post-processing with B filters reduced the spatial frequency by 16%. With reduced the dose of 30%, lesions in the spleen had the CNR higher than 10% routine protocols with 140 kV acquired and post-processed to filter A. The image post-processing with a filter A with a 80kV voltage provided CNR values equal to the routine for the liver lesions with a 30

  3. Method of GPS Water Vapor Tomography Based on Kalman Filter and Its Application%基于Kalman滤波的GPS水汽层析方法及其应用

    Institute of Scientific and Technical Information of China (English)

    毕研盟; 杨光林; 聂晶

    2011-01-01

    A tomography method based on Kalman filter was developed. Kalman filter method is an efficient filter that is easily computed and able to estimate the state of a system from a series of measurements. This method is tested in a small GPS network experiment performed in Hainan region and water vapor vertical structure above GPS sites is successfully obtained. The results show that tomographic water vapor vertical profiles agree well with profiles from radiosondes. Kalman filter tomography can retrieve correct and reliable water vapor vertical structure even on the situation of a priori information containing ±50% bias. Analysing the reason of stable results, it is may be that this method avoids the ill-posed problems of tomographic equations to some degree, and is insensitive to a priori information of water vapor.Therefore, the tomographic results even more depend on raw GPS observation data.%开发了基于Kalman滤波的GPS水汽层析方法,Kalman滤波提供了一种高效可计算的方法来估计过程的状态.将这种方法应用于海南地区GPS小网观测试验中,成功地层析出观测站上空大气水汽的垂直结构.结果表明,GPS层析得到的水汽廓线信息与探空符合较好,即使水汽先验估计存在士50%偏差的情况下,依然能获取正确、可靠的水汽垂直结构信息.通过初步分析认为,层析结果较稳定的原因可能是因为这种方法在一定程度上避免了层析方程解算过程中的病态问题,且对水汽先验信息并不敏感,使得层析结果更忠实于原始GPS观测资料.

  4. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  5. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    Science.gov (United States)

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications.

  6. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  7. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  8. The Computer Manpower Evolution

    Science.gov (United States)

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  9. Elementary School Computer Literacy.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  10. My Computer Romance

    Science.gov (United States)

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  11. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  12. Students’ Choice for Computers

    Institute of Scientific and Technical Information of China (English)

    Cai; Wei

    2015-01-01

    Nowadays,computers are widely used as useful tools for our daily life.So you can see students using computers everywhere.The purpose of our survey is to find out the answers to the following questions:1.What brand of computers do students often choose?2.What is the most important factor of choosing computers in students’idea?3.What do students want to do with computers most?After that,we hope the students will know what kind of computers they really need and how many factors must be thought about when buying computers.

  13. Study on Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    Guo-Liang Chen; Guang-Zhong Sun; Yun-Quan Zhang; Ze-Yao Mo

    2006-01-01

    In this paper, we present a general survey on parallel computing. The main contents include parallel computer system which is the hardware platform of parallel computing, parallel algorithm which is the theoretical base of parallel computing, parallel programming which is the software support of parallel computing. After that, we also introduce some parallel applications and enabling technologies. We argue that parallel computing research should form an integrated methodology of "architecture - algorithm - programming - application". Only in this way, parallel computing research becomes continuous development and more realistic.

  14. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  15. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  16. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  17. Computation in Classical Mechanics

    CERN Document Server

    Timberlake, Todd

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss the ways we have used computation in our classical mechanics courses, focusing on how computational work can improve students' understanding of physics as well as their computational skills. We present examples of computational problems that serve these two purposes. In addition, we provide information about resources for instructors who would like to include computation in their courses.

  18. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  19. Cloud Computing (3)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: In the preceding two parts of this series, several aspects of cloud computing-including definition, classification, characteristics, typical applications, and service levels-were discussed. This part continues with a discussion of Cloud Computing Oopen Architecture and Market-Oriented Cloud. A comparison is made between cloud computing and other distributed computing technologies, and Google's cloud platform is analyzed to determine how distributed computing is implemented in its particular model.

  20. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  1. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  2. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  3. Ion Trap Quantum Computing

    Science.gov (United States)

    2011-12-01

    an inspiring speech at the MIT Physics of Computation 1st Conference in 1981, Feynman proposed the development of a computer that would obey the...on ion trap based 36 quantum computing for physics and computer science students would include lecture notes, slides, lesson plans, a syllabus...reading lists, videos, demonstrations, and laboratories. 37 LIST OF REFERENCES [1] R. P. Feynman , “Simulating physics with computers,” Int. J

  4. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  5. Heterogeneous Distributed Computing for Computational Aerosciences

    Science.gov (United States)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  6. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  7. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  8. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  9. Duality quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article,we make a review on the development of a newly proposed quantum computer,duality computer,or the duality quantum computer and the duality mode of quantum computers.The duality computer is based on the particle-wave duality principle of quantum mechanics.Compared to an ordinary quantum computer,the duality quantum computer is a quantum computer on the move and passing through a multi-slit.It offers more computing operations than is possible with an ordinary quantum computer.The most two distinct operations are:the quantum division operation and the quantum combiner operation.The division operation divides the wave function of a quantum computer into many attenuated,and identical parts.The combiner operation combines the wave functions in different parts into a single part.The duality mode is a way in which a quantum computer with some extra qubit resource simulates a duality computer.The main structure of duality quantum computer and duality mode,the duality mode,their mathematical description and algorithm designs are reviewed.

  10. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  11. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  12. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  13. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  14. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  15. Systematic review of perfusion imaging with computed tomography and magnetic resonance in acute ischemic stroke: heterogeneity of acquisition and postprocessing parameters: a translational medicine research collaboration multicentre acute stroke imaging study.

    Science.gov (United States)

    Dani, Krishna A; Thomas, Ralph G R; Chappell, Francesca M; Shuler, Kirsten; Muir, Keith W; Wardlaw, Joanna M

    2012-02-01

    Heterogeneity of acquisition and postprocessing parameters for magnetic resonance- and computed tomography-based perfusion imaging in acute stroke may limit comparisons between studies, but the current degree of heterogeneity in the literature has not been precisely defined. We examined articles published before August 30, 2009 that reported perfusion thresholds, average lesion perfusion values, or correlations of perfusion deficit volumes from acute stroke patients computed tomography perfusion and 49 perfusion-weighted imaging studies were included from 7152 articles. Although certain parameters were reported frequently, consistently, and in line with the Roadmap proposals, we found substantial heterogeneity in other parameters, and there was considerable variation and underreporting of postprocessing methodology. There is substantial scope to increase homogeneity in future studies, eg, through reporting standards.

  16. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  17. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  18. Computer Intrusions and Attacks.

    Science.gov (United States)

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  19. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  20. Optical Quantum Computing

    National Research Council Canada - National Science Library

    Jeremy L. O'Brien

    2007-01-01

    In 2001, all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single-photon sources, linear optical elements, and single-photon detectors...

  1. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  2. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  3. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, ... the body being studied. top of page How is the procedure performed? The technologist begins by positioning ...

  5. Applying Computational Intelligence

    CERN Document Server

    Kordon, Arthur

    2010-01-01

    Offers guidelines on creating value from the application of computational intelligence methods. This work introduces a methodology for effective real-world application of computational intelligence while minimizing development cost, and outlines the critical, underestimated technology marketing efforts required

  6. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  7. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  8. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  9. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  10. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  11. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  12. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  14. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  15. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available

    Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.

    Keywords: Cloud computing, QoS, quality of cloud computing

  16. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  17. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  19. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  20. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  1. Comparative micro computed tomography study of a vertebral body

    Science.gov (United States)

    Drews, Susanne; Beckmann, Felix; Herzen, Julia; Brunke, Oliver; Salmon, Phil; Friess, Sebastian; Laib, Andres; Koller, Bruno; Hemberger, Thomas; Müller-Gerbl, Magdalena; Müller, Bert

    2008-08-01

    Investigations of bony tissues are often performed using micro computed tomography based on X-rays, since the calcium distribution leads to superior contrast. Osteoporotic bone, for example, can be well compared with healthy one with respect to density and morphology. Degenerative and rheumatoid diseases usually start, however, at the bone-cartilage-interface, which is hardly accessible. The direct influence on the bone itself becomes only visible at later stage. For the development of suitable therapies against degenerative cartilage damages the exact three-dimensional description of the bone-cartilage interface is vital, as demonstrated for transplanted cartilage-cells or bone-cartilage-constructs in animal models. So far, the morphological characterization was restricted to magnetic resonance imaging (MRI) with poor spatial resolution or to time-consuming histological sectioning with appropriate spatial resolution only in two rather arbitrarily chosen directions. Therefore, one should develop μCT to extract the features of low absorbing cartilage. The morphology and the volume of the inter-vertebral cartilage disc of lumbar motion segments have been determined for one PMMA embedded specimen. Tomograms were recorded using nanotom® (Phoenix|x-ray, Wunstorf, Germany), μCT 35TM (Scanco Medical, Brütisellen, Switzerland), 1172TM and 1174TM (both Skyscan, Kontich, Belgium), as well as using the SRμCT at HASYLAB/DESY. Conventional and SRμCT can provide the morphology and the volume of cartilage between bones. Increasing the acquisition time, the signal-to-noise ratio becomes better and better but the prominent artifacts in conventional μCT as the result of inhomogeneously distributed bony tissue prevents the exact segmentation of cartilage. SRμCT allows segmenting the cartilage but requires long periods of expensive beam-time to obtain reasonable contrast.

  2. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  3. Computer system identification

    OpenAIRE

    Lesjak, Borut

    2008-01-01

    The concept of computer system identity in computer science bears just as much importance as does the identity of an individual in a human society. Nevertheless, the identity of a computer system is incomparably harder to determine, because there is no standard system of identification we could use and, moreover, a computer system during its life-time is quite indefinite, since all of its regular and necessary hardware and software upgrades soon make it almost unrecognizable: after a number o...

  4. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Introduction to Quantum Computation

    Science.gov (United States)

    Ekert, A.

    A computation is a physical process. It may be performed by a piece of electronics or on an abacus, or in your brain, but it is a process that takes place in nature and as such it is subject to the laws of physics. Quantum computers are machines that rely on characteristically quantum phenomena, such as quantum interference and quantum entanglement in order to perform computation. In this series of lectures I want to elaborate on the computational power of such machines.

  6. Computational intelligence in optimization

    CERN Document Server

    Tenne, Yoel

    2010-01-01

    This volume presents a collection of recent studies covering the spectrum of computational intelligence applications with emphasis on their application to challenging real-world problems. Topics covered include: Intelligent agent-based algorithms, Hybrid intelligent systems, Cognitive and evolutionary robotics, Knowledge-Based Engineering, fuzzy sets and systems, Bioinformatics and Bioengineering, Computational finance and Computational economics, Data mining, Machine learning, and Expert systems. ""Computational Intelligence in Optimization"" is a comprehensive reference for researchers, prac

  7. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  8. Applications of membrane computing

    CERN Document Server

    Ciobanu, Gabriel; Păun, Gheorghe

    2006-01-01

    Membrane computing is a branch of natural computing which investigates computing models abstracted from the structure and functioning of living cells and from their interactions in tissues or higher-order biological structures. The models considered, called membrane systems (P systems), are parallel, distributed computing models, processing multisets of symbols in cell-like compartmental architectures. In many applications membrane systems have considerable advantages - among these are their inherently discrete nature, parallelism, transparency, scalability and nondeterminism. In dedicated cha

  9. Biomolecular computation for bionanotechnology

    CERN Document Server

    Liu, Jian-Qin

    2006-01-01

    Computers built with moleware? The drive toward non-silicon computing is underway, and this first-of-its-kind guide to molecular computation gives researchers a firm grasp of the technologies, biochemical details, and theoretical models at the cutting edge. It explores advances in molecular biology and nanotechnology and illuminates how the convergence of various technologies is propelling computational capacity beyond the limitations of traditional hardware technology and into the realm of moleware.

  10. Computably regular topological spaces

    OpenAIRE

    Weihrauch, Klaus

    2013-01-01

    This article continues the study of computable elementary topology started by the author and T. Grubba in 2009 and extends the author's 2010 study of axioms of computable separation. Several computable T3- and Tychonoff separation axioms are introduced and their logical relation is investigated. A number of implications between these axioms are proved and several implications are excluded by counter examples, however, many questions have not yet been answered. Known results on computable metr...

  11. Mobile collaborative cloudless computing

    OpenAIRE

    Cruz, Nuno Miguel Machado, 1978-

    2015-01-01

    Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2015 Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. Cloud computing is an innovative computing paradigm wh...

  12. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  13. Space Spurred Computer Graphics

    Science.gov (United States)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  14. People Shaping Educational Computing.

    Science.gov (United States)

    Blair, Marjorie; Lobello, Sharon

    1984-01-01

    Discusses contributions to educational computing of Seymour Papert, LOGO creator; Irwin Hoffman, first school-based computer education program developer; Dorothy Deringer, National Science Foundation's monitor and supporter of educational computing projects; Sherwin Steffin, educational software company vice-president; and Jessie Muse, National…

  15. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  16. Women and Computer Science.

    Science.gov (United States)

    Breene, L. Anne

    1992-01-01

    Discusses issues concerning women in computer science education, and in the workplace, and sex bias in the computer science curriculum. Concludes that computing environment has not improved for women over last 20 years. Warns that, although number of white males entering college is declining, need for scientists and engineers is not. (NB)

  17. Computers at the Crossroads.

    Science.gov (United States)

    Ediger, Marlow

    1988-01-01

    Discusses reasons for the lack of computer and software use in the classroom, especially on the elementary level. Highlights include deficiencies in available software, including lack of interaction and type of feedback; philosophies of computer use; the psychology of learning and computer use; and suggestions for developing quality software. (4…

  18. Computer Training at Harwell

    Science.gov (United States)

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  19. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  20. Computer applications in bioprocessing.

    Science.gov (United States)

    Bungay, H R

    2000-01-01

    Biotechnologists have stayed at the forefront for practical applications for computing. As hardware and software for computing have evolved, the latest advances have found eager users in the area of bioprocessing. Accomplishments and their significance can be appreciated by tracing the history and the interplay between the computing tools and the problems that have been solved in bioprocessing.

  1. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  2. Education for Computers

    Science.gov (United States)

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  3. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss th

  4. Computational Thinking Patterns

    Science.gov (United States)

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  5. Ethics and Computer Scientists.

    Science.gov (United States)

    Pulliam, Sylvia Clark

    The purpose of this study was to explore the perceptions that computer science educators have about computer ethics. The study focused on four areas: (1) the extent to which computer science educators believe that ethically inappropriate practices are taking place (both on campus and throughout society); (2) perceptions of such educators about…

  6. Deductive Computer Programming. Revision

    Science.gov (United States)

    1989-09-30

    Lecture Notes in Computer Science 354...automata", In Temporal Logic in Specification, Lecture Notes in Computer Science 398, Springer-Verlag, 1989, pp. 124-164. *[MP4] Z. Manna and A. Pnueli... Notes in Computer Science 372, Springer-Verlag, 1989, pp. 534-558. CONTRIBUTION TO BOOKS [MP5] Z. Manna and A. Pnueli, "An exercise in the

  7. The Next Computer Revolution.

    Science.gov (United States)

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  8. Computational Social Creativity.

    Science.gov (United States)

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  9. Mixing Computations and Proofs

    Directory of Open Access Journals (Sweden)

    Michael Beeson

    2016-01-01

    Full Text Available We examine the relationship between proof and computation in mathematics, especially in formalized mathematics. We compare the various approaches to proofs with a significant computational component, including (i verifying  the algorithms, (ii verifying the results of the unverified algorithms, and (iii trusting an external computation.

  10. Research on ionospheric tomography based on variable pixel height

    Science.gov (United States)

    Zheng, Dunyong; Li, Peiqing; He, Jie; Hu, Wusheng; Li, Chaokui

    2016-05-01

    A novel ionospheric tomography technique based on variable pixel height was developed for the tomographic reconstruction of the ionospheric electron density distribution. The method considers the height of each pixel as an unknown variable, which is retrieved during the inversion process together with the electron density values. In contrast to conventional computerized ionospheric tomography (CIT), which parameterizes the model with a fixed pixel height, the variable-pixel-height computerized ionospheric tomography (VHCIT) model applies a disturbance to the height of each pixel. In comparison with conventional CIT models, the VHCIT technique achieved superior results in a numerical simulation. A careful validation of the reliability and superiority of VHCIT was performed. According to the results of the statistical analysis of the average root mean square errors, the proposed model offers an improvement by 15% compared with conventional CIT models.

  11. Application of optical coherence tomography based microangiography for cerebral imaging

    Science.gov (United States)

    Baran, Utku; Wang, Ruikang K.

    2016-03-01

    Requirements of in vivo rodent brain imaging are hard to satisfy using traditional technologies such as magnetic resonance imaging and two-photon microscopy. Optical coherence tomography (OCT) is an emerging tool that can easily reach at high speeds and provide high resolution volumetric images with a relatively large field of view for rodent brain imaging. Here, we provide the overview of recent developments of functional OCT based imaging techniques for neuroscience applications on rodents. Moreover, a summary of OCT-based microangiography (OMAG) studies for stroke and traumatic brain injury cases on rodents are provided.

  12. Optical Doppler tomography based on a field programmable gate array

    DEFF Research Database (Denmark)

    Larsen, Henning Engelbrecht; Nilsson, Ronnie Thorup; Thrane, Lars

    2008-01-01

    We report the design of and results obtained by using a field programmable gate array (FPGA) to digitally process optical Doppler tomography signals. The processor fits into the analog signal path in an existing optical coherence tomography setup. We demonstrate both Doppler frequency and envelope...... extraction using the Hilbert transform, all in a single FPGA. An FPGA implementation has certain advantages over general purpose digital signal processor (DSP) due to the fact that the processing elements operate in parallel as opposed to the DSP. which is primarily a sequential processor....

  13. Computerized tomography-based anatomic description of the porcine liver.

    Science.gov (United States)

    Bekheit, Mohamed; Bucur, Petru O; Wartenberg, Mylene; Vibert, Eric

    2017-04-01

    The knowledge of the anatomic features is imperative for successful modeling of the different surgical situations. This study aims to describe the anatomic features of the porcine using computerized tomography (CT) scan. Thirty large, white, female pigs were included in this study. The CT image acquisition was performed in four-phase contrast study. Subsequently, analysis of the images was performed using syngo.via software (Siemens) to subtract mainly the hepatic artery and its branches. Analysis of the portal and hepatic veins division pattern was performed using the Myrian XP-Liver 1.14.1 software (Intrasense). The mean total liver volume was 915 ± 159 mL. The largest sector in the liver was the right medial one representing around 28 ± 5.7% of the total liver volume. Next in order is the right lateral sector constituting around 24 ± 5%. Its volume is very close to the volume of the left medial sector, which represents around 22 ± 4.7% of the total liver volume. The caudate lobe represents around 8 ± 2% of the total liver volume.The portal vein did not show distinct right and left divisions rather than consecutive branches that come off the main trunk. The hepatic artery frequently trifurcates into left trunk that gives off the right gastric artery and the artery to the left lateral sector, the middle hepatic artery that supplies both the right and the left medial sectors and the right hepatic artery trunk that divides to give anterior branch to the right lateral lobe, branch to the right medial lobe, and at least a branch to the caudate lobe. Frequently, there is a posterior branch that crosses behind the portal vein to the right lateral lobe. The suprahepatic veins join the inferior vena cava in three distinct openings. There are communications between the suprahepatic veins that drain the adjacent sectors. The vein from the right lateral and the right medial sectors drains into a common trunk. The vein from the left lateral and from the left medial sectors drains into a common trunk. A separate opening is usually encountered draining the right medial sector. The caudate lobe drains separately into inferior vena cava caudal to the other veins. Knowledge of the anatomic features of the porcine liver is crucial to the performance of a successful surgical procedure. We herein describe the CT-depicted anatomic features of the porcine liver. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Microcomputed tomography-based assessment of retrieved dental implants

    NARCIS (Netherlands)

    Narra, N.; Antalainen, A.K.; Zipprich, H.; Sándor, G.K.; Wolff, J.

    2015-01-01

    Purpose: The aim of this study was to demonstrate the potential of microcomputed tomography (micro-CT) technology in the assessment of retrieved dental implants. Cases are presented to illustrate the value of micro-CT imaging techniques in determining possible mechanical causes for dental implant

  15. Electron tomography based on a total variation minimization reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Goris, B., E-mail: bart.goris@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van den Broek, W. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [Centrum Wiskunde and Informatica, Science Park 123, NL-1098XG Amsterdam (Netherlands); Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Heidari Mezerji, H.; Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2012-02-15

    The 3D reconstruction of a tilt series for electron tomography is mostly carried out using the weighted backprojection (WBP) algorithm or using one of the iterative algorithms such as the simultaneous iterative reconstruction technique (SIRT). However, it is known that these reconstruction algorithms cannot compensate for the missing wedge. Here, we apply a new reconstruction algorithm for electron tomography, which is based on compressive sensing. This is a field in image processing specialized in finding a sparse solution or a solution with a sparse gradient to a set of ill-posed linear equations. Therefore, it can be applied to electron tomography where the reconstructed objects often have a sparse gradient at the nanoscale. Using a combination of different simulated and experimental datasets, it is shown that missing wedge artefacts are reduced in the final reconstruction. Moreover, it seems that the reconstructed datasets have a higher fidelity and are easier to segment in comparison to reconstructions obtained by more conventional iterative algorithms. -- Highlights: Black-Right-Pointing-Pointer A reconstruction algorithm for electron tomography is investigated based on total variation minimization. Black-Right-Pointing-Pointer Missing wedge artefacts are reduced by this algorithm. Black-Right-Pointing-Pointer The reconstruction is easier to segment. Black-Right-Pointing-Pointer More reliable quantitative information can be obtained.

  16. Bioluminescence tomography based on the phase approximation model

    OpenAIRE

    Cong, W; Wang, G.

    2010-01-01

    A reconstruction method of bioluminescence sources is proposed based on a phase approximation model. Compared with the diffuse approximation, this phase approximation model more correctly predicts bioluminescence photon propagation in biological tissues, so that bioluminescence tomography can accurately locate and quantify the distribution of bioluminescence sources. The compressive sensing (CS) technique is applied to regularize the inverse source reconstruction to enhance numerical stabilit...

  17. Optical coherence tomography-based micro-particle image velocimetry.

    Science.gov (United States)

    Mujat, Mircea; Ferguson, R Daniel; Iftimia, Nicusor; Hammer, Daniel X; Nedyalkov, Ivaylo; Wosnik, Martin; Legner, Hartmut

    2013-11-15

    We present a new application of optical coherence tomography (OCT), widely used in biomedical imaging, to flow analysis in near-wall hydrodynamics for marine research. This unique capability, called OCT micro-particle image velocimetry, provides a high-resolution view of microscopic flow phenomena and measurement of flow statistics within the first millimeter of a boundary layer. The technique is demonstrated in a small flow cuvette and in a water tunnel.

  18. Quantum State Tomography Based on Quantum Games Theoretic Setup

    CERN Document Server

    Nawaz, Ahmad

    2009-01-01

    We develop a technique for single qubit quantum state tomography using the mathematical setup of generalized quantization scheme for games. In our technique Alice sends an unknown pure quantum state to Bob who appends it with |0><0| and then applies the unitary operators on the appended quantum state and finds the payoffs for Alice and himself. It is shown that for a particular set of unitary operators these elements become equal to Stokes parameters for an unknown quantum state. In this way an unknown quantum state can be measured and reconstructed. Strictly speaking this technique is not a game as no strategic competitions are involved.

  19. Video-rate volumetric optical coherence tomography-based microangiography

    Science.gov (United States)

    Baran, Utku; Wei, Wei; Xu, Jingjiang; Qi, Xiaoli; Davis, Wyatt O.; Wang, Ruikang K.

    2016-04-01

    Video-rate volumetric optical coherence tomography (vOCT) is relatively young in the field of OCT imaging but has great potential in biomedical applications. Due to the recent development of the MHz range swept laser sources, vOCT has started to gain attention in the community. Here, we report the first in vivo video-rate volumetric OCT-based microangiography (vOMAG) system by integrating an 18-kHz resonant microelectromechanical system (MEMS) mirror with a 1.6-MHz FDML swept source operating at ˜1.3 μm wavelength. Because the MEMS scanner can offer an effective B-frame rate of 36 kHz, we are able to engineer vOMAG with a video rate up to 25 Hz. This system was utilized for real-time volumetric in vivo visualization of cerebral microvasculature in mice. Moreover, we monitored the blood perfusion dynamics during stimulation within mouse ear in vivo. We also discussed this system's limitations. Prospective MEMS-enabled OCT probes with a real-time volumetric functional imaging capability can have a significant impact on endoscopic imaging and image-guided surgery applications.

  20. How Computers Work: Computational Thinking for Everyone

    Directory of Open Access Journals (Sweden)

    Rex Page

    2013-01-01

    Full Text Available What would you teach if you had only one course to help students grasp the essence of computation and perhaps inspire a few of them to make computing a subject of further study? Assume they have the standard college prep background. This would include basic algebra, but not necessarily more advanced mathematics. They would have written a few term papers, but would not have written computer programs. They could surf and twitter, but could not exclusive-or and nand. What about computers would interest them or help them place their experience in context? This paper provides one possible answer to this question by discussing a course that has completed its second iteration. Grounded in classical logic, elucidated in digital circuits and computer software, it expands into areas such as CPU components and massive databases. The course has succeeded in garnering the enthusiastic attention of students with a broad range of interests, exercising their problem solving skills, and introducing them to computational thinking.

  1. The science of computing - Parallel computation

    Science.gov (United States)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  2. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  3. Scalable distributed computing hierarchy: cloud, fog and dew computing

    OpenAIRE

    Skala, Karolj; Davidović, Davor; Afgan, Enis; Sović, Ivan; Šojat, Zorislav

    2015-01-01

    The paper considers the conceptual approach for organization of the vertical hierarchical links between the scalable distributed computing paradigms: Cloud Computing, Fog Computing and Dew Computing. In this paper, the Dew Computing is described and recognized as a new structural layer in the existing distributed computing hierarchy. In the existing computing hierarchy, the Dew computing is positioned as the ground level for the Cloud and Fog computing paradigms. Vertical, complementary, hier...

  4. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  5. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  6. Perspectives in Computation

    CERN Document Server

    Geroch, Robert

    2009-01-01

    Computation is the process of applying a procedure or algorithm to the solution of a mathematical problem. Mathematicians and physicists have been occupied for many decades pondering which problems can be solved by which procedures, and, for those that can be solved, how this can most efficiently be done. In recent years, quantum mechanics has augmented our understanding of the process of computation and of its limitations. Perspectives in Computation covers three broad topics: the computation process and its limitations, the search for computational efficiency, and the role of quantum mechani

  7. Rough-Granular Computing

    Institute of Scientific and Technical Information of China (English)

    Andrzej Skowron

    2006-01-01

    Solving complex problems by multi-agent systems in distributed environments requires new approximate reasoning methods based on new computing paradigms. One such recently emerging computing paradigm is Granular Computing(GC). We discuss the Rough-Granular Computing(RGC) approach to modeling of computations in complex adaptive systems and multiagent systems as well as for approximate reasoning about the behavior of such systems. The RGC methods have been successfully applied for solving complex problems in areas such as identification of objects or behavioral patterns by autonomous systems, web mining, and sensor fusion.

  8. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  9. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  10. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  11. Replacing the computer mouse

    OpenAIRE

    Dernoncourt, Franck

    2014-01-01

    In a few months the computer mouse will be half-a-century-old. It is known to have many drawbacks, the main ones being: loss of productivity due to constant switching between keyboard and mouse, and health issues such as RSI. Like the keyboard, it is an unnatural human-computer interface. However the vast majority of computer users still use computer mice nowadays. In this article, we explore computer mouse alternatives. Our research shows that moving the mouse cursor can be done efficiently ...

  12. Computation over Mismatched Channels

    CERN Document Server

    Karamchandani, Nikhil; Diggavi, Suhas

    2012-01-01

    We consider the problem of distributed computation of a target function over a multiple-access channel. If the target and channel functions are matched (i.e., compute the same function), significant performance gains can be obtained by jointly designing the computation and communication tasks. However, in most situations there is mismatch between these two functions. In this work, we analyze the impact of this mismatch on the performance gains achievable with joint computation and communication designs over separation-based designs. We show that for most pairs of target and channel functions there is no such gain, and separation of computation and communication is optimal.

  13. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface

  14. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  15. Analogue computing methods

    CERN Document Server

    Welbourne, D

    1965-01-01

    Analogue Computing Methods presents the field of analogue computation and simulation in a compact and convenient form, providing an outline of models and analogues that have been produced to solve physical problems for the engineer and how to use and program the electronic analogue computer. This book consists of six chapters. The first chapter provides an introduction to analogue computation and discusses certain mathematical techniques. The electronic equipment of an analogue computer is covered in Chapter 2, while its use to solve simple problems, including the method of scaling is elaborat

  16. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  17. Topology for computing

    CERN Document Server

    Zomorodian, Afra J

    2005-01-01

    The emerging field of computational topology utilizes theory from topology and the power of computing to solve problems in diverse fields. Recent applications include computer graphics, computer-aided design (CAD), and structural biology, all of which involve understanding the intrinsic shape of some real or abstract space. A primary goal of this book is to present basic concepts from topology and Morse theory to enable a non-specialist to grasp and participate in current research in computational topology. The author gives a self-contained presentation of the mathematical concepts from a comp

  18. Trust Based Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LI Shiqun; Shane Balfe; ZHOU Jianying; CHEN Kefei

    2006-01-01

    Pervasive computing environment is a distributed and mobile space. Trust relationship must be established and ensured between devices and the systems in the pervasive computing environment. The trusted computing (TC) technology introduced by trusted computing group is a distributed-system-wide approach to the provisions of integrity protection of resources. The TC' notion of trust and security can be described as conformed system behaviors of a platform environment such that the conformation can be attested to a remote challenger. In this paper the trust requirements in a pervasive/ubiquitous environment are analyzed. Then security schemes for the pervasive computing are proposed using primitives offered by TC technology.

  19. Cloud Computing Technologies

    Directory of Open Access Journals (Sweden)

    Sean Carlin

    2012-06-01

    Full Text Available This paper outlines the key characteristics that cloud computing technologies possess and illustrates the cloud computing stack containing the three essential services (SaaS, PaaS and IaaS that have come to define the technology and its delivery model. The underlying virtualization technologies that make cloud computing possible are also identified and explained. The various challenges that face cloud computing technologies today are investigated and discussed. The future of cloud computing technologies along with its various applications and trends are also explored, giving a brief outlook of where and how the technology will progress into the future.

  20. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    Science.gov (United States)

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  1. Transoesophageal ultrasound and computer tomographic assessment of the equine cricoarytenoid dorsalis muscle: Relationship between muscle geometry and exercising laryngeal function.

    Science.gov (United States)

    Kenny, M; Cercone, M; Rawlinson, J J; Ducharme, N G; Bookbinder, L; Thompson, M; Cheetham, J

    2017-05-01

    Early detection of recurrent laryngeal neuropathy (RLN) is of considerable interest to the equine industry. To describe two imaging modalities, transoesophageal ultrasound (TEU) and computed tomography (CT) with multiplanar reconstruction to assess laryngeal muscle geometry, and determine the relationship between cricoarytenoid dorsalis (CAD) geometry and function. Two-phase study evaluating CAD geometry in experimental horses and horses with naturally occurring RLN. Equine CAD muscle volume was determined from CT scan sets using volumetric reconstruction with LiveWire. The midbody and caudal dorsal-ventral thickness of the CAD muscle was determined using a TEU in the same horses; and in horses with a range of severity of RLN (n = 112). Transoesophageal ultrasound was able to readily image the CAD muscles and lower left:right CAD thickness ratios were observed with increasing disease severity. Computed tomography based muscle volume correlated very closely with ex vivo muscle volume (R(2) = 0.77). Computed tomography reconstruction can accurately determine intrinsic laryngeal muscle geometry. A relationship between TEU measurements of CAD geometry and laryngeal function was established. These imaging techniques could be used to track the response of the CAD muscle to restorative surgical treatments such as nerve muscle pedicle graft, nerve anastomosis and functional electrical stimulation. © 2016 EVJ Ltd.

  2. Richard Feynman and computation

    Science.gov (United States)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  3. ALMA correlator computer systems

    Science.gov (United States)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  4. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  5. Hyperswitch Communication Network Computer

    Science.gov (United States)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  6. Community Cloud Computing

    CERN Document Server

    Marinos, Alexandros

    2009-01-01

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenge...

  7. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  8. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  9. COMPUTER-ASSISTED ACCOUNTING

    Directory of Open Access Journals (Sweden)

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  10. Core of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Prof. C.P.Chandgude

    2017-04-01

    Full Text Available Advancement in computing facilities marks back from 1960’s with introduction of mainframes. Each of the computing has one or the other issues, so keeping this in mind cloud computing was introduced. Cloud computing has its roots in older technologies such as hardware virtualization, distributed computing, internet technologies, and autonomic computing. Cloud computing can be described with two models, one is service model and second is deployment model. While providing several services, cloud management’s primary role is resource provisioning. While there are several such benefits of cloud computing, there are challenges in adopting public clouds because of dependency on infrastructure that is shared by many enterprises. In this paper, we present core knowledge of cloud computing, highlighting its key concepts, deployment models, service models, benefits as well as security issues related to cloud data. The aim of this paper is to provide a better understanding of the cloud computing and to identify important research directions in this field

  11. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  12. Serious computer games in computer science education

    Directory of Open Access Journals (Sweden)

    Jože Rugelj

    2015-11-01

    Full Text Available The role and importance of serious computer games in contemporary educational practice is presented in this paper as well as the theoretical fundamentals that justify their use in different forms of education. We present a project for designing and developing serious games that take place within the curriculum for computer science teachers’ education as an independent project work in teams. In this project work students have to use their knowledge in the field of didactics and computer science to develop games. The developed game is tested and evaluated in schools in the framework of their practical training. The results of the evaluation can help students improve their games and verify to which extent specified learning goals have been achieved.

  13. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  14. New computing systems and their impact on computational mechanics

    Science.gov (United States)

    Noor, Ahmed K.

    1989-01-01

    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  15. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete......Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...

  16. Programming in Biomolecular Computation

    DEFF Research Database (Denmark)

    Hartmann, Lars; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level....

  17. Computation with narrow CTCs

    CERN Document Server

    Say, A C Cem

    2011-01-01

    We examine some variants of computation with closed timelike curves (CTCs), where various restrictions are imposed on the memory of the computer, and the information carrying capacity and range of the CTC. We give full characterizations of the classes of languages recognized by polynomial time probabilistic and quantum computers that can send a single classical bit to their own past. Such narrow CTCs are demonstrated to add the power of limited nondeterminism to deterministic computers, and lead to exponential speedup in constant-space probabilistic and quantum computation. We show that, given a time machine with constant negative delay, one can implement CTC-based computations without the need to know about the runtime beforehand.

  18. CLOUD COMPUTING AND SECURITY

    Directory of Open Access Journals (Sweden)

    Asharani Shinde

    2015-10-01

    Full Text Available This document gives an insight into Cloud Computing giving an overview of key features as well as the detail study of exact working of Cloud computing. Cloud Computing lets you access all your application and documents from anywhere in the world, freeing you from the confines of the desktop thus making it easier for group members in different locations to collaborate. Certainly cloud computing can bring about strategic, transformational and even revolutionary benefits fundamental to future enterprise computing but it also offers immediate and pragmatic opportunities to improve efficiencies today while cost effectively and systematically setting the stage for the strategic change. As this technology makes the computing, sharing, networking easy and interesting, we should think about the security and privacy of information too. Thus the key points we are going to be discussed are what is cloud, what are its key features, current applications, future status and the security issues and the possible solutions.

  19. Introduction to morphogenetic computing

    CERN Document Server

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  20. Electronics and computer acronyms

    CERN Document Server

    Brown, Phil

    1988-01-01

    Electronics and Computer Acronyms presents a list of almost 2,500 acronyms related to electronics and computers. The material for this book is drawn from a number of subject areas, including electrical, electronics, computers, telecommunications, fiber optics, microcomputers/microprocessors, audio, video, and information technology. The acronyms also encompass avionics, military, data processing, instrumentation, units, measurement, standards, services, organizations, associations, and companies. This dictionary offers a comprehensive and broad view of electronics and all that is associated wi

  1. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  2. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  3. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  4. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  5. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  6. Computable de Finetti measures

    CERN Document Server

    Freer, Cameron E

    2009-01-01

    We prove a uniformly computable version of de Finetti's theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify non-local state. Along the way, we prove that a distribution on the unit interval is computable if and only if its moments are uniformly computable.

  7. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  8. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  9. Computer and Applied Ethics

    OpenAIRE

    越智, 貢

    2014-01-01

    With this essay I treat some problems raised by the new developments in science and technology, that is, those about Computer Ethics to show how and how far Applied Ethics differs from traditional ethics. I take up backgrounds on which Computer Ethics rests, particularly historical conditions of morality. Differences of conditions in time and space explain how Computer Ethics and Applied Ethics are not any traditional ethics in concrete cases. But I also investigate the normative rea...

  10. Computing with Colored Tangles

    Directory of Open Access Journals (Sweden)

    Avishy Y. Carmi

    2015-07-01

    Full Text Available We suggest a diagrammatic model of computation based on an axiom of distributivity. A diagram of a decorated colored tangle, similar to those that appear in low dimensional topology, plays the role of a circuit diagram. Equivalent diagrams represent bisimilar computations. We prove that our model of computation is Turing complete and with bounded resources that it can decide any language in complexity class IP, sometimes with better performance parameters than corresponding classical protocols.

  11. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  12. Sensor sentinel computing device

    Science.gov (United States)

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  13. Computer aided production engineering

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    This book presents the following contents: CIM in avionics; computer analysis of product designs for robot assembly; a simulation decision mould for manpower forecast and its application; development of flexible manufacturing system; advances in microcomputer applications in CAD/CAM; an automated interface between CAD and process planning; CAM and computer vision; low friction pneumatic actuators for accurate robot control; robot assembly of printed circuit boards; information systems design for computer integrated manufacture; and a CAD engineering language to aid manufacture.

  14. Factors Affecting Computer Anxiety in High School Computer Science Students.

    Science.gov (United States)

    Hayek, Linda M.; Stephens, Larry

    1989-01-01

    Examines factors related to computer anxiety measured by the Computer Anxiety Index (CAIN). Achievement in two programing courses was inversely related to computer anxiety. Students who had a home computer and had computer experience before high school had lower computer anxiety than those who had not. Lists 14 references. (YP)

  15. Handheld-computers

    NARCIS (Netherlands)

    Ramaekers, P.; Huiskes, J.

    1994-01-01

    Het Proefstation voor de Varkenshouderij onderzoekt de meerwaarde van 'handheld'-computers (handcomputers) voor de registratie van ziekten en behandelingen ten opzichte van de schriftelijke registratie.

  16. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  17. Frontiers in Computer Education

    CERN Document Server

    Zhu, Egui; 2011 International Conference on Frontiers in Computer Education (ICFCE 2011)

    2012-01-01

    This book is the proceedings of the 2011 International Conference on Frontiers in Computer Education (ICFCE 2011) in Sanya, China, December 1-2, 2011. The contributions can be useful for researchers, software engineers, and programmers, all interested in promoting the computer and education development. Topics covered are computing and communication technology, network management, wireless networks, telecommunication, Signal and Image Processing, Machine Learning, educational management, educational psychology, educational system, education engineering, education technology and training.  The emphasis is on methods and calculi for computer science and education technology development, verification and verification tools support, experiences from doing developments, and the associated theoretical problems.

  18. Theory and Computation

    Data.gov (United States)

    Federal Laboratory Consortium — Flexible computational infrastructure, software tools and theoretical consultation are provided to support modeling and understanding of the structure and properties...

  19. Introduction to quantum computers

    CERN Document Server

    Berman, Gennady P; Mainieri, Ronnie; Tsifrinovich, Vladimir I

    1998-01-01

    Quantum computing promises to solve problems which are intractable on digital computers. Highly parallel quantum algorithms can decrease the computational time for some problems by many orders of magnitude. This important book explains how quantum computers can do these amazing things. Several algorithms are illustrated: the discrete Fourier transform, Shor’s algorithm for prime factorization; algorithms for quantum logic gates; physical implementations of quantum logic gates in ion traps and in spin chains; the simplest schemes for quantum error correction; correction of errors caused by im

  20. Computer assisted audit techniques

    Directory of Open Access Journals (Sweden)

    Dražen Danić

    2008-12-01

    Full Text Available The purpose of this work is to point to the possibilities of more efficient auditing. In the encirclement of more and more intensive use of computer techniques that help to CAAT all the aims and the volume of auditing do not change when the audit is done in the computer-informatics environment. The computer assisted audit technique (CAATs can improve the efficiency and productivity of audit procedures. In the computerized information system, the CAATs are the ways in which an auditor can use computer to gather or as help in gathering auditing evidence. There are more reasons why the auditors apply computer techniques that help in auditing. Most often, they do it to achieve improvement of auditing efficiency when the data volume is large. It depends on several factors whether the auditors will apply the computer techniques that help auditing and to what degree respectively. If they do it, the most important are the computer knowledge, professional skill, experience of auditors, and availability of computer technique, and adequacy of computer supports, infeasibility of hand tests, efficiency and time limit. Through several examples from practice, we showed the possibilities of ACL as one of the CAAT tools.

  1. COMPUTER SUPPORT MANAGEMENT PRODUCTION

    Directory of Open Access Journals (Sweden)

    Svetlana Trajković

    2014-10-01

    Full Text Available The modern age in which we live today, modern and highly advanced technology that follows us all, gives great importance in the management of production within the computer support of management. Computer applications in production, the organization of production systems, in the organization of management and business, is gaining in importance. We live in a time when more and more uses computer technology and thus gives the opportunity for a broad and important area of application of computer systems in production, as well as some methods that enable us to successful implementation of a computer, such as in the management of production. Computer technology speeds up the processing and transfer of Information By needed in decision-making at various levels of management. Computer applications in production management and organizational management business production system gets more and more. New generation of computers caused the first technological revolution in industry. On these solutions the industry has been able to use all the modern technology of computers in manufacturing, automation and production management .

  2. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  3. Modeling Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Shuyi; WEN Yingyou; ZHAO Hong

    2006-01-01

    In this paper, a formal approach based on predicate logic is proposed for representing and reasoning of trusted computing models. Predicates are defined to represent the characteristics of the objects and the relationship among these objects in a trusted system according to trusted computing specifications. Inference rules of trusted relation are given too. With the semantics proposed, some trusted computing models are formalized and verified, which shows that Predicate calculus logic provides a general and effective method for modeling and reasoning trusted computing systems.

  4. Discrete and computational geometry

    CERN Document Server

    Devadoss, Satyan L

    2011-01-01

    Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well a

  5. Proto-computational Thinking

    DEFF Research Database (Denmark)

    Tatar, Deborah Gail; Harrison, Steve; Stewart, Michael

    2017-01-01

    , the observation that computing is usually about some non-computational thing can lead to an approach that integrates computational thinking instruction with existing core curricular classes. A social justice argument can be made for this approach, because all students take courses in the core curriculum...... in plausible theories of change and a number of different educational projects suitable for classroom instruction. However, a major outcome of the study was to advance the importance of proto-computational thinking (PCT). We argue that, in the absence of preexisting use of representational tools for thinking...

  6. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  7. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  8. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  9. Annual review of computer science

    Energy Technology Data Exchange (ETDEWEB)

    Traub, J.F. (Columbia Univ., New York, NY (USA)); Grosz, B.J. (Harvard Univ., Cambridge, MA (USA)); Lampson, B.W. (Digital Equipment Corp. (US)); Nilsson, N.J. (Stanford Univ., CA (USA))

    1988-01-01

    This book contains the annual review of computer science. Topics covered include: Database security, parallel algorithmic techniques for combinatorial computation, algebraic complexity theory, computer applications in manufacturing, and computational geometry.

  10. Computational matter: evolving computational solutions in materials

    NARCIS (Netherlands)

    Miller, Julian F.; Broersma, Hajo; Silva, Sara

    2015-01-01

    Natural Evolution has been exploiting the physical properties of matter since life first appeared on earth. Evolution-in-materio (EIM) attempts to program matter so that computational problems can be solved. The beauty of this approach is that artificial evolution may be able to utilize unknown phys

  11. Educational Computer Utilization and Computer Communications.

    Science.gov (United States)

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  12. Computations in Plasma Physics.

    Science.gov (United States)

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  13. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  14. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  15. Student Computer Use.

    Science.gov (United States)

    Education Statistics Quarterly, 1999

    1999-01-01

    Presents rates of student computer use by grade level, frequency of use, reason for use, and family income. In 1996, 79% of 4th graders, 91% of 8th graders, and 96% of 11th graders were using a computer at home or at school to write stories and papers. (Author/SLD)

  16. Asynchronous Multiparty Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Geisler, Martin; Krøigaard, Mikkel

    2009-01-01

    We propose an asynchronous protocol for general multiparty computation. The protocol has perfect security and communication complexity  where n is the number of parties, |C| is the size of the arithmetic circuit being computed, and k is the size of elements in the underlying field. The protocol g...

  17. Teaching Using Computer Games

    Science.gov (United States)

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  18. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  19. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  20. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem conspicu...

  1. Fault tolerant computing systems

    CERN Document Server

    Randell, B

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection, damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (15 refs).

  2. Advances in Computer Entertainment.

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  3. Computer Use Exposed

    NARCIS (Netherlands)

    J.M. Richter (Janneke)

    2009-01-01

    textabstractEver since the introduction of the personal computer, our daily lives are infl uenced more and more by computers. A day in the life of a PhD-student illustrates this: “At the breakfast table, I check my e-mail to see if the meeting later that day has been confi rmed, and I check the time

  4. A Home Computer Primer.

    Science.gov (United States)

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  5. Text analysis and computers

    OpenAIRE

    1995-01-01

    Content: Erhard Mergenthaler: Computer-assisted content analysis (3-32); Udo Kelle: Computer-aided qualitative data analysis: an overview (33-63); Christian Mair: Machine-readable text corpora and the linguistic description of danguages (64-75); Jürgen Krause: Principles of content analysis for information retrieval systems (76-99); Conference Abstracts (100-131).

  6. Computer controlled antenna system

    Science.gov (United States)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  7. Computer Virus Protection

    Science.gov (United States)

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  8. Logic via Computer Programming.

    Science.gov (United States)

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  9. Computer Anxiety and Instruction.

    Science.gov (United States)

    Baumgarte, Roger

    While the computer is commonly viewed as a tool for simplifying and enriching lives, many individuals react to this technology with feelings of anxiety, paranoia, and alienation. These reactions may have potentially serious career and educational consequences. Fear of computers reflects a generalized fear of current technology and is most…

  10. Advances in Computer Entertainment.

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant a

  11. Quantum Knitting Computer

    OpenAIRE

    Fujii, Toshiyuki; Matsuo, Shigemasa; Hatakenaka, Noriyuki

    2009-01-01

    We propose a fluxon-controlled quantum computer incorporated with three-qubit quantum error correction using special gate operations, i.e., joint-phase and SWAP gate operations, inherent in capacitively coupled superconducting flux qubits. The proposed quantum computer acts exactly like a knitting machine at home.

  12. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  13. Computer Processed Evaluation.

    Science.gov (United States)

    Griswold, George H.; Kapp, George H.

    A student testing system was developed consisting of computer generated and scored equivalent but unique repeatable tests based on performance objectives for undergraduate chemistry classes. The evaluation part of the computer system, made up of four separate programs written in FORTRAN IV, generates tests containing varying numbers of multiple…

  14. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  15. Preventing Computer Glitches

    Science.gov (United States)

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  16. Learning with Ubiquitous Computing

    Science.gov (United States)

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  17. Computational physics: a perspective.

    Science.gov (United States)

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  18. Computational chemistry at Janssen.

    Science.gov (United States)

    van Vlijmen, Herman; Desjarlais, Renee L; Mirzadegan, Tara

    2016-12-19

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  19. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  20. Computer-assisted Crystallization.

    Science.gov (United States)

    Semeister, Joseph J., Jr.; Dowden, Edward

    1989-01-01

    To avoid a tedious task for recording temperature, a computer was used for calculating the heat of crystallization for the compound sodium thiosulfate. Described are the computer-interfacing procedures. Provides pictures of laboratory equipment and typical graphs from experiments. (YP)

  1. Exercises in Computational Chemistry

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  2. Computational chemistry at Janssen

    Science.gov (United States)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2016-12-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  3. Exercises in Computational Chemistry

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  4. Programming the social computer.

    Science.gov (United States)

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  5. The Computing World

    Science.gov (United States)

    1992-04-01

    before Zuse would finish the machine. The British ended up receiving a smuggled replica of the German message-scrambling device. Alan Turing applied...general purpose computer followed closely by Alan Turing and his programmable digital computer. These pioneers thus launched the modern era of

  6. Testing On Computers

    Directory of Open Access Journals (Sweden)

    Michael Russell

    1999-06-01

    Full Text Available Russell and Haney (1997 reported that open-ended test items administered on paper may underestimate the achievement of students accustomed to writing on computers. This study builds on Russell and Haney's work by examining the effect of taking open-ended tests on computers and on paper for students with different levels of computer skill. Using items from the Massachusetts Comprehensive Assessment System (MCAS and the National Assessment of Educational Progress (NAEP, this study focuses on language arts, science and math tests administered to eighth grade students. In addition, information on students' prior computer use and keyboarding speed was collected. Unlike the previous study that found large effects for open-ended writing and science items, this study reports mixed results. For the science test, performance on computers had a positive group effect. For the two language arts tests, an overall group effect was not found. However, for students whose keyboarding speed is at least 0.5 or one-half of a standard deviation above the mean, performing the language arts test on computer had a moderate positive effect. Conversely, for students whose keyboarding speed was 0.5 standard deviations below the mean, performing the tests on computer had a substantial negative effect. For the math test, performing the test on computer had an overall negative effect, but this effect became less pronounced as keyboarding speed increased. Implications are discussed in terms of testing policies and future research.

  7. Computer Aided Lecturing.

    Science.gov (United States)

    Van Meter, Donald E.

    1994-01-01

    Surveyed students taking a natural resource conservation course to determine the effects of computer software that provides tools for creating and managing visual presentations to students. Results indicated that 94% of the respondents believed computer-aided lectures helped them and recommended their continued use; note taking was more effective,…

  8. Theory of computational complexity

    CERN Document Server

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  9. Ubiquitous Human Computing

    OpenAIRE

    Zittrain, Jonathan L.

    2008-01-01

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a thumb tack and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This short essay explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  10. Preventing Computer Glitches

    Science.gov (United States)

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  11. Neuroscience, brains, and computers

    Directory of Open Access Journals (Sweden)

    Giorno Maria Innocenti

    2013-07-01

    Full Text Available This paper addresses the role of the neurosciences in establishing what the brain is and how states of the brain relate to states of the mind. The brain is viewed as a computational deviceperforming operations on symbols. However, the brain is a special purpose computational devicedesigned by evolution and development for survival and reproduction, in close interaction with theenvironment. The hardware of the brain (its structure is very different from that of man-made computers.The computational style of the brain is also very different from traditional computers: the computationalalgorithms, instead of being sets of external instructions, are embedded in brain structure. Concerningthe relationships between brain and mind a number of questions lie ahead. One of them is why andhow, only the human brain grasped the notion of God, probably only at the evolutionary stage attainedby Homo sapiens.

  12. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  13. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  14. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... examples of place-specific computing are presented from a series of pilot studies, conducted in close collaboration with design students in Malmö, Berlin, Cape Town and Rome, that generated 36 design concepts in the genre. Reflecting on these examples, issues in the design of place-specific computing...

  15. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...... with learning objectives, teaching and assessment methods, and overall pedagogical mission of the individual medical school in relation to society. Medical CT as part of the medical curriculum could help educate novices (medical students and physicians in training) in the analysis and design of complex...... healthcare organizations, which increasingly rely on computer technology. Such teaching should engage novices in information practices where they learn to perceive practices of computer technology as directly involved in the provision of patient care. However, medical CT as a teaching and research field...

  16. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  17. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  18. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  19. Fostering Computational Thinking

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  20. Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Friday, Adrian

    2009-01-01

    First introduced two decades ago, the term ubiquitous computing is now part of the common vernacular. Ubicomp, as it is commonly called, has grown not just quickly but broadly so as to encompass a wealth of concepts and technology that serves any number of purposes across all of human endeavor......, an original ubicomp pioneer, Ubiquitous Computing Fundamentals brings together eleven ubiquitous computing trailblazers who each report on his or her area of expertise. Starting with a historical introduction, the book moves on to summarize a number of self-contained topics. Taking a decidedly human...... perspective, the book includes discussion on how to observe people in their natural environments and evaluate the critical points where ubiquitous computing technologies can improve their lives. Among a range of topics this book examines: How to build an infrastructure that supports ubiquitous computing...

  1. Unconditionally verifiable blind computation

    CERN Document Server

    Fitzsimons, Joseph F

    2012-01-01

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. Recently the authors together with Broadbent proposed a universal unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. In this paper we extend the BQC protocol presented in [Broadbent, Fitzsimons and Kashefi, FOCS 2009 p517] with new functionality allowing blind computational basis m...

  2. Blind Quantum Computation

    CERN Document Server

    Arrighi, P; Arrighi, Pablo; Salvail, Louis

    2003-01-01

    We investigate the possibility of having someone carry out the work of executing a function for you, but without letting him learn anything about your input. Say Alice wants Bob to compute some well-known function f upon her input x, but wants to prevent Bob from learning anything about x. The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The setting is quantum, the security is unconditional, the eavesdropper is as malicious as can be. Keywords: Secure Circuit Evaluation, Secure Two-party Computation, Information Hiding, Information gain vs disturbance.

  3. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...... project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future...

  4. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  5. Distributed computer control systems

    Energy Technology Data Exchange (ETDEWEB)

    Suski, G.J.

    1986-01-01

    This book focuses on recent advances in the theory, applications and techniques for distributed computer control systems. Contents (partial): Real-time distributed computer control in a flexible manufacturing system. Semantics and implementation problems of channels in a DCCS specification. Broadcast protocols in distributed computer control systems. Design considerations of distributed control architecture for a thermal power plant. The conic toolset for building distributed systems. Network management issues in distributed control systems. Interprocessor communication system architecture in a distributed control system environment. Uni-level homogenous distributed computer control system and optimal system design. A-nets for DCCS design. A methodology for the specification and design of fault tolerant real time systems. An integrated computer control system - architecture design, engineering methodology and practical experience.

  6. The Computer Festival

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    The Beijing neighborhood of Zhongguancun is considered China’s Silicon Valley. After ten years of rapid devel-opment, it has hewn out a dominant position for itself in respect to computer markets, technology and labor force. Even on an average day, the famous "Computer Street" attracts a large number of visitors and consumers, but at a recent computer fair, the crowds were even larger. The purpose of the festival was to encourage computer use in homes and offices, to further promote the development of high-tech production and to keep pushing the modernization of information in China. The once-a-year computer festival will probably become a new custom in Chinese people’s lives.

  7. Philosophy of Computer Science

    Directory of Open Access Journals (Sweden)

    Aatami Järvinen

    2014-06-01

    Full Text Available The diversity and interdisciplinary of Computer Sciences, and the multiplicity of its uses in other sciences make it difficult to define them and prescribe how to perform them. Furthermore, also cause friction between computer scientists from different branches. Because of how they are structured, these sciences programs are criticized for not offer an adequate methodological training, or a deep understanding of different research traditions. To collaborate on a solution, some have decided to include in their curricula courses that enable students to gain awareness about epistemology and methodological issues in Computer Science, as well as give meaning to the practice of computer scientists. In this article the needs and objectives of the courses on the philosophy of Computer Science are analyzed, and its structure and management are explained.

  8. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  9. Computers and neurosurgery.

    Science.gov (United States)

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Computing Algebraic Immunity by Reconfigurable Computer

    Science.gov (United States)

    2012-09-01

    the linear system, then the amount of computation required is O (( n d )ω) , where ω is the well–known “exponent of Gaussian reduction” (ω = 3 ( Gauss ...x2, x3) = x1x2 ⊕ x1x3 ⊕ x2x3. The top half of Table 2 shows the minterm canonical form of f̄ . Here, the first (leftmost) column represents all

  11. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  12. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation.

  13. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  14. Computation and Asymptotics

    CERN Document Server

    Ramnath, Rudrapatna V

    2012-01-01

    This book addresses the task of computation from the standpoint of asymptotic analysis and multiple scales that may be inherent in the system dynamics being studied. This is in contrast to the usual methods of numerical analysis and computation. The technical literature is replete with numerical methods such as Runge-Kutta approach and its variations, finite element methods, and so on. However, not much attention has been given to asymptotic methods for computation, although such approaches have been widely applied with great success in the analysis of dynamic systems. The presence of differen

  15. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  16. Convergence: Computing and communications

    Energy Technology Data Exchange (ETDEWEB)

    Catlett, C. [National Center for Supercomputing Applications, Champaign, IL (United States)

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  17. Single neuron computation

    CERN Document Server

    McKenna, Thomas M; Zornetzer, Steven F

    1992-01-01

    This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real n

  18. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating......In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...

  19. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  20. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ