WorldWideScience

Sample records for previous computational results

  1. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  2. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  3. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  4. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  5. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  6. Impossibility results for distributed computing

    CERN Document Server

    Attiya, Hagit

    2014-01-01

    To understand the power of distributed systems, it is necessary to understand their inherent limitations: what problems cannot be solved in particular systems, or without sufficient resources (such as time or space). This book presents key techniques for proving such impossibility results and applies them to a variety of different problems in a variety of different system models. Insights gained from these results are highlighted, aspects of a problem that make it difficult are isolated, features of an architecture that make it inadequate for solving certain problems efficiently are identified

  7. Determination of the Boltzmann constant with cylindrical acoustic gas thermometry: new and previous results combined

    Science.gov (United States)

    Feng, X. J.; Zhang, J. T.; Lin, H.; Gillis, K. A.; Mehl, J. B.; Moldover, M. R.; Zhang, K.; Duan, Y. N.

    2017-10-01

    We report a new determination of the Boltzmann constant k B using a cylindrical acoustic gas thermometer. We determined the length of the copper cavity from measurements of its microwave resonance frequencies. This contrasts with our previous work (Zhang et al 2011 Int. J. Thermophys. 32 1297, Lin et al 2013 Metrologia 50 417, Feng et al 2015 Metrologia 52 S343) that determined the length of a different cavity using two-color optical interferometry. In this new study, the half-widths of the acoustic resonances are closer to their theoretical values than in our previous work. Despite significant changes in resonator design and the way in which the cylinder length is determined, the value of k B is substantially unchanged. We combined this result with our four previous results to calculate a global weighted mean of our k B determinations. The calculation follows CODATA’s method (Mohr and Taylor 2000 Rev. Mod. Phys. 72 351) for obtaining the weighted mean value of k B that accounts for the correlations among the measured quantities in this work and in our four previous determinations of k B. The weighted mean {{\\boldsymbol{\\hat{k}}}{B}} is 1.380 6484(28)  ×  10-23 J K-1 with the relative standard uncertainty of 2.0  ×  10-6. The corresponding value of the universal gas constant is 8.314 459(17) J K-1 mol-1 with the relative standard uncertainty of 2.0  ×  10-6.

  8. Computations for a condenser. Experimental results

    International Nuclear Information System (INIS)

    Walden, Jean.

    1975-01-01

    Computations for condensers are presented with experimental results. The computations are concerned with the steam flux at the condenser input, and inside the tube bundle. Experimental results are given for the flux inside the condenser sleeve and the flow passing through the tube bundle [fr

  9. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  10. Initial results of CyberKnife treatment for recurrent previously irradiated head and neck cancer

    International Nuclear Information System (INIS)

    Himei, Kengo; Katsui, Kuniaki; Yoshida, Atsushi

    2003-01-01

    The purpose of this study was to evaluate the efficacy of CyberKnife for recurrent previously irradiated head and neck cancer. Thirty-one patients with recurrent previously irradiated head and neck cancer were treated with a CyberKnife from July 1999 to March 2002 at Okayama Kyokuto Hospital were retrospectively studied. The accumulated dose was 28-80 Gy (median 60 Gy). The interval between CyberKnife treatment and previous radiotherapy was 0.4-429.5 months (median 16.3 months). Primary lesions were nasopharynx: 7, maxillary sinus: 6, tongue: 5, ethmoid sinus: 3, and others: 1. The pathology was squamous cell carcinoma: 25, adenoid cystic carcinoma: 4, and others: 2. Symptoms were pain: 8, and nasal bleeding: 2. The prescribed dose was 15.0-40.3 Gy (median 32.3 Gy) as for the marginal dose. The response rate (complete response (CR)+partial response (PR)) and local control rate (CR+PR+no change (NC)) was 74% and 94% respectively. Pain disappeared for 4 cases, relief was obtained for 4 cases and no change for 2 cases and nasal bleeding disappeared for 2 cases for an improvement of symptoms. An adverse effects were observed as mucositis in 5 cases and neck swelling in one case. Prognosis of recurrent previously irradiated head and neck cancer was estimated as poor. Our early experience shows that CyberKnife is expected to be feasible treatment for recurrent previously irradiated head and neck cancer, and for the reduction adverse effects and maintenance of useful quality of life (QOL) for patients. (author)

  11. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  12. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  13. Does a previous prostate biopsy-related acute bacterial prostatitis affect the results of radical prostatectomy?

    Directory of Open Access Journals (Sweden)

    Hakan Türk

    Full Text Available ABSTRACT Objective To The standard technique for obtaining a histologic diagnosis of prostatic carcinomas is transrectal ultrasound guided prostate biopsy. Acute prostatitis which might develop after prostate biopsy can cause periprostatic inflammation and fibrosis. In this study, we performed a retrospective review of our database to determine whether ABP history might affect the outcome of RP. Materials and Methods 441 RP patients who were operated in our clinic from 2002 to 2014 were included in our study group. All patients’ demographic values, PSA levels, biopsy and radical prostatectomy specimen pathology results and their perioperative/ postoperative complications were evaluated. Results There were 41 patients in patients with acute prostatitis following biopsy and 397 patients that did not develop acute prostatitis. Mean blood loss, transfusion rate and operation period were found to be significantly higher in ABP patients. Hospitalization period and reoperation rates were similar in both groups. However, post-op complications were significantly higher in ABP group. Conclusion Even though it does not affect oncological outcomes, we would like to warn the surgeons for potential complaints during surgery in ABP patients.

  14. Does a previous prostate biopsy-related acute bacterial prostatitis affect the results of radical prostatectomy?

    Science.gov (United States)

    Türk, Hakan; Ün, Sitki; Arslan, Erkan; Zorlu, Ferruh

    2018-01-01

    To The standard technique for obtaining a histologic diagnosis of prostatic carcinomas is transrectal ultrasound guided prostate biopsy. Acute prostatitis which might develop after prostate biopsy can cause periprostatic inflammation and fibrosis. In this study, we performed a retrospective review of our database to determine whether ABP history might affect the outcome of RP. 441 RP patients who were operated in our clinic from 2002 to 2014 were included in our study group. All patients' demographic values, PSA levels, biopsy and radical prostatectomy specimen pathology results and their perioperative/postoperative complications were evaluated. There were 41 patients in patients with acute prostatitis following biopsy and 397 patients that did not develop acute prostatitis. Mean blood loss, transfusion rate and operation period were found to be significantly higher in ABP patients. Hospitalization period and reoperation rates were similar in both groups. However, post-op complications were significantly higher in ABP group. Even though it does not affect oncological outcomes, we would like to warn the surgeons for potential complaints during surgery in ABP patients. Copyright® by the International Brazilian Journal of Urology.

  15. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  16. New computation results for the solar dynamo

    International Nuclear Information System (INIS)

    Csada, I.K.

    1983-01-01

    The analytical solution to the solar dynamo equation leads to a relatively simple algorythm for the computation in terms of kinematic models. The internal and external velocities taken to be in the form of axisymmetric meridional circulation and differential rotation, respectively. Pure radial expanding motions in the corona are also taken into consideration. Numerical results are presented in terms of the velocity parameters for the period of field reversal, decay time, magnitudes and phases of the first four multipoles. (author)

  17. Surgical Results of Trabeculectomy and Ahmed Valve Implantation Following a Previous Failed Trabeculectomy in Primary Congenital Glaucoma Patients

    OpenAIRE

    Lee, Naeun; Ma, Kyoung Tak; Bae, Hyoung Won; Hong, Samin; Seong, Gong Je; Hong, Young Jae; Kim, Chan Yun

    2015-01-01

    Purpose To compare the surgical results of trabeculectomy and Ahmed glaucoma valve implantation after a previous failed trabeculectomy. Methods A retrospective comparative case series review was performed on 31 eye surgeries in 20 patients with primary congenital glaucoma who underwent trabeculectomy or Ahmed glaucoma valve implantation after a previous failed trabeculectomy with mitomycin C. Results The preoperative mean intraocular pressure was 25.5 mmHg in the trabeculectomy group and 26.9...

  18. Surgical results of trabeculectomy and Ahmed valve implantation following a previous failed trabeculectomy in primary congenital glaucoma patients.

    Science.gov (United States)

    Lee, Naeun; Ma, Kyoung Tak; Bae, Hyoung Won; Hong, Samin; Seong, Gong Je; Hong, Young Jae; Kim, Chan Yun

    2015-04-01

    To compare the surgical results of trabeculectomy and Ahmed glaucoma valve implantation after a previous failed trabeculectomy. A retrospective comparative case series review was performed on 31 eye surgeries in 20 patients with primary congenital glaucoma who underwent trabeculectomy or Ahmed glaucoma valve implantation after a previous failed trabeculectomy with mitomycin C. The preoperative mean intraocular pressure was 25.5 mmHg in the trabeculectomy group and 26.9 mmHg in the Ahmed glaucoma valve implantation group (p = 0.73). The 48-month postoperative mean intraocular pressure was 19.6 mmHg in the trabeculectomy group and 20.2 mmHg in the Ahmed glaucoma valve implantation group (p = 0.95). The 12-month trabeculectomy success rate was 69%, compared with 64% for Ahmed glaucoma valve implantation, and the 48-month success rates were 42% and 36% for trabeculectomy and valve implantation, respectively. The success rates following the entire follow-up period were not significantly different between the two groups (p > 0.05 by log rank test). Postoperative complications occurred in 25% of the trabeculectomy-operated eyes and 9% of the Ahmed-implanted eyes (p = 0.38). There was no significant difference in surgical outcome between the trabeculectomy and Ahmed glaucoma valve implantation groups, neither of which had favorable results. However, the trabeculectomy group demonstrated a higher prevalence of adverse complications such as post-operative endophthalmitis.

  19. Everolimus for Previously Treated Advanced Gastric Cancer: Results of the Randomized, Double-Blind, Phase III GRANITE-1 Study

    Science.gov (United States)

    Ohtsu, Atsushi; Ajani, Jaffer A.; Bai, Yu-Xian; Bang, Yung-Jue; Chung, Hyun-Cheol; Pan, Hong-Ming; Sahmoud, Tarek; Shen, Lin; Yeh, Kun-Huei; Chin, Keisho; Muro, Kei; Kim, Yeul Hong; Ferry, David; Tebbutt, Niall C.; Al-Batran, Salah-Eddin; Smith, Heind; Costantini, Chiara; Rizvi, Syed; Lebwohl, David; Van Cutsem, Eric

    2013-01-01

    Purpose The oral mammalian target of rapamycin inhibitor everolimus demonstrated promising efficacy in a phase II study of pretreated advanced gastric cancer. This international, double-blind, phase III study compared everolimus efficacy and safety with that of best supportive care (BSC) in previously treated advanced gastric cancer. Patients and Methods Patients with advanced gastric cancer that progressed after one or two lines of systemic chemotherapy were randomly assigned to everolimus 10 mg/d (assignment schedule: 2:1) or matching placebo, both given with BSC. Randomization was stratified by previous chemotherapy lines (one v two) and region (Asia v rest of the world [ROW]). Treatment continued until disease progression or intolerable toxicity. Primary end point was overall survival (OS). Secondary end points included progression-free survival (PFS), overall response rate, and safety. Results Six hundred fifty-six patients (median age, 62.0 years; 73.6% male) were enrolled. Median OS was 5.4 months with everolimus and 4.3 months with placebo (hazard ratio, 0.90; 95% CI, 0.75 to 1.08; P = .124). Median PFS was 1.7 months and 1.4 months in the everolimus and placebo arms, respectively (hazard ratio, 0.66; 95% CI, 0.56 to 0.78). Common grade 3/4 adverse events included anemia, decreased appetite, and fatigue. The safety profile was similar in patients enrolled in Asia versus ROW. Conclusion Compared with BSC, everolimus did not significantly improve overall survival for advanced gastric cancer that progressed after one or two lines of previous systemic chemotherapy. The safety profile observed for everolimus was consistent with that observed for everolimus in other cancers. PMID:24043745

  20. Locating previously unknown patterns in data-mining results: a dual data- and knowledge-mining method

    Directory of Open Access Journals (Sweden)

    Knaus William A

    2006-03-01

    Full Text Available Abstract Background Data mining can be utilized to automate analysis of substantial amounts of data produced in many organizations. However, data mining produces large numbers of rules and patterns, many of which are not useful. Existing methods for pruning uninteresting patterns have only begun to automate the knowledge acquisition step (which is required for subjective measures of interestingness, hence leaving a serious bottleneck. In this paper we propose a method for automatically acquiring knowledge to shorten the pattern list by locating the novel and interesting ones. Methods The dual-mining method is based on automatically comparing the strength of patterns mined from a database with the strength of equivalent patterns mined from a relevant knowledgebase. When these two estimates of pattern strength do not match, a high "surprise score" is assigned to the pattern, identifying the pattern as potentially interesting. The surprise score captures the degree of novelty or interestingness of the mined pattern. In addition, we show how to compute p values for each surprise score, thus filtering out noise and attaching statistical significance. Results We have implemented the dual-mining method using scripts written in Perl and R. We applied the method to a large patient database and a biomedical literature citation knowledgebase. The system estimated association scores for 50,000 patterns, composed of disease entities and lab results, by querying the database and the knowledgebase. It then computed the surprise scores by comparing the pairs of association scores. Finally, the system estimated statistical significance of the scores. Conclusion The dual-mining method eliminates more than 90% of patterns with strong associations, thus identifying them as uninteresting. We found that the pruning of patterns using the surprise score matched the biomedical evidence in the 100 cases that were examined by hand. The method automates the acquisition of

  1. Passive acoustic monitoring using a towed hydrophone array results in identification of a previously unknown beaked whale habitat.

    Science.gov (United States)

    Yack, Tina M; Barlow, Jay; Calambokidis, John; Southall, Brandon; Coates, Shannon

    2013-09-01

    Beaked whales are diverse and species rich taxa. They spend the vast majority of their time submerged, regularly diving to depths of hundreds to thousands of meters, typically occur in small groups, and behave inconspicuously at the surface. These factors make them extremely difficult to detect using standard visual survey methods. However, recent advancements in acoustic detection capabilities have made passive acoustic monitoring (PAM) a viable alternative. Beaked whales can be discriminated from other odontocetes by the unique characteristics of their echolocation clicks. In 2009 and 2010, PAM methods using towed hydrophone arrays were tested. These methods proved highly effective for real-time detection of beaked whales in the Southern California Bight (SCB) and were subsequently implemented in 2011 to successfully detect and track beaked whales during the ongoing Southern California Behavioral Response Study. The three year field effort has resulted in (1) the successful classification and tracking of Cuvier's (Ziphius cavirostris), Baird's (Berardius bairdii), and unidentified Mesoplodon beaked whale species and (2) the identification of areas of previously unknown beaked whale habitat use. Identification of habitat use areas will contribute to a better understanding of the complex relationship between beaked whale distribution, occurrence, and preferred habitat characteristics on a relatively small spatial scale. These findings will also provide information that can be used to promote more effective management and conservation of beaked whales in the SCB, a heavily used Naval operation and training region.

  2. COMPARISON OF THE RESULTS OF BLOOD GLUCOSE SELFMONITORING AND CONTINUOUS GLUCOSE MONITORING IN PREGNANT WOMEN WITH PREVIOUS DIABETES MELLITUS

    Directory of Open Access Journals (Sweden)

    A. V. Dreval'

    2015-01-01

    Full Text Available Background: Pregnancy is one of the indications for continuous glucose monitoring (CGM. The data on its efficiency in pregnant women are contradictory.Aim: To compare the results of blood glucose self-monitoring (SMBG and CGM in pregnant women with previous diabetes mellitus.Materials and methods: We performed a cross-sectional comparative study of glycemia in 18 pregnant women with previous type 1 (87.8% of patients and type 2 diabetes (22.2% of patients with various degrees of glycemic control. Their age was 27.7 ± 4.9 year. At study entry, the patients were at 17.2 ± 6.1 weeks of gestation. CGM and SMBG were performed in and by all patients for the duration of 5.4 ± 1.5 days. Depending on their HbA1c levels, all patients were divided into two groups: group 1 – 12 women with the HbA1c above the target (8.5 ± 1%, and group 2 – 6 women with the HbA1c levels within the target (5.6 ± 0.3%.Results: According to SMBG results, women from group 2 had above-the-target glycemia levels before breakfast, at 1 hour after breakfast and at bedtime: 6.2 ± 1.6, 8.7 ± 2.1, and 5.7 ± 1.9 mmol/L, respectively. According to CGM, patients from group 1 had higher postprandial glycemia than those from group 2 (8.0 ± 2.1 and 6.9 ± 1.8 mmol/L, respectively, p = 0.03. The analysis of glycemia during the day time revealed significant difference between the groups only at 1 hour after dinner (7.1 ± 1.4 mmol/L in group 1 and 5.8 ± 0.9 mmol/L in group 2, р = 0.041 and the difference was close to significant before lunch (6.0 ± 2.2 mmol/L in group 1 and 4.8 ± 1.0 mmol/L in group 2, р = 0.053. Comparison of SMBG and CGM results demonstrated significant difference only at one timepoint (at 1 hour after lunch and only in group 1: median glycemia was 7.4 [6.9; 8.1] mmol/L by SMBG and 6 [5.4; 6.6] mmol/L by CGM measurement (р = 0.001. Lower median values by CGM measurement could be explained by averaging of three successive measurements carried out in the

  3. Membrane computing: brief introduction, recent results and applications.

    Science.gov (United States)

    Păun, Gheorghe; Pérez-Jiménez, Mario J

    2006-07-01

    The internal organization and functioning of living cells, as well as their cooperation in tissues and higher order structures, can be a rich source of inspiration for computer science, not fully exploited at the present date. Membrane computing is an answer to this challenge, well developed at the theoretical (mathematical and computability theory) level, already having several applications (via usual computers), but without having yet a bio-lab implementation. After briefly discussing some general issues related to natural computing, this paper provides an informal introduction to membrane computing, focused on the main ideas, the main classes of results and of applications. Then, three recent achievements, of three different types, are briefly presented, with emphasis on the usefulness of membrane computing as a framework for devising models of interest for biological and medical research.

  4. Overview of JET post-mortem results following the 2007-9 operational period, and comparisons with previous campaigns

    International Nuclear Information System (INIS)

    Coad, J P; Gruenhagen, S; Widdowson, A; Hole, D E; Hakola, A; Koivuranta, S; Likonen, J; Rubel, M

    2011-01-01

    In 2010, all the plasma-facing components were removed from JET so that the carbon-based surfaces could be replaced with beryllium (Be) or tungsten as part of the ITER-like wall (ILW) project. This gives unprecedented opportunities for post-mortem analyses of these plasma-facing surfaces; this paper reviews the data obtained so far and relates the information to studies of tiles removed during previous JET shutdowns. The general pattern of erosion/deposition at the JET divertor has been maintained, with deposition of impurities in the scrape-off layer (SOL) at the inner divertor and preferential removal of carbon and transport into the corner. However, the remaining films in the SOL contain very high Be/C ratios at the surface. The first measurements of erosion using a tile profiler have been completed, with up to 200 microns erosion being recorded at points on the inner wall guard limiters.

  5. Ifosfamide in previously untreated disseminated neuroblastoma. Results of Study 3A of the European Neuroblastoma Study Group.

    Science.gov (United States)

    Kellie, S J; De Kraker, J; Lilleyman, J S; Bowman, A; Pritchard, J

    1988-05-01

    A prospective study of the effectiveness of ifosfamide as a single agent in the management of previously untreated patients with Evans stage IV neuroblastoma was undertaken. Eighteen children aged more than 1 year were treated with ifosfamide (IFX) 3 g/m2 daily for 2 days immediately after diagnosis and 3 weeks later. Treatment was continued with combination chemotherapy using vincristine, cyclophosphamide, cisplatinum and etoposide (OPEC) or a variant. Mesna (2-mercaptoethane sulphonate) was given to all patients during IFX treatment to prevent urotoxicity. Eight of the 18 patients (44%) responded to IFX. Nine had greater than 66% reduction in baseline tumor volume. Of 15 evaluable patients with raised pre-treatment urinary catecholamine excretion, six (40%) achieved greater than 50% reduction in pretreatment levels. Two of 10 patients evaluable for bone marrow response had complete clearance. Toxicity was mild in all patients. Upon completing 'first line' therapy, only four patients (22%) achieved a good partial remission (GPR) or complete response (CR). Median survival was 11 months. There was a lower rate of attaining GPR and shortened median survival in patients receiving phase II IFX before OPEC or variant, compared to patients with similar pre-treatment characteristics treated with OPEC from diagnosis in an earlier study.

  6. Long-term follow-up results of 131I treatment of recurrent hyperthyroidism previously treated by subtotal thyroidectomy

    International Nuclear Information System (INIS)

    Bal, C.S.; Padhy, A.K.; Nair, P.G.

    1998-01-01

    Full text: In patients with recurrent hyperthyroidism following previous subtotal thyroidectomy for Graves' disease or toxic MNG, radioiodine therapy is often recommended. However, our knowledge of the long-term effect of 131 I in this subset of patient is limited. 47 patients presented with post surgery recurrence at thyroid clinic of Nuclear Medicine Department from 1972 to 1996. Mean age of patients at presentation was 43 years (range 23-67 years), 10 were males and 28 had Graves' and rest toxic-MNG. Time of recurrence following surgery varied widely from 6 months to 32 years, 21% recurrent within a year and 75% before tenth year. However, 15% recurred beyond 20 years. 11 patients (23.4%) were aged more than 50 years at the time of recurrence. 34 patients (72%) needed single dose of 131 I (mean dose 288 MBq and range 107 - 740 MBq) and remaining 13 patients multiple doses of 131 I, to be free of thyrotoxicosis (7 patients: 2 doses, 3 patients: 3 doses, 2 patients: 4 doses and the last one 5 doses). 38 patients required ≤370 MBq for this purpose. One individual needed the maximum which was 1480 MBq in divided doses to be euthyroid. The maximum duration of follow-up was 26 years with mean follow up of 10 years. 5 patients were lost to follow-up after their 131 I therapy. The end point considered was confirmed hypothyroidism or euthyroidism in the last visit. 26 patients (62%) were euthyroid and 16 (38%) were hypothyroid after 10 years of mean follow-up period. However, hypothyroidism at the end of one year was in eleven patients (26%). Comparing age, sex, type of gland, time of 131 I treatment and RAIU matched non-operated thyrotoxic patients revealed hypothyroidism rate at first year was 9% and cumulative hypothyroidism after 9.8 years of follow-up (ranging 1-26 years) 36%. This study reveals 15% of patients recur even after 20 years, indicating life-long follow-up after thyroidectomy. The 131 I treatment in these patients shows high initial hypothyroidism rate

  7. Continuation of the summarizing interim report on previous results of the Gorleben site survey as of May 1983

    International Nuclear Information System (INIS)

    1990-04-01

    In addition to results from the 1983 interim report, this report contains, in order to supplement the surface explorations, seismic reflection measurements, hydrogeologic and seismologic investigations, sorption experiments, and studies of glacial development in the site region and of long-term safety of final waste repositories in salt domes. The site's high grade of suitability for becoming a final radioactive waste repository, the legal basis as well as quality assurance are evaluated. (orig.) [de

  8. Effects of 60 Hz electromagnetic fields on early growth in three plant species and a replication of previous results

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M.S. [Univ. of Sunderland (United Kingdom). Ecology Centre

    1996-05-01

    In an attempt to replicate the findings of Smith et al., seeds of Raphanus sativus L. (radish), Sinapsis alba L. (mustard), and Hordeum vulgare L. (barley) were grown for between 9 and 21 days in continuous electromagnetic fields (EMFs) at ion-cyclotron resonance conditions for stimulation of Ca{sup 2+} (B{sub H} = 78.3 {micro}T, B{sub HAC} = 40 {micro}T peak-peak at 60 Hz, B{sub v} = 0). On harvesting, radish showed results similar to those of Smith et al. Dry stem weight and plant height were both significantly greater (Mann-Whitney tests, Ps < 0.05) in EMF-exposed plants than in control plants in each EMF experiment. Wet root weight was significantly greater in EMF-exposed plants in two out of three experiments, as were dry leaf weight, dry whole weight, and stem diameter. Dry root weight, wet leaf weight, and wet whole weight were significantly greater in EMF-exposed plants in one of three experiments. All significant differences indicated an increase in weight or size in the EMF-exposed plants. In each of the sham experiments, no differences between exposed and control plants were evident. Mustard plants failed to respond to the EMFs in any of the plant parameters measured. In one experiment, barley similarly failed to respond; but in another showed significantly greater wet root weight and significantly smaller stem diameter and dry seed weight at the end of the experiment in exposed plants compared to control plants. Although these results give no clue about the underlying bioelectromagnetic mechanism, they demonstrate that, at least for one EMF-sensitive biosystem, results can be independently replicated in another laboratory. Such replication is crucial in establishing the validity of bioelectromagnetic science.

  9. Making the best use of our previous results as a clue for interpreting kinetics of scintigraphic agents

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Sato

    2011-08-01

    Full Text Available Up to now, we have performed scintigraphy with 201-thallium chloride (201-TlCl and 99m-Tc-hexakis-2-methoxy-isobutyl-isonitrile (99m-Tc-MIBI for malignant tumors and lymphoscintigraphy with 99m-Tc-rhenium-colloid (99m-Tc-Re and 99m-Tc-human-serum-albumin-diethylene-triamine-penta-acetic-acid (99m-Tc-HSA-D for lymph node metastasis. In this article, we re-evaluated scintigraphic images retrospectively with a hope that the results might be a clue, even if it is small, for dentists to try to improve the accuracy of diagnosis of malignant tumors. From scintigraphy, we obtained the tumor retention index as a factor to estimate the uptake of radioactive agents in tumor cells. Moreover, we estimated transport proteins of Na+/K+-ATPase and permeability-glycoprotein (P-gp expressed on the cell membrane that might regulate the kinetic condition of radioactive agents. Among the tumor retention index, the transport protein and the histopathologic finding of tumors, there were relatively well correlations. The tumor retention index showed a difference clearly between malignant tumor and benign tumor. The transport protein revealed a distinct expression in accordance with the malignancy of tumor, and the uptake clearly depended upon the expression of transport protein. Moreover, the lymph node metastasis was detected well by lymphoscintigraphy with 99m-Tc-Re and 99m-Tc-HSA-D.

  10. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  11. Results of a Research Evaluating Quality of Computer Science Education

    Science.gov (United States)

    Záhorec, Ján; Hašková, Alena; Munk, Michal

    2012-01-01

    The paper presents the results of an international research on a comparative assessment of the current status of computer science education at the secondary level (ISCED 3A) in Slovakia, the Czech Republic, and Belgium. Evaluation was carried out based on 14 specific factors gauging the students' point of view. The authors present qualitative…

  12. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  13. Computation of Quasiperiodic Normally Hyperbolic Invariant Tori: Rigorous Results

    Science.gov (United States)

    Canadell, Marta; Haro, Àlex

    2017-12-01

    The development of efficient methods for detecting quasiperiodic oscillations and computing the corresponding invariant tori is a subject of great importance in dynamical systems and their applications in science and engineering. In this paper, we prove the convergence of a new Newton-like method for computing quasiperiodic normally hyperbolic invariant tori carrying quasiperiodic motion in smooth families of real-analytic dynamical systems. The main result is stated as an a posteriori KAM-like theorem that allows controlling the inner dynamics on the torus with appropriate detuning parameters, in order to obtain a prescribed quasiperiodic motion. The Newton-like method leads to several fast and efficient computational algorithms, which are discussed and tested in a companion paper (Canadell and Haro in J Nonlinear Sci, 2017. doi: 10.1007/s00332-017-9388-z), in which new mechanisms of breakdown are presented.

  14. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  15. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  16. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  17. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  18. BaBar computing - From collisions to physics results

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The BaBar experiment at SLAC studies B-physics at the Upsilon(4S) resonance using the high-luminosity e+e- collider PEP-II at the Stanford Linear Accelerator Center (SLAC). Taking, processing and analyzing the very large data samples is a significant computing challenge. This presentation will describe the entire BaBar computing chain and illustrate the solutions chosen as well as their evolution with the ever higher luminosity being delivered by PEP-II. This will include data acquisition and software triggering in a high availability, low-deadtime online environment, a prompt, automated calibration pass through the data SLAC and then the full reconstruction of the data that takes place at INFN-Padova within 24 hours. Monte Carlo production takes place in a highly automated fashion in 25+ sites. The resulting real and simulated data is distributed and made available at SLAC and other computing centers. For analysis a much more sophisticated skimming pass has been introduced in the past year, ...

  19. FPGAs in High Perfomance Computing: Results from Two LDRD Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Underwood, Keith D; Ulmer, Craig D.; Thompson, David; Hemmert, Karl Scott

    2006-11-01

    Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave order of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5

  20. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  1. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  2. Methodics of computing the results of monitoring the exploratory gallery

    Directory of Open Access Journals (Sweden)

    Krúpa Víazoslav

    2000-09-01

    Full Text Available At building site of motorway tunnel Višòové-Dubná skala , the priority is given to driving of exploration galley that secures in detail: geologic, engineering geology, hydrogeology and geotechnics research. This research is based on gathering information for a supposed use of the full profile driving machine that would drive the motorway tunnel. From a part of the exploration gallery which is driven by the TBM method, a fulfilling information is gathered about the parameters of the driving process , those are gathered by a computer monitoring system. The system is mounted on a driving machine. This monitoring system is based on the industrial computer PC 104. It records 4 basic values of the driving process: the electromotor performance of the driving machine Voest-Alpine ATB 35HA, the speed of driving advance, the rotation speed of the disintegrating head TBM and the total head pressure. The pressure force is evaluated from the pressure in the hydraulic cylinders of the machine. Out of these values, the strength of rock mass, the angle of inner friction, etc. are mathematically calculated. These values characterize rock mass properties as their changes. To define the effectivity of the driving process, the value of specific energy and the working ability of driving head is used. The article defines the methodics of computing the gathered monitoring information, that is prepared for the driving machine Voest – Alpine ATB 35H at the Institute of Geotechnics SAS. It describes the input forms (protocols of the developed method created by an EXCEL program and shows selected samples of the graphical elaboration of the first monitoring results obtained from exploratory gallery driving process in the Višòové – Dubná skala motorway tunnel.

  3. Thermodynamic properties of indan: Experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Steele, William V.; Kazakov, Andrei F.

    2016-01-01

    Highlights: • Heat capacities were measured for the temperature range (5 to 445) K. • Vapor pressures were measured for the temperature range (338 to 495) K. • Densities at saturation pressure were measured from T = (323 to 523) K. • Computed and experimentally derived properties for ideal gas entropies are in excellent accord. • Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Measurements leading to the calculation of thermodynamic properties in the ideal-gas state for indan (Chemical Abstracts registry number [496-11-7], 2,3-dihydro-1H-indene) are reported. Experimental methods were adiabatic heat-capacity calorimetry, differential scanning calorimetry, comparative ebulliometry, and vibrating-tube densitometry. Molar thermodynamic functions (enthalpies, entropies, and Gibbs energies) for the condensed and ideal-gas states were derived from the experimental studies at selected temperatures. Statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d, p) level of theory. Computed ideal-gas properties derived with the rigid-rotor harmonic-oscillator approximation are shown to be in excellent accord with ideal-gas entropies derived from thermophysical property measurements of this research, as well as with experimental heat capacities for the ideal-gas state reported in the literature. Literature spectroscopic studies and ab initio calculations report a range of values for the barrier to ring puckering. Results of the present work are consistent with a large barrier that allows use of the rigid-rotor harmonic-oscillator approximation for ideal-gas entropy and heat-capacity calculations, even with the stringent uncertainty requirements imposed by the calorimetric and physical property measurements reported here. All experimental results are compared with property values reported in the literature.

  4. Computer processing of the Δlambda/lambda measured results

    International Nuclear Information System (INIS)

    Draguniene, V.J.; Makariuniene, E.K.

    1979-01-01

    For the processing of the experimental data on the influence of the chemical environment on the radioactive decay constants, five programs have been written in the Fortran language in the version for the monitoring system DUBNA on the BESM-6 computer. Each program corresponds to a definite stage of data processing and acquirement of the definite answer. The first and second programs are calculation of the ratio of the pulse numbers measured with different sources and calculation of the mean value of dispersions. The third program is the averaging of the ratios of the pulse numbers. The fourth and the fifth are determination of the change of the radioactive decay constant. The created programs for the processing of the measurement results permit the processing of the experimental data beginning from the values of pulse numbers obtained directly in the experiments. The programs allow to treat a file of the experimental results, to calculated various errors in all the stages of the calculations. Printing of the obtained results is convenient for usage

  5. Transversity results and computations in symplectic field theory

    International Nuclear Information System (INIS)

    Fabert, Oliver

    2008-01-01

    Although the definition of symplectic field theory suggests that one has to count holomorphic curves in cylindrical manifolds R x V equipped with a cylindrical almost complex structure J, it is already well-known from Gromov-Witten theory that, due to the presence of multiply-covered curves, we in general cannot achieve transversality for all moduli spaces even for generic choices of J. In this thesis we treat the transversality problem of symplectic field theory in two important cases. In the first part of this thesis we are concerned with the rational symplectic field theory of Hamiltonian mapping tori, which is also called the Floer case. For this observe that in the general geometric setup for symplectic field theory, the contact manifolds can be replaced by mapping tori M φ of symplectic manifolds (M,ω M ) with symplectomorphisms φ. While the cylindrical contact homology of M φ is given by the Floer homologies of powers of φ, the other algebraic invariants of symplectic field theory for M φ provide natural generalizations of symplectic Floer homology. For symplectically aspherical M and Hamiltonian φ we study the moduli spaces of rational curves and prove a transversality result, which does not need the polyfold theory by Hofer, Wysocki and Zehnder and allows us to compute the full contact homology of M φ ≅ S 1 x M. The second part of this thesis is devoted to the branched covers of trivial cylinders over closed Reeb orbits, which are the trivial examples of punctured holomorphic curves studied in rational symplectic field theory. Since all moduli spaces of trivial curves with virtual dimension one cannot be regular, we use obstruction bundles in order to find compact perturbations making the Cauchy-Riemann operator transversal to the zero section and show that the algebraic count of elements in the resulting regular moduli spaces is zero. Once the analytical foundations of symplectic field theory are established, our result implies that the

  6. Transversity results and computations in symplectic field theory

    Energy Technology Data Exchange (ETDEWEB)

    Fabert, Oliver

    2008-02-21

    Although the definition of symplectic field theory suggests that one has to count holomorphic curves in cylindrical manifolds R x V equipped with a cylindrical almost complex structure J, it is already well-known from Gromov-Witten theory that, due to the presence of multiply-covered curves, we in general cannot achieve transversality for all moduli spaces even for generic choices of J. In this thesis we treat the transversality problem of symplectic field theory in two important cases. In the first part of this thesis we are concerned with the rational symplectic field theory of Hamiltonian mapping tori, which is also called the Floer case. For this observe that in the general geometric setup for symplectic field theory, the contact manifolds can be replaced by mapping tori M{sub {phi}} of symplectic manifolds (M,{omega}{sub M}) with symplectomorphisms {phi}. While the cylindrical contact homology of M{sub {phi}} is given by the Floer homologies of powers of {phi}, the other algebraic invariants of symplectic field theory for M{sub {phi}} provide natural generalizations of symplectic Floer homology. For symplectically aspherical M and Hamiltonian {phi} we study the moduli spaces of rational curves and prove a transversality result, which does not need the polyfold theory by Hofer, Wysocki and Zehnder and allows us to compute the full contact homology of M{sub {phi}} {approx_equal} S{sup 1} x M. The second part of this thesis is devoted to the branched covers of trivial cylinders over closed Reeb orbits, which are the trivial examples of punctured holomorphic curves studied in rational symplectic field theory. Since all moduli spaces of trivial curves with virtual dimension one cannot be regular, we use obstruction bundles in order to find compact perturbations making the Cauchy-Riemann operator transversal to the zero section and show that the algebraic count of elements in the resulting regular moduli spaces is zero. Once the analytical foundations of symplectic

  7. Bomb-Pulse Chlorine-36 At The Proposed Yucca Mountain Repository Horizon: An Investigation Of Previous Conflicting Results And Collection Of New Data

    International Nuclear Information System (INIS)

    J. Cizdziel

    2006-01-01

    Previous studies by scientists at Los Alamos National Laboratory (LANL) found elevated ratios of chlorine-36 to total chloride ( 36 Cl/Cl) in samples of rock collected from the Exploratory Studies Facility (ESF) and the Enhanced Characterization of the Repository Block (ECRB) at Yucca Mountain as the tunnels were excavated. The data were interpreted as an indication that fluids containing 'bomb-pulse' 36 Cl reached the repository horizon in the ∼50 years since the peak period of above-ground nuclear testing. Moreover, the data support the concept that so-called fast pathways for infiltration not only exist but are active, possibly through a combination of porous media, faults and/or other geologic features. Due to the significance of 36 Cl data to conceptual models of unsaturated zone flow and transport, the United States Geological Survey (USGS) was requested by the Department of Energy (DOE) to design and implement a study to validate the LANL findings. The USGS chose to drill new boreholes at select locations across zones where bomb-pulse ratios had previously been identified. The drill cores were analyzed at Lawrence Livermore National Laboratory (LLNL) for 36 Cl/Cl using both active and passive leaches, with the USGS/LLNL concluding that the active leach extracted too much rock-Cl and the passive leach did not show bomb-pulse ratios. Because consensus was not reached between the USGS/LLNL and LANL on several fundamental points, including the conceptual strategy for sampling, interpretation and use of tritium ( 3 H) data, and the importance and interpretation of blanks, in addition to the presence or absence of bomb-pulse 36 Cl, an evaluation by an independent entity, the University of Nevada, Las Vegas (UNLV), using new samples was initiated. This report is the result of that study. The overall objectives of the UNLV study were to investigate the source or sources of the conflicting results from the previous validation study, and to obtain additional data to

  8. Positron Computed Tomography: Current State, Clinical Results and Future Trends

    Science.gov (United States)

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  9. Positron computed tomography: current state, clinical results and future trends

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  10. Positron computed tomography: current state, clinical results and future trends

    International Nuclear Information System (INIS)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-01-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends

  11. Evaluation of results for computed tomography in head region

    International Nuclear Information System (INIS)

    Himeji, Toshiharu

    1983-01-01

    In 2 years and 5 months from April 1980 to May 1982, I had maked examination for computed tomography (CT) in head region by TCT-60A (TOSHIBA), and so reported the evaluation of following those results; 1) The number of CT scan was 1228 patients and total 1513 scannings. The contents of its scan were plain CT (86.1%), CE (contrast enhancement) CT (7.3%) and both application methods (6.6%), and included from 1 CT time (85.3%), 2 CT times (9.6%), 3 CT times (3.3%),... til 7 CT times. Our CT scan cases were 720 males (58.6%) and 508 females (41.4%);its scan age level was mostly 40 y.o. -- over 70 y.o., but low age patients (under 10 y.o.) indicated number of 15.3%. In consideration of this fact the advantage of CT scan was very easily and safely procedure free from body lesion. 2) In number of CT scan: the most many patients were visiting department of internal medicine clinic, and following pediatric clinic, surgery and orthopedic department. Above all CT scan cases were included of other all clinical departments in our hospital. (CT scan was very useful for neurological examination). 3) In CT diagnosis our cases were it of cerebral infarction 128 (10.4%), cerebral hemorrage 19 (1.5%) and brain tumor 24 (2.3%), in small cases other craniocerebral diseases. 4) The visiting cases in internal medicine often complain of cerebrovascular symptomes, and in pediatric clinic chief complain was often suspected mental retardation and neurological sign. In surgery department it was suspected metastatic brain tumor from other malignant cancers, and in orthopedic surgery often skull injury or traffic accident. (J.P.N.)

  12. [Usage patterns of internet and computer games : Results of an observational study of Tyrolean adolescents].

    Science.gov (United States)

    Riedl, David; Stöckl, Andrea; Nussbaumer, Charlotte; Rumpold, Gerhard; Sevecke, Kathrin; Fuchs, Martin

    2016-12-01

    The use of digital media such as the Internet and Computer games has greatly increased. In the western world, almost all young people regularly use these relevant technologies. Against this background, forms of use with possible negative consequences for young people have been recognized and scientifically examined. The aim of our study was therefore to investigate the prevalence of pathological use of these technologies in a sample of young Tyrolean people. 398 students (average age 15.2 years, SD ± 2.3 years, 34.2% female) were interviewed by means of the structured questionnaires CIUS (Internet), CSV-S (Computer games) and SWE (Self efficacy). Additionally, socio demographic data were collected. In line with previous studies, 7.7% of the adolescents of our sample showed criteria for problematic internet use, 3.3% for pathological internet use. 5.4% of the sample reported pathological computer game usage. The most important aspect to influence our results was the gender of the subjects. Intensive users in the field of Internet and Computer games were more often young men, young women, however, showed significantly less signs of pathological computer game use. A significant percentage of Tyrolean adolescents showed difficulties in the development of competent media use, indicating the growing significance of prevention measures such as media education. In a follow-up project, a sample of adolescents with mental disorders will be examined concerning their media use and be compared with our school-sample.

  13. Antimicrobial usage in German acute care hospitals: results of the third national point prevalence survey and comparison with previous national point prevalence surveys.

    Science.gov (United States)

    Aghdassi, Seven Johannes Sam; Gastmeier, Petra; Piening, Brar Christian; Behnke, Michael; Peña Diaz, Luis Alberto; Gropmann, Alexander; Rosenbusch, Marie-Luise; Kramer, Tobias Siegfried; Hansen, Sonja

    2018-04-01

    Previous point prevalence surveys (PPSs) revealed the potential for improving antimicrobial usage (AU) in German acute care hospitals. Data from the 2016 German national PPS on healthcare-associated infections and AU were used to evaluate efforts in antimicrobial stewardship (AMS). A national PPS in Germany was organized by the German National Reference Centre for Surveillance of Nosocomial Infections in 2016 as part of the European PPS initiated by the ECDC. The data were collected in May and June 2016. Results were compared with data from the PPS 2011. A total of 218 hospitals with 64 412 observed patients participated in the PPS 2016. The prevalence of patients with AU was 25.9% (95% CI 25.6%-26.3%). No significant increase or decrease in AU prevalence was revealed in the group of all participating hospitals. Prolonged surgical prophylaxis was found to be common (56.1% of all surgical prophylaxes on the prevalence day), but significantly less prevalent than in 2011 (P < 0.01). The most frequently administered antimicrobial groups were penicillins plus β-lactamase inhibitors (BLIs) (23.2%), second-generation cephalosporins (12.9%) and fluoroquinolones (11.3%). Significantly more penicillins plus BLIs and fewer second-generation cephalosporins and fluoroquinolones were used in 2016. Overall, an increase in the consumption of broad-spectrum antimicrobials was noted. For 68.7% of all administered antimicrobials, the indication was documented in the patient notes. The current data reaffirm the points of improvement that previous data identified and reveal that recent efforts in AMS in German hospitals require further intensification.

  14. Energy-resolved computed tomography: first experimental results

    International Nuclear Information System (INIS)

    Shikhaliev, Polad M

    2008-01-01

    First experimental results with energy-resolved computed tomography (CT) are reported. The contrast-to-noise ratio (CNR) in CT has been improved with x-ray energy weighting for the first time. Further, x-ray energy weighting improved the CNR in material decomposition CT when applied to CT projections prior to dual-energy subtraction. The existing CT systems use an energy (charge) integrating x-ray detector that provides a signal proportional to the energy of the x-ray photon. Thus, the x-ray photons with lower energies are scored less than those with higher energies. This underestimates contribution of lower energy photons that would provide higher contrast. The highest CNR can be achieved if the x-ray photons are scored by a factor that would increase as the x-ray energy decreases. This could be performed by detecting each x-ray photon separately and measuring its energy. The energy selective CT data could then be saved, and any weighting factor could be applied digitally to a detected x-ray photon. The CT system includes a photon counting detector with linear arrays of pixels made from cadmium zinc telluride (CZT) semiconductor. A cylindrical phantom with 10.2 cm diameter made from tissue-equivalent material was used for CT imaging. The phantom included contrast elements representing calcifications, iodine, adipose and glandular tissue. The x-ray tube voltage was 120 kVp. The energy selective CT data were acquired, and used to generate energy-weighted and material-selective CT images. The energy-weighted and material decomposition CT images were generated using a single CT scan at a fixed x-ray tube voltage. For material decomposition the x-ray spectrum was digitally spilt into low- and high-energy parts and dual-energy subtraction was applied. The x-ray energy weighting resulted in CNR improvement of calcifications and iodine by a factor of 1.40 and 1.63, respectively, as compared to conventional charge integrating CT. The x-ray energy weighting was also applied

  15. Computer evaluation of the results of batch fermentations

    Energy Technology Data Exchange (ETDEWEB)

    Nyeste, L; Sevella, B

    1980-01-01

    A useful aid to the mathematical modeling of fermentation systems, for the kinetic evaluation of batch fermentations, is described. The generalized logistic equation may be used to describe the growth curves, substrate consumption, and product formation. A computer process was developed to fit the equation to experimental points, automatically determining the equation constants on the basis of the iteration algorithm of the method of non-linear least squares. By fitting the process to different master programs of various fermentations, the complex kinetic evaluation of fermentations becomes possible. Based on the analysis easily treatable generalized logistic equation, it is possible to calculate by computer different kinetic characteristics, e.g. rates, special rates, yields, etc. The possibility of committing subjective errors was reduced to a minimum. Employment of the method is demonstrated on some fermentation processes and problems arising in the course of application are discussed.

  16. Techniques for animation of CFD results. [computational fluid dynamics

    Science.gov (United States)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  17. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  18. Office and ambulatory blood pressure control with a fixed-dose combination of candesartan and hydrochlorothiazide in previously uncontrolled hypertensive patients: results of CHILI CU Soon

    Science.gov (United States)

    Mengden, Thomas; Hübner, Reinhold; Bramlage, Peter

    2011-01-01

    Background Fixed-dose combinations of candesartan 32 mg and hydrochlorothiazide (HCTZ) have been shown to be effective in clinical trials. Upon market entry we conducted a noninterventional study to document the safety and effectiveness of this fixed-dose combination in an unselected population in primary care and to compare blood pressure (BP) values obtained during office measurement (OBPM) with ambulatory blood pressure measurement (ABPM). Methods CHILI CU Soon was a prospective, noninterventional, noncontrolled, open-label, multicenter study with a follow-up of at least 10 weeks. High-risk patients aged ≥18 years with previously uncontrolled hypertension were started on candesartan 32 mg in a fixed-dose combination with either 12.5 mg or 25 mg HCTZ. OBPM and ABPM reduction and adverse events were documented. Results A total of 4131 patients (52.8% male) with a mean age of 63.0 ± 11.0 years were included. BP was 162.1 ± 14.8/94.7 ± 9.2 mmHg during office visits at baseline. After 10 weeks of candesartan 32 mg/12.5 mg or 25 mg HCTZ, mean BP had lowered to 131.7 ± 10.5/80.0 ± 6.6 mmHg (P good (r = 0.589 for systolic BP and r = 0.389 for diastolic BP during the day). Of those who were normotensive upon OBPM, 35.1% had high ABPM during the day, 49.3% were nondippers, and 3.4% were inverted dippers. Forty-nine adverse events (1.19%) were reported, of which seven (0.17%) were regarded as serious. Conclusion Candesartan 32 mg in a fixed-dose combination with either 12.5 mg or 25 mg HCTZ is safe and effective for further BP lowering irrespective of prior antihypertensive drug class not being able to control BP. PMID:22241950

  19. Verification of SACI-2 computer code comparing with experimental results of BIBLIS-A and LOOP-7 computer code

    International Nuclear Information System (INIS)

    Soares, P.A.; Sirimarco, L.F.

    1984-01-01

    SACI-2 is a computer code created to study the dynamic behaviour of a PWR nuclear power plant. To evaluate the quality of its results, SACI-2 was used to recalculate commissioning tests done in BIBLIS-A nuclear power plant and to calculate postulated transients for Angra-2 reactor. The results of SACI-2 computer code from BIBLIS-A showed as much good agreement as those calculated with the KWU Loop 7 computer code for Angra-2. (E.G.) [pt

  20. Meningitis tuberculosa: Clinical findings and results of cranial computed tomography

    International Nuclear Information System (INIS)

    Trautmann, M.; Loddenkemper, R.; Hoffmann, H.G.; Krankenhaus Zehlendorf, Berlin; Allgemeines Krankenhaus Altona

    1982-01-01

    Guided by 9 own observations between 1977 and 1981, new diagnostic facilities in tuberculous meningitis are discussed. For differentiation from viral meningitis, measurement of CSF lactic acid concentration in addition to that of CSF glucose has proved to be of value in recent years. In accordance with the literature, two cases of this series which were examined for CSF lactic acid concentration showed markedly elevated levels of 8,4 rsp. 10,4 mmol/l. In contrast to this, in viral meningitis usually values of less than 3.5 mmol/l are found. Additionally, the presence of hypochlor- and hyponatremia, which could be demonstrated in 6 of our 9 patients, may raise the suspicion of tuberculous etiology. In the series presented, cranial computed tomography was of greatest diagnostic value, enabling the diagnosis of hydrocephalus internus in 5, and basal arachnoiditis in 2 cases. (orig.) [de

  1. Meningitis tuberculosa: Clinical findings and results of cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Trautmann, M.; Loddenkemper, R.; Hoffmann, H.G.

    1982-10-01

    Guided by 9 own observations between 1977 and 1981, new diagnostic facilities in tuberculous meningitis are discussed. For differentiation from viral meningitis, measurement of CSF lactic acid concentration in addition to that of CSF glucose has proved to be of value in recent years. In accordance with the literature, two cases of this series which were examined for CSF lactic acid concentration showed markedly elevated levels of 8,4 rsp. 10,4 mmol/l. In contrast to this, in viral meningitis usually values of less than 3.5 mmol/l are found. Additionally, the presence of hypochlor- and hyponatremia, which could be demonstrated in 6 of our 9 patients, may raise the suspicion of tuberculous etiology. In the series presented, cranial computed tomography was of greatest diagnostic value, enabling the diagnosis of hydrocephalus internus in 5, and basal arachnoiditis in 2 cases.

  2. Dynamic computed tomography scanning of benign bone lesions: Preliminary results

    International Nuclear Information System (INIS)

    Levine, E.; Neff, J.R.

    1983-01-01

    The majority of benign bone lesions can be evaluated adequately using conventional radiologic techniques. However, it is not always possible to differentiate reliably between different types of benign bone lesions on the basis of plain film appearances alone. Dynamic computed tomography (CT) scanning provides a means for further characterizing such lesions by assessing their degree of vascularity. Thus, it may help in distinguishing an osteoid osteoma, which has a hypervascular nidus, from a Brodie's abscess, which is avascular. Dynamic CT scanning may also help in the differentiation between a fluid-containing simple bone cyst, which is avascular, and other solid or semi-solid benign bone lesions which slow varying degrees of vascularity. However, because of the additional irradiation involved, dynamic CT scanning should be reserved for evaluation of selected patients with benign bone lesions in whom the plain film findings are not definitive and in whom the CT findings may have a significant influence on management. (orig.)

  3. Office and ambulatory blood pressure control with a fixed-dose combination of candesartan and hydrochlorothiazide in previously uncontrolled hypertensive patients: results of CHILI CU Soon

    Directory of Open Access Journals (Sweden)

    Bramlage P

    2011-12-01

    Full Text Available Thomas Mengden1, Reinhold Hübner2, Peter Bramlage31Kerckhoff-Klinik GmbH, Bad Nauheim, 2Takeda Pharma GmbH, Aachen, 3Institut für Kardiovaskuläre Pharmakologie und Epidemiologie, Mahlow, GermanyBackground: Fixed-dose combinations of candesartan 32 mg and hydrochlorothiazide (HCTZ have been shown to be effective in clinical trials. Upon market entry we conducted a noninterventional study to document the safety and effectiveness of this fixed-dose combination in an unselected population in primary care and to compare blood pressure (BP values obtained during office measurement (OBPM with ambulatory blood pressure measurement (ABPM.Methods: CHILI CU Soon was a prospective, noninterventional, noncontrolled, open-label, multicenter study with a follow-up of at least 10 weeks. High-risk patients aged ≥18 years with previously uncontrolled hypertension were started on candesartan 32 mg in a fixed-dose combination with either 12.5 mg or 25 mg HCTZ. OBPM and ABPM reduction and adverse events were documented.Results: A total of 4131 patients (52.8% male with a mean age of 63.0 ± 11.0 years were included. BP was 162.1 ± 14.8/94.7 ± 9.2 mmHg during office visits at baseline. After 10 weeks of candesartan 32 mg/12.5 mg or 25 mg HCTZ, mean BP had lowered to 131.7 ± 10.5/80.0 ± 6.6 mmHg (P < 0.0001 for both comparisons. BP reduction was comparable irrespective of prior or concomitant medication. In patients for whom physicians regarded an ABPM to be necessary (because of suspected noncontrol over 24 hours, ABP at baseline was 158.2/93.7 mmHg during the day and 141.8/85.2 mmHg during the night. At the last visit, BP had significantly reduced to 133.6/80.0 mmHg and 121.0/72.3 mmHg, respectively, resulting in 20.8% being normotensive over 24 hours (<130/80 mmHg. The correlation between OBPM and ABPM was good (r = 0.589 for systolic BP and r = 0.389 for diastolic BP during the day. Of those who were normotensive upon OBPM, 35.1% had high ABPM during the

  4. Presentation of RELAP5 results on the personal computer

    International Nuclear Information System (INIS)

    Salamun, I.; Stritar, A.

    1991-01-01

    DrALF is a program for graphical presentation of RELAP5 results. Results may be displayed in two different forms, as graphs with different zoom capabilities and as drawings or nodalizations with different variables displayed on a background picture. (author)

  5. Computer usage and national energy consumption: Results from a field-metering study

    Energy Technology Data Exchange (ETDEWEB)

    Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Greenblatt, Jeffery [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Claybaugh, Erin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Nagaraju, Mythri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Young, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division

    2014-12-01

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power

  6. Culture Negative Listeria monocytogenes Meningitis Resulting in Hydrocephalus and Severe Neurological Sequelae in a Previously Healthy Immunocompetent Man with Penicillin Allergy

    DEFF Research Database (Denmark)

    Gaini, Shahin; Karlsen, Gunn Hege; Nandy, Anirban

    2015-01-01

    A previously healthy 74-year-old Caucasian man with penicillin allergy was admitted with evolving headache, confusion, fever, and neck stiffness. Treatment for bacterial meningitis with dexamethasone and monotherapy ceftriaxone was started. The cerebrospinal fluid showed negative microscopy...... the catheter. The patient had severe neurological sequelae. This case report emphasises the importance of covering empirically for Listeria monocytogenes in all patients with penicillin allergy with suspected bacterial meningitis. The case also shows that it is possible to have significant infection...

  7. Treatment of Previously Treated Facial Capillary Malformations: Results of Single-Center Retrospective Objective 3-Dimensional Analysis of the Efficacy of Large Spot 532 nm Lasers.

    Science.gov (United States)

    Kwiek, Bartłomiej; Ambroziak, Marcin; Osipowicz, Katarzyna; Kowalewski, Cezary; Rożalski, Michał

    2018-06-01

    Current treatment of facial capillary malformations (CM) has limited efficacy. To assess the efficacy of large spot 532 nm lasers for the treatment of previously treated facial CM with the use of 3-dimensional (3D) image analysis. Forty-three white patients aged 6 to 59 were included in this study. Patients had 3D photography performed before and after treatment with a 532 nm Nd:YAG laser with large spot and contact cooling. Objective analysis of percentage improvement based on 3D digital assessment of combined color and area improvement (global clearance effect [GCE]) were performed. The median maximal improvement achieved during the treatment (GCE) was 59.1%. The mean number of laser procedures required to achieve this improvement was 6.2 (range 1-16). Improvement of minimum 25% (GCE25) was achieved by 88.4% of patients, a minimum of 50% (GCE50) by 61.1%, a minimum of 75% (GCE75) by 25.6%, and a minimum of 90% (GCE90) by 4.6%. Patients previously treated with pulsed dye lasers had a significantly less response than those treated with other modalities (GCE 37.3% vs 61.8%, respectively). A large spot 532 nm laser is effective in previously treated patients with facial CM.

  8. Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry.

    OpenAIRE

    Manning, B.; Gadd, C. S.

    2001-01-01

    Although published reports describe specific handheld computer applications in medical training, we know very little yet about how, and how well, handheld computing fits into the spectrum of information resources available for patient care and physician training. This paper reports preliminary quantitative and qualitative results from an evaluation study designed to track changes in computer usage patterns and computer-related attitudes before and after introduction of handheld computing. Pre...

  9. Results of diagnosis of pancreatic cancer by computed tomography (CT)

    International Nuclear Information System (INIS)

    Kimura, Kazue; Okuaki, Koji; Ito, Masami; Katakura, Toshihiko; Suzuki, Kenji

    1981-01-01

    Results of examination of pancreatic diseases, especially pancreatic cancer, conducted by CT during the past 3 years are summarized. The EMI CT Type 5000 or 5005 were used. During the 3 years from September 1976 to August 1979, a total of 1961 patients were examined by CT, and the upper abdomen was examined in 772 of these patients. In 97 patients, positive findings were obtained in the CT image of the pancreas. In 52 of these patients, the findings were confirmed operatively or by autopsy. Though cancer of the pancreas was diagnosed by CT in 30 patients, it was confirmed in 20 by surgical operation and in 1 by autopsy. Of the 9 misdiagnosed cases, 4 were cases of infiltration of the pancreas by carcinoma of the stomach or bile duct, and the other 5 were one case each of lipoma of the abdominal wall, normal pancreas, hyperplasia of Langerhans's islets of the pancreas tail, abscess between the pancreas and the posterior wall of the stomach, and choledocholithiasis. A case diagnosed by CT as cholelithiasis was a carcinoma measuring 5 x 5 x 6 cm located on the head of the pancreas, complicated by choledocholithiasis. The 22 patients with carcinoma of the pancreas were 9 with lesions less than 3.5 x 3.0 x 3.0 cm in size who could be radically operated, 6 who underwent exploratory laparotomy or autopsy, and 7 in whom operation was impossible. False negative and false positive CT results are also discussed. (author)

  10. Drug-Induced QT Prolongation as a Result of an Escitalopram Overdose in a Patient with Previously Undiagnosed Congenital Long QT Syndrome

    Directory of Open Access Journals (Sweden)

    Paul Singh

    2014-01-01

    Full Text Available We present a case of drug-induced QT prolongation caused by an escitalopram overdose in a patient with previously undiagnosed congenital LQTS. A 15-year-old Caucasian female presented following a suicide attempt via an escitalopram overdose. The patient was found to have a prolonged QT interval with episodes of torsades de pointes. The patient was admitted to the telemetry unit and treated. Despite the resolution of the torsades de pointes, she continued to demonstrate a persistently prolonged QT interval. She was seen by the cardiology service and diagnosed with congenital long QT syndrome. This case illustrates the potential for an escitalopram overdose to cause an acute QT prolongation in a patient with congenital LQTS and suggests the importance of a screening electrocardiogram prior to the initiation of SSRIs, especially in patients at high risk for QT prolongation.

  11. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  12. Computational bone remodelling simulations and comparisons with DEXA results.

    Science.gov (United States)

    Turner, A W L; Gillies, R M; Sekel, R; Morris, P; Bruce, W; Walsh, W R

    2005-07-01

    Femoral periprosthetic bone loss following total hip replacement is often associated with stress shielding. Extensive bone resorption may lead to implant or bone failure and complicate revision surgery. In this study, an existing strain-adaptive bone remodelling theory was modified and combined with anatomic three-dimensional finite element models to predict alterations in periprosthetic apparent density. The theory incorporated an equivalent strain stimulus and joint and muscle forces from 45% of the gait cycle. Remodelling was simulated for three femoral components with different design philosophies: cobalt-chrome alloy, two-thirds proximally coated; titanium alloy, one-third proximally coated; and a composite of cobalt-chrome surrounded by polyaryletherketone, fully coated. Theoretical bone density changes correlated significantly with clinical densitometry measurements (DEXA) after 2 years across the Gruen zones (R2>0.67, p<0.02), with average differences of less than 5.4%. The results suggest that a large proportion of adaptive bone remodelling changes seen clinically with these implants may be explained by a consistent theory incorporating a purely mechanical stimulus. This theory could be applied to pre-clinical testing of new implants, investigation of design modifications, and patient-specific implant selection.

  13. Late tamoxifen in patients previously operated for breast cancer without postoperative tamoxifen: 5-year results of a single institution randomised study

    International Nuclear Information System (INIS)

    Veronesi, Andrea; Miolo, GianMaria; Magri, Maria D; Crivellari, Diana; Scalone, Simona; Bidoli, Ettore; Lombardi, Davide

    2010-01-01

    A population of breast cancer patients exists who, for various reasons, never received adjuvant post-operative tamoxifen (TAM). This study was aimed to evaluate the role of late TAM in these patients. From 1997 to 2003, patients aged 35 to 75 years, operated more than 2 years previously for monolateral breast cancer without adjuvant TAM, with no signs of metastases and no contraindication to TAM were randomized to TAM 20 mg/day orally for 2 years or follow-up alone. Events were categorized as locoregional relapse, distant metastases, metachronous breast cancer, tumours other than breast cancer and death from any causes, whichever occurred first. The sample size (197 patients per arm, plus 10% allowance) was based on the assumption of a 30% decrease in the number of events occurring at a rate of 5% annually in the 10 years following randomization. Four hundred and thirty-three patients were randomized in the study (TAM 217, follow-up 216). Patients characteristics (TAM/follow-up) included: median age 55/55 years, median time from surgery 25/25 months (range, 25-288/25-294), in situ carcinoma 18/24, oestrogen receptor (ER) positive in 75/68, negative in 70/57, unknown in 72/91 patients. Previous adjuvant treatment included chemotherapy in 131/120 and an LHRH analogue in 11/13 patients. Thirty-six patients prematurely discontinued TAM after a median of 1 month, mostly because of subjective intolerance. Eighty-three events (TAM 39, follow-up 44) occurred: locoregional relapse in 10/8, distant metastases in 14/16, metachronous breast cancer in 4/10, other tumours in 11/10 patients. Less ER-positive secondary breast cancers occurred in the TAM treated patients than in follow-up patients (1 vs 10, p = 0.005). Event-free survival was similar in both groups of patients. This 5-year analysis revealed significantly less metachronous ER-positive breast cancers in the TAM treated patients. No other statistically significant differences have emerged thus far

  14. Lack of association of variants previously associated with anti-TNF medication response in rheumatoid arthritis patients: results from a homogeneous Greek population.

    Directory of Open Access Journals (Sweden)

    Maria I Zervou

    Full Text Available Treatment strategies blocking tumor necrosis factor (anti-TNF have proven very successful in patients with rheumatoid arthritis (RA, showing beneficial effects in approximately 50-60% of the patients. However, a significant subset of patients does not respond to anti-TNF agents, for reasons that are still unknown. The aim of this study was to validate five single nucleotide polymorphisms (SNPs of PTPRC, CD226, AFF3, MyD88 and CHUK gene loci that have previously been reported to predict anti-TNF outcome. In addition, two markers of RA susceptibility, namely TRAF1/C5 and STAT4 were assessed, in a cohort of anti-TNF-treated RA patients, from the homogeneous Greek island of Crete, Greece. The RA patient cohort consisted of 183 patients treated with either of 3 anti-TNF biologic agents (infliximab, adalimumab and etanercept from the Clinic of Rheumatology of the University Hospital of Crete. The SNPs were genotyped by TaqMan assays or following the Restriction Fragments Length Polymorphisms (RFLPs approach. Disease activity score in 28 joints (DAS28 at baseline and after 6 months were available for all patients and analysis of good versus poor response at 6 months was performed for each SNP. None of the 7 genetic markers correlated with treatment response. We conclude that the gene polymorphisms under investigation are not strongly predictive of anti-TNF response in RA patients from Greece.

  15. Lack of association of variants previously associated with anti-TNF medication response in rheumatoid arthritis patients: results from a homogeneous Greek population.

    Science.gov (United States)

    Zervou, Maria I; Myrthianou, Efsevia; Flouri, Irene; Plant, Darren; Chlouverakis, Gregory; Castro-Giner, Francesc; Rapsomaniki, Panayiota; Barton, Anne; Boumpas, Dimitrios T; Sidiropoulos, Prodromos; Goulielmos, George N

    2013-01-01

    Treatment strategies blocking tumor necrosis factor (anti-TNF) have proven very successful in patients with rheumatoid arthritis (RA), showing beneficial effects in approximately 50-60% of the patients. However, a significant subset of patients does not respond to anti-TNF agents, for reasons that are still unknown. The aim of this study was to validate five single nucleotide polymorphisms (SNPs) of PTPRC, CD226, AFF3, MyD88 and CHUK gene loci that have previously been reported to predict anti-TNF outcome. In addition, two markers of RA susceptibility, namely TRAF1/C5 and STAT4 were assessed, in a cohort of anti-TNF-treated RA patients, from the homogeneous Greek island of Crete, Greece. The RA patient cohort consisted of 183 patients treated with either of 3 anti-TNF biologic agents (infliximab, adalimumab and etanercept) from the Clinic of Rheumatology of the University Hospital of Crete. The SNPs were genotyped by TaqMan assays or following the Restriction Fragments Length Polymorphisms (RFLPs) approach. Disease activity score in 28 joints (DAS28) at baseline and after 6 months were available for all patients and analysis of good versus poor response at 6 months was performed for each SNP. None of the 7 genetic markers correlated with treatment response. We conclude that the gene polymorphisms under investigation are not strongly predictive of anti-TNF response in RA patients from Greece.

  16. Culture Negative Listeria monocytogenes Meningitis Resulting in Hydrocephalus and Severe Neurological Sequelae in a Previously Healthy Immunocompetent Man with Penicillin Allergy

    Directory of Open Access Journals (Sweden)

    Shahin Gaini

    2015-01-01

    Full Text Available A previously healthy 74-year-old Caucasian man with penicillin allergy was admitted with evolving headache, confusion, fever, and neck stiffness. Treatment for bacterial meningitis with dexamethasone and monotherapy ceftriaxone was started. The cerebrospinal fluid showed negative microscopy for bacteria, no bacterial growth, and negative polymerase chain reaction for bacterial DNA. The patient developed hydrocephalus on a second CT scan of the brain on the 5th day of admission. An external ventricular catheter was inserted and Listeria monocytogenes grew in the cerebrospinal fluid from the catheter. The patient had severe neurological sequelae. This case report emphasises the importance of covering empirically for Listeria monocytogenes in all patients with penicillin allergy with suspected bacterial meningitis. The case also shows that it is possible to have significant infection and inflammation even with negative microscopy, negative cultures, and negative broad range polymerase chain reaction in cases of Listeria meningitis. Follow-up spinal taps can be necessary to detect the presence of Listeria monocytogenes.

  17. Can GSTM1 and GSTT1 polymorphisms predict clinical outcomes of chemotherapy in gastric and colorectal cancers? A result based on the previous reports

    Directory of Open Access Journals (Sweden)

    Liu H

    2016-06-01

    Full Text Available Haixia Liu,1,* Wei Shi,2,* Lianli Zhao,3 Dianlu Dai,4 Jinghua Gao,5 Xiangjun Kong6 1Department of Ultrasound, 2Office of Medical Statistics, 3Human Resource Department, 4Department of Surgical Oncology, 5Department of Medical Oncology, 6Central Laboratory, Cangzhou Central Hospital, Yunhe District, Cangzhou, People’s Republic of China *These authors contributed equally to this study and should be considered cofirst authors Background: Gastric and colorectal cancers remain the major causes of cancer-related death. Although chemotherapy improves the prognosis of the patients with gastrointestinal cancers, some patients do not benefit from therapy and are exposed to the adverse effects. The polymorphisms in genes including GSTM1 and GSTT1 have been explored to predict therapeutic efficacy; however, the results were inconsistent and inconclusive. Materials and methods: A systematic review and meta-analysis was performed by searching relevant studies about the association between the GSTM1 and GSTT1 polymorphisms and chemotherapy efficacy in gastrointestinal cancers in databases such as PubMed, EMBASE, Web of Science, Chinese National Knowledge Infrastructure, and Wanfang database up to January 10, 2016. Subgroup analyses were also performed according to ethnicity, cancer type, evaluation criteria, study type, chemotherapy type, and age. Results: A total of 19 articles containing 3,217 cases were finally included. Overall analysis suggested that no significance was found between overall toxicity, neurotoxicity, neutropenia, gastrointestinal toxicity, tumor response, and progression-free survival, and the polymorphisms in GSTM1 and GSTT1, while GSTM1 polymorphism associated with overall survival (OS; hazard ratio =1.213, 95% confidence interval =1.060–1.388, P=0.005. Subgroup analyses suggested that neurotoxicity was associated with GSTM1 polymorphism in the Asian population, neutropenia was associated with GSTM1 polymorphism in palliative

  18. Contamination of the Alluvium of the Nitra River in Slovakia by Cadmium, Mercury and Lead as a Result of Previous Intense Industrial Activity.

    Science.gov (United States)

    Vollmannova, A; Kujovsky, M; Stanovic, R; Arvay, J; Harangozo, L

    2016-10-01

    The Nitra river is one of the most polluted rivers in the Slovak Republic. The aim of the study was to estimate the risk of Cd, Pb and Hg contamination of riverside sediments and alluvial soil in the vicinity of the Nitra river. The pseudototal Cd (all Cd forms except for residual fraction) and total Hg contents in riverside sediments (0.74-1.88 and 0.06-5.44 mg/kg, respectively) exceeded the limits for Cd and Hg in sandy soils (0.4 and 0.15 mg/kg). In three chosen localities in the flood plain of the Nitra river the soil content of mobile Pb forms (0.10-0.32 mg/kg), the pseudototal Cd (0.25-2.52 mg/kg) and total Hg content (0.03-1.6 mg/kg) exceeded the limits for Pb, Cd and Hg in loamy soils (0.1, 0.7 and 0.5 mg/kg, respectively). The obtained results confirmed the risk of Pb, Cd, Hg contamination caused by industrial activity in the vicinity of the Nitra river.

  19. Computations for the 1:5 model of the THTR pressure vessel compared with experimental results

    International Nuclear Information System (INIS)

    Stangenberg, F.

    1972-01-01

    In this report experimental results measured at the 1:5-model of the prestressed concrete pressure vessel of the THTR-nuclear power station Schmehausen in 1971, are compared with the results of axis-symmetrical computations. Linear-elastic computations were performed as well as approximate computations for overload pressures taking into consideration the influences of the load history (prestressing, temperature, creep) and the effects of the steel components. (orig.) [de

  20. Some gender issues in educational computer use: results of an international comparative survey

    OpenAIRE

    Janssen Reinen, I.A.M.; Plomp, T.

    1993-01-01

    In the framework of the Computers in Education international study of the International Association for the Evaluation of Educational Achievement (IEA), data have been collected concerning the use of computers in 21 countries. This article examines some results regarding the involvement of women in the implementation and use of computers in the educational practice of elementary, lower secondary and upper secondary education in participating countries. The results show that in many countries ...

  1. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  2. Constructing a paleo-DEM in an urban area by the example of the city of Aachen, Germany: Methods and previous results

    Science.gov (United States)

    Pröschel, Bernhard; Lehmkuhl, Frank

    2017-04-01

    Reconstructing paleo-landscapes in urban areas is always a special challenge since the research area often witnessed constant human impact over long time periods. Dense building development is a major difficulty, particularly in regard to accessibility to in-situ soils and archaeological findings. It is therefore necessary to use data from various sources and combine methods from different fields to gain a detailed picture of the former topography. The area, which is occupied by the city of Aachen today, looks back on a long history of human influence. Traces of human activity can be dated back to Neolithic time. The first architectural structures and the first road network were built by the Romans about 2000 years ago. From then on, the area of Aachen was more or less continuously inhabited forming today's city. This long history is represented by archaeological findings throughout the city. Several meters of settlement deposits, covering different eras, are present in many locations. Therefore, it can be assumed that the modern topography significantly differs from the pre-roman topography. The main objective of this project is a reconstruction of the paleo-topography of Aachen in order to gain new insights on the spatial preconditions that the first settlers found. Moreover, further attention is given to the question whether and to what extent a paleo-DEM can help to clarify specific open archaeological and historical questions. The main database for the reconstruction are the archaeological excavation reports of the past 150 years, provided by municipal and regional archives. After analyzing these written accounts, we linked this information to drill data, provided by the Geological Service of North Rhine-Westphalia. Together with additional sources like geological and hydrological maps, we generated a GIS-based terrain model. The result is a high-resolution terrain model, representing the undisturbed pre-roman topography of the inner city of Aachen without any

  3. Results From the Phase III Randomized Trial of Onartuzumab Plus Erlotinib Versus Erlotinib in Previously Treated Stage IIIB or IV Non-Small-Cell Lung Cancer: METLung.

    Science.gov (United States)

    Spigel, David R; Edelman, Martin J; O'Byrne, Kenneth; Paz-Ares, Luis; Mocci, Simonetta; Phan, See; Shames, David S; Smith, Dustin; Yu, Wei; Paton, Virginia E; Mok, Tony

    2017-02-01

    Purpose The phase III OAM4971g study (METLung) examined the efficacy and safety of onartuzumab plus erlotinib in patients with locally advanced or metastatic non-small-cell lung cancer selected by MET immunohistochemistry whose disease had progressed after treatment with a platinum-based chemotherapy regimen. Patients and Methods Patients were randomly assigned at a one-to-one ratio to receive onartuzumab (15 mg/kg intravenously on day 1 of each 21-day cycle) plus daily oral erlotinib 150 mg or intravenous placebo plus daily oral erlotinib 150 mg. The primary end point was overall survival (OS) in the intent-to-treat population. Secondary end points included median progression-free survival, overall response rate, biomarker analysis, and safety. Results A total of 499 patients were enrolled (onartuzumab, n = 250; placebo, n = 249). Median OS was 6.8 versus 9.1 months for onartuzumab versus placebo (stratified hazard ratio [HR], 1.27; 95% CI, 0.98 to 1.65; P = .067), with a greater number of deaths in the onartuzumab arm (130 [52%] v 114 [46%]). Median progression-free survival was 2.7 versus 2.6 months (stratified HR, 0.99; 95% CI, 0.81 to 1.20; P = .92), and overall response rate was 8.4% and 9.6% for onartuzumab versus placebo, respectively. Exploratory analyses using MET fluorescence in situ hybridization status and gene expression showed no benefit for onartuzumab; patients with EGFR mutations showed a trend toward shorter OS with onartuzumab treatment (HR, 4.68; 95% CI, 0.97 to 22.63). Grade 3 to 5 adverse events were reported by 56.0% and 51.2% of patients, with serious AEs in 33.9% and 30.7%, for experimental versus control arms, respectively. Conclusion Onartuzumab plus erlotinib did not improve clinical outcomes, with shorter OS in the onartuzumab arm, compared with erlotinib in patients with MET-positive non-small-cell lung cancer.

  4. Evidence of previous but not current transmission of chikungunya virus in southern and central Vietnam: Results from a systematic review and a seroprevalence study in four locations.

    Directory of Open Access Journals (Sweden)

    Tran Minh Quan

    2018-02-01

    Full Text Available Arbovirus infections are a serious concern in tropical countries due to their high levels of transmission and morbidity. With the outbreaks of chikungunya (CHIKV in surrounding regions in recent years and the fact that the environment in Vietnam is suitable for the vectors of CHIKV, the possibility of transmission of CHIKV in Vietnam is of great interest. However, information about CHIKV activity in Vietnam remains limited.In order to address this question, we performed a systematic review of CHIKV in Vietnam and a CHIKV seroprevalence survey. The seroprevalence survey tested for CHIKV IgG in population serum samples from individuals of all ages in 2015 from four locations in Vietnam.The four locations were An Giang province (n = 137, Ho Chi Minh City (n = 136, Dak Lak province (n = 137, and Hue City (n = 136. The findings give us evidence of some CHIKV activity: 73/546 of overall samples were seropositive (13.4%. The age-adjusted seroprevalences were 12.30% (6.58-18.02, 13.42% (7.16-19.68, 7.97% (3.56-12.38, and 3.72% (1.75-5.69 in An Giang province, Ho Chi Minh City, Dak Lak province, and Hue City respectively. However, the age-stratified seroprevalence suggests that the last transmission ended around 30 years ago, consistent with results from the systematic review. We see no evidence for on-going transmission in three of the locations, though with some evidence of recent exposure in Dak Lak, most likely due to transmission in neighbouring countries. Before the 1980s, when transmission was occurring, we estimate on average 2-4% of the population were infected each year in HCMC and An Giang and Hue (though transmision ended earlier in Hue. We estimate lower transmission in Dak Lak, with around 1% of the population infected each year.In conclusion, we find evidence of past CHIKV transmission in central and southern Vietnam, but no evidence of recent sustained transmission. When transmission of CHIKV did occur, it appeared to be widespread and

  5. 14th annual Results and Review Workshop on High Performance Computing in Science and Engineering

    CERN Document Server

    Nagel, Wolfgang E; Resch, Michael M; Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2011; High Performance Computing in Science and Engineering '11

    2012-01-01

    This book presents the state-of-the-art in simulation on supercomputers. Leading researchers present results achieved on systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2011. The reports cover all fields of computational science and engineering, ranging from CFD to computational physics and chemistry, to computer science, with a special emphasis on industrially relevant applications. Presenting results for both vector systems and microprocessor-based systems, the book allows readers to compare the performance levels and usability of various architectures. As HLRS

  6. Columnar modelling of nucleation burst evolution in the convective boundary layer – first results from a feasibility study Part IV: A compilation of previous observations for valuation of simulation results from a columnar modelling study

    Directory of Open Access Journals (Sweden)

    O. Hellmuth

    2006-01-01

    Full Text Available In the preceding Papers I, II and III a revised columnar high-order modelling approach to model gas-aerosol-turbulence interactions in the convective boundary layer (CBL was proposed, and simulation results of two synthetic nucleation scenarios (binary vs. ternary on new particle formation (NPF in the anthropogenically influenced CBL were presented and discussed. The purpose of the present finishing Paper IV is twofold: Firstly, an attempt is made to compile previous observational findings on NPF bursts in the CBL, obtained from a number of field experiments. Secondly, the scenario simulations discussed in Paper III will be evaluated with respect to the role of CBL turbulence in NPF burst evolution. It was demonstrated, that completely different nucleation mechanisms can lead to the occurrence of NPF bursts in the surface layer, but the corresponding evolution patterns strongly differ with respect to the origin, amplitude and phase of the NPF burst as well as with respect to the time-height evolution of turbulent vertical fluxes and double correlation terms of physicochemical and aerosoldynamical variables. The large differences between the binary and ternary case scenario indicate, that ammonia (NH3 can not be considered as a time-independent tuning parameter in nucleation modelling. Its contribution to the evolution of the NPF burst pattern is much more complicated and reflects the influence of CBL turbulence as well as the strong non-linearity of the ternary nucleation rate. The impact of water (H2O vapour on the nucleation rate is quite varying depending on the considered nucleation mechanism. According to the classical theory of binary nucleation involving H2O and sulphuric acid (H2SO4, H2O vapour favours NPF, according to the classical theory of ternary nuncleation involving H2O, H2SO4 and NH3 and according to organic nucleation via chemical reactions involving stabilised Criegee intermediates (SCIs, H2O vapour disfavours nucleation, and

  7. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    International Nuclear Information System (INIS)

    Sommer, C. M.; Fritz, S.; Vollherbst, D.; Zelzer, S.; Wachter, M. F.; Bellemann, N.; Gockner, T.; Mokry, T.; Schmitz, A.; Aulmann, S.; Stampfl, U.; Pereira, P.; Kauczor, H. U.; Werner, J.; Radeleff, B. A.

    2015-01-01

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm 3 , and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm 3 , and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver

  8. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, C. M., E-mail: christof.sommer@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Fritz, S., E-mail: stefan.fritz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Vollherbst, D., E-mail: dominikvollherbst@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Zelzer, S., E-mail: s.zelzer@dkfz-heidelberg.de [German Cancer Research Center (dkfz), Medical and Biological Informatics (Germany); Wachter, M. F., E-mail: fredericwachter@googlemail.com; Bellemann, N., E-mail: nadine.bellemann@med.uni-heidelberg.de; Gockner, T., E-mail: theresa.gockner@med.uni-heidelberg.de; Mokry, T., E-mail: theresa.mokry@med.uni-heidelberg.de; Schmitz, A., E-mail: anne.schmitz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Aulmann, S., E-mail: sebastian.aulmann@mail.com [University Hospital Heidelberg, Department of General Pathology (Germany); Stampfl, U., E-mail: ulrike.stampfl@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Pereira, P., E-mail: philippe.pereira@slk-kliniken.de [SLK Kliniken Heilbronn GmbH, Clinic for Radiology, Minimally-invasive Therapies and Nuclear Medicine (Germany); Kauczor, H. U., E-mail: hu.kauczor@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Werner, J., E-mail: jens.werner@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Radeleff, B. A., E-mail: boris.radeleff@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany)

    2015-02-15

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm{sup 3}, and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm{sup 3}, and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver.

  9. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  10. Technical Note. The Concept of a Computer System for Interpretation of Tight Rocks Using X-Ray Computed Tomography Results

    Directory of Open Access Journals (Sweden)

    Habrat Magdalena

    2017-03-01

    Full Text Available The article presents the concept of a computer system for interpreting unconventional oil and gas deposits with the use of X-ray computed tomography results. The functional principles of the solution proposed are presented in the article. The main goal is to design a product which is a complex and useful tool in a form of a specialist computer software for qualitative and quantitative interpretation of images obtained from X-ray computed tomography. It is devoted to the issues of prospecting and identification of unconventional hydrocarbon deposits. The article focuses on the idea of X-ray computed tomography use as a basis for the analysis of tight rocks, considering especially functional principles of the system, which will be developed by the authors. The functional principles include the issues of graphical visualization of rock structure, qualitative and quantitative interpretation of model for visualizing rock samples, interpretation and a description of the parameters within realizing the module of quantitative interpretation.

  11. Increasing the trustworthiness of research results: the role of computers in qualitative text analysis

    Science.gov (United States)

    Lynne M. Westphal

    2000-01-01

    By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...

  12. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  13. Re-Computation of Numerical Results Contained in NACA Report No. 496

    Science.gov (United States)

    Perry, Boyd, III

    2015-01-01

    An extensive examination of NACA Report No. 496 (NACA 496), "General Theory of Aerodynamic Instability and the Mechanism of Flutter," by Theodore Theodorsen, is described. The examination included checking equations and solution methods and re-computing interim quantities and all numerical examples in NACA 496. The checks revealed that NACA 496 contains computational shortcuts (time- and effort-saving devices for engineers of the time) and clever artifices (employed in its solution methods), but, unfortunately, also contains numerous tripping points (aspects of NACA 496 that have the potential to cause confusion) and some errors. The re-computations were performed employing the methods and procedures described in NACA 496, but using modern computational tools. With some exceptions, the magnitudes and trends of the original results were in fair-to-very-good agreement with the re-computed results. The exceptions included what are speculated to be computational errors in the original in some instances and transcription errors in the original in others. Independent flutter calculations were performed and, in all cases, including those where the original and re-computed results differed significantly, were in excellent agreement with the re-computed results. Appendix A contains NACA 496; Appendix B contains a Matlab(Reistered) program that performs the re-computation of results; Appendix C presents three alternate solution methods, with examples, for the two-degree-of-freedom solution method of NACA 496; Appendix D contains the three-degree-of-freedom solution method (outlined in NACA 496 but never implemented), with examples.

  14. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  15. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  16. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  18. Golimumab in patients with active rheumatoid arthritis who have previous experience with tumour necrosis factor inhibitors: results of a long-term extension of the randomised, double-blind, placebo-controlled GO-AFTER study through week 160

    NARCIS (Netherlands)

    Smolen, Josef S.; Kay, Jonathan; Landewé, Robert B. M.; Matteson, Eric L.; Gaylis, Norman; Wollenhaupt, Jurgen; Murphy, Frederick T.; Zhou, Yiying; Hsia, Elizabeth C.; Doyle, Mittie K.

    2012-01-01

    The aim of this study was to assess long-term golimumab therapy in patients with rheumatoid arthritis (RA) who discontinued previous tumour necrosis factor alpha (TNFα) inhibitor(s) for any reason. Results through week 24 of this multicentre, randomised, double-blind, placebo-controlled study of

  19. Results of work of neurological clinic in first year of computer tomograph application

    Energy Technology Data Exchange (ETDEWEB)

    Volejnik, V; Nettl, S; Heger, L [Karlova Univ., Hradec Kralove (Czechoslovakia). Lekarska Fakulta

    1980-11-01

    The results are analyzed of one year's use of a computer tomograph (CT) by a department of neurology. Detailed comparisons with corresponding PEG and CT findings showed the accuracy of CT examinations in the descriptions of the width of the subarachnoid spaces and of the ventricular system. The advantages of CT are assessed from the medical, economic, and ethical points of view.

  20. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    Science.gov (United States)

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  1. Results of work of neurological clinic in first year of computer tomograph application

    International Nuclear Information System (INIS)

    Volejnik, V.; Nettl, S.; Heger, L.

    1980-01-01

    The results are analyzed of one year's use of a computer tomograph (CT) by a department of neurology. Detailed comparisons with corresponding PEG and CT findings showed the accuracy of CT examinations in the descriptions of the width of the subarachnoid spaces and of the ventricular system. The advantages of CT are assessed from the medical, economic, and ethical points of view. (author)

  2. Technique and results of the spinal computed tomography in the diagnosis of cervical disc disease

    International Nuclear Information System (INIS)

    Artmann, H.; Salbeck, R.; Grau, H.

    1985-01-01

    We give a description of a technique of the patient's positioning with traction of the arms during the cervical spinal computed tomography which allows to draw the shoulders downwards by about one to three cervical segments. By this method the quality of the images can be improved in 96% in the cervical segment 6/7 and in 81% in the cervical/thoracal segment 7/1 to such a degree that a reliable judgement of the soft parts in the spinal canal becomes possible. The diagnostic reliability of the computed tomography of the cervical disc herniation is thus improved so that the necessity of a myelography is decreasing. The results of 396 cervical spinal computed tomographies are presented. (orig.) [de

  3. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  4. Discrete ordinates cross-section generation in parallel plane geometry -- 2: Computational results

    International Nuclear Information System (INIS)

    Yavuz, M.

    1998-01-01

    In Ref. 1, the author presented inverse discrete ordinates (S N ) methods for cross-section generation with an arbitrary scattering anisotropy of order L (L ≤ N - 1) in parallel plane geometry. The solution techniques depend on the S N eigensolutions. The eigensolutions are determined by the inverse simplified S N method (ISS N ), which uses the surface Green's function matrices (T and R). Inverse problems are generally designed so that experimentally measured physical quantities can be used in the formulations. In the formulations, although T and R (TR matrices) are measurable quantities, the author does not have such data to check the adequacy and accuracy of the methods. However, it is possible to compute TR matrices by S N methods. The author presents computational results and computationally observed properties

  5. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    Science.gov (United States)

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  6. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  7. A result-driven minimum blocking method for PageRank parallel computing

    Science.gov (United States)

    Tao, Wan; Liu, Tao; Yu, Wei; Huang, Gan

    2017-01-01

    Matrix blocking is a common method for improving computational efficiency of PageRank, but the blocking rules are hard to be determined, and the following calculation is complicated. In tackling these problems, we propose a minimum blocking method driven by result needs to accomplish a parallel implementation of PageRank algorithm. The minimum blocking just stores the element which is necessary for the result matrix. In return, the following calculation becomes simple and the consumption of the I/O transmission is cut down. We do experiments on several matrixes of different data size and different sparsity degree. The results show that the proposed method has better computational efficiency than traditional blocking methods.

  8. Previous ISD Program Review.

    Science.gov (United States)

    1981-03-01

    report. The detail required for such a review would be unwieldy and would comsume inordinate amounts of time. The result of the document review will...attempts have been made at writing specific behavioral objectives (SBOs). These, however, have proven to be inadequate in that they are not stated in... behavioral terms (e.g., "will understand," "will have a knowledge of," etc.). C. Development of CRO/CRTs? In nearly all cases, ISD teams are just

  9. Altered reward processing in pathological computer gamers--ERP-results from a semi-natural gaming-design.

    Science.gov (United States)

    Duven, Eva C P; Müller, Kai W; Beutel, Manfred E; Wölfling, Klaus

    2015-01-01

    Internet Gaming Disorder has been added as a research diagnosis in section III for the DSM-V. Previous findings from neuroscientific research indicate an enhanced motivational attention toward cues related to computer games, similar to findings in substance-related addictions. On the other hand in clinical observational studies tolerance effects are reported by patients with Internet Gaming disorder. In the present study we investigated whether an enhanced motivational attention or tolerance effects are present in patients with Internet Gaming Disorder. A clinical sample from the Outpatient Clinic for Behavioral Addictions in Mainz, Germany was recruited, fulfilling the diagnostic criteria for Internet Gaming Disorder. In a semi-natural EEG design participants played a computer game during the recording of event-related potentials to assess reward processing. The results indicated an attenuated P300 for patients with Internet Gaming Disorder in response to rewards in comparison to healthy controls, while the latency of N100 was prolonged and the amplitude of N100 was increased. Our findings support the hypothesis that tolerance effects are present in patients with Internet Gaming Disorder, when actively playing computer games. In addition, the initial orienting toward the gaming reward is suggested to consume more capacity for patients with Internet Gaming Disorder, which has been similarly reported by other studies with other methodological background in disorders of substance-related addictions.

  10. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  11. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  12. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Directory of Open Access Journals (Sweden)

    Carolina Milesi

    2017-04-01

    Full Text Available While the underrepresentation of women in the fast-growing STEM field of computer science (CS has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  13. Degenerative dementia: nosological aspects and results of single photon emission computed tomography

    International Nuclear Information System (INIS)

    Dubois, B.; Habert, M.O.

    1999-01-01

    Ten years ago, the diagnosis discussion of a dementia case for the old patient was limited to two pathologies: the Alzheimer illness and the Pick illness. During these last years, the frame of these primary degenerative dementia has fallen into pieces. The different diseases and the results got with single photon emission computed tomography are discussed. for example: fronto-temporal dementia, primary progressive aphasia, progressive apraxia, visio-spatial dysfunction, dementia at Lewy's bodies, or cortico-basal degeneration. (N.C.)

  14. First results with twisted mass fermions towards the computation of parton distribution functions on the lattice

    International Nuclear Information System (INIS)

    Alexandrou, Constantia; Cyprus Institute, Nicosia; Deutsches Elektronen-Synchrotron; Cichy, Krzysztof; Poznan Univ.; Drach, Vincent; Garcia-Ramos, Elena; Humboldt-Universitaet, Berlin; Hadjiyiannakou, Kyriakos; Jansen, Karl; Steffens, Fernanda; Wiese, Christian

    2014-11-01

    We report on our exploratory study for the evaluation of the parton distribution functions from lattice QCD, based on a new method proposed in Ref.∝arXiv:1305.1539. Using the example of the nucleon, we compare two different methods to compute the matrix elements needed, and investigate the application of gauge link smearing. We also present first results from a large production ensemble and discuss the future challenges related to this method.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  16. Flexibility of Bricard's linkages and other structures via resultants and computer algebra.

    Science.gov (United States)

    Lewis, Robert H; Coutsias, Evangelos A

    2016-07-01

    Flexibility of structures is extremely important for chemistry and robotics. Following our earlier work, we study flexibility using polynomial equations, resultants, and a symbolic algorithm of our creation that analyzes the resultant. We show that the software solves a classic arrangement of quadrilaterals in the plane due to Bricard. We fill in several gaps in Bricard's work and discover new flexible arrangements that he was apparently unaware of. This provides strong evidence for the maturity of the software, and is a wonderful example of mathematical discovery via computer assisted experiment.

  17. Feature Extraction on Brain Computer Interfaces using Discrete Dyadic Wavelet Transform: Preliminary Results

    International Nuclear Information System (INIS)

    Gareis, I; Gentiletti, G; Acevedo, R; Rufiner, L

    2011-01-01

    The purpose of this work is to evaluate different feature extraction alternatives to detect the event related evoked potential signal on brain computer interfaces, trying to minimize the time employed and the classification error, in terms of sensibility and specificity of the method, looking for alternatives to coherent averaging. In this context the results obtained performing the feature extraction using discrete dyadic wavelet transform using different mother wavelets are presented. For the classification a single layer perceptron was used. The results obtained with and without the wavelet decomposition were compared; showing an improvement on the classification rate, the specificity and the sensibility for the feature vectors obtained using some mother wavelets.

  18. Cognitive impairment and computer tomography image in patients with arterial hypertension -preliminary results

    International Nuclear Information System (INIS)

    Yaneva-Sirakova, T.; Tarnovska-Kadreva, R.; Traykov, L.; Zlatareva, D.

    2012-01-01

    Arterial hypertension is the leading risk factor for cognitive impairment, but it is developed only in some of the patients with pour control. On the other hand, not all of the patents with white matter changes have cognitive deficit. There may be a variety of reasons for this: the accuracy of methods for blood pressure measurement, the specific brain localization or some other reason. Here are the preliminary results of a study (or the potential correlation between self-measured, office-, ambulatory monitored blood pressure, central aortic blood pressure, minimal cognitive impairment and the specific brain image on contrast computer tomography. We expect to answer, the question whether central aortic or self-measured blood pressure have the leading role for the development of cognitive impairment in the presence of a specific neuroimaging finding, as well as what is the prerequisite for the clinical manifestation of cognitive dysfunction in patients with computer tomographic pathology. (authors)

  19. First results from a combined analysis of CERN computing infrastructure metrics

    Science.gov (United States)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  20. Comparison of Experimental Surface and Flow Field Measurements to Computational Results of the Juncture Flow Model

    Science.gov (United States)

    Roozeboom, Nettie H.; Lee, Henry C.; Simurda, Laura J.; Zilliac, Gregory G.; Pulliam, Thomas H.

    2016-01-01

    Wing-body juncture flow fields on commercial aircraft configurations are challenging to compute accurately. The NASA Advanced Air Vehicle Program's juncture flow committee is designing an experiment to provide data to improve Computational Fluid Dynamics (CFD) modeling in the juncture flow region. Preliminary design of the model was done using CFD, yet CFD tends to over-predict the separation in the juncture flow region. Risk reduction wind tunnel tests were requisitioned by the committee to obtain a better understanding of the flow characteristics of the designed models. NASA Ames Research Center's Fluid Mechanics Lab performed one of the risk reduction tests. The results of one case, accompanied by CFD simulations, are presented in this paper. Experimental results suggest the wall mounted wind tunnel model produces a thicker boundary layer on the fuselage than the CFD predictions, resulting in a larger wing horseshoe vortex suppressing the side of body separation in the juncture flow region. Compared to experimental results, CFD predicts a thinner boundary layer on the fuselage generates a weaker wing horseshoe vortex resulting in a larger side of body separation.

  1. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10 8 kg, with a corresponding kinetic energy of 1.88 x 10 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references

  2. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-04-01

    A computational approach used for subsurface explosion cratering has been extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for our first computer simulation because it was the most thoroughly studied. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Shoemaker estimates that the impact occurred about 20,000 to 30,000 years ago [Roddy (1977)]. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s. meteorite mass of 1.57E + 08 kg, with a corresponding kinetic energy of 1.88E + 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation a Tillotson equation-of-state description for iron and limestone was used with no shear strength. A color movie based on this calculation was produced using computer-generated graphics. Results obtained for this preliminary calculation of the formation of Meteor Crater, Arizona, are in good agreement with Meteor Crater Measurements

  3. Operating Wireless Sensor Nodes without Energy Storage: Experimental Results with Transient Computing

    Directory of Open Access Journals (Sweden)

    Faisal Ahmed

    2016-12-01

    Full Text Available Energy harvesting is increasingly used for powering wireless sensor network nodes. Recently, it has been suggested to combine it with the concept of transient computing whereby the wireless sensor nodes operate without energy storage capabilities. This new combined approach brings benefits, for instance ultra-low power nodes and reduced maintenance, but also raises new challenges, foremost dealing with nodes that may be left without power for various time periods. Although transient computing has been demonstrated on microcontrollers, reports on experiments with wireless sensor nodes are still scarce in the literature. In this paper, we describe our experiments with solar, thermal, and RF energy harvesting sources that are used to power sensor nodes (including wireless ones without energy storage, but with transient computing capabilities. The results show that the selected solar and thermal energy sources can operate both the wired and wireless nodes without energy storage, whereas in our specific implementation, the developed RF energy source can only be used for the selected nodes without wireless connectivity.

  4. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  5. Results of application of automatic computation of static corrections on data from the South Banat Terrain

    Science.gov (United States)

    Milojević, Slavka; Stojanovic, Vojislav

    2017-04-01

    Due to the continuous development of the seismic acquisition and processing method, the increase of the signal/fault ratio always represents a current target. The correct application of the latest software solutions improves the processing results and justifies their development. A correct computation and application of static corrections represents one of the most important tasks in pre-processing. This phase is of great importance for further processing steps. Static corrections are applied to seismic data in order to compensate the effects of irregular topography, the difference between the levels of source points and receipt in relation to the level of reduction, of close to the low-velocity surface layer (weathering correction), or any reasons that influence the spatial and temporal position of seismic routes. The refraction statics method is the most common method for computation of static corrections. It is successful in resolving of both the long-period statics problems and determining of the difference in the statics caused by abrupt lateral changes in velocity in close to the surface layer. XtremeGeo FlatironsTM is a program whose main purpose is computation of static correction through a refraction statics method and allows the application of the following procedures: picking of first arrivals, checking of geometry, multiple methods for analysis and modelling of statics, analysis of the refractor anisotropy and tomography (Eikonal Tomography). The exploration area is located on the southern edge of the Pannonian Plain, in the plain area with altitudes of 50 to 195 meters. The largest part of the exploration area covers Deliblato Sands, where the geological structure of the terrain and high difference in altitudes significantly affects the calculation of static correction. Software XtremeGeo FlatironsTM has powerful visualization and tools for statistical analysis which contributes to significantly more accurate assessment of geometry close to the surface

  6. Thermodynamic properties of 1-naphthol: Mutual validation of experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Steele, William V.; Kazakov, Andrei F.

    2015-01-01

    Highlights: • Heat capacities were measured for the temperature range 5 K to 445 K. • Vapor pressures were measured for the temperature range 370 K to 570 K. • Computed and derived properties for ideal gas entropies are in excellent accord. • The enthalpy of combustion was measured and shown to be consistent with reliable literature values. • Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Thermodynamic properties for 1-naphthol (Chemical Abstracts registry number [90-15-3]) in the ideal-gas state are reported based on both experimental and computational methods. Measured properties included the triple-point temperature, enthalpy of fusion, and heat capacities for the crystal and liquid phases by adiabatic calorimetry; vapor pressures by inclined-piston manometry and comparative ebulliometry; and the enthalpy of combustion of the crystal phase by oxygen bomb calorimetry. Critical properties were estimated. Entropies for the ideal-gas state were derived from the experimental studies for the temperature range 298.15 ⩽ T/K ⩽ 600, and independent statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. The mutual validation of the independent experimental and computed results is achieved with a scaling factor of 0.975 applied to the calculated vibrational frequencies. This same scaling factor was successfully applied in the analysis of results for other polycyclic molecules, as described in a series of recent articles by this research group. This article reports the first extension of this approach to a hydroxy-aromatic compound. All experimental results are compared with property values reported in the literature. Thermodynamic consistency between properties is used to show that several studies in the literature are erroneous. The enthalpy of combustion for 1-naphthol was also measured in this research, and excellent

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Multidetector computed tomography of urolithiasis. Technique and results; Multidetektor-Computertomografie der Urolithiasis. Technik und Ergebnisse

    Energy Technology Data Exchange (ETDEWEB)

    Karul, M.; Regier, M. [Universitaetsklinikum Hamburg-Eppendorf, Hamburg (Germany). Zentrum fuer Radiologie und Endoskopie; Heuer, R. [Universitaetsklinikum Hamburg-Eppendorf, Hamburg (Germany). Zentrum fuer Operative Medizin

    2013-02-15

    The diagnosis of acute urolithiasis results from unenhanced multidetector computed tomography (MDCT). This test analyses the functional and anatomical possibility for passing an ureteral calculi, the localization and dimension of which are important parameters for further therapy. Alternatively chronic urolithiasis could be ruled out by magnetic resonance urography (MRU). MRU is the first choice especially in pregnant women and children because of radiation hygiene. Enhanced MDCT must be emphasized as an alternative to intravenous urography (IVU) for diagnosis of complex drainage of urine and suspected disorder of the involved kidney. This review illustrates the principles of different tests and the clinical relevance thereof. (orig.)

  11. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  12. [Computer-assisted analysis of the results of training in internal medicine].

    Science.gov (United States)

    Vrbová, H; Spunda, M

    1991-06-01

    Analysis of the results of teaching of clinical disciplines has in the long run an impact on the standard and value of medical care. It requires processing of quantitative and qualitative data. The selection of indicators which will be followed up and procedures used for their processing are of fundamental importance. The submitted investigation is an example how to use possibilities to process results of effectiveness analysis in teaching internal medicine by means of computer technique. As an indicator of effectiveness the authors selected the percentage of students who had an opportunity during the given period of their studies to observe a certain pathological condition, and as method of data collection a survey by means of questionnaires was used. The task permits to differentiate the students' experience (whether the student examined the patient himself or whether the patient was only demonstrated) and it makes it possible to differentiate the place of observation (at the university teaching hospital or regional non-teaching hospital attachment). The task permits also to form sub-groups of respondents to combine them as desired and to compare their results. The described computer programme support comprises primary processing of the output of the questionnaire survey. The questionnaires are transformed and stored by groups of respondents in data files of suitable format (programme SDFORM); the processing of results is described as well as their presentation as output listing or on the display in the interactive way (SDRESULT programme). Using the above programmes, the authors processed the results of a survey made among students during and after completion of the studies in a series of 70 recommended pathological conditions. As an example the authors compare results of observations in 20 selected pathological conditions important for the diagnosis and therapy in primary care in the final stage of the medical course in 1981 and 1985.

  13. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    Science.gov (United States)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  14. Development of computer code SIMPSEX for simulation of FBR fuel reprocessing flowsheets: II. additional benchmarking results

    International Nuclear Information System (INIS)

    Shekhar Kumar; Koganti, S.B.

    2003-07-01

    Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)

  15. Performance of various mathematical methods for computer-aided processing of radioimmunoassay results

    International Nuclear Information System (INIS)

    Vogt, W.; Sandel, P.; Langfelder, Ch.; Knedel, M.

    1978-01-01

    The performance of 6 algorithms were compared for computer aided determination of radioimmunological end results. These were weighted and unweighted linear logit log regression; quadratic logit log regression, smoothing spline interpolation with a large and small smoothing factor, respectively, and polygonal interpolation and the manual curve fitting on the basis of three radioimmunoassays with different reference curve characteristics (digoxin, estriol, human chorionic somatomammotrophin (HCS)). Great store was set by the accuracy of the approximation at the intermediate points on the curve, i.e. those points that lie midway between two standard concentrations. These concentrations were obtained by weighing and inserted as unknown samples. In the case of digoxin and estriol the polygonal interpolation provided the best results, while the weighted logit log regression proved superior in the case of HCS. (Auth.)

  16. Results of the First National Assessment of Computer Competence (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)

  17. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    International Nuclear Information System (INIS)

    Eyler, L.L.; Trent, D.S.; Budden, M.J.

    1983-09-01

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs

  18. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  19. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  20. General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 12 (2003), s. 2727-2778 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : computational power * computational complexity * perceptrons * radial basis functions * spiking neurons * feedforward networks * reccurent networks * probabilistic computation * analog computation Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  1. MELMRK 2.0: A description of computer models and results of code testing

    International Nuclear Information System (INIS)

    Wittman, R.S.; Denny, V.; Mertol, A.

    1992-01-01

    An advanced version of the MELMRK computer code has been developed that provides detailed models for conservation of mass, momentum, and thermal energy within relocating streams of molten metallics during meltdown of Savannah River Site (SRS) reactor assemblies. In addition to a mechanistic treatment of transport phenomena within a relocating stream, MELMRK 2.0 retains the MOD1 capability for real-time coupling of the in-depth thermal response of participating assembly heat structure and, further, augments this capability with models for self-heating of relocating melt owing to steam oxidation of metallics and fission product decay power. As was the case for MELMRK 1.0, the MOD2 version offers state-of-the-art numerics for solving coupled sets of nonlinear differential equations. Principal features include application of multi-dimensional Newton-Raphson techniques to accelerate convergence behavior and direct matrix inversion to advance primitive variables from one iterate to the next. Additionally, MELMRK 2.0 provides logical event flags for managing the broad range of code options available for treating such features as (1) coexisting flow regimes, (2) dynamic transitions between flow regimes, and (3) linkages between heatup and relocation code modules. The purpose of this report is to provide a detailed description of the MELMRK 2.0 computer models for melt relocation. Also included are illustrative results for code testing, as well as an integrated calculation for meltdown of a Mark 31a assembly

  2. Reflectivity of 1D photonic crystals: A comparison of computational schemes with experimental results

    Science.gov (United States)

    Pérez-Huerta, J. S.; Ariza-Flores, D.; Castro-García, R.; Mochán, W. L.; Ortiz, G. P.; Agarwal, V.

    2018-04-01

    We report the reflectivity of one-dimensional finite and semi-infinite photonic crystals, computed through the coupling to Bloch modes (BM) and through a transfer matrix method (TMM), and their comparison to the experimental spectral line shapes of porous silicon (PS) multilayer structures. Both methods reproduce a forbidden photonic bandgap (PBG), but slowly-converging oscillations are observed in the TMM as the number of layers increases to infinity, while a smooth converged behavior is presented with BM. The experimental reflectivity spectra is in good agreement with the TMM results for multilayer structures with a small number of periods. However, for structures with large amount of periods, the measured spectral line shapes exhibit better agreement with the smooth behavior predicted by BM.

  3. First principle calculations of effective exchange integrals: Comparison between SR (BS) and MR computational results

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Kizashi [Institute for Nano Science Design Center, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan and TOYOTA Physical and Chemical Research Institute, Nagakute, Aichi, 480-1192 (Japan); Nishihara, Satomichi; Saito, Toru; Yamanaka, Shusuke; Kitagawa, Yasutaka; Kawakami, Takashi; Yamada, Satoru; Isobe, Hiroshi; Okumura, Mitsutaka [Department of Chemistry, Graduate School of Science, Osaka University, 1-1 Machikaneyama, Toyonaka, Osaka 560-0043 (Japan)

    2015-01-22

    First principle calculations of effective exchange integrals (J) in the Heisenberg model for diradical species were performed by both symmetry-adapted (SA) multi-reference (MR) and broken-symmetry (BS) single reference (SR) methods. Mukherjee-type (Mk) state specific (SS) MR coupled-cluster (CC) calculations by the use of natural orbital (NO) references of ROHF, UHF, UDFT and CASSCF solutions were carried out to elucidate J values for di- and poly-radical species. Spin-unrestricted Hartree Fock (UHF) based coupled-cluster (CC) computations were also performed to these species. Comparison between UHF-NO(UNO)-MkMRCC and BS UHF-CC computational results indicated that spin-contamination of UHF-CC solutions still remains at the SD level. In order to eliminate the spin contamination, approximate spin-projection (AP) scheme was applied for UCC, and the AP procedure indeed corrected the error to yield good agreement with MkMRCC in energy. The CC double with spin-unrestricted Brueckner's orbital (UBD) was furthermore employed for these species, showing that spin-contamination involved in UHF solutions is largely suppressed, and therefore AP scheme for UBCCD removed easily the rest of spin-contamination. We also performed spin-unrestricted pure- and hybrid-density functional theory (UDFT) calculations of diradical and polyradical species. Three different computational schemes for total spin angular momentums were examined for the AP correction of the hybrid (H) UDFT. HUDFT calculations followed by AP, HUDFT(AP), yielded the S-T gaps that were qualitatively in good agreement with those of MkMRCCSD, UHF-CC(AP) and UB-CC(AP). Thus a systematic comparison among MkMRCCSD, UCC(AP) UBD(AP) and UDFT(AP) was performed concerning with the first principle calculations of J values in di- and poly-radical species. It was found that BS (AP) methods reproduce MkMRCCSD results, indicating their applicability to large exchange coupled systems.

  4. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  5. Development and application of a new deterministic method for calculating computer model result uncertainties

    International Nuclear Information System (INIS)

    Maerker, R.E.; Worley, B.A.

    1989-01-01

    Interest in research into the field of uncertainty analysis has recently been stimulated as a result of a need in high-level waste repository design assessment for uncertainty information in the form of response complementary cumulative distribution functions (CCDFs) to show compliance with regulatory requirements. The solution to this problem must obviously rely on the analysis of computer code models, which, however, employ parameters that can have large uncertainties. The motivation for the research presented in this paper is a search for a method involving a deterministic uncertainty analysis approach that could serve as an improvement over those methods that make exclusive use of statistical techniques. A deterministic uncertainty analysis (DUA) approach based on the use of first derivative information is the method studied in the present procedure. The present method has been applied to a high-level nuclear waste repository problem involving use of the codes ORIGEN2, SAS, and BRINETEMP in series, and the resulting CDF of a BRINETEMP result of interest is compared with that obtained through a completely statistical analysis

  6. Result

    Indian Academy of Sciences (India)

    With reference to the detailed evaluation of bids submitted the following agencies has been selected to award the contract on L1 ( lowest bidder) basis. 1. M/s . CITO INFOTECH, Bengaluru ( for procurement of desktop computers). 2. M/s. MCCANNINFO SOLUTION, Mumbai ( for procurement of laptops computers)

  7. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    Directory of Open Access Journals (Sweden)

    Yamashiro T

    2015-02-01

    Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not

  8. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  9. The use of computers in education worldwide : results from a comparative survey in 18 countries

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.

    1991-01-01

    In 1989, the International Association for the Evaluation of Educational Achievement (IEA) Computers in Education study collected data on computer use in elementary, and lower- and upper-secondary education in 22 countries. Although all data sets from the participating countries had not been

  10. Computed tomography of the chest in blunt thoracic trauma: results of a prospective study

    International Nuclear Information System (INIS)

    Blostein, P.; Hodgman, C.

    1998-01-01

    Blunt thoracic injuries detected by computed tomography of the chest infrequently require immediate therapy. If immediate therapy is needed, findings will be visible on plain roentgenograms or on clinical exam. Routine Computed Tomography of the chest in blunt trauma is not recommended but may be helpful in selected cases. (N.C.)

  11. Compound analysis of gallstones using dual energy computed tomography-Results in a phantom model

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ralf W., E-mail: ralfwbauer@aol.co [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany); Schulz, Julian R., E-mail: julian.schulz@t-online.d [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany); Zedler, Barbara, E-mail: zedler@em.uni-frankfurt.d [Department of Forensic Medicine, Clinic of the Goethe University Frankfurt, Kennedyallee 104, 60596 Frankfurt (Germany); Graf, Thomas G., E-mail: thomas.gt.graf@siemens.co [Siemens AG Healthcare Sector, Computed Tomography, Physics and Applications, Siemensstrasse 1, 91313 Forchheim (Germany); Vogl, Thomas J., E-mail: t.vogl@em.uni-frankfurt.d [Department of Diagnostic and Interventional Radiology, Clinic of the Goethe University Frankfurt, Theodor-Stern-Kai 7, 60596 Frankfurt (Germany)

    2010-07-15

    Purpose: The potential of dual energy computed tomography (DECT) for the analysis of gallstone compounds was investigated. The main goal was to find parameters, that can reliably define high percentage (>70%) cholesterol stones without calcium components. Materials and methods: 35 gallstones were analyzed with DECT using a phantom model. Stone samples were put into specimen containers filled with formalin. Containers were put into a water-filled cylindrical acrylic glass phantom. DECT scans were performed using a tube voltage/current of 140 kV/83 mAs (tube A) and 80 kV/340 mAs (tube B). ROI-measurements to determine CT attenuation of each sector of the stones that had different appearance on the CT images were performed. Finally, semi-quantitative infrared spectroscopy (FTIR) of these sectors was performed for chemical analysis. Results: ROI-measurements were performed in 45 different sectors in 35 gallstones. Sectors containing >70% of cholesterol and no calcium component (n = 20) on FTIR could be identified with 95% sensitivity and 100% specificity on DECT. These sectors showed typical attenuation of -8 {+-} 4 HU at 80 kV and +22 {+-} 3 HU at 140 kV. Even the presence of a small calcium component (<10%) hindered the reliable identification of cholesterol components as such. Conclusion: Dual energy CT allows for reliable identification of gallstones containing a high percentage of cholesterol and no calcium component in this pre-clinical phantom model. Results from in vivo or anthropomorphic phantom trials will have to confirm these results. This may enable the identification of patients eligible for non-surgical treatment options in the future.

  12. 3D ultrasound computer tomography: Hardware setup, reconstruction methods and first clinical results

    Science.gov (United States)

    Gemmeke, Hartmut; Hopp, Torsten; Zapf, Michael; Kaiser, Clemens; Ruiter, Nicole V.

    2017-11-01

    A promising candidate for improved imaging of breast cancer is ultrasound computer tomography (USCT). Current experimental USCT systems are still focused in elevation dimension resulting in a large slice thickness, limited depth of field, loss of out-of-plane reflections, and a large number of movement steps to acquire a stack of images. 3D USCT emitting and receiving spherical wave fronts overcomes these limitations. We built an optimized 3D USCT, realizing for the first time the full benefits of a 3D system. The point spread function could be shown to be nearly isotropic in 3D, to have very low spatial variability and fit the predicted values. The contrast of the phantom images is very satisfactory in spite of imaging with a sparse aperture. The resolution and imaged details of the reflectivity reconstruction are comparable to a 3 T MRI volume. Important for the obtained resolution are the simultaneously obtained results of the transmission tomography. The KIT 3D USCT was then tested in a pilot study on ten patients. The primary goals of the pilot study were to test the USCT device, the data acquisition protocols, the image reconstruction methods and the image fusion techniques in a clinical environment. The study was conducted successfully; the data acquisition could be carried out for all patients with an average imaging time of six minutes per breast. The reconstructions provide promising images. Overlaid volumes of the modalities show qualitative and quantitative information at a glance. This paper gives a summary of the involved techniques, methods, and first results.

  13. Compound analysis of gallstones using dual energy computed tomography-Results in a phantom model

    International Nuclear Information System (INIS)

    Bauer, Ralf W.; Schulz, Julian R.; Zedler, Barbara; Graf, Thomas G.; Vogl, Thomas J.

    2010-01-01

    Purpose: The potential of dual energy computed tomography (DECT) for the analysis of gallstone compounds was investigated. The main goal was to find parameters, that can reliably define high percentage (>70%) cholesterol stones without calcium components. Materials and methods: 35 gallstones were analyzed with DECT using a phantom model. Stone samples were put into specimen containers filled with formalin. Containers were put into a water-filled cylindrical acrylic glass phantom. DECT scans were performed using a tube voltage/current of 140 kV/83 mAs (tube A) and 80 kV/340 mAs (tube B). ROI-measurements to determine CT attenuation of each sector of the stones that had different appearance on the CT images were performed. Finally, semi-quantitative infrared spectroscopy (FTIR) of these sectors was performed for chemical analysis. Results: ROI-measurements were performed in 45 different sectors in 35 gallstones. Sectors containing >70% of cholesterol and no calcium component (n = 20) on FTIR could be identified with 95% sensitivity and 100% specificity on DECT. These sectors showed typical attenuation of -8 ± 4 HU at 80 kV and +22 ± 3 HU at 140 kV. Even the presence of a small calcium component (<10%) hindered the reliable identification of cholesterol components as such. Conclusion: Dual energy CT allows for reliable identification of gallstones containing a high percentage of cholesterol and no calcium component in this pre-clinical phantom model. Results from in vivo or anthropomorphic phantom trials will have to confirm these results. This may enable the identification of patients eligible for non-surgical treatment options in the future.

  14. Initial results from a prototype whole-body photon-counting computed tomography system.

    Science.gov (United States)

    Yu, Z; Leng, S; Jorgensen, S M; Li, Z; Gutjahr, R; Chen, B; Duan, X; Halaweish, A F; Yu, L; Ritman, E L; McCollough, C H

    X-ray computed tomography (CT) with energy-discriminating capabilities presents exciting opportunities for increased dose efficiency and improved material decomposition analyses. However, due to constraints imposed by the inability of photon-counting detectors (PCD) to respond accurately at high photon flux, to date there has been no clinical application of PCD-CT. Recently, our lab installed a research prototype system consisting of two x-ray sources and two corresponding detectors, one using an energy-integrating detector (EID) and the other using a PCD. In this work, we report the first third-party evaluation of this prototype CT system using both phantoms and a cadaver head. The phantom studies demonstrated several promising characteristics of the PCD sub-system, including improved longitudinal spatial resolution and reduced beam hardening artifacts, relative to the EID sub-system. More importantly, we found that the PCD sub-system offers excellent pulse pileup control in cases of x-ray flux up to 550 mA at 140 kV, which corresponds to approximately 2.5×10 11 photons per cm 2 per second. In an anthropomorphic phantom and a cadaver head, the PCD sub-system provided image quality comparable to the EID sub-system for the same dose level. Our results demonstrate the potential of the prototype system to produce clinically-acceptable images in vivo .

  15. Computer-assisted comparison of analysis and test results in transportation experiments

    International Nuclear Information System (INIS)

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-01-01

    As a part of its ongoing research efforts, Sandia National Laboratories' Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment

  16. Results of computer-tomographic examination in different forms and course of schizophrenia

    International Nuclear Information System (INIS)

    Stojchev, R.

    1991-01-01

    Data are reported of a clinical and computer-tomographic study of 103 schizophrenic patients. Those with simple form of the disease had most pronounced evidence of dilated III and lateral ventricles (41.8% of the cases for the III ventricle and 72.4% for the lateral ventricles). All patients with circular, simple and catatonic form had signs of pathology of the cortical sulci. Regarding the ventricular system evidences of pathology prevailed in cases of impetus-progredient and constantly progredient course, whereas in respect to cortical pathology, the results were almost identical in all three types of psychosis - 95.2% of cases of constantly progredient and 95.6% - of impetus-progredient course. Attention was called to the 'surprising' data of organic brain injury in patients with paranoid and circular form of the disease, as well as in the most benign (from clinical point of view) impetus course. It is assumed that morphologic changes in the brain of schizophrenic patients are a natural phenomenon, but so far have not been a subject of comprehensive studies, maybe because of prejudice or lack of appropriate methods for examination of the brain during life's time. 6 figs., 15 refs

  17. Comparison of Computational and Experimental Microphone Array Results for an 18%-Scale Aircraft Model

    Science.gov (United States)

    Lockard, David P.; Humphreys, William M.; Khorrami, Mehdi R.; Fares, Ehab; Casalino, Damiano; Ravetta, Patricio A.

    2015-01-01

    An 18%-scale, semi-span model is used as a platform for examining the efficacy of microphone array processing using synthetic data from numerical simulations. Two hybrid RANS/LES codes coupled with Ffowcs Williams-Hawkings solvers are used to calculate 97 microphone signals at the locations of an array employed in the NASA LaRC 14x22 tunnel. Conventional, DAMAS, and CLEAN-SC array processing is applied in an identical fashion to the experimental and computational results for three different configurations involving deploying and retracting the main landing gear and a part span flap. Despite the short time records of the numerical signals, the beamform maps are able to isolate the noise sources, and the appearance of the DAMAS synthetic array maps is generally better than those from the experimental data. The experimental CLEAN-SC maps are similar in quality to those from the simulations indicating that CLEAN-SC may have less sensitivity to background noise. The spectrum obtained from DAMAS processing of synthetic array data is nearly identical to the spectrum of the center microphone of the array, indicating that for this problem array processing of synthetic data does not improve spectral comparisons with experiment. However, the beamform maps do provide an additional means of comparison that can reveal differences that cannot be ascertained from spectra alone.

  18. Implementation of depression screening in antenatal clinics through tablet computers: results of a feasibility study.

    Science.gov (United States)

    Marcano-Belisario, José S; Gupta, Ajay K; O'Donoghue, John; Ramchandani, Paul; Morrison, Cecily; Car, Josip

    2017-05-10

    Mobile devices may facilitate depression screening in the waiting area of antenatal clinics. This can present implementation challenges, of which we focused on survey layout and technology deployment. We assessed the feasibility of using tablet computers to administer a socio-demographic survey, the Whooley questions and the Edinburgh Postnatal Depression Scale (EPDS) to 530 pregnant women attending National Health Service (NHS) antenatal clinics across England. We randomised participants to one of two layout versions of these surveys: (i) a scrolling layout where each survey was presented on a single screen; or (ii) a paging layout where only one question appeared on the screen at any given time. Overall, 85.10% of eligible pregnant women agreed to take part. Of these, 90.95% completed the study procedures. Approximately 23% of participants answered Yes to at least one Whooley question, and approximately 13% of them scored 10 points of more on the EPDS. We observed no association between survey layout and the responses given to the Whooley questions, the median EPDS scores, the number of participants at increased risk of self-harm, and the number of participants asking for technical assistance. However, we observed a difference in the number of participants at each EPDS scoring interval (p = 0.008), which provide an indication of a woman's risk of depression. A scrolling layout resulted in faster completion times (median = 4 min 46 s) than a paging layout (median = 5 min 33 s) (p = 0.024). However, the clinical significance of this difference (47.5 s) is yet to be determined. Tablet computers can be used for depression screening in the waiting area of antenatal clinics. This requires the careful consideration of clinical workflows, and technology-related issues such as connectivity and security. An association between survey layout and EPDS scoring intervals needs to be explored further to determine if it corresponds to a survey layout effect

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. VX hydrolysis by human serum paraoxonase 1: a comparison of experimental and computational results.

    Directory of Open Access Journals (Sweden)

    Matthew W Peterson

    Full Text Available Human Serum paraoxonase 1 (HuPON1 is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VX(ts in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VX(ts complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VX(ts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX.

  1. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  2. New results on classical problems in computational geometry in the plane

    DEFF Research Database (Denmark)

    Abrahamsen, Mikkel

    In this thesis, we revisit three classical problems in computational geometry in the plane. An obstacle that often occurs as a subproblem in more complicated problems is to compute the common tangents of two disjoint, simple polygons. For instance, the common tangents turn up in problems related...... to visibility, collision avoidance, shortest paths, etc. We provide a remarkably simple algorithm to compute all (at most four) common tangents of two disjoint simple polygons. Given each polygon as a read-only array of its corners in cyclic order, the algorithm runs in linear time and constant workspace...... and is the first to achieve the two complexity bounds simultaneously. The set of common tangents provides basic information about the convex hulls of the polygons—whether they are nested, overlapping, or disjoint—and our algorithm thus also decides this relationship. One of the best-known problems in computational...

  3. Speed test results and hardware/software study of computational speed problem, appendix D

    Science.gov (United States)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  4. Risk perception and risk management in cloud computing: results from a case study of Swiss companies

    OpenAIRE

    Brender, Nathalie; Markov, Iliya

    2013-01-01

    In today's economic turmoil, the pay-per-use pricing model of cloud computing, its flexibility and scalability and the potential for better security and availability levels are alluring to both SMEs and large enterprises. However, cloud computing is fraught with security risks which need to be carefully evaluated before any engagement in this area. This article elaborates on the most important risks inherent to the cloud such as information security, regulatory compliance, data location, inve...

  5. Computer-Based Cognitive Training for Mild Cognitive Impairment: Results from a Pilot Randomized, Controlled Trial

    OpenAIRE

    Barnes, Deborah E.; Yaffe, Kristine; Belfor, Nataliya; Jagust, William J.; DeCarli, Charles; Reed, Bruce R.; Kramer, Joel H.

    2009-01-01

    We performed a pilot randomized, controlled trial of intensive, computer-based cognitive training in 47 subjects with mild cognitive impairment (MCI). The intervention group performed exercises specifically designed to improve auditory processing speed and accuracy for 100 minutes/day, 5 days/week for 6 weeks; the control group performed more passive computer activities (reading, listening, visuospatial game) for similar amounts of time. Subjects had a mean age of 74 years and 60% were men; 7...

  6. Calculating buoy response for a wave energy converter—A comparison of two computational methods and experimental results

    Directory of Open Access Journals (Sweden)

    Linnea Sjökvist

    2017-05-01

    Full Text Available When designing a wave power plant, reliable and fast simulation tools are required. Computational fluid dynamics (CFD software provides high accuracy but with a very high computational cost, and in operational, moderate sea states, linear potential flow theories may be sufficient to model the hydrodynamics. In this paper, a model is built in COMSOL Multiphysics to solve for the hydrodynamic parameters of a point-absorbing wave energy device. The results are compared with a linear model where the hydrodynamical parameters are computed using WAMIT, and to experimental results from the Lysekil research site. The agreement with experimental data is good for both numerical models.

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. Initial quality performance results using a phantom to simulate chest computed radiography

    Directory of Open Access Journals (Sweden)

    Muhogora Wilbroad

    2011-01-01

    Full Text Available The aim of this study was to develop a homemade phantom for quantitative quality control in chest computed radiography (CR. The phantom was constructed from copper, aluminium, and polymenthylmethacrylate (PMMA plates as well as Styrofoam materials. Depending on combinations, the literature suggests that these materials can simulate the attenuation and scattering characteristics of lung, heart, and mediastinum. The lung, heart, and mediastinum regions were simulated by 10 mm x 10 mm x 0.5 mm, 10 mm x 10 mm x 0.5 mm and 10 mm x 10 mm x 1 mm copper plates, respectively. A test object of 100 mm x 100 mm and 0.2 mm thick copper was positioned to each region for CNR measurements. The phantom was exposed to x-rays generated by different tube potentials that covered settings in clinical use: 110-120 kVp (HVL=4.26-4.66 mm Al at a source image distance (SID of 180 cm. An approach similar to the recommended method in digital mammography was applied to determine the CNR values of phantom images produced by a Kodak CR 850A system with post-processing turned off. Subjective contrast-detail studies were also carried out by using images of Leeds TOR CDR test object acquired under similar exposure conditions as during CNR measurements. For clinical kVp conditions relevant to chest radiography, the CNR was highest over 90-100 kVp range. The CNR data correlated with the results of contrast detail observations. The values of clinical tube potentials at which CNR is the highest are regarded to be optimal kVp settings. The simplicity in phantom construction can offer easy implementation of related quality control program.

  10. [Excessive computer usage in adolescents--results of a psychometric evaluation].

    Science.gov (United States)

    Grüsser, Sabine M; Thalemann, Ralf; Albrecht, Ulrike; Thalemann, Carolin N

    2005-03-01

    Excessive computer and video game playing among children is being critically discussed from a pedagogic and public health point of view. To date, no reliable data for this phenomenon in Germany exists. In the present study, the excessive usage of computer and video games is seen as a rewarding behavior which can, due to learning mechanisms, become a prominent and inadequate strategy for children to cope with negative emotions like frustration, uneasiness and fears. In the survey, 323 children ranging in age from 11 to 14 years were asked about their video game playing behavior. Criteria for excessive computer and video game playing were developed in accordance with the criteria for dependency and pathological gambling (DSM-IV, ICD-10). Data show that 9.3% (N = 30) of the children fulfill all criteria for excessive computer and video game playing. Furthermore, these children differ from their class mates with respect to watching television, communication patterns, the ability to concentrate in school lectures and the preferred strategies coping with negative emotions. In accordance with findings in studies about substance-related addiction, data suggest that excessive computer and video game players use their excessive rewarding behavior specifically as an inadequate stress coping strategy.

  11. Computational results with a branch and cut code for the capacitated vehicle routing problem

    Energy Technology Data Exchange (ETDEWEB)

    Augerat, P.; Naddef, D. [Institut National Polytechnique, 38 - Grenoble (France); Belenguer, J.M.; Benavent, E.; Corberan, A. [Valencia Univ. (Spain); Rinaldi, G. [Consiglio Nazionale delle Ricerche, Rome (Italy)

    1995-09-01

    The Capacitated Vehicle Routing Problem (CVRP) we consider in this paper consists in the optimization of the distribution of goods from a single depot to a given set of customers with known demand using a given number of vehicles of fixed capacity. There are many practical routing applications in the public sector such as school bus routing, pick up and mail delivery, and in the private sector such as the dispatching of delivery trucks. We present a Branch and Cut algorithm to solve the CVRP which is based in the partial polyhedral description of the corresponding polytope. The valid inequalities used in our method can ne found in Cornuejols and Harche (1993), Harche and Rinaldi (1991) and in Augerat and Pochet (1995). We concentrated mainly on the design of separation procedures for several classes of valid inequalities. The capacity constraints (generalized sub-tour eliminations inequalities) happen to play a crucial role in the development of a cutting plane algorithm for the CVRP. A large number of separation heuristics have been implemented and compared for these inequalities. There has been also implemented heuristic separation algorithms for other classes of valid inequalities that also lead to significant improvements: comb and extended comb inequalities, generalized capacity inequalities and hypo-tour inequalities. The resulting cutting plane algorithm has been applied to a set of instances taken from the literature and the lower bounds obtained are better than the ones previously known. Some branching strategies have been implemented to develop a Branch an Cut algorithm that has been able to solve large CVRP instances, some of them which had never been solved before. (authors). 32 refs., 3 figs., 10 tabs.

  12. Computer use and ulnar neuropathy: results from a case-referent study

    DEFF Research Database (Denmark)

    Andersen, JH; Frost, P.; Fuglsang-Frederiksen, A.

    2012-01-01

    We aimed to evaluate associations between vocational computer use and 1) ulnar neuropathy, and 2) ulnar neuropathy- like symptoms as distinguished by electroneurography. We identified all patients aged 18-65 years, examined at the Department of Neurophysiology on suspicion of ulnar neuropathy, 2001...... was performed by conditional logistic regression.There were a negative association between daily hours of computer use and the two outcomes of interest. Participants who reported their elbow to be in contact with their working table for 2 hours or more during the workday had an elevated risk for ulnar...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  14. Effective Computer-Aided Assessment of Mathematics; Principles, Practice and Results

    Science.gov (United States)

    Greenhow, Martin

    2015-01-01

    This article outlines some key issues for writing effective computer-aided assessment (CAA) questions in subjects with substantial mathematical or statistical content, especially the importance of control of random parameters and the encoding of wrong methods of solution (mal-rules) commonly used by students. The pros and cons of using CAA and…

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    Science.gov (United States)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  17. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    Science.gov (United States)

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  18. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  19. The determination of surface of powders by BET method using nitrogen and krypton with computer calculation of the results

    International Nuclear Information System (INIS)

    Dembinski, W.; Zlotowski, T.

    1973-01-01

    A computer program written in FORTRAN language for calculations of final results of specific surface analysis based on BET theory has been described. Two gases - nitrogen and krypton were used. A technical description of measuring apparaturs is presented as well as theoretical basis of the calculations together with statistical analysis of the results for uranium compounds powders. (author)

  20. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    International Nuclear Information System (INIS)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T.

    2011-01-01

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  5. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    Energy Technology Data Exchange (ETDEWEB)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T. [School of Engineering of Universidad Pontificia Comillas, C/Alberto Aguilera, 23, 28015 Madrid (Spain)

    2011-02-15

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  6. Usefulness of indirect alcohol biomarkers for predicting recidivism of drunk-driving among previously convicted drunk-driving offenders: results from the recidivism of alcohol-impaired driving (ROAD) study.

    Science.gov (United States)

    Maenhout, Thomas M; Poll, Anneleen; Vermassen, Tijl; De Buyzere, Marc L; Delanghe, Joris R

    2014-01-01

    In several European countries, drivers under the influence (DUI), suspected of chronic alcohol abuse are referred for medical and psychological examination. This study (the ROAD study, or Recidivism Of Alcohol-impaired Driving) investigated the usefulness of indirect alcohol biomarkers for predicting drunk-driving recidivism in previously convicted drunk-driving offenders. The ROAD study is a prospective study (2009-13) that was performed on 517 randomly selected drivers in Belgium. They were convicted for drunk-driving for which their licence was confiscated. The initial post-arrest blood samples were collected and analysed for percentage carbohydrate-deficient transferrin (%CDT), transaminsase activities [alanine amino transferase (ALT), aspartate amino transferase (AST)], gamma-glutamyltransferase (γGT) and red cell mean corpuscular volume (MCV). The observation time for each driver was 3 years and dynamic. A logistic regression analysis revealed that ln(%CDT) (P drunk-driving. The ROAD index (which includes ln(%CDT), ln(γGT), -ln(ALT) and the sex of the driver) was calculated and had a significantly higher area under the receiver operator characteristic curve (0.71) than the individual biomarkers for drunk-driving recidivism. Drivers with a high risk of recidivating (ROAD index ≥ 25%; third tertile) could be distinguished from drivers with an intermediate risk (16% ≤ ROAD index drunk-driving. The association with gamma-glutamyltransferase, alanine amino transferase and the sex of the driver could have additional value for identifying drunk-drivers at intermediate risk of recidivism. Non-specific indirect alcohol markers, such as alanine amino transferase, gamma-glutamyltransferase, aspartate amino transferase and red cell mean corpuscular volume have minimal added value to % carbohydrate-deficient transferrin for distinguishing drunk drivers with a low or high risk of recidivism. © 2013 Society for the Study of Addiction.

  7. Computational Fluid Dynamics Modeling Of Scaled Hanford Double Shell Tank Mixing - CFD Modeling Sensitivity Study Results

    International Nuclear Information System (INIS)

    Jackson, V.L.

    2011-01-01

    The primary purpose of the tank mixing and sampling demonstration program is to mitigate the technical risks associated with the ability of the Hanford tank farm delivery and celtification systems to measure and deliver a uniformly mixed high-level waste (HLW) feed to the Waste Treatment and Immobilization Plant (WTP) Uniform feed to the WTP is a requirement of 24590-WTP-ICD-MG-01-019, ICD-19 - Interface Control Document for Waste Feed, although the exact definition of uniform is evolving in this context. Computational Fluid Dynamics (CFD) modeling has been used to assist in evaluating scaleup issues, study operational parameters, and predict mixing performance at full-scale.

  8. Integrated Berth Allocation and Quay Crane Assignment Problem: Set partitioning models and computational results

    DEFF Research Database (Denmark)

    Iris, Cagatay; Pacino, Dario; Røpke, Stefan

    2015-01-01

    Most of the operational problems in container terminals are strongly interconnected. In this paper, we study the integrated Berth Allocation and Quay Crane Assignment Problem in seaport container terminals. We will extend the current state-of-the-art by proposing novel set partitioning models....... To improve the performance of the set partitioning formulations, a number of variable reduction techniques are proposed. Furthermore, we analyze the effects of different discretization schemes and the impact of using a time-variant/invariant quay crane allocation policy. Computational experiments show...

  9. Inhaling Difluoroethane Computer Cleaner Resulting in Acute Kidney Injury and Chronic Kidney Disease

    Directory of Open Access Journals (Sweden)

    Kristen Calhoun

    2018-01-01

    Full Text Available Difluoroethane is the active ingredient in various computer cleaners and is increasingly abused by teenagers due to its ease of access, quick onset of euphoric effects, and lack of detectability on current urine drug screens. The substance has detrimental effects on various organ systems; however, its effects on the kidneys remain largely unreported. The following case report adds new information to the developing topic of acute kidney injury in patients abusing difluoroethane inhalants. In addition, it is one of the first to show a possible relationship between prolonged difluoroethane abuse and the development of chronic kidney disease in the absence of other predisposing risk factors.

  10. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  11. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  12. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  13. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  14. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  16. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  17. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. Fuel behaviour and fission product release under realistic hydrogen conditions comparisons between HEVA 06 test results and Vulcain computations

    International Nuclear Information System (INIS)

    Dumas, J.M.; Lhiaubet, G.

    1989-07-01

    The HEVA 06 test was designed to simulate the conditions existing at the time when fission products are released from irradiated fuel under hydrogen conditions occurring in a PWR core at low pressure. The test conditions were defined from results provided by the core degradation module of the ESCADRE system (1): VULCAIN. This computer code has been recently used to analyse the early core degradation of a 900 MWe PWR in the AF accident sequence (as defined in WASH - 1400, USNRC - 1975). In this scenario, the core would begin to uncover about one day after scram with the system pressure at about 0.4 MPa. The fission product release starts 70 minutes after core dewatering. The F.P. are transferred to the core outlet in an increasingly hydrogen-rich steam atmosphere. The carrier gas is nearly pure hydrogen in the time period 100 - 130 minutes after core uncovering. A large release of F.P. is predicted in the upper part of the core when the steam starvation occurs. At that time, two thirds of the cladding have been oxidised on an average. Before each HEVA test a fuel sample with a burn-up of 36 GWd/tU is reirradiated in order to observe the release of short-lived fission products. A pre-oxidation was primarely conducted in the HEVA 06 test at a temperature of 1300 0 C and controlled to reach a 2/3 cladding oxidation state. Then the steam was progressively replaced by hydrogen and a heat-up rate of 1.5 0 C/s was induced to reach a temperature of 2100 0 C. The fuel was maintained at this temperature for half an hour in hydrogen. The volatile F.P. release kinetics were observed by on-line gamma spectrometry. Pre test calculations of F.P. release kinetics performed with the EMIS module based on the CORSOR models (3) are compared with the test results. Measured releases of cesium and iodine are really lower than those predicted. Axial and radial F.P. distributions in the fuel pellets are available from gamma tomography measurements performed after the test. Tellurium seems

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  2. Conception of spent fuel and radioactive wastes management in Poland based on the results of the previous work performed in the frames of Governmental Strategic Programme realised under patronate of National Atomic Energy Agency

    International Nuclear Information System (INIS)

    Wlodarski, J.; Chwaszczewski, S.; Slizowski, K.; Frankowski, Z.

    1999-01-01

    About 300 cubic meters of solid and solidified radioactive wastes of low and medium activity are produced each year in Poland. Such materials, after processing, are stored in the Institute of Atomic Energy at Swierk or in the National Repository for Radioactive Wastes in Rozan. About 6000 spent fuel elements are temporarily stored in water pools at Swierk. Assumptions and strategy of future spent fuel and radioactive wastes management are presented taking into account operation of the first nuclear power plants (after 2010). Then Governmental Strategic Programme (GSP) for the year 1997-1999 concerning such topic is described and some results of the work performed in the frames of the GSP are given

  3. Intensive versus Guideline Blood Pressure and Lipid Lowering in Patients with Previous Stroke: Main Results from the Pilot ‘Prevention of Decline in Cognition after Stroke Trial’ (PODCAST) Randomised Controlled Trial

    Science.gov (United States)

    Scutt, Polly; Blackburn, Daniel J.; Ankolekar, Sandeep; Krishnan, Kailash; Ballard, Clive; Burns, Alistair; Mant, Jonathan; Passmore, Peter; Pocock, Stuart; Reckless, John; Sprigg, Nikola; Stewart, Rob; Wardlaw, Joanna M.; Ford, Gary A.

    2017-01-01

    Background Stroke is associated with the development of cognitive impairment and dementia. We assessed the effect of intensive blood pressure (BP) and/or lipid lowering on cognitive outcomes in patients with recent stroke in a pilot trial. Methods In a multicentre, partial-factorial trial, patients with recent stroke, absence of dementia, and systolic BP (SBP) 125–170 mmHg were assigned randomly to at least 6 months of intensive (target SBP Addenbrooke’s Cognitive Examination-Revised (ACE-R). Results We enrolled 83 patients, mean age 74.0 (6.8) years, and median 4.5 months after stroke. The median follow-up was 24 months (range 1–48). Mean BP was significantly reduced with intensive compared to guideline treatment (difference –10·6/–5·5 mmHg; pcognition, intensive BP and lipid lowering were feasible and safe, but did not alter cognition over two years. The association between intensive lipid lowering and improved scores for some secondary outcomes suggests further trials are warranted. Trial Registration ISRCTN ISRCTN85562386 PMID:28095412

  4. Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment

    Science.gov (United States)

    Hancock, Thomas M., III

    1999-01-01

    This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  6. [Computer optical topography: a study of the repeatability of the results of human body model examination].

    Science.gov (United States)

    Sarnadskiĭ, V N

    2007-01-01

    The problem of repeatability of the results of examination of a plastic human body model is considered. The model was examined in 7 positions using an optical topograph for kyphosis diagnosis. The examination was performed under television camera monitoring. It was shown that variation of the model position in the camera view affected the repeatability of the results of topographic examination, especially if the model-to-camera distance was changed. A study of the repeatability of the results of optical topographic examination can help to increase the reliability of the topographic method, which is widely used for medical screening of children and adolescents.

  7. results

    Directory of Open Access Journals (Sweden)

    Salabura Piotr

    2017-01-01

    Full Text Available HADES experiment at GSI is the only high precision experiment probing nuclear matter in the beam energy range of a few AGeV. Pion, proton and ion beams are used to study rare dielectron and strangeness probes to diagnose properties of strongly interacting matter in this energy regime. Selected results from p + A and A + A collisions are presented and discussed.

  8. UPVAPOR: computer application for the analysis of the results of Cofrentes Nuclear Power Plant production

    International Nuclear Information System (INIS)

    Palomo, M. J.; Baraza Peregrin, A.; Bucho Piqueras, L.; Vaquer Perez, J. I.; Lopez Lopez, B.

    2010-01-01

    UPVapor is a software developed for the Cofrentes Nuclear Power Plant Group of Results. This application presents a graphical environment for analysis in which the user has available many variables registered to configure the graphics. This application saves a lot of time at work because it allows other users to do their own analysis without resorting to analysts.

  9. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  10. Measurement of single kidney contrast media clearance by multiphasic spiral computed tomography: preliminary results

    International Nuclear Information System (INIS)

    Hackstein, N.; Puille, M.F.; Bak, Benjamin H.; Scharwat, Oliver; Rau, W.S.

    2001-01-01

    Objective. We present preliminary results of a new method (hereinafter called 'CT-clearance') to measure single kidney contrast media clearance by performing multiphasic helical CT of the kidneys. CT-clearance was calculated according to an extension of the Patlak-Plot. In contrast to prior investigators, who repeatedly measured a single slice, this method makes it possible to calculate single kidney clearance from at least three spiral CTs, utilizing the whole kidney volume. Methods. Spiral CT of the kidneys was performed unenhanced and about 30 and 100 s after administration of about 120 ml iopromide. Sum-density of the whole kidneys and aortic density was calculated from this data. Using this data, renal clearance of contrast media was calculated by CT-clearance in 29 patients. As reference, Serum-clearance was calculated in 24 patients by application of a modified one-exponential slope model. Information on the relative kidney function was gained by renal scintigraphy with Tc99m-MAG-3 or Tc99m-DMSA in 29 patients. Results. Linear regression analysis revealed a correlation coefficient of CT-clearance with Serum-clearance of r=0.78 with Cl (CT) [ml/min]=22.2+1.03 * Cl (serum), n=24. Linear regression of the relative kidney function (rkf) of the right kidney calculated by CT-clearance compared to scintigraphy results provided a correlation coefficient r=0.89 with rkf(CT)[%]=18.6+0.58 * rkf(scintigraphy), n=29. Conclusion. The obtained results of contrast media clearance measured by CT-clearance are in the physiological range of the parameter. Future studies should be performed to improve the methodology with the aim of higher accuracy. More specifically, better determination of the aortic density curve might improve the accuracy

  11. Exploring organizational crises from a legitimation perspective: Results from a computer simulation and illustrative cases

    OpenAIRE

    Breitsohl, Heiko

    2008-01-01

    Organizational crises are rare, yet they fundamentally influence the evolution of organizations. An aspect of crises deserving more attention is the interaction of organizations and their stakeholders during a crisis from a legitimation perspective. This paper presents a simulation model mapping causal relationships behind this interaction. Results suggest that the nature and timing of organizational response to crises has considerable effect on the success and duration of attempts of regaini...

  12. Comparing the Floating Point Systems, Inc. AP-190L to representative scientific computers: some benchmark results

    International Nuclear Information System (INIS)

    Brengle, T.A.; Maron, N.

    1980-01-01

    Results are presented of comparative timing tests made by running a typical FORTRAN physics simulation code on the following machines: DEC PDP-10 with KI processor; DEC PDP-10, KI processor, and FPS AP-190L; CDC 7600; and CRAY-1. Factors such as DMA overhead, code size for the AP-190L, and the relative utilization of floating point functional units for the different machines are discussed. 1 table

  13. Sintering of hardmetals in different conditions: experimental results of 2-D dilatometry and computer simulations

    International Nuclear Information System (INIS)

    Gasik, M.; Zhang, B.; Kaskiala, M.; Yilkeraelae, J.

    2001-01-01

    Properties of WC-Co functionally gradated materials (FGM) manufactured by powder metallurgy from nanograin powders are studied. New optical system (a 2-D dilatometer) has been developed, using a high-resolution CCd camera and a dedicated software fro image processing. Sintering of WC-Co hard metals with different cobalt and grain growth inhibitors content was performed for various conditions (substrate, heating rate, temperature) and resulting anisotropy was measured. (author)

  14. A flashing driven moderator cooling system for CANDU reactors: Experimental and computational results

    International Nuclear Information System (INIS)

    Khartabil, H.F.

    2000-01-01

    A flashing-driven passive moderator cooling system is being developed at AECL for CANDU reactors. Preliminary simulations and experiments showed that the concept was feasible at normal operating power. However, flow instabilities were observed at low powers under conditions of variable and constant calandria inlet temperatures. This finding contradicted code predictions that suggested the loop should be stable at all powers if the calandria inlet temperature was constant. This paper discusses a series of separate-effects tests that were used to identify the sources of low-power instabilities in the experiments, and it explores methods to avoid them. It concludes that low-power instabilities can be avoided, thereby eliminating the discrepancy between the experimental and code results. Two factors were found to be important for loop stability: (1) oscillations in the calandria outlet temperature, and (2) flashing superheat requirements, and the presence of nucleation sites. By addressing these factors, we could make the loop operate in a stable manner over the whole power range and we could obtain good agreement between the experimental and code results. (author)

  15. Review of current results in computational studies of hydrocarbon phase and transport properties in nanoporous structures

    Science.gov (United States)

    Stroev, N.; Myasnikov, A.

    2017-12-01

    This article provides a general overview of the main simulation results on the behavior of gas/liquids under confinement conditions, namely hydrocarbons in shale formations, and current understanding of such phenomena. In addition to the key effects, which different research groups obtained and which have to be taken into account during the creation of reservoir simulation software, a list of methods is briefly covered. Comprehensive understanding of both fluid phase equilibrium and transport properties in nanoscale structures is of great importance for many scientific and technical disciplines, especially for petroleum engineering considering the hydrocarbon behavior in complex shale formations, the development of which increases with time. Recent estimations show that a significant amount of resources are trapped inside organic matter and clays, which has extremely low permeability and yet great economic potential. The issue is not only of practical importance, as the existing conventional approaches by definition are unable to capture complicated physics phenomena for effective results, but it is also of fundamental value. The research of the processes connected with such deposits is necessary for both evaluations of petroleum reservoir deposits and hydrodynamic simulators. That is why the review is divided into two major parts—equilibrium states of hydrocarbons and their transport properties in highly confined conditions.

  16. Building Capacity Through Hands-on Computational Internships to Assure Reproducible Results and Implementation of Digital Documentation in the ICERT REU Program

    Science.gov (United States)

    Gomez, R.; Gentle, J.

    2015-12-01

    Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community

  17. Experimental and computer simulation results of the spot welding process using SORPAS software

    International Nuclear Information System (INIS)

    Al-Jader, M A; Cullen, J D; Athi, N; Al-Shamma'a, A I

    2009-01-01

    The highly competitive nature of the automotive industry drives demand for improvements and increased precision engineering in resistance spot welding. Currently there are about 4300 weld points on the average steel vehicle. Current industrial monitoring systems check the quality of the nugget after processing 15 cars, once every two weeks. The nuggets are examined off line using a destructive process, which takes approximately 10 days to complete causing a long delay in the production process. This paper presents a simulation of the spot welding growth curves, along with a comparison to growth curves performed on an industrial spot welding machine. The correlation of experimental results shows that SORPAS simulations can be used as an off line measurement to reduce factory energy usage. The first section in your paper

  18. Thermodynamic properties of 9-fluorenone: Mutual validation of experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Kazakov, Andrei F.; Steele, William V.

    2012-01-01

    Highlights: ► Heat capacities were measured for the temperature range 5 K to 520 K. ► Vapor pressures were measured for the temperature range 368 K to 668 K. ► The enthalpy of combustion was measured and the enthalpy of formation was derived. ► Calculated and derived properties for the ideal gas are in excellent accord. ► Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Measurements leading to the calculation of thermodynamic properties for 9-fluorenone (IUPAC name 9H-fluoren-9-one and Chemical Abstracts registry number [486-25-9]) in the ideal-gas state are reported. Experimental methods were adiabatic heat-capacity calorimetry, inclined-piston manometry, comparative ebulliometry, and combustion calorimetry. Critical properties were estimated. Molar entropies for the ideal-gas state were derived from the experimental studies at selected temperatures T between T = 298.15 K and T = 600 K, and independent statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6 − 31 + G(d,p) level of theory. Values derived with the independent methods are shown to be in excellent accord with a scaling factor of 0.975 applied to the calculated frequencies. This same scaling factor was successfully applied in the analysis of results for other polycyclic molecules, as described in recent articles by this research group. All experimental results are compared with property values reported in the literature. Thermodynamic consistency between properties is used to show that several studies in the literature are erroneous.

  19. Aging adult skull remains through radiological density estimates: A comparison of different computed tomography systems and the use of computer simulations to judge the accuracy of results.

    Science.gov (United States)

    Obert, Martin; Kubelt, Carolin; Schaaf, Thomas; Dassinger, Benjamin; Grams, Astrid; Gizewski, Elke R; Krombach, Gabriele A; Verhoff, Marcel A

    2013-05-10

    The objective of this article was to explore age-at-death estimates in forensic medicine, which were methodically based on age-dependent, radiologically defined bone-density (HC) decay and which were investigated with a standard clinical computed tomography (CT) system. Such density decay was formerly discovered with a high-resolution flat-panel CT in the skulls of adult females. The development of a standard CT methodology for age estimations--with thousands of installations--would have the advantage of being applicable everywhere, whereas only few flat-panel prototype CT systems are in use worldwide. A Multi-Slice CT scanner (MSCT) was used to obtain 22,773 images from 173 European human skulls (89 male, 84 female), taken from a population of patients from the Department of Neuroradiology at the University Hospital Giessen and Marburg during 2010 and 2011. An automated image analysis was carried out to evaluate HC of all images. The age dependence of HC was studied by correlation analysis. The prediction accuracy of age-at-death estimates was calculated. Computer simulations were carried out to explore the influence of noise on the accuracy of age predictions. Human skull HC values strongly scatter as a function of age for both sexes. Adult male skull bone-density remains constant during lifetime. Adult female HC decays during lifetime, as indicated by a correlation coefficient (CC) of -0.53. Prediction errors for age-at-death estimates for both of the used scanners are in the range of ±18 years at a 75% confidence interval (CI). Computer simulations indicate that this is the best that can be expected for such noisy data. Our results indicate that HC-decay is indeed present in adult females and that it can be demonstrated both by standard and by high-resolution CT methods, applied to different subject groups of an identical population. The weak correlation between HC and age found by both CT methods only enables a method to estimate age-at-death with limited

  20. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  1. Initial results of the quality control in 11 computed tomography scanners at Curitiba

    International Nuclear Information System (INIS)

    Kodlulovich, S; Oliveira, L.; Jakubiak, R.R.; Miquelin, C.A.

    2008-01-01

    The aim of this study was to evaluate the image quality of 11 scanners installed in public and private centers of Curitiba, Brazil. This sample represents 30% of the CT scanners in the city so far. The ACR CT accreditation phantom was used to verify the accomplishment of the scanners performance to the international quality requirements. The results indicate that efforts should be concentrated in the maintenance of the equipments and specific training of the technicians. Most of the scanners have showed some non-conformity. In 27,5% of the sample the positioning requirement wasn't accomplished. The CT number accuracy evaluation showed that in 72,3 % of the scanners the CT numbers were out of the tolerance range, reaching values 35% greater than the limit. The low contrast resolution criteria weren't accomplished in 9% of the scanners. The main concern is that there isn't a specific program to evaluate the image quality of the CT scanners neither to estimate the CT doses in the procedures. (author)

  2. A computer model to forecast wetland vegetation changes resulting from restoration and protection in coastal Louisiana

    Science.gov (United States)

    Visser, Jenneke M.; Duke-Sylvester, Scott M.; Carter, Jacoby; Broussard, Whitney P.

    2013-01-01

    The coastal wetlands of Louisiana are a unique ecosystem that supports a diversity of wildlife as well as a diverse community of commercial interests of both local and national importance. The state of Louisiana has established a 5-year cycle of scientific investigation to provide up-to-date information to guide future legislation and regulation aimed at preserving this critical ecosystem. Here we report on a model that projects changes in plant community distribution and composition in response to environmental conditions. This model is linked to a suite of other models and requires input from those that simulate the hydrology and morphology of coastal Louisiana. Collectively, these models are used to assess how alternative management plans may affect the wetland ecosystem through explicit spatial modeling of the physical and biological processes affected by proposed modifications to the ecosystem. We have also taken the opportunity to advance the state-of-the-art in wetland plant community modeling by using a model that is more species-based in its description of plant communities instead of one based on aggregated community types such as brackish marsh and saline marsh. The resulting model provides an increased level of ecological detail about how wetland communities are expected to respond. In addition, the output from this model provides critical inputs for estimating the effects of management on higher trophic level species though a more complete description of the shifts in habitat.

  3. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  4. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  5. Implementation of the Principal Component Analysis onto High-Performance Computer Facilities for Hyperspectral Dimensionality Reduction: Results and Comparisons

    Directory of Open Access Journals (Sweden)

    Ernestina Martel

    2018-06-01

    Full Text Available Dimensionality reduction represents a critical preprocessing step in order to increase the efficiency and the performance of many hyperspectral imaging algorithms. However, dimensionality reduction algorithms, such as the Principal Component Analysis (PCA, suffer from their computationally demanding nature, becoming advisable for their implementation onto high-performance computer architectures for applications under strict latency constraints. This work presents the implementation of the PCA algorithm onto two different high-performance devices, namely, an NVIDIA Graphics Processing Unit (GPU and a Kalray manycore, uncovering a highly valuable set of tips and tricks in order to take full advantage of the inherent parallelism of these high-performance computing platforms, and hence, reducing the time that is required to process a given hyperspectral image. Moreover, the achieved results obtained with different hyperspectral images have been compared with the ones that were obtained with a field programmable gate array (FPGA-based implementation of the PCA algorithm that has been recently published, providing, for the first time in the literature, a comprehensive analysis in order to highlight the pros and cons of each option.

  6. Preliminary results of very fast computation of Moment Magnitude and focal mechanism in the context of tsunami warning

    Science.gov (United States)

    Schindelé, François; Roch, Julien; Rivera, Luis

    2015-04-01

    Various methodologies were recently developed to compute the moment magnitude and the focal mechanism, thanks to the real time access to numerous broad-band seismic data. Several methods were implemented at the CENALT, in particular the W-Phase method developed by H. Kanamori and L. Rivera. For earthquakes of magnitudes in the range 6.5-9.0, this method provides accurate results in less than 40 minutes. The context of the tsunami warning in Mediterranean, a small basin impacted in less than one hour, and with small sources but some with high tsunami potential (Boumerdes 2003), a comprehensive tsunami warning system in that region should include very fast computation of the seismic parameters. The results of the values of Mw, the focal depth and the type of fault (reverse, normal, strike-slip) are the most relevant parameters expected for the tsunami warning. Preliminary results will be presented using data in the North-eastern and Mediterranean region for the recent period 2010-2014. This work is funded by project ASTARTE - - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  7. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    Directory of Open Access Journals (Sweden)

    Bundschuh Bettina B

    2011-11-01

    Full Text Available Abstract Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

  8. Nonlinear excitation of electron cyclotron waves by a monochromatic strong microwave: computer simulation analysis of the MINIX results

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, H.; Kimura, T.

    1986-01-01

    Triggered by the experimental results of the MINIX, a computer simulation study was initiated on the nonlinear excitation of electrostatic electron cyclotron waves by a monochromatic electromagnetic wave such as the transmitted microwave in the MINIX. The model used assumes that both of the excited waves and exciting (pumping) electromagnetic wave as well as the idler electromagnetic wave propagate in the direction perpendicular to the external magnetic field. The simulation code used for this study was the one-and-two-half dimensional electromagnetic particle code named KEMPO. The simulation result shows the high power electromagnetic wave produces both the backscattered electromagnetic wave and electrostatic electron cyclotron waves as a result of nonlinear parametric instability. Detailed nonlinear microphysics related to the wave excitation is discussed in terms of the nonlinear wave-wave couplings and associated ponderomotive force produced by the high power electromagnetic waves. 2 references, 4 figures.

  9. Nonlinear excitation of electron cyclotron waves by a monochromatic strong microwave: computer simulation analysis of the MINIX results

    International Nuclear Information System (INIS)

    Matsumoto, H.; Kimura, T.

    1986-01-01

    Triggered by the experimental results of the MINIX, a computer simulation study was initiated on the nonlinear excitation of electrostatic electron cyclotron waves by a monochromatic electromagnetic wave such as the transmitted microwave in the MINIX. The model used assumes that both of the excited waves and exciting (pumping) electromagnetic wave as well as the idler electromagnetic wave propagate in the direction perpendicular to the external magnetic field. The simulation code used for this study was the one-and-two-half dimensional electromagnetic particle code named KEMPO. The simulation result shows the high power electromagnetic wave produces both the backscattered electromagnetic wave and electrostatic electron cyclotron waves as a result of nonlinear parametric instability. Detailed nonlinear microphysics related to the wave excitation is discussed in terms of the nonlinear wave-wave couplings and associated ponderomotive force produced by the high power electromagnetic waves. 2 references, 4 figures

  10. Influence of chamber type integrated with computer-assisted semen analysis (CASA) system on the results of boar semen evaluation.

    Science.gov (United States)

    Gączarzewicz, D

    2015-01-01

    The objective of the study was to evaluate the effect of different types of chambers used in computer-assisted semen analysis (CASA) on boar sperm concentration and motility parameters. CASA measurements were performed on 45 ejaculates by comparing three commonly used chambers: Leja chamber (LJ), Makler chamber (MK) and microscopic slide-coverslip (SL). Concentration results obtained with CASA were verified by manual counting on a Bürker hemocytometer (BH). No significant differences were found between the concentrations determined with BH vs. LJ and SL, whereas higher (p0.05). The results obtained show that CASA assessment of boar semen should account for the effect of counting chamber on the results of sperm motility and concentration, which confirms the need for further study on standardizing the automatic analysis of boar semen.

  11. Pattern recognition, neural networks, genetic algorithms and high performance computing in nuclear reactor diagnostics. Results and perspectives

    International Nuclear Information System (INIS)

    Dzwinel, W.; Pepyolyshev, N.

    1996-01-01

    The main goal of this paper is the presentation of our experience in development of the diagnostic system for the IBR-2 (Russia - Dubna) nuclear reactor. The authors show the principal results of the system modifications to make it work more reliable and much faster. The former needs the adaptation of new techniques of data processing, the latter, implementation of the newest computational facilities. The results of application of the clustering techniques and a method of visualization of the multi-dimensional information directly on the operator display are presented. The experiences with neural nets, used for prediction of the reactor operation, are discussed. The genetic algorithms were also tested, to reduce the quantity of data nd extracting the most informative components of the analyzed spectra. (authors)

  12. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  13. Diagnostic reference levels for common computed tomography (CT) examinations: results from the first Nigerian nationwide dose survey.

    Science.gov (United States)

    Ekpo, Ernest U; Adejoh, Thomas; Akwo, Judith D; Emeka, Owujekwe C; Modu, Ali A; Abba, Mohammed; Adesina, Kudirat A; Omiyi, David O; Chiegwu, Uche H

    2018-01-29

    To explore doses from common adult computed tomography (CT) examinations and propose national diagnostic reference levels (nDRLs) for Nigeria. This retrospective study was approved by the Nnamdi Azikiwe University and University Teaching Hospital Institutional Review Boards (IRB: NAUTH/CS/66/Vol8/84) and involved dose surveys of adult CT examinations across the six geographical regions of Nigeria and Abuja from January 2016 to August 2017. Dose data of adult head, chest and abdomen/pelvis CT examinations were extracted from patient folders. The median, 75th and 25th percentile CT dose index volume (CTDI vol ) and dose-length-product (DLP) were computed for each of these procedures. Effective doses (E) for these examinations were estimated using the k conversion factor as described in the ICRP publication 103 (E DLP  =  k × DLP ). The proposed 75th percentile CTDI vol for head, chest, and abdomen/pelvis are 61 mGy, 17 mGy, and 20 mGy, respectively. The corresponding DLPs are 1310 mGy.cm, 735 mGy.cm, and 1486 mGy.cm respectively. The effective doses were 2.75 mSv (head), 10.29 mSv (chest), and 22.29 mSv (abdomen/pelvis). Findings demonstrate wide dose variations within and across centres in Nigeria. The results also show CTDI vol comparable to international standards, but considerably higher DLP and effective doses.

  14. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 2: Computational implementation and first results

    Science.gov (United States)

    Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina

    2017-11-01

    This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M risk reduction.

  15. The Effect of Animation in Multimedia Computer-Based Learning and Learning Style to the Learning Results

    Directory of Open Access Journals (Sweden)

    Muhammad RUSLI

    2017-10-01

    Full Text Available The effectiveness of a learning depends on four main elements, they are content, desired learning outcome, instructional method and the delivery media. The integration of those four elements can be manifested into a learning modul which is called multimedia learning or learning by using multimedia. In learning context by using computer-based multimedia, there are two main things that need to be noticed so that the learning process can run effectively: how the content is presented, and what the learner’s chosen way in accepting and processing the information into a meaningful knowledge. First it is related with the way to visualize the content and how people learn. The second one is related with the learning style of the learner. This research aims to investigate the effect of the type of visualization—static vs animated—on a multimedia computer-based learning, and learning styles—visual vs verbal, towards the students’ capability in applying the concepts, procedures, principles of Java programming. Visualization type act as independent variables, and learning styles of the students act as a moderator variable. Moreover, the instructional strategies followed the Component Display Theory of Merril, and the format of presentation of multimedia followed the Seven Principles of Multimedia Learning of Mayer and Moreno. Learning with the multimedia computer-based learning has been done in the classroom. The subject of this research was the student of STMIK-STIKOM Bali in odd semester 2016-2017 which followed the course of Java programming. The Design experiments used multivariate analysis of variance, MANOVA 2 x 2, with a large sample of 138 students in 4 classes. Based on the results of the analysis, it can be concluded that the animation in multimedia interactive learning gave a positive effect in improving students’ learning outcomes, particularly in the applying the concepts, procedures, and principles of Java programming. The

  16. Concomitant Use of Transcranial Direct Current Stimulation and Computer-Assisted Training for the Rehabilitation of Attention in Traumatic Brain Injured Patients: Behavioral and Neuroimaging Results.

    Science.gov (United States)

    Sacco, Katiuscia; Galetto, Valentina; Dimitri, Danilo; Geda, Elisabetta; Perotti, Francesca; Zettin, Marina; Geminiani, Giuliano C

    2016-01-01

    Divided attention (DA), the ability to distribute cognitive resources among two or more simultaneous tasks, may be severely compromised after traumatic brain injury (TBI), resulting in problems with numerous activities involved with daily living. So far, no research has investigated whether the use of non-invasive brain stimulation associated with neuropsychological rehabilitation might contribute to the recovery of such cognitive function. The main purpose of this study was to assess the effectiveness of 10 transcranial direct current stimulation (tDCS) sessions combined with computer-assisted training; it also intended to explore the neural modifications induced by the treatment. Thirty-two patients with severe TBI participated in the study: 16 were part of the experimental group, and 16 part of the control group. The treatment included 20' of tDCS, administered twice a day for 5 days. The electrodes were placed on the dorso-lateral prefrontal cortex. Their location varied across patients and it depended on each participant's specific area of damage. The control group received sham tDCS. After each tDCS session, the patient received computer-assisted cognitive training on DA for 40'. The results showed that the experimental group significantly improved in DA performance between pre- and post-treatment, showing faster reaction times (RTs), and fewer omissions. No improvement was detected between the baseline assessment (i.e., 1 month before treatment) and the pre-training assessment, or within the control group. Functional magnetic resonance imaging (fMRI) data, obtained on the experimental group during a DA task, showed post-treatment lower cerebral activations in the right superior temporal gyrus (BA 42), right and left middle frontal gyrus (BA 6), right postcentral gyrus (BA 3) and left inferior frontal gyrus (BA 9). We interpreted such neural changes as normalization of previously abnormal hyperactivations.

  17. Concomitant use of transcranial Direct Current Stimulation and computer-assisted training for the rehabilitation of attention in traumatic brain injured patients: behavioral and neuroimaging results

    Directory of Open Access Journals (Sweden)

    Katiuscia eSacco

    2016-03-01

    Full Text Available Divided attention, the ability to distribute cognitive resources among two or more simultaneous tasks, may be severely compromised after traumatic brain injury (TBI, resulting in problems with numerous activities involved with daily living. So far, no research has investigated whether the use of non-invasive brain stimulation associated with neuropsychological rehabilitation might contribute to the recovery of such cognitive function. The main purpose of this study was to assess the effectiveness of 10 tDCS sessions combined with computer-assisted training; it also intended to explore the neural modifications induced by the treatment. Thirty-two patients with severe TBI participated in the study: sixteen were part of the experimental group, and sixteen part of the control group. The treatment included 20’ of tDCS, administered twice a day for 5 days. The electrodes were placed on the dorso-lateral prefrontal cortex. Their location varied across patients and it depended on each participant’s specific area of damage. The control group received sham tDCS. After each tDCS session, the patient received computer-assisted cognitive training on divided attention for 40’. The results showed that the experimental group significantly improved in divided attention performance between pre- and post-treatment, showing faster reaction times, and fewer omissions. No improvement was detected between the baseline assessment (i.e., one month before treatment and the pre-training assessment, or within the control group. Functional magnetic resonance imaging data, obtained on the experimental group during a divided attention task, showed post-treatment lower cerebral activations in the right superior temporal gyrus (BA 42, right and left middle frontal gyrus (BA 6, right postcentral gyrus (BA 3 and left inferior frontal gyrus (BA 9. We interpreted such neural changes as normalization of previously abnormal hyperactivations.

  18. Method of processing results of tests of heating surfaces of a steam generator on a digital computer

    Energy Technology Data Exchange (ETDEWEB)

    Glusker, B.N.

    1975-03-01

    At present, processing of information obtained by testing steam generators in high-capacity generating units is carried out manually. This takes a long time and does not always permit one to process all the information obtained, which impoverishes the results of experimental work. In addition, this kind of processing of experimental results is as a rule done after completion of a considerable part of the tests, and occasionally after completion of all the tests. In this case, it is impossible to conduct a better directed, corrected experiment, and this leads to duplication of experiments and to increasing the period of adjusting and exploratory work on industrial plants. An algorithm was developed for automated processing of the hydraulic and temperature conditions of the heating surfaces in steam generators on digital computers, which is a part of the general algorithm of processing of results of thermal tests of steam generators. It includes calculation of all characteristics determining the thermal and hydraulic conditions of the heating surfaces. The program of processing includes a subprogram: determination of the thermophysical and thermodynamic properties of the water and steam.

  19. A computer-aided audit system for respiratory therapy consult evaluations: description of a method and early results.

    Science.gov (United States)

    Kester, Lucy; Stoller, James K

    2013-05-01

    Use of respiratory therapist (RT)-guided protocols enhances allocation of respiratory care. In the context that optimal protocol use requires a system for auditing respiratory care plans to assure adherence to protocols and expertise of the RTs generating the care plan, a live audit system has been in longstanding use in our Respiratory Therapy Consult Service. Growth in the number of RT positions and the need to audit more frequently has prompted development of a new, computer-aided audit system. The number and results of audits using the old and new systems were compared (for the periods May 30, 2009 through May 30, 2011 and January 1, 2012 through May 30, 2012, respectively). In contrast to the original, live system requiring a patient visit by the auditor, the new system involves completion of a respiratory therapy care plan using patient information in the electronic medical record, both by the RT generating the care plan and the auditor. Completing audits in the new system also uses an electronic respiratory therapy management system. The degrees of concordance between the audited RT's care plans and the "gold standard" care plans using the old and new audit systems were similar. Use of the new system was associated with an almost doubling of the rate of audits (ie, 11 per month vs 6.1 per month). The new, computer-aided audit system increased capacity to audit more RTs performing RT-guided consults while preserving accuracy as an audit tool. Ensuring that RTs adhere to the audit process remains the challenge for the new system, and is the rate-limiting step.

  20. TSOAK-M1: a computer code to determine tritium reaction/adsorption/release parameters from experimental results of air-detritiation tests

    International Nuclear Information System (INIS)

    Land, R.H.; Maroni, V.A.; Minkoff, M.

    1979-01-01

    A computer code has been developed which permits the determination of tritium reaction (T 2 to HTO)/adsorption/release and instrument correction parameters from enclosure (building) - detritiation test data. The code is based on a simplified model which treats each parameter as a normalized time-independent constant throughout the data-unfolding steps. Because of the complicated four-dimensional mathematical surface generated by the resulting differential equation system, occasional local-minima effects are observed, but these effects can be overcome in most instances by selecting a series of trial guesses for the initial parameter values and observing the reproducibility of final parameter values for cases where the best overall fit to experimental data is achieved. The code was then used to analyze existing small-cubicle test data with good success, and the resulting normalized parameters were employed to evaluate hypothetical reactor-building detritiation scenarios. It was concluded from the latter evaluation that the complications associated with moisture formation, adsorption, and release, particularly in terms of extended cleanup times, may not be as great as was previously thought. It is recommended that the validity of the TSOAK-M1 model be tested using data from detritiation tests conducted on large experimental enclosures (5 to 10 cm 3 ) and, if possible, actual facility buildings

  1. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Igari, Yui, E-mail: igari@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosoya, Tadashi [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan)

    2015-04-15

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  2. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Usui, Akihito; Hosokai, Yoshiyuki; Igari, Yui; Hosoya, Tadashi; Hayashizaki, Yoshie; Saito, Haruo; Ishibashi, Tadashi; Funayama, Masato

    2015-01-01

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  3. Thermodynamic properties of xanthone: Heat capacities, phase-transition properties, and thermodynamic-consistency analyses using computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Kazakov, Andrei F.

    2015-01-01

    Highlights: • Heat capacities were measured for the temperature range (5 to 520) K. • The enthalpy of combustion was measured and the enthalpy of formation was derived. • Thermodynamic-consistency analysis resolved inconsistencies in literature enthalpies of sublimation. • An inconsistency in literature enthalpies of combustion was resolved. • Application of computational chemistry in consistency analysis was demonstrated successfully. - Abstract: Heat capacities and phase-transition properties for xanthone (IUPAC name 9H-xanthen-9-one and Chemical Abstracts registry number [90-47-1]) are reported for the temperature range 5 < T/K < 524. Statistical calculations were performed and thermodynamic properties for the ideal gas were derived based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. These results are combined with sublimation pressures from the literature to allow critical evaluation of inconsistent enthalpies of sublimation for xanthone, also reported in the literature. Literature values for the enthalpy of combustion of xanthone are re-assessed, a revision is recommended for one result, and a new value for the enthalpy of formation of the ideal gas is derived. Comparisons with thermophysical properties reported in the literature are made for all other reported and derived properties, where possible

  4. Mid-term results of off-pump coronary artery bypass grafting assessed by multi-slice computed tomography

    International Nuclear Information System (INIS)

    Yoshida, Seijiro; Nitta, Yoshio; Oda, Katsuhiko

    2004-01-01

    Off-pump coronary artery bypass (OPCAB) has recently increased in popularity, but the long-term results are still unknown. We evaluated the mid-term results of OPCAB surgery using multi-slice computed tomography (MSCT), which is a non-invasive postoperative evaluation method. Thirty-one consecutive patients who underwent OPCAB surgery at least 2 years prior to the study were selected. The age was 50 to 79 years (66.9±6.5) and the ratio of men to women was 26:5. Coronary angiography was performed in all patients at 2 weeks postoperatively. The follow-up was complete, and mean follow-up was 30.9 months. There were no hospital deaths and 1 non-cardiac late death. The graft patency rate in coronary angiography was left internal thoracic artery (LITA) 30/30 (100%), right internal thoracic artery (RITA) 2/2 (100%), radial artery (RA) 14/15 (93%), saphenous vein graft (SVG) 15/17 (88%). No graft became occluded on MSCT study and all patients have been angina-free during the follow-up period. We suggest that OPCAB is feasible in most patients with good patency and low mortality. MSCT is an effective follow up method for the morphological findings and noninvasive quantitative evaluation of the bypass grafts. (author)

  5. Nonlinear ultrasound propagation through layered liquid and tissue-equivalent media: computational and experimental results at high frequency

    International Nuclear Information System (INIS)

    Williams, Ross; Cherin, Emmanuel; Lam, Toby Y J; Tavakkoli, Jahangir; Zemp, Roger J; Foster, F Stuart

    2006-01-01

    Nonlinear propagation has been demonstrated to have a significant impact on ultrasound imaging. An efficient computational algorithm is presented to simulate nonlinear ultrasound propagation through layered liquid and tissue-equivalent media. Results are compared with hydrophone measurements. This study was undertaken to investigate the role of nonlinear propagation in high frequency ultrasound micro-imaging. The acoustic field of a focused transducer (20 MHz centre frequency, f-number 2.5) was simulated for layered media consisting of water and tissue-mimicking phantom, for several wide-bandwidth source pulses. The simulation model accounted for the effects of diffraction, attenuation and nonlinearity, with transmission and refraction at layer boundaries. The parameter of nonlinearity, B/A, of the water and tissue-mimicking phantom were assumed to be 5.2 and 7.4, respectively. The experimentally measured phantom B/A value found using a finite-amplitude insert-substitution method was shown to be 7.4 ± 0.6. Relative amounts of measured second and third harmonic pressures as a function of the fundamental pressures at the focus were in good agreement with simulations. Agreement within 3% was found between measurements and simulations of the beam widths of the fundamental and second harmonic signals following propagation through the tissue phantom. The results demonstrate significant nonlinear propagation effects for high frequency imaging beams

  6. Percutaneous computed tomography-guided core needle biopsy of soft tissue tumors: results and correlation with surgical specimen analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Grigio, Henrique Ramos; Bitencourt, Almir Galvao Vieira; Pinto, Paula Nicole Vieira; Tyng, Chiang J.; Cunha, Isabela Werneck da; Aguiar Junior, Samuel; Lopes, Ademar, E-mail: chojniak@uol.com.br [Hospital A.C. Camargo, Sao Paulo, SP (Brazil)

    2012-09-15

    Objective: To evaluate the efficacy of percutaneous computed tomography (CT)-guided core needle biopsy of soft tissue tumors in obtaining appropriate samples for histological analysis, and compare its diagnosis with the results of the surgical pathology as available. Materials and Methods: The authors reviewed medical records, imaging and histological reports of 262 patients with soft-tissue tumors submitted to CT-guided core needle biopsy in an oncologic reference center between 2003 and 2009. Results: Appropriate samples were obtained in 215 (82.1%) out of the 262 patients. The most prevalent tumors were sarcomas (38.6%), metastatic carcinomas (28.8%), benign mesenchymal tumors (20.5%) and lymphomas (9.3%). Histological grading was feasible in 92.8% of sarcoma patients, with the majority of them (77.9%) being classified as high grade tumors. Out of the total sample, 116 patients (44.3%) underwent surgical excision and diagnosis confirmation. Core biopsy demonstrated 94.6% accuracy in the identification of sarcomas, with 96.4% sensitivity and 89.5% specificity. A significant intermethod agreement about histological grading was observed between core biopsy and surgical resection (p < 0.001; kappa = 0.75). Conclusion: CT-guided core needle biopsy demonstrated a high diagnostic accuracy in the evaluation of soft tissue tumors as well as in the histological grading of sarcomas, allowing an appropriate therapeutic planning (author)

  7. Results of the deepest all-sky survey for continuous gravitational waves on LIGO S6 data running on the Einstein@Home volunteer distributed computing project

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acemese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Arker, Bd.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Be, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitoss, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Boutfanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, O.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, C.; Cahillane, C.; Bustillo, J. Calderon; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, Laura; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Costa, C. F. Da Silva; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.A.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M. Di; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Dreyer, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Egizenstein, H. -B.; Ehrens, P.; Eichholel, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, O.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Far, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.M.; Fournier, J. -D.; Frasca, J. -D; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garuti, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gi, K.; Glaetke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Granta, A.; Gras, S.; Cray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, S.; Hennig, J.; Henry, J.A.; Heptonsta, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howel, E. J.; Hu, Y. M.; Huang, O.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Isogai, T.; Lyer, B. R.; Fzumi, K.; Jaccimin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jones, R.; Jonker, R. J. G.; Ju, L.; Wads, k; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kefelian, F.; Keh, M. S.; Keite, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, Namjun; Kim, W.; Kimbre, S. J.; King, E. J.; King, P. J.; Kisse, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringe, V.; Krishnan, B.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Liick, H.; Lundgren, A. P.; Lynch, R.; Ivia, Y.; Machenschalk, B.; Maclnnis, M.; Macleod, D. M.; Magafia-Sandoval, F.; Zertuche, L. Magafia; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Manse, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matiehard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Miche, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecehia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Gutierrez-Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Hang, S.; Ohme, F.; Oliver, M.; Oppermann, P.; Ram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Phelps, M.; Piccinni, . J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powel, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L. G.; Puncken, .; Punturo, M.; Purrer, PuppoM.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rowan, RosiliskaS.; Ruggi, RiidigerP.; Ryan, K.; Sachdev, Perminder S; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Saulson, P. R.; Sauter, E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabe, R.; Schofield, R. M. S.; Schonbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T. J.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Sielleez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, António Dias da; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazus, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sunil, Suns; Sutton, P. J.; Swinkels, B. L.; Szczepariczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tomasi, Z.; Torres, C. V.; Tome, C.; Tot, D.; Travasso, F.; Traylor, G.; Trifire, D.; Tringali, M. C.; Trozz, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Valente, G.; Valdes, G.; van Bake, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; Van Heilningen, J. V.; Van Vegge, A. A.; Vardaro, M.; Vass, S.; Vaslith, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P.J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Vvang, G.; Wang, O.; Wang, X.; Wiang, Y.; Ward, R. L.; Wiarner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weliels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; WilIke, B.; Wimmer, M. H.; Whinkler, W.; Wipf, C. C.; De Witte, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J.L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S.J.; Zhu, X.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    We report results of a deep all-sky search for periodic gravitational waves from isolated neutron stars in data from the S6 LIGO science run. The search was possible thanks to the computing power provided by the volunteers of the Einstein@Home distributed computing project. We find no significant

  8. Quantifying differences between computational results and measurements in the case of a large-scale well-confined fire scenario

    International Nuclear Information System (INIS)

    Audouin, L.; Chandra, L.; Consalvi, J.-L.; Gay, L.; Gorza, E.; Hohm, V.; Hostikka, S.; Ito, T.; Klein-Hessling, W.; Lallemand, C.; Magnusson, T.; Noterman, N.; Park, J.S.; Peco, J.; Rigollet, L.; Suard, S.; Van-Hees, P.

    2011-01-01

    Research Highlights: → We performed a numerical benchmark in the framework of an OECD experimental program of a pool fire in a well-confined compartment. → The benchmark involves 17 participants using 8 fire models, 3 CFD and 5 zone models. → We investigated the capabilities of validation metrics for a real large-scale fire. → Six quantities were compared during the whole fire duration. → It is important to consider more than one metric for the validation process. - Abstract: The objective of this work was to quantify comparisons between several computational results and measurements performed during a pool fire scenario in a well-confined compartment. This collaborative work was initiated under the framework of the OECD fire research program and involves the most frequently used fire models in the fire community, including field and zone models. The experimental scenario was conducted at the French Institut de Radioprotection et de Surete Nucleaire (IRSN) and deals with a full-scale liquid pool fire in a confined and mechanically ventilated compartment representative for nuclear plants. The practical use of different metric operators and their ability to report the capabilities of fire models are presented. The quantitative comparisons between measurements and numerical results obtained from 'open' calculations concern six important quantities from a safety viewpoint: gas temperature, oxygen concentration, wall temperature, total heat flux, compartment pressure and ventilation flow rate during the whole fire duration. The results indicate that it is important to use more than one metric for the validation process in order to get information on the uncertainties associated with different aspects of fire safety.

  9. An analysis of true- and false-positive results of vocal fold uptake in positron emission tomography-computed tomography imaging.

    Science.gov (United States)

    Seymour, N; Burkill, G; Harries, M

    2018-03-01

    Positron emission tomography-computed tomography with fluorine-18 fluorodeoxy-D-glucose has a major role in the investigation of head and neck cancers. Fluorine-18 fluorodeoxy-D-glucose is not a tumour-specific tracer and can also accumulate in benign pathology. Therefore, positron emission tomography-computed tomography scan interpretation difficulties are common in the head and neck, which can produce false-positive results. This study aimed to investigate patients detected as having abnormal vocal fold uptake on fluorine-18 fluorodeoxy-D-glucose positron emission tomography-computed tomography. Positron emission tomography-computed tomography scans were identified over a 15-month period where reports contained evidence of unilateral vocal fold uptake or vocal fold pathology. Patients' notes and laryngoscopy results were analysed. Forty-six patients were identified as having abnormal vocal fold uptake on positron emission tomography-computed tomography. Twenty-three patients underwent positron emission tomography-computed tomography and flexible laryngoscopy: 61 per cent of patients had true-positive positron emission tomography-computed tomography scans and 39 per cent had false-positive scan results. Most patients referred to ENT for abnormal findings on positron emission tomography-computed tomography scans had true-positive findings. Asymmetrical fluorine-18 fluorodeoxy-D-glucose uptake should raise suspicion of vocal fold pathology, accepting a false-positive rate of approximately 40 per cent.

  10. Functional analysis of rare variants in mismatch repair proteins augments results from computation-based predictive methods

    Science.gov (United States)

    Arora, Sanjeevani; Huwe, Peter J.; Sikder, Rahmat; Shah, Manali; Browne, Amanda J.; Lesh, Randy; Nicolas, Emmanuelle; Deshpande, Sanat; Hall, Michael J.; Dunbrack, Roland L.; Golemis, Erica A.

    2017-01-01

    ABSTRACT The cancer-predisposing Lynch Syndrome (LS) arises from germline mutations in DNA mismatch repair (MMR) genes, predominantly MLH1, MSH2, MSH6, and PMS2. A major challenge for clinical diagnosis of LS is the frequent identification of variants of uncertain significance (VUS) in these genes, as it is often difficult to determine variant pathogenicity, particularly for missense variants. Generic programs such as SIFT and PolyPhen-2, and MMR gene-specific programs such as PON-MMR and MAPP-MMR, are often used to predict deleterious or neutral effects of VUS in MMR genes. We evaluated the performance of multiple predictive programs in the context of functional biologic data for 15 VUS in MLH1, MSH2, and PMS2. Using cell line models, we characterized VUS predicted to range from neutral to pathogenic on mRNA and protein expression, basal cellular viability, viability following treatment with a panel of DNA-damaging agents, and functionality in DNA damage response (DDR) signaling, benchmarking to wild-type MMR proteins. Our results suggest that the MMR gene-specific classifiers do not always align with the experimental phenotypes related to DDR. Our study highlights the importance of complementary experimental and computational assessment to develop future predictors for the assessment of VUS. PMID:28494185

  11. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    Science.gov (United States)

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  12. Requirements for Computer Based-Procedures for Nuclear Power Plant Field Operators. Results from a Qualitative Study

    International Nuclear Information System (INIS)

    Le Blanc, Katya; Oxstrand, J.H.; Waicosky, T.

    2012-01-01

    Although computer-based procedures (CBPs) have been investigated as a way to enhance operator performance on procedural tasks in the nuclear industry for almost thirty years, they are not currently widely deployed at United States utilities. One of the barriers to the wide scale deployment of CBPs is the lack of operational experience with CBPs that could serve as a sound basis for justifying the use of CBPs for nuclear utilities. Utilities are hesitant to adopt CBPs because of concern over potential costs of implementation, and concern over regulatory approval. Regulators require a sound technical basis for the use of any procedure at the utilities; without operating experience to support the use CBPs, it is difficult to establish such a technical basis. In an effort to begin the process of developing a technical basis for CBPs, researchers at Idaho National Laboratory are partnering with industry to explore CBPs with the objective of defining requirements for CBPs and developing an industry-wide vision and path forward for the use of CBPs. This paper describes the results from a qualitative study aimed at defining requirements for CBPs to be used by field operators and maintenance technicians. (author)

  13. Using scattering theory to compute invariant manifolds and numerical results for the laser-driven Hénon-Heiles system.

    Science.gov (United States)

    Blazevski, Daniel; Franklin, Jennifer

    2012-12-01

    Scattering theory is a convenient way to describe systems that are subject to time-dependent perturbations which are localized in time. Using scattering theory, one can compute time-dependent invariant objects for the perturbed system knowing the invariant objects of the unperturbed system. In this paper, we use scattering theory to give numerical computations of invariant manifolds appearing in laser-driven reactions. In this setting, invariant manifolds separate regions of phase space that lead to different outcomes of the reaction and can be used to compute reaction rates.

  14. An examination of intrinsic errors in electronic structure methods using the Environmental Molecular Sciences Laboratory computational results database and the Gaussian-2 set

    International Nuclear Information System (INIS)

    Feller, D.; Peterson, K.A.

    1998-01-01

    The Gaussian-2 (G2) collection of atoms and molecules has been studied with Hartree endash Fock and correlated levels of theory, ranging from second-order perturbation theory to coupled cluster theory with noniterative inclusion of triple excitations. By exploiting the systematic convergence properties of the correlation consistent family of basis sets, complete basis set limits were estimated for a large number of the G2 energetic properties. Deviations with respect to experimentally derived energy differences corresponding to rigid molecules were obtained for 15 basis set/method combinations, as well as the estimated complete basis set limit. The latter values are necessary for establishing the intrinsic error for each method. In order to perform this analysis, the information generated in the present study was combined with the results of many previous benchmark studies in an electronic database, where it is available for use by other software tools. Such tools can assist users of electronic structure codes in making appropriate basis set and method choices that will increase the likelihood of achieving their accuracy goals without wasteful expenditures of computer resources. copyright 1998 American Institute of Physics

  15. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  16. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  17. Mechanical Behavior of Nanostructured and Ultrafine Grained Materials under Shock Wave Loadings. Experimental Data and Results of Computer Simulation.

    Science.gov (United States)

    Skripnyak, Vladimir

    2011-06-01

    Features of mechanical behavior of nanostructured (NS) and ultrafine grained (UFG) metal and ceramic materials under quasistatic and shock wave loadings are discussed in this report. Multilevel models developed within the approach of computational mechanics of materials were used for simulation mechanical behavior of UFG and NS metals and ceramics. Comparisons of simulation results with experimental data are presented. Models of mechanical behavior of nanostructured metal alloys takes into account a several structural factors influencing on the mechanical behavior of materials (type of a crystal lattice, density of dislocations, a size of dislocation substructures, concentration and size of phase precipitation, and distribution of grains sizes). Results show the strain rate sensitivity of the yield stress of UFG and polycrystalline alloys is various in a range from 103 up to 106 1/s. But the difference of the Hugoniot elastic limits of a UFG and coarse-grained alloys may be not considerable. The spall strength, the yield stress of UFG and NS alloys are depend not only on grains size, but a number of factors such as a distribution of grains sizes, a concentration and sizes of voids and cracks, a concentration and sizes of phase precipitation. Some titanium alloys with grain sizes from 300 to 500 nm have the quasi-static yield strength and the tensile strength twice higher than that of coarse grained counterparts. But the spall strength of the UFG titanium alloys is only 10 percents above than that of coarse grained alloys. At the same time it was found the spall strength of the bulk UFG aluminium and magnesium alloys with precipitation strengthening is essentially higher in comparison of coarse-grained counterparts. The considerable decreasing of the strain before failure of UFG alloys was predicted at high strain rates. The Hugoniot elastic limits of oxide nanoceramics depend not only on the porosity, but also on sizes and volume distribution of voids.

  18. Models and procedures for interval evaluating the results of control of knowledge in computer systems testing of Navy

    Directory of Open Access Journals (Sweden)

    D. A. Pechnikov

    2018-01-01

    Full Text Available To implement effective military and professional training of Navy specialists, a corresponding educational and material base is needed. As a result of the reduction in the 1990s in the branches of the military-industrial complex developing weapons and equipment for the Navy, the latest models of this technology are now produced not in batches, but in individual copies. The question of the production of training and training samples is not worth it at all. Under these conditions, only virtual analogues of military equipment and weapons, developed by means of information technology, i.e., training and training systems (TOS, can be considered as the only means capable of providing military-professional training. At the modern level of the development of information technologies, testing is the only universal technical means of monitoring the knowledge of students. Procedures for knowledge control in modern computer testing systems do not meet the requirements for them according to the following characteristics: 1 the absence of the possibility of evaluating the error of the test results; 2 the absence of the possibility of stopping testing when the specified reliability of its results is achieved. In order to effectively implement the means of operational criteria-based pedagogical control of knowledge in the process of training specialists of the Navy and to enable joint analysis and processing of evaluations of learning outcomes, it is advisable to implement the following practical recommendations: 1. Formulating the teacher's preferences system regarding the quality of trainee training and the teacher's preferences system in relation to The significance of single test tasks in the test should be considered as the most important The essential steps in preparing a test for practical use. 2. The teacher who first enters the input of his preference systems should check their actual compliance on a sample of 5-10 such test results that cover the full

  19. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    International Nuclear Information System (INIS)

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  20. Design of a digital beam attenuation system for computed tomography. Part II. Performance study and initial results

    International Nuclear Information System (INIS)

    Szczykutowicz, Timothy P.; Mistretta, Charles A.

    2013-01-01

    reduction of ≈4 times relative to flat field CT. The dynamic range for the DBA prototype was 3.7 compared to 84.2 for the flat field scan. Conclusions: Based on the results presented in this paper and the companion paper [T. Szczykutowicz and C. Mistretta, “Design of a digital beam attenuation system for computed tomography. Part I. System design and simulation framework,” Med. Phys. 40, 021905 (2013)], FFMCT implemented via the DBA device seems feasible and should result in both a dose reduction and an improvement in image quality as judged by noise uniformity and scatter reduction. In addition, the dynamic range reduction achievable using the DBA may allow photon counting imaging to become a clinical reality. This study may allow for yet another step to be taken in the field of patient specific dose modulation.

  1. Computer assisted tomography tandem and ovoids (CATTO): results of a 3D CT based assessment of bladder and rectal doses

    International Nuclear Information System (INIS)

    Gebara, Wade; Weeks, Ken; Hahn, Carol; Montana, Gustavo; Anscher, Mitchell

    1996-01-01

    Purpose: To compare bladder and rectal dose rates in tandem and ovoid applications using two different dosimetry systems: traditional orthogonal radiograph-based dosimetry (TORD) vs. computer assisted tomography tandem and ovoids dosimetry (CATTO). Materials and Methods: From August 1992 through February 1996, 22 patients with carcinoma of the uterine cervix received the brachytherapy component of their radiotherapy with a CT-compatible Fletcher-Suit-Delclos device. Three-dimensional (3D) anatomic reconstructions were created with axial CT images. Three-dimensional dose calculations were then performed, and the isodose map was superimposed on the 3D anatomic reconstructions. Maximum bladder (B max ) and rectal (R max ) dose rates were determined from the result of calculating the dose rate to each point on the 3D surface of those organs. Three-dimensional computer displays were also obtained to determine the anatomic positions of the largest dose. Additionally, orthogonal radiography, with contrast in a Foley catheter balloon and a radio-opaque rectal tube was used to define rectal and bladder points. The dose rates at these points were calculated using a commercial treatment planning system. The effect of the tungsten shielding was ignored in the TORD calculations, but included in the CATTO calculations. Bladder and rectal dose rates determined by each dosimetry system were compared. Results: The B max calculated using the CATTO system was higher in all 22 patients when compared with the TORD system. The average B max for the patients using TORD was 43.4 cGy/hr, as compared to 86.2 cGy/hr using the CATTO system (p = 0.0083). The location of B max on CATTO was never at the Foley bulb where the maximum bladder dose was calculated with TORD. It was located approximately 1 cm superior to the colpostats and just anterior to the tandem in (16(22)) patients. R max was higher in (17(22)) patients using the CATTO system when compared with TORD. The average R max using TORD

  2. Preliminary Results of Emergency Computed Tomography-Guided Ventricular Drain Placement-Precision for the Most Difficult Cases.

    Science.gov (United States)

    Nowacki, Andreas; Wagner, Franca; Söll, Nicole; Hakim, Arsany; Beck, Jürgen; Raabe, Andreas; Z'Graggen, Werner J

    2018-04-05

    External ventricular drainage (EVD) catheter placement is one of the most commonly performed neurosurgical procedures. The study's objective was to compare a computed tomography (CT) bolt scan-guided approach for the placement of EVDs with conventional landmark-based insertion. In this retrospective case-control study, we analyzed patients undergoing bolt-kit EVD catheter placement, either CT-guided or landmark-based, between 2013 and 2016. The CT bolt scan-guided approach was based on a dose-reduced CT scan after bolt fixation with immediate image reconstruction along the axis of the bolt to evaluate the putative insertion axis. If needed, angulation of the bolt was corrected and the procedure repeated before the catheter was inserted. Primary endpoint was the accuracy of insertion. Secondary endpoints were the overall number of attempts, duration of intervention, complication rates, and cumulative radiation dose. In total, 34 patients were included in the final analysis. In the group undergoing CT-guided placement, the average ventricle width was significantly smaller (P = 0.04) and average midline shift significantly more pronounced (P = 0.01). CT-guided placement resulted in correct positioning of the catheter in the ipsilateral frontal horn in all 100% of the cases compared with landmark-guided insertion (63%; P = 0.01). Application of the CT-guided approach increased the number of total CT scans (3.6 ± 1.9) and the overall radiation dose (3.34 ± 1.61 mSv) compared with the freehand insertion group (1.84 ± 2.0 mSv and 1.55 ± 1.66 mSv). No differences were found for the other secondary outcome parameters. CT-guided bolt-kit EVD catheter placement is feasible and accurate in the most difficult cases. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Result from systematic compilation of barrel bolt findings in S/KWU type PWRs in the context of computational analysis

    International Nuclear Information System (INIS)

    Kilian, R.; Devrient, B.; Koenig, G.; Stanislowski, M.; Widera, M.; Beusekom, R. van; Wermelinger, T.

    2015-01-01

    In 2005 intergranular stress corrosion cracking (IGSCC) of barrel bolts was observed in several S/KWU type PWRs. The bolts, known as star bolts, are made of SS type 316 Ti (German Material No. 1.4571 K) bars which are cold worked to adjust the required mechanical properties. This damage mechanism was so far less understood for PWR primary conditions. Therefore an extended joint research program was launched by AREVA GmbH and VGB e.V. to clarify the specific conditions which contributed to the observed findings on barrel bolts. A systematic analysis of the IGSCC affecting parameters as material, heats, environment and mechanical load was performed based on a plant data compilation from all six S/KWU PWRs with comparable core barrel design using barrel and baffle bolts made from type 316Ti. Using the outcome of this systematic data compilation additional computational fluid dynamics calculations in combination with radiolysis calculations were performed. The results showed that by a combination of reduced volume exchange by local flow conditions and radiolysis reactions forming oxidizing species as dissolved oxygen and/or hydrogen peroxide may locally affect the corrosion behavior of cold worked austenitic stainless steels. Therefore, small local volumes with oxidizing water chemistry conditions are assumed to lead to the IGSCC of cold worked type 316Ti. The comparison of the initial cold worked microstructure by TEM with the cold worked and in service irradiated microstructure (void formation, dislocation loop density, etc.) clearly reveals that neutron irradiation hardening in terms of IASCC (Irradiation-Assisted Stress Corrosion Cracking) is not the leading mechanism for these cases of barrel bolt cracking in the analyzed PWRs. (authors)

  4. Highly Parallel Computing Architectures by using Arrays of Quantum-dot Cellular Automata (QCA): Opportunities, Challenges, and Recent Results

    Science.gov (United States)

    Fijany, Amir; Toomarian, Benny N.

    2000-01-01

    There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA

  5. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  6. The impact of optimize solar radiation received on the levels and energy disposal of levels on architectural design result by using computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Rezaei, Davood; Farajzadeh Khosroshahi, Samaneh; Sadegh Falahat, Mohammad [Zanjan University (Iran, Islamic Republic of)], email: d_rezaei@znu.ac.ir, email: ronas_66@yahoo.com, email: Safalahat@yahoo.com

    2011-07-01

    In order to minimize the energy consumption of a building it is important to achieve optimum solar energy. The aim of this paper is to introduce the use of computer modeling in the early stages of design to optimize solar radiation received and energy disposal in an architectural design. Computer modeling was performed on 2 different projects located in Los Angeles, USA, using ECOTECT software. Changes were made to the designs following analysis of the modeling results and a subsequent analysis was carried out on the optimized designs. Results showed that the computer simulation allows the designer to set the analysis criteria and improve the energy performance of a building before it is constructed; moreover, it can be used for a wide range of optimization levels. This study pointed out that computer simulation should be performed in the design stage to optimize a building's energy performance.

  7. Assessment of the Results from Conducted Experimental Training in Computer Networks and Communications in the Laboratory Exercises

    Directory of Open Access Journals (Sweden)

    Gencho Stoitsov

    2017-05-01

    Full Text Available This article describes a conducted educational research, related to the use of virtual models and appropriate software in order to acquire practical knowledge and skills in laboratory work in the subject "Computer Networks and Communications" (CNC at the FMI at PU "Paisii Hilendarski".

  8. Synthesis of radiolabelled aryl azides from diazonium salts: experimental and computational results permit the identification of the preferred mechanism.

    Science.gov (United States)

    Joshi, Sameer M; de Cózar, Abel; Gómez-Vallejo, Vanessa; Koziorowski, Jacek; Llop, Jordi; Cossío, Fernando P

    2015-05-28

    Experimental and computational studies on the formation of aryl azides from the corresponding diazonium salts support a stepwise mechanism via acyclic zwitterionic intermediates. The low energy barriers associated with both transition structures are compatible with very fast and efficient processes, thus making this method suitable for the chemical synthesis of radiolabelled aryl azides.

  9. Ten Years toward Equity: Preliminary Results from a Follow-Up Case Study of Academic Computing Culture

    Directory of Open Access Journals (Sweden)

    Tanya L. Crenshaw

    2017-05-01

    Full Text Available Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing.

  10. Evaluating a computational support tool for set-based configuration of production systems : Results from an industrial case

    NARCIS (Netherlands)

    Unglert, Johannes; Hoekstra, Sipke; Jauregui Becker, Juan Manuel

    2017-01-01

    This paper describes research conducted in the context of an industrial case dealing with the design of re configurable cellular manufacturing systems. Reconfiguring such systems represents a complex task due to the interdependences between the constituent subsystems. A novel computational tool was

  11. The Effects of Computer-Aided Instruction on Learning and Attitudes in Economic Principles Courses: Revised Results.

    Science.gov (United States)

    Henry, Mark

    1979-01-01

    Recounts statistical inaccuracies in an article on computer-aided instruction in economics courses on the college level. The article, published in the J. Econ. Ed (Fall 1978), erroneously placed one student in the TIPS group instead of the control group. Implications of this alteration are discussed. (DB)

  12. Craniocerebral trauma--congruence between post-mortem computed tomography diagnoses and autopsy results: a 2-year retrospective study

    DEFF Research Database (Denmark)

    Jacobsen, Christina; Lynnerup, Niels

    2010-01-01

    Computed tomography (CT) has been used routinely at the Department of Forensic Medicine, University of Copenhagen since 2002. A retrospective study was performed in order to correlate CT-scan based diagnoses of cranial and cerebral lesions with macroscopic autopsy diagnoses in 56 cases. The CT-sc...

  13. Replicated Computations Results (RCR) report for “A holistic approach for collaborative workload execution in volunteer clouds”

    DEFF Research Database (Denmark)

    Vandin, Andrea

    2018-01-01

    “A Holistic Approach for Collaborative Workload Execution in Volunteer Clouds” [3] proposes a novel approach to task scheduling in volunteer clouds. Volunteer clouds are decentralized cloud systems based on collaborative task execution, where clients voluntarily share their own unused computational...

  14. Medulloblastoma: long-term results for patients treated with definitive radiation therapy during the computed tomography era

    International Nuclear Information System (INIS)

    Merchant, Thomas E.; Wang, M.-H.; Haida, Toni; Lindsley, Karen L.; Finlay, Jonathan; Dunkel, Ira J.; Rosenblum, Marc K.; Leibel, Steven A.

    1996-01-01

    Purpose: We performed a retrospective evaluation of the patterns of failure and outcome for medulloblastoma patients treated with craniospinal irradiation therapy during the computed tomography (CT) era. Materials and Methods: The records of 100 patients treated at Memorial Sloan-Kettering Cancer Center between 1979 and 1994 were reviewed. CT scans or magnetic resonance imaging were used to guide surgical intervention and evaluate the extent of resection postoperatively. All patients were treated with conventional fractionation (1.8 Gy/day) and the majority received full-dose neuraxis radiation therapy and > 50 Gy to the primary site. Results: With a median follow-up of 100 months, the median, 5-year, and 10-year actuarial overall survival for the entire group were 58 months, 50%, and 25%, respectively. The median, 5- and 10-year actuarial disease-free survivals were 37 months, 41%, and 27%, respectively. Patients with localized disease (no evidence of disease beyond the primary site) had significantly improved overall (p < 0.02) and disease-free (p < 0.02) survivals compared to those with non localized disease. For patients with localized disease, the 5- and 10-year overall survival rates were 59% and 31%, whereas the disease-free survivals were 49% and 31%, respectively. Disease-free and overall survivals at similar intervals for patients with non localized disease were 29% and 30% (5 years), and 29% and 20% (10 years), respectively. Sixty-four of 100 patients failed treatment. Local failure as any component of first failure occurred in 35% of patients or 55% (35 of 64) of all failures and as the only site of first failure in 14% or 22% (14 of 64) of all failures. For patients presenting with localized disease (n = 68), local failure as any component of first failure occurred in 32% (22 of 68) and in 18% (12 of 68) as the only site. A multivariate analysis showed that M stage was the only prognostic factor to influence overall survival. For disease-free survival

  15. Risk of cancer incidence before the age of 15 years after exposure to ionising radiation from computed tomography: results from a German cohort study

    Energy Technology Data Exchange (ETDEWEB)

    Krille, L. [University Medical Center Mainz, Institute of Medical Biostatistics, Epidemiology and Informatics, Mainz (Germany); International Agency for Research on Cancer, Lyon (France); Dreger, S.; Zeeb, H. [University of Bremen, Leibniz - Institute for Prevention Research and Epidemiology - BIPS, Research Focus Health Sciences Bremen, Bremen (Germany); Schindel, R.; Blettner, M. [University Medical Center Mainz, Institute of Medical Biostatistics, Epidemiology and Informatics, Mainz (Germany); Albrecht, T. [Vivantes, Klinikum Neukoelln, Institut fuer Radiologie und Interventionelle Therapie, Berlin (Germany); Asmussen, M. [Zentralinstitut fuer Bildgebende Diagnostik, Staedtisches Klinikum Karlsruhe, Karlsruhe (Germany); Barkhausen, J. [Universitaetsklinikum Schleswig Holstein, Klinik fuer Radiologie und Nuklearmedizin, Campus Luebeck, Luebeck (Germany); Berthold, J.D. [Medizinische Hochschule Hannover, Institut fuer Diagnostische und Interventionelle Radiologie, Hannover (Germany); Chavan, A. [Klinikum Oldenburg GmbH, Institut fuer Diagnostische and Interventionelle Radiologie, Oldenburg (Germany); Claussen, C. [Universitaetsklinikum Tuebingen, Abt. fuer Diagnostische und Interventionelle Radiologie, Tuebingen (Germany); Forsting, M. [Universitaetsklinikum Essen, Institut fuer Diagnostische und Interventionelle Radiologie und Neuroradiologie, Essen (Germany); Gianicolo, E.A.L. [University Medical Center Mainz, Institute of Medical Biostatistics, Epidemiology and Informatics, Mainz (Germany); National Research Council, Institute of Clinical Physiology, Lecce (Italy); Jablonka, K. [Klinikum Bremen-Mitte, Klinik fuer Radiologische Diagnostik und Nuklearmedizin, Bremen (Germany); Jahnen, A. [Centre de Recherche Public Henri Tudor, Luxembourg (Luxembourg); Langer, M. [Universitaetsklinikum Freiburg, Klinik fuer Radiologie, Freiburg (Germany); Laniado, M. [Universitaetsklinikum Carl Gustav Carus Dresden, Institut und Poliklinik fuer Radiologische Diagnostik, Dresden (Germany); Lotz, J. [Universitaetsmedizin Goettingen, Institut fuer Diagnostische und Interventionelle Radiologie, Goettingen (Germany); Mentzel, H.J. [Universitaetsklinikum Jena, Institut fuer Diagnostische und Interventionelle Radiologie, Sektion Kinderradiologie, Jena (Germany); Queisser-Wahrendorf, A. [Universitaetsmedizin Mainz, Zentrum fuer Kinder- und Jugendmedizin, Mainz (Germany); Rompel, O. [Universitaetsklinikum Erlangen, Radiologisches Institut, Erlangen (Germany); Schlick, I. [Klinikum Nuernberg Sued, Institut fuer Radiologie und Neuroradiologie, Nuremberg (Germany); Schneider, K.; Seidenbusch, M. [Institut fuer Klinische Radiologie, Klinikum der Universitaet Muenchen, Dr. von Haunersches Kinderspital, Munich (Germany); Schumacher, M. [Universitaetsklinik Freiburg, Klinik fuer Neuroradiologie, Neurozentrum, Freiburg (Germany); Spix, C. [University Medical Center Mainz, German Childhood Cancer Registry, Mainz (Germany); Spors, B. [Charite - Universitaetsmedizin Berlin, Kinderradiologie, Standort Campus Virchow Klinikum, Berlin (Germany); Staatz, G. [Universitaetsmedizin Mainz, Klinik und Poliklinik fuer diagnostische und interventionelle Radiologie, Sektion Kinderradiologie, Mainz (Germany); Vogl, T. [Klinikum der Johann Wolfgang Goethe-Universitaet Frankfurt/Main, Institut fuer Diagnostische und Interventionelle Radiologie, Frankfurt (Germany); Wagner, J. [Vivantes, Klinikum im Friedrichshain, Institut fuer Radiologie und Interventionelle Therapie, Berlin (Germany); Weisser, G. [Universitaetsklinikum Mannheim, Institut fuer Klinische Radiologie und Nuklearmedizin, Mannheim (Germany)

    2015-03-15

    The aim of this cohort study was to assess the risk of developing cancer, specifically leukaemia, tumours of the central nervous system and lymphoma, before the age of 15 years in children previously exposed to computed tomography (CT) in Germany. Data for children with at least one CT between 1980 and 2010 were abstracted from 20 hospitals. Cancer cases occurring between 1980 and 2010 were identified by stochastic linkage with the German Childhood Cancer Registry (GCCR). For all cases and a sample of non-cases, radiology reports were reviewed to assess the underlying medical conditions at time of the CT. Cases were only included if diagnosis occurred at least 2 years after the first CT and no signs of cancer were recorded in the radiology reports. Standardised incidence ratios (SIR) using incidence rates from the general population were estimated. The cohort included information on 71,073 CT examinations in 44,584 children contributing 161,407 person-years at risk with 46 cases initially identified through linkage with the GCCR. Seven cases had to be excluded due to signs possibly suggestive of cancer at the time of first CT. Overall, more cancer cases were observed (O) than expected (E), but this was mainly driven by unexpected and possibly biased results for lymphomas. For leukaemia, the SIR (SIR = O/E) was 1.72 (95 % CI 0.89-3.01, O = 12), and for CNS tumours, the SIR was 1.35 (95 % CI 0.54-2.78, O = 7). Despite careful examination of the medical information, confounding by indication or reverse causation cannot be ruled out completely and may explain parts of the excess. Furthermore, the CT exposure may have been underestimated as only data from the participating clinics were available. This should be taken into account when interpreting risk estimates. (orig.)

  16. Risk of cancer incidence before the age of 15 years after exposure to ionising radiation from computed tomography: results from a German cohort study.

    Science.gov (United States)

    Krille, L; Dreger, S; Schindel, R; Albrecht, T; Asmussen, M; Barkhausen, J; Berthold, J D; Chavan, A; Claussen, C; Forsting, M; Gianicolo, E A L; Jablonka, K; Jahnen, A; Langer, M; Laniado, M; Lotz, J; Mentzel, H J; Queißer-Wahrendorf, A; Rompel, O; Schlick, I; Schneider, K; Schumacher, M; Seidenbusch, M; Spix, C; Spors, B; Staatz, G; Vogl, T; Wagner, J; Weisser, G; Zeeb, H; Blettner, M

    2015-03-01

    The aim of this cohort study was to assess the risk of developing cancer, specifically leukaemia, tumours of the central nervous system and lymphoma, before the age of 15 years in children previously exposed to computed tomography (CT) in Germany. Data for children with at least one CT between 1980 and 2010 were abstracted from 20 hospitals. Cancer cases occurring between 1980 and 2010 were identified by stochastic linkage with the German Childhood Cancer Registry (GCCR). For all cases and a sample of non-cases, radiology reports were reviewed to assess the underlying medical conditions at time of the CT. Cases were only included if diagnosis occurred at least 2 years after the first CT and no signs of cancer were recorded in the radiology reports. Standardised incidence ratios (SIR) using incidence rates from the general population were estimated. The cohort included information on 71,073 CT examinations in 44,584 children contributing 161,407 person-years at risk with 46 cases initially identified through linkage with the GCCR. Seven cases had to be excluded due to signs possibly suggestive of cancer at the time of first CT. Overall, more cancer cases were observed (O) than expected (E), but this was mainly driven by unexpected and possibly biased results for lymphomas. For leukaemia, the SIR (SIR = O/E) was 1.72 (95 % CI 0.89-3.01, O = 12), and for CNS tumours, the SIR was 1.35 (95 % CI 0.54-2.78, O = 7). Despite careful examination of the medical information, confounding by indication or reverse causation cannot be ruled out completely and may explain parts of the excess. Furthermore, the CT exposure may have been underestimated as only data from the participating clinics were available. This should be taken into account when interpreting risk estimates.

  17. A new computer-based counselling system for the promotion of physical activity in patients with chronic diseases--results from a pilot study.

    Science.gov (United States)

    Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna

    2011-05-01

    To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Improving the management of diabetes in hospitalized patients: the results of a computer-based house staff training program.

    Science.gov (United States)

    Vaidya, Anand; Hurwitz, Shelley; Yialamas, Maria; Min, Le; Garg, Rajesh

    2012-07-01

    Poorly controlled diabetes in hospitalized patients is associated with poor clinical outcomes. We hypothesized that computer-based diabetes training could improve house staff knowledge and comfort for the management of diabetes in a large tertiary-care hospital. We implemented a computer-based training program on inpatient diabetes for internal medicine house staff at the Brigham and Women's Hospital (Boston, MA) in September 2009. House staff were required to complete the program and answer a set of questions, before and after the program, to evaluate their level of comfort and knowledge of inpatient diabetes. Chart reviews of all non-critically ill patients with diabetes managed by house staff in August 2009 (before the program) and December 2009 (after the program) were performed. Chart reviews were also performed for August 2008 and December 2008 to compare house staff management practices when the computer-based educational program was not available. A significant increase in comfort levels and knowledge in the management of inpatient diabetes was seen among house staff at all levels of training (Pstaff compared with junior house staff. Nonsignificant trends suggesting increased use of basal-bolus insulin (P=0.06) and decreased use of sliding-scale insulin (P=0.10) were seen following the educational intervention in 2009, whereas no such change was seen in 2008 (P>0.90). Overall, house staff evaluated the training program as "very relevant" and the technology interface as "good." A computer-based diabetes training program can improve the comfort and knowledge of house staff and potentially improve their insulin administration practices at large academic centers.

  19. Migration goals and risk management in cloud computing: A review of state of the art and survey results on practitioners

    OpenAIRE

    Islam, Shareeful; Fenz, Stefan; Weippl, Edgar; Kalloniatis, Christos

    2016-01-01

    Organizations are now seriously considering adopting cloud into the existing business context, but migrating\\ud data, application and services into cloud doesn’t come without substantial risks. These risks are the significant\\ud barriers for the wider cloud adoption. Cloud computing has obtained a lot of attention by both research and\\ud industry communities in recent years. There are works that consolidate the existing work on cloud migration\\ud and technology. However, there is no secondary...

  20. Computer-monitored radionuclide tracking of three-dimensional mandibular movements. Part II: experimental setup and preliminary results - Posselt diagram

    Energy Technology Data Exchange (ETDEWEB)

    Salomon, J.A.; Waysenson, B.D.; Warshaw, B.D.

    1979-04-01

    This article described a new method to track mandibular movements using a computer-assisted radionuclide kinematics technique. The usefulness of various image-enhancement techniques is discussed, and the reproduction of physiologic displacements is shown. Vertical, lateral, and protrusive envelopes of motion of a point on a tooth of a complete denture mounted on a semiadjustable articulator were measured. A demonstrative example of the validity of this approach is reproducing the motion of the dental point, which clearly evidences the Posselt diagram.

  1. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis: Results From the VIRTU-Fast Study.

    Science.gov (United States)

    Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2017-08-01

    Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  2. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  3. Comparative analysis of the results obtained by computer code ASTEC V2 and RELAP 5.3.2 for small leak ID 80 for VVER 1000

    International Nuclear Information System (INIS)

    Atanasova, B.; Grudev, P.

    2011-01-01

    The purpose of this report is to present the results obtained by simulation and subsequent analysis of emergency mode for small leak with ID 80 for WWER 1000/B320 - Kozloduy NPP Units 5 and 6. Calculations were performed with the ASTEC v2 computer code used for calculation of severe accident, which was designed by French and German groups - IRSN and GRS. Integral RELAP5 computer code is used as a reference for comparison of results. The analyzes are focused on the processes occurring in reactor internals phase of emergency mode with significant core damage. The main thermohydraulic parameters, start of reactor core degradation and subsequent fuel relocalization till reactor vessel failure are evaluated in the analysis. RELAP5 computer code is used as a reference code to compare the results obtained till early core degradation that occurs after core stripping and excising of fuel temperature above 1200 0 C

  4. Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods, and Results for a User Study

    Science.gov (United States)

    2016-11-01

    once I was finished with an alert it should have been removed from my view. Also I should be able to remove noise from my view with a filter. The...been removed from my view.” Another said, “Connections were very hard to follow, information was displayed in a non- intuitive manner, correlations...iPhone, Android phone, tablets, etc.)? Please choose only one of the following: [Yes, No] 13 How often on a daily basis, do you use computers

  5. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  6. Results from a pilot study of a computer-based role-playing game for young people with psychosis.

    Science.gov (United States)

    Olivet, Jeffrey; Haselden, Morgan; Piscitelli, Sarah; Kenney, Rachael; Shulman, Alexander; Medoff, Deborah; Dixon, Lisa

    2018-03-15

    Recent research on first episode psychosis (FEP) has demonstrated the effectiveness of coordinated specialty care (CSC) models to support young adults and their families, yet few tools exist to promote engagement in care. This study aimed to develop a prototype computer-based role-playing game (RPG) designed for young people who have experienced FEP, and conduct a pilot study to determine feasibility and test whether the game improves consumers' attitudes toward treatment and recovery. Twenty young people with FEP who were receiving services at a CSC program enrolled in the study and played the game for 1 hour. Pre- and post-quantitative assessments measured change in hope, recovery, stigma, empowerment and engagement in treatment. Qualitative interviews explored participants' experience with the game and ideas for further product development. Participants showed significant increase in positive attitudes toward recovery. The qualitative findings further demonstrated the game's positive impact across these domains. Of all game features, participants most highly valued video testimonials of other young adults with FEP telling their stories of hope and recovery. These findings provide modest support for the potential benefits of this type of computer-based RPG, if further developed for individuals experiencing psychosis. © 2018 John Wiley & Sons Australia, Ltd.

  7. Light Water Reactor Sustainability Program: Computer-based procedure for field activities: results from three evaluations at nuclear power plants

    International Nuclear Information System (INIS)

    2014-01-01

    Nearly all activities that involve human interaction with the systems of a nuclear power plant are guided by procedures. The paper-based procedures (PBPs) currently used by industry have a demonstrated history of ensuring safety; however, improving procedure use could yield tremendous savings in increased efficiency and safety. One potential way to improve procedure-based activities is through the use of computer-based procedures (CBPs). Computer-based procedures provide the opportunity to incorporate context driven job aids, such as drawings, photos, just-in-time training, etc into CBP system. One obvious advantage of this capability is reducing the time spent tracking down the applicable documentation. Additionally, human performance tools can be integrated in the CBP system in such way that helps the worker focus on the task rather than the tools. Some tools can be completely incorporated into the CBP system, such as pre-job briefs, placekeeping, correct component verification, and peer checks. Other tools can be partly integrated in a fashion that reduces the time and labor required, such as concurrent and independent verification. Another benefit of CBPs compared to PBPs is dynamic procedure presentation. PBPs are static documents which limits the degree to which the information presented can be tailored to the task and conditions when the procedure is executed. The CBP system could be configured to display only the relevant steps based on operating mode, plant status, and the task at hand. A dynamic presentation of the procedure (also known as context-sensitive procedures) will guide the user down the path of relevant steps based on the current conditions. This feature will reduce the user's workload and inherently reduce the risk of incorrectly marking a step as not applicable and the risk of incorrectly performing a step that should be marked as not applicable. As part of the Department of Energy's (DOE) Light Water Reactors Sustainability Program

  8. Comparative analysis of the results of surgery for juvenile nasopharyngeal angiofibroma with the use of 3D reconstructions of computed tomography angiography

    Directory of Open Access Journals (Sweden)

    N. S. Grachev

    2017-01-01

    Full Text Available Rationale: The relapse rates after surgery for juvenile nasopharyngeal and/or skull base angiofibroma is in the range of 23 to 27.5%, which is mostly related to diagnostic issues. Aim: To perform a comparative analysis of the results of surgical treatment for juvenile nasopharyngeal and skull base angiofibroma based on our technique of 3D reconstructions of computed tomography angiograms in patients with primary tumors and with relapses. Materials and methods: We analyzed retrospectively the data from 32 patients with juvenile nasopharyngeal and skull base angiofibroma who had been diagnosed and treated from 2013 to 2017 (42 surgeries. Multislice computed tomography (MSCT angiography with 3D reconstruction was used for the planning of surgical approaches. At days 3 to 7 after the surgery, in 31 patients with stages II, IIIa and IIIb (according to U. Fisch classification modified by R. Andrews, 1989, we looked for residual tumor tissues by MSCT with standard analysis and with 3D MSCT angiography reconstructions, comparing them with their corresponding baseline images. The patients were divided into two groups: group 1, 17 patients with primary tumors (median age 13.5 years, group 2, 14 patients who had been previously operated (median age 14 years. Both groups were comparable in their clinical and demographic characteristics, as well as in the tumor staging (p > 0.05. Results: The relapse rates were 22.58% (7 / 31 patients, being 11.76% (2 / 17 in the group 1 and 35.71% (5 / 14 in the group 2 (p > 0.05. In each group, the maximal difference in the resected tumor volume was found in stage II patients, with more radical resection in the patients with primary tumors (p < 0.05. Contrast-enhanced MSCT showed residual tumor masses in 19 patients (8, with primary tumors and 11, with relapses. From those, 10 patients (3 with primary tumors and 7 who had underwent surgery earlier required second surgeries (4 patients were curatively operated, and 2

  9. Application of Computer Aided Design (CADD) in data display and integration of numerical and field results - Stripa phase 3

    International Nuclear Information System (INIS)

    Press, D.E.; Halliday, S.M.; Gale, J.E.

    1990-12-01

    Existing CAD/CADD systems have been reviewed and the micro-computer compatible solids modelling CADD software SilverScreen was selected for use in constructing a CADD model of the Stripa site. Maps of the Stripa mine drifts, shafts, raises and stopes were digitized and used to create three-dimensional images of the north-eastern part of the mine and the SCV site. In addition, the use of CADD sub-programs to display variation in fracture geometry and hydraulic heads have been demonstrated. The database developed in this study is available as either raw digitized files, processed data files, SilverScreen script files or in DXF or IGES formats; all of which are described in this report. (au)

  10. Improved operating scenarios of the DIII-D tokamak as a result of the addition of UNIX computer systems

    International Nuclear Information System (INIS)

    Henline, P.A.

    1995-10-01

    The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape control due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described

  11. Evaluation of the web-based computer-tailored FATaintPHAT intervention to promote energy balance among adolescents: Results from a school cluster randomized trial

    NARCIS (Netherlands)

    N.P.M. Ezendam (Nicole); J. Brug (Hans); A. Oenema (Anke)

    2012-01-01

    textabstractObjective: To evaluate the short- and long-term results of FATaintPHAT, a Web-based computer-tailored intervention aiming to increase physical activity, decrease sedentary behavior, and promote healthy eating to contribute to the prevention of excessive weight gain among adolescents.

  12. Evaluation of the Web-Based Computer-Tailored FATaintPHAT Intervention to Promote Energy Balance Among Adolescents Results From a School Cluster Randomized Trial

    NARCIS (Netherlands)

    Ezendam, N.P.M.; Brug, J.; Oenema, A.

    2012-01-01

    Objective: To evaluate the short- and long-term results of FATaintPHAT, a Web-based computer-tailored intervention aiming to increase physical activity, decrease sedentary behavior, and promote healthy eating to contribute to the prevention of excessive weight gain among adolescents. Design: Cluster

  13. 3D computation of the shape of etched tracks in CR-39 for oblique particle incidence and comparison with experimental results

    International Nuclear Information System (INIS)

    Doerschel, B.; Hermsdorf, D.; Reichelt, U.; Starke, S.; Wang, Y.

    2003-01-01

    Computation of the shape of etch pits needs to know the varying track etch rate along the particle trajectories. Experiments with alpha particles and 7 Li ions entering CR-39 detectors under different angles showed that this function is not affected by the inclination of the particle trajectory with respect to the normal on the detector surface. Track formation for oblique particle incidence can, therefore, be simulated using the track etch rates determined for perpendicular incidence. 3D computation of the track shape was performed applying a model recently described in literature. A special program has been written for computing the x,y,z coordinates of points on the etch pit walls. In addition, the etch pit profiles in sagittal sections as well as the contours of the etch pit openings on the detector surface have been determined experimentally. Computed and experimental results were in good agreement confirming the applicability of the 3D computational model in combination with the functions for the depth-dependent track etch rates determined experimentally

  14. 77 FR 70176 - Previous Participation Certification

    Science.gov (United States)

    2012-11-23

    ... participants' previous participation in government programs and ensure that the past record is acceptable prior... information is designed to be 100 percent automated and digital submission of all data and certifications is... government programs and ensure that the past record is acceptable prior to granting approval to participate...

  15. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  16. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  17. Clinical role of 18F-fluorodeoxyglucose positron emission tomography/computed tomography in post-operative follow up of gastric cancer: Initial results

    Institute of Scientific and Technical Information of China (English)

    Long Sun; Xin-Hui Su; Yong-Song Guan; Wei-Ming Pan; Zuo-Ming Luo; Ji-Hong Wei; Hua Wu

    2008-01-01

    AIM: To evaluate the clinical role of 18F-fluorodeo-xyglucose positron emission and computed tomography(18F-FDG PET/CT) in detection of gastric cancer recur rence after initial surgical resection.METHODS: In the period from January 2007 to May 2008, 23 patients who had previous surgical resection of histopathologically diagnosed gastric cancer underwent a total of 25 18F-FDG PET/CT scans as follow-up visits in our center. The standard of reference for tumor recurrence consisted of histopathologic confirmation or clinical follow-up information for at least 5 mo after PET/CT examinations.RESULTS: PET/Cr was positive in 14 patients (61%)and negative in 9 (39%). When correlated with final diagnosis, which was confirmed by histopathologic evidence of tumor recurrence in 8 of the 23 patients(35%) and by clinical follow-up in 15 (65%), PET/CT was true positive in 12 patients, false positive in 2,true negative in 8 and false negative in 2. Overall,the accuracy of PET/CT was 82.6%, the negative predictive value (NPV) was 77.7%, and the positive predictive value (PPV) was 85.7%. The 2 false positive PET/CT findings were actually chronic inflammatory tissue lesions. For the two patients with false negative PET/CT, the final diagnosis was recurrence of mucinous adenocarcinoma in the anastomosis in one patient and abdominal wall metastasis in the other. Importantly,PET/CT revealed true-positive findings in 11 (47.8%)patients who had negative or no definite findings by CT. PET/CT revealed extra-abdominal metastases in 7 patients and additional esophageal carcinoma in onepatient. Clinical treatment decisions were changed in 7 (30.4%) patients after introducing PET/CT into theirconventional post-operative follow-up program.CONCLUSION: Whole body 18F-FDG PET/CT was highly effective in discriminating true recurrence in post-operative patients with gastric cancer and had important impacts on clinical decisions in a considerable portion of patients.

  18. Integrated design of Nb-based superalloys: Ab initio calculations, computational thermodynamics and kinetics, and experimental results

    International Nuclear Information System (INIS)

    Ghosh, G.; Olson, G.B.

    2007-01-01

    An optimal integration of modern computational tools and efficient experimentation is presented for the accelerated design of Nb-based superalloys. Integrated within a systems engineering framework, we have used ab initio methods along with alloy theory tools to predict phase stability of solid solutions and intermetallics to accelerate assessment of thermodynamic and kinetic databases enabling comprehensive predictive design of multicomponent multiphase microstructures as dynamic systems. Such an approach is also applicable for the accelerated design and development of other high performance materials. Based on established principles underlying Ni-based superalloys, the central microstructural concept is a precipitation strengthened system in which coherent cubic aluminide phase(s) provide both creep strengthening and a source of Al for Al 2 O 3 passivation enabled by a Nb-based alloy matrix with required ductile-to-brittle transition temperature, atomic transport kinetics and oxygen solubility behaviors. Ultrasoft and PAW pseudopotentials, as implemented in VASP, are used to calculate total energy, density of states and bonding charge densities of aluminides with B2 and L2 1 structures relevant to this research. Characterization of prototype alloys by transmission and analytical electron microscopy demonstrates the precipitation of B2 or L2 1 aluminide in a (Nb) matrix. Employing Thermo-Calc and DICTRA software systems, thermodynamic and kinetic databases are developed for substitutional alloying elements and interstitial oxygen to enhance the diffusivity ratio of Al to O for promotion of Al 2 O 3 passivation. However, the oxidation study of a Nb-Hf-Al alloy, with enhanced solubility of Al in (Nb) than in binary Nb-Al alloys, at 1300 deg. C shows the presence of a mixed oxide layer of NbAlO 4 and HfO 2 exhibiting parabolic growth

  19. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  20. Preliminary results for validation of Computational Fluid Dynamics for prediction of flow through a split vane spacer grid

    International Nuclear Information System (INIS)

    Rashkovan, A.; Novog, D.R.

    2012-01-01

    This paper presents the results of the CFD simulations of turbulent flow past spacer grid with mixing vanes. This study summarizes the first stage of the ongoing numerical blind exercise organized by OECD-NEA. McMaster University along with other participants plan to submit a numerical prediction of the detailed flow field and turbulence characteristics of the flow past 5x5 rod bundle with a spacer grid equipped with two types of mixing vanes. The results will be compared with blind experimental measurements performed in Korea. Due to the fact that a number of the modeling strategies are suggested in literature for such types of flows, we have performed a series of tests to assess the mesh requirements, flow steadiness, turbulence modeling and wall treatment effects. Results of these studies are reported in the present paper. (author)

  1. Previous utilization of service does not improve timely booking in ...

    African Journals Online (AJOL)

    Previous utilization of service does not improve timely booking in antenatal care: Cross sectional study ... Journal Home > Vol 24, No 3 (2010) > ... Results: Past experience on antenatal care service utilization did not come out as a predictor for ...

  2. The cost-effectiveness of the RSI QuickScan intervention programme for computer workers: Results of an economic evaluation alongside a randomised controlled trial.

    Science.gov (United States)

    Speklé, Erwin M; Heinrich, Judith; Hoozemans, Marco J M; Blatter, Birgitte M; van der Beek, Allard J; van Dieën, Jaap H; van Tulder, Maurits W

    2010-11-11

    The costs of arm, shoulder and neck symptoms are high. In order to decrease these costs employers implement interventions aimed at reducing these symptoms. One frequently used intervention is the RSI QuickScan intervention programme. It establishes a risk profile of the target population and subsequently advises interventions following a decision tree based on that risk profile. The purpose of this study was to perform an economic evaluation, from both the societal and companies' perspective, of the RSI QuickScan intervention programme for computer workers. In this study, effectiveness was defined at three levels: exposure to risk factors, prevalence of arm, shoulder and neck symptoms, and days of sick leave. The economic evaluation was conducted alongside a randomised controlled trial (RCT). Participating computer workers from 7 companies (N = 638) were assigned to either the intervention group (N = 320) or the usual care group (N = 318) by means of cluster randomisation (N = 50). The intervention consisted of a tailor-made programme, based on a previously established risk profile. At baseline, 6 and 12 month follow-up, the participants completed the RSI QuickScan questionnaire. Analyses to estimate the effect of the intervention were done according to the intention-to-treat principle. To compare costs between groups, confidence intervals for cost differences were computed by bias-corrected and accelerated bootstrapping. The mean intervention costs, paid by the employer, were 59 euro per participant in the intervention and 28 euro in the usual care group. Mean total health care and non-health care costs per participant were 108 euro in both groups. As to the cost-effectiveness, improvement in received information on healthy computer use as well as in their work posture and movement was observed at higher costs. With regard to the other risk factors, symptoms and sick leave, only small and non-significant effects were found. In this study, the RSI Quick

  3. Brain Computer Interfaces on Track to Home: Results of the Evaluation at Disabled End-Users's Homes and Lessons Learnt

    Directory of Open Access Journals (Sweden)

    Felip eMiralles

    2015-11-01

    Full Text Available The BackHome system is a multi-functional BCI system, the final outcome of a User Centred Design approach, whose ambition is to move BCI systems from laboratories into the home of people in need for their independent home use. The paper presents the results of testing and evaluation of the BackHome system with end-users at their own homes. Results show moderate to good acceptance from end-users, caregivers and therapists; which reported promising usability levels, good user satisfaction and levels of control in the use of services and home support based on remote monitoring tools.

  4. Computer simulations of large asteroid impacts into oceanic and continental sites--preliminary results on atmospheric, cratering and ejecta dynamics

    Science.gov (United States)

    Roddy, D.J.; Schuster, S.H.; Rosenblatt, M.; Grant, L.B.; Hassig, P.J.; Kreyenhagen, K.N.

    1987-01-01

    Computer simulations have been completed that describe passage of a 10-km-diameter asteroid through the Earth's atmosphere and the subsequent cratering and ejecta dynamics caused by impact of the asteroid into both oceanic and continental sites. The asteroid was modeled as a spherical body moving vertically at 20 km/s with a kinetic energy of 2.6 ?? 1030 ergs (6.2 ?? 107 Mt ). Detailed material modeling of the asteroid, ocean, crustal units, sedimentary unit, and mantle included effects of strength and fracturing, generic asteroid and rock properties, porosity, saturation, lithostatic stresses, and geothermal contributions, each selected to simulate impact and geologic conditions that were as realistic as possible. Calculation of the passage of the asteroid through a U.S. Standard Atmosphere showed development of a strong bow shock wave followed by a highly shock compressed and heated air mass. Rapid expansion of this shocked air created a large low-density region that also expanded away from the impact area. Shock temperatures in air reached ???20,000 K near the surface of the uplifting crater rim and were as high as ???2000 K at more than 30 km range and 10 km altitude. Calculations to 30 s showed that the shock fronts in the air and in most of the expanding shocked air mass preceded the formation of the crater, ejecta, and rim uplift and did not interact with them. As cratering developed, uplifted rim and target material were ejected into the very low density, shock-heated air immediately above the forming crater, and complex interactions could be expected. Calculations of the impact events showed equally dramatic effects on the oceanic and continental targets through an interval of 120 s. Despite geologic differences in the targets, both cratering events developed comparable dynamic flow fields and by ???29 s had formed similar-sized transient craters ???39 km deep and ???62 km across. Transient-rim uplift of ocean and crust reached a maximum altitude of nearly

  5. Comparison of the results of several heat transfer computer codes when applied to a hypothetical nuclear waste repository

    International Nuclear Information System (INIS)

    Claiborne, H.C.; Wagner, R.S.; Just, R.A.

    1979-12-01

    A direct comparison of transient thermal calculations was made with the heat transfer codes HEATING5, THAC-SIP-3D, ADINAT, SINDA, TRUMP, and TRANCO for a hypothetical nuclear waste repository. With the exception of TRUMP and SINDA (actually closer to the earlier CINDA3G version), the other codes agreed to within +-5% for the temperature rises as a function of time. The TRUMP results agreed within +-5% up to about 50 years, where the maximum temperature occurs, and then began an oscillary behavior with up to 25% deviations at longer times. This could have resulted from time steps that were too large or from some unknown system problems. The available version of the SINDA code was not compatible with the IBM compiler without using an alternative method for handling a variable thermal conductivity. The results were about 40% low, but a reasonable agreement was obtained by assuming a uniform thermal conductivity; however, a programming error was later discovered in the alternative method. Some work is required on the IBM version to make it compatible with the system and still use the recommended method of handling variable thermal conductivity. TRANCO can only be run as a 2-D model, and TRUMP and CINDA apparently required longer running times and did not agree in the 2-D case; therefore, only HEATING5, THAC-SIP-3D, and ADINAT were used for the 3-D model calculations. The codes agreed within +-5%; at distances of about 1 ft from the waste canister edge, temperature rises were also close to that predicted by the 3-D model

  6. The acoustics of public squares/places: A comparison between results from a computer simulation program and measurements in situ

    DEFF Research Database (Denmark)

    Paini, Dario; Rindel, Jens Holger; Gade, Anders

    2004-01-01

    or a band during, for instance, music summer festivals) and the best position for the audience. A further result could be to propose some acoustic adjustments to achieve better acoustic quality by considering the acoustic parameters which are typically used for concert halls and opera houses.......In the contest of a PhD thesis, in which the main purpose is to analyse the importance of the public square/place (“agora”) as a meeting point of sound and music, with particular regard to its use for concerts (amplified or not), a first step was done, making comparisons between measurement in situ...

  7. Preliminary results of the average glandular dose to the breast with TLDS measure is computed as the conversion factors

    International Nuclear Information System (INIS)

    Sardo, Luiz T.L.; Almeida, Claudio D.; Coutinho, Celia M.C.

    2013-01-01

    At mammography exams there is a risk of a breast cancer induced from the absorbed dose by the glandular tissue. According to the National Institute of Cancer, INCA, breast cancer is the second type most frequent in the world and the most common among women, therefore the necessity of monitoring the mean glandular dose, D G . Measuring methods of D G were established by some authors. Among the established methods the method of Dance is one of the most known. In this study was utilized a measurement method realized with TL dosimeters inserted in a breast tissue equivalent phantom, BTE, with 46% of glandularity and exposed using Mo/Mo and Mo/Rh target/filter combination and 28kV. To ensure this measurement method the results were compared with a calculation method, used by Dance, of D G from the measurement of incident air kerma, K i , and conversion factors to consider mainly the beam quality, the compressed thickness and the glandularity of the breast. The results of the comparison of the D G measurement with the obtained dose by the method of Dance demonstrated that for the thickness of 4.0 and 6.0 cm the doses were consistent. For the thickness of 5.0 cm the difference was higher, indicating that the glandularity may influence, suggesting further investigation. (author)

  8. [Interactions of DNA bases with individual water molecules. Molecular mechanics and quantum mechanics computation results vs. experimental data].

    Science.gov (United States)

    Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I

    2013-01-01

    To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.

  9. Implant breast reconstruction after salvage mastectomy in previously irradiated patients.

    Science.gov (United States)

    Persichetti, Paolo; Cagli, Barbara; Simone, Pierfranco; Cogliandro, Annalisa; Fortunato, Lucio; Altomare, Vittorio; Trodella, Lucio

    2009-04-01

    The most common surgical approach in case of local tumor recurrence after quadrantectomy and radiotherapy is salvage mastectomy. Breast reconstruction is the subsequent phase of the treatment and the plastic surgeon has to operate on previously irradiated and manipulated tissues. The medical literature highlights that breast reconstruction with tissue expanders is not a pursuable option, considering previous radiotherapy a contraindication. The purpose of this retrospective study is to evaluate the influence of previous radiotherapy on 2-stage breast reconstruction (tissue expander/implant). Only patients with analogous timing of radiation therapy and the same demolitive and reconstructive procedures were recruited. The results of this study prove that, after salvage mastectomy in previously irradiated patients, implant reconstruction is still possible. Further comparative studies are, of course, advisable to draw any conclusion on the possibility to perform implant reconstruction in previously irradiated patients.

  10. Results of small break LOCA analysis for Kuosheng nuclear power plant using the RELAP5YA computer code

    International Nuclear Information System (INIS)

    Wang, L.C.; Jeng, S.C.; Chung, N.M.

    2004-01-01

    One lesson learned from the Three Mile Island (TMI) accident was the analysis methods used by Nuclear Steam Supply System (NSSS) vendors and/or nuclear fuel suppliers for small break Loss Of Coolant Accident (LOCA) analysis for compliance with appendix K to 10CFR50 should be revised, documented and submitted for USNRC approval and the plant-specific calculations using NRC-approved models for small-break LOCA to show compliance with 10CFR50.46 should be submitted for NRC approval. A study by Taiwan Power Company (TPC) under the guidance of Yankee Atomic Electric Company (YAEC) has been undertaken to perform this analysis for Kuosheng nuclear power plant. This paper presents the results of the analysis that are useful in satisfying the same requirements of the Republic Of China Atomic Energy Commission (ROCAEC). (author)

  11. Development of computer software to analyze entire LANDSAT scenes and to summarize classification results of variable-size polygons

    Science.gov (United States)

    Turner, B. J. (Principal Investigator); Baumer, G. M.; Myers, W. L.; Sykes, S. G.

    1981-01-01

    The Forest Pest Management Division (FPMD) of the Pennsylvania Bureau of Forestry has the responsibility for conducting annual surveys of the State's forest lands to accurately detect, map, and appraise forest insect infestations. A standardized, timely, and cost-effective method of accurately surveying forests and their condition should enhance the probability of suppressing infestations. The repetitive and synoptic coverage provided by LANDSAT (formerly ERTS) makes such satellite-derived data potentially attractive as a survey medium for monitoring forest insect damage over large areas. Forest Pest Management Division personnel have expressed keen interest in LANDSAT data and have informally cooperated with NASA/Goddard Space Flight Center (GSFC) since 1976 in the development of techniques to facilitate their use. The results of this work indicate that it may be feasible to use LANDSAT digital data to conduct annual surveys of insect defoliation of hardwood forests.

  12. Computation Results from a Parametric Study to Determine Bounding Critical Systems of Homogeneously Water-Moderated Mixed Plutonium--Uranium Oxides

    Energy Technology Data Exchange (ETDEWEB)

    Shimizu, Y.

    2001-01-11

    This report provides computational results of an extensive study to examine the following: (1) infinite media neutron-multiplication factors; (2) material bucklings; (3) bounding infinite media critical concentrations; (4) bounding finite critical dimensions of water-reflected and homogeneously water-moderated one-dimensional systems (i.e., spheres, cylinders of infinite length, and slabs that are infinite in two dimensions) that were comprised of various proportions and densities of plutonium oxides and uranium oxides, each having various isotopic compositions; and (5) sensitivity coefficients of delta k-eff with respect to critical geometry delta dimensions were determined for each of the three geometries that were studied. The study was undertaken to support the development of a standard that is sponsored by the International Standards Organization (ISO) under Technical Committee 85, Nuclear Energy (TC 85)--Subcommittee 5, Nuclear Fuel Technology (SC 5)--Working Group 8, Standardization of Calculations, Procedures and Practices Related to Criticality Safety (WG 8). The designation and title of the ISO TC 85/SC 5/WG 8 standard working draft is WD 14941, ''Nuclear energy--Fissile materials--Nuclear criticality control and safety of plutonium-uranium oxide fuel mixtures outside of reactors.'' Various ISO member participants performed similar computational studies using their indigenous computational codes to provide comparative results for analysis in the development of the standard.

  13. Report of the evaluation by the Ad Hoc Review Committee on Computational Science and Engineering. Result evaluation in fiscal year 2000

    International Nuclear Information System (INIS)

    2001-06-01

    The Research Evaluation Committee, which consisted of 14 members from outside of the Japan Atomic Energy Research Institute (JAERI), set up an Ad Hoc Review Committee on Computational Science and Engineering in accordance with the 'Fundamental Guideline for the Evaluation of Research and Development (R and D) at JAERI' and its subsidiary regulations in order to evaluate the R and D accomplishments achieved for five years from Fiscal Year 1995 to Fiscal Year 1999 at Center for Promotion of Computational Science and Engineering of JAERI. The Ad Hoc Review Committee consisted of seven specialists from outside of JAERI. The Ad Hoc Review Committee conducted its activities from December 2000 to March 2001. The evaluation was performed on the basis of the materials submitted in advance and of the oral presentations made at the Ad Hoc Review Committee meeting which was held on December 27, 2000, in line with the items, viewpoints, and criteria for the evaluation specified by the Research Evaluation Committee. The result of the evaluation by the Ad Hoc Review Committee was submitted to the Research Evaluation Committee, and was judged to be appropriate at its meeting held on March 16, 2001. This report describes the result of the evaluation by the Ad Hoc Review Committee on Computational Science and Engineering. (author)

  14. Prospective Coronary Heart Disease Screening in Asymptomatic Hodgkin Lymphoma Patients Using Coronary Computed Tomography Angiography: Results and Risk Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Girinsky, Theodore, E-mail: girinsky.theodore@orange.fr [Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); M’Kacher, Radhia [Laboratory of Radiobiology and Oncology, Institut de Radiobiologie Cellulaire et Moleculaire/Direction des Sciences Vivantes/Commissariat Energie Atomique, Fontenay aux Roses (France); Lessard, Nathalie [Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); Koscielny, Serge [Biostatistics and Epidemiology Unit, Institut Gustave Roussy, Villejuif (France); Elfassy, Eric; Raoux, François [Department of Radiology, Marie Lannelongue, Chatenay-Malabry (France); Carde, Patrice [Department of Hematology, Institut Gustave Roussy, Villejuif (France); Santos, Marcos Dos [Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); Margainaud, Jean-Pierre [Department of Head and Neck Surgery, Institut Gustave Roussy, Villejuif (France); Sabatier, Laure [Laboratory of Radiobiology and Oncology, Institut de Radiobiologie Cellulaire et Moleculaire/Direction des Sciences Vivantes/Commissariat Energie Atomique, Fontenay aux Roses (France); Ghalibafian, Mithra [Department of Radiation Oncology, Institut Gustave Roussy, Villejuif (France); Paul, Jean-François [Department of Radiology, Marie Lannelongue, Chatenay-Malabry (France)

    2014-05-01

    Purpose: To prospectively investigate the coronary artery status using coronary CT angiography (CCTA) in patients with Hodgkin lymphoma treated with combined modalities and mediastinal irradiation. Methods and Materials: All consecutive asymptomatic patients with Hodgkin lymphoma entered the study during follow-up, from August 2007 to May 2012. Coronary CT angiography was performed, and risk factors were recorded along with leukocyte telomere length (LTL) measurements. Results: One hundred seventy-nine patients entered the 5-year study. The median follow-up was 11.6 years (range, 2.1-40.2 years), and the median interval between treatment and the CCTA was 9.5 years (range, 0.5-40 years). Coronary artery abnormalities were demonstrated in 46 patients (26%). Coronary CT angiography abnormalities were detected in nearly 15% of the patients within the first 5 years after treatment. A significant increase (34%) occurred 10 years after treatment (P=.05). Stenoses were mostly nonostial. Severe stenoses were observed in 12 (6.7%) of the patients, entailing surgery with either angioplasty with stent placement or bypass grafting in 10 of them (5.5%). A multivariate analysis demonstrated that age at treatment, hypertension, and hypercholesterolemia, as well as radiation dose to the coronary artery origins, were prognostic factors. In the group of patients with LTL measurements, hypertension and LTL were the only independent risk factors. Conclusions: The findings suggest that CCTA can identify asymptomatic individuals at risk of acute coronary artery disease who might require either preventive or curative measures. Conventional risk factors and the radiation dose to coronary artery origins were independent prognostic factors. The prognostic value of LTL needs further investigation.

  15. Computing Educator Attitudes about Motivation

    OpenAIRE

    Settle, Amber; Sedlak, Brian

    2016-01-01

    While motivation is of great interest to computing educators, relatively little work has been done on understanding faculty attitudes toward student motivation. Two previous qualitative studies of instructor attitudes found results identical to those from other disciplines, but neither study considered whether instructors perceive student motivation to be more important in certain computing classes. In this work we present quantitative results about the perceived importance of student motivat...

  16. [Electronic cigarettes - effects on health. Previous reports].

    Science.gov (United States)

    Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa

    2014-01-01

    Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.

  17. Computational results for the effects of external disturbances on transition location of bodies of revolution from subsonic to supersonic speeds and comparisons with experimental data

    Science.gov (United States)

    Goradia, S. H.; Bobbitt, P. J.; Harvey, W. D.

    1989-01-01

    Computational experiments have been performed for a few configurations in order to investigate the effects of external flow disturbances on the extent of laminar flow and wake drag. Theoretical results have been compared with experimental data for the AEDC cone, for Mach numbers from subsonic to supersonic, and for both free flight and wind tunnel environments. The comparisons have been found to be very satisfactory, thus establishing the utility of the present method for the design and development of laminar flow configurations and for the assessment of wind tunnel data. In addition, results of calculations concerning the effects of unit Reynolds numbers on transition are presented. In addition to the AEDC cone, computations have been performed for an ogive body of revolution at zero angle of attack and supersonic Mach numbers. Results are presented for transition Reynolds number and wake drag for external disturbances corresponding to free air and the test section of the AEDC-VKF tunnel. These results have been found to compare quite well with wind tunnel data for cases when surface suction is applied as well as when suction is absent.

  18. Validation of one-dimensional module of MARS 2.1 computer code by comparison with the RELAP5/MOD3.3 developmental assessment results

    International Nuclear Information System (INIS)

    Lee, Y. J.; Bae, S. W.; Chung, B. D.

    2003-02-01

    This report records the results of the code validation for the one-dimensional module of the MARS 2.1 thermal hydraulics analysis code by means of result-comparison with the RELAP5/MOD3.3 computer code. For the validation calculations, simulations of the RELAP5 code development assessment problem, which consists of 22 simulation problems in 3 categories, have been selected. The results of the 3 categories of simulations demonstrate that the one-dimensional module of the MARS 2.1 code and the RELAP5/MOD3.3 code are essentially the same code. This is expected as the two codes have basically the same set of field equations, constitutive equations and main thermal hydraulic models. The results suggests that the high level of code validity of the RELAP5/MOD3.3 can be directly applied to the MARS one-dimensional module

  19. Short-distance expansion for the electromagnetic half-space Green's tensor: general results and an application to radiative lifetime computations

    International Nuclear Information System (INIS)

    Panasyuk, George Y; Schotland, John C; Markel, Vadim A

    2009-01-01

    We obtain a short-distance expansion for the half-space, frequency domain electromagnetic Green's tensor. The small parameter of the theory is ωε 1 L/c, where ω is the frequency, ε 1 is the permittivity of the upper half-space, in which both the source and the point of observation are located, and which is assumed to be transparent, c is the speed of light in vacuum and L is a characteristic length, defined as the distance from the point of observation to the reflected (with respect to the planar interface) position of the source. In the case when the lower half-space (the substrate) is characterized by a complex permittivity ε 2 , we compute the expansion to third order. For the case when the substrate is a transparent dielectric, we compute the imaginary part of the Green's tensor to seventh order. The analytical calculations are verified numerically. The practical utility of the obtained expansion is demonstrated by computing the radiative lifetime of two electromagnetically interacting molecules in the vicinity of a transparent dielectric substrate. The computation is performed in the strong interaction regime when the quasi-particle pole approximation is inapplicable. In this regime, the integral representation for the half-space Green's tensor is difficult to use while its electrostatic limiting expression is grossly inadequate. However, the analytical expansion derived in this paper can be used directly and efficiently. The results of this study are also relevant to nano-optics and near-field imaging, especially when tomographic image reconstruction is involved

  20. Interobserver agreement of the injury diagnoses obtained by postmortem computed tomography of traffic fatality victims and a comparison with autopsy results

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Struckmann, Henrik; Lauritsen, Jens

    2013-01-01

    The present study investigated the interobserver variation between a radiologist and a forensic pathologist in 994 injury diagnoses obtained by postmortem computed tomography (CT) of 67 traffic fatality victims, and the results were compared with diagnoses obtained by autopsy. The injuries were...... system, but the pathologist diagnosed more organ injuries. We recommend the use of a radiologist as a consultant for the evaluation of postmortem CT images. Training in radiology should be included in forensic medicine postgraduate training. CT was superior to autopsy in detecting abnormal air...

  1. Degenerative dementia: nosological aspects and results of single photon emission computed tomography; Les demences degeneratives: aspects nosologiques et resultats de la tomographie d'emission monophotonique

    Energy Technology Data Exchange (ETDEWEB)

    Dubois, B.; Habert, M.O. [Hopital Pitie-Salpetriere, 75 - Paris (France)

    1999-12-01

    Ten years ago, the diagnosis discussion of a dementia case for the old patient was limited to two pathologies: the Alzheimer illness and the Pick illness. During these last years, the frame of these primary degenerative dementia has fallen into pieces. The different diseases and the results got with single photon emission computed tomography are discussed. for example: fronto-temporal dementia, primary progressive aphasia, progressive apraxia, visio-spatial dysfunction, dementia at Lewy's bodies, or cortico-basal degeneration. (N.C.)

  2. Integration of computer-aided diagnosis/detection (CAD) results in a PACS environment using CAD-PACS toolkit and DICOM SR

    International Nuclear Information System (INIS)

    Le, Anh H.T.; Liu, Brent; Huang, H.K.

    2009-01-01

    Picture Archiving and Communication System (PACS) is a mature technology in health care delivery for daily clinical imaging service and data management. Computer-aided detection and diagnosis (CAD) utilizes computer methods to obtain quantitative measurements from medical images and clinical information to assist clinicians to assess a patient's clinical state more objectively. CAD needs image input and related information from PACS to improve its accuracy; and PACS benefits from CAD results online and available at the PACS workstation as a second reader to assist physicians in the decision making process. Currently, these two technologies remain as two separate independent systems with only minimal system integration. This paper describes a universal method to integrate CAD results with PACS in its daily clinical environment. The method is based on Health Level 7 (HL7) and Digital imaging and communications in medicine (DICOM) standards, and Integrating the Healthcare Enterprise (IHE) workflow profiles. In addition, the integration method is Health Insurance Portability and Accountability Act (HIPAA) compliant. The paper presents (1) the clinical value and advantages of integrating CAD results in a PACS environment, (2) DICOM Structured Reporting formats and some important IHE workflow profiles utilized in the system integration, (3) the methodology using the CAD-PACS integration toolkit, and (4) clinical examples with step-by-step workflows of this integration. (orig.)

  3. Classification Method to Define Synchronization Capability Limits of Line-Start Permanent-Magnet Motor Using Mesh-Based Magnetic Equivalent Circuit Computation Results

    Directory of Open Access Journals (Sweden)

    Bart Wymeersch

    2018-04-01

    Full Text Available Line start permanent magnet synchronous motors (LS-PMSM are energy-efficient synchronous motors that can start asynchronously due to a squirrel cage in the rotor. The drawback, however, with this motor type is the chance of failure to synchronize after start-up. To identify the problem, and the stable operation limits, the synchronization at various parameter combinations is investigated. For accurate knowledge of the operation limits to assure synchronization with the utility grid, an accurate classification of parameter combinations is needed. As for this, many simulations have to be executed, a rapid evaluation method is indispensable. To simulate the dynamic behavior in the time domain, several modeling methods exist. In this paper, a discussion is held with respect to different modeling methods. In order to include spatial factors and magnetic nonlinearities, on the one hand, and to restrict the computation time on the other hand, a magnetic equivalent circuit (MEC modeling method is developed. In order to accelerate numerical convergence, a mesh-based analysis method is applied. The novelty in this paper is the implementation of support vector machine (SVM to classify the results of simulations at various parameter combinations into successful or unsuccessful synchronization, in order to define the synchronization capability limits. It is explained how these techniques can benefit the simulation time and the evaluation process. The results of the MEC modeling correspond to those obtained with finite element analysis (FEA, despite the reduced computation time. In addition, simulation results obtained with MEC modeling are experimentally validated.

  4. (hydronsan) in previously untreated cases of pulmo

    African Journals Online (AJOL)

    B., M.D., King George V. Hospital, Durban. SUMMARY. The results of a randomized, single-blind, between-paTienI trial of various combinations of rifampicin, .... clearing after 16 weeks of trial is higher in the rifampicin- treated groups than in the ethambutol-plus-Hydronsan group. Table III shows the radiological clearing.

  5. The program system for the automatic graphical representation, on the Calcomp recorder, of the results of numerical or hybrid simulations on the E.A.I. 8900 Computer

    International Nuclear Information System (INIS)

    Neel, Daniele

    1970-01-01

    This report was the first subject of a thesis submitted by Madame Daniele NEEL, on the 25 of May 1970, to the Faculte des Sciences in Paris in order to obtain the grade of doctor engineer. The differential equations, treated by hybrid calculations, were solved continuously by the analog machine; at the same time the digital computer sampled the results at different times. The program system was divided Into two parts. A card index system was developed progressively from the results (even if they were in real time); the results were displayed graphically directly on an oscilloscope screen with a memory as a curve or a series of curves or, by a delayed system using a digital tracer. The graphs obtained were ready to be inserted in a report and contained all the relevant information. The second subject 'The hybrid calculation - Generalities and Bibliography' was covered by a note CEA-N-1345. (author) [fr

  6. Computer-Based Training in Math and Working Memory Improves Cognitive Skills and Academic Achievement in Primary School Children: Behavioral Results

    Directory of Open Access Journals (Sweden)

    Noelia Sánchez-Pérez

    2018-01-01

    Full Text Available Student academic achievement has been positively related to further development outcomes, such as the attainment of higher educational, employment, and socioeconomic aspirations. Among all the academic competences, mathematics has been identified as an essential skill in the field of international leadership as well as for those seeking positions in disciplines related to science, technology, and engineering. Given its positive consequences, studies have designed trainings to enhance children's mathematical skills. Additionally, the ability to regulate and control actions and cognitions, i.e., executive functions (EF, has been associated with school success, which has resulted in a strong effort to develop EF training programs to improve students' EF and academic achievement. The present study examined the efficacy of a school computer-based training composed of two components, namely, working memory and mathematics tasks. Among the advantages of using a computer-based training program is the ease with which it can be implemented in school settings and the ease by which the difficulty of the tasks can be adapted to fit the child's ability level. To test the effects of the training, children's cognitive skills (EF and IQ and their school achievement (math and language grades and abilities were evaluated. The results revealed a significant improvement in cognitive skills, such as non-verbal IQ and inhibition, and better school performance in math and reading among the children who participated in the training compared to those children who did not. Most of the improvements were related to training on WM tasks. These findings confirmed the efficacy of a computer-based training that combined WM and mathematics activities as part of the school routines based on the training's impact on children's academic competences and cognitive skills.

  7. Computer-Based Training in Math and Working Memory Improves Cognitive Skills and Academic Achievement in Primary School Children: Behavioral Results.

    Science.gov (United States)

    Sánchez-Pérez, Noelia; Castillo, Alejandro; López-López, José A; Pina, Violeta; Puga, Jorge L; Campoy, Guillermo; González-Salinas, Carmen; Fuentes, Luis J

    2017-01-01

    Student academic achievement has been positively related to further development outcomes, such as the attainment of higher educational, employment, and socioeconomic aspirations. Among all the academic competences, mathematics has been identified as an essential skill in the field of international leadership as well as for those seeking positions in disciplines related to science, technology, and engineering. Given its positive consequences, studies have designed trainings to enhance children's mathematical skills. Additionally, the ability to regulate and control actions and cognitions, i.e., executive functions (EF), has been associated with school success, which has resulted in a strong effort to develop EF training programs to improve students' EF and academic achievement. The present study examined the efficacy of a school computer-based training composed of two components, namely, working memory and mathematics tasks. Among the advantages of using a computer-based training program is the ease with which it can be implemented in school settings and the ease by which the difficulty of the tasks can be adapted to fit the child's ability level. To test the effects of the training, children's cognitive skills (EF and IQ) and their school achievement (math and language grades and abilities) were evaluated. The results revealed a significant improvement in cognitive skills, such as non-verbal IQ and inhibition, and better school performance in math and reading among the children who participated in the training compared to those children who did not. Most of the improvements were related to training on WM tasks. These findings confirmed the efficacy of a computer-based training that combined WM and mathematics activities as part of the school routines based on the training's impact on children's academic competences and cognitive skills.

  8. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2012-01-31

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  9. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2011-01-01

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  10. Incidence of traumatic carotid and vertebral artery dissections: results of cervical vessel computed tomography angiogram as a mandatory scan component in severely injured patients

    Directory of Open Access Journals (Sweden)

    Schicho A

    2018-01-01

    Full Text Available Andreas Schicho,1 Lukas Luerken,1 Ramona Meier,1 Antonio Ernstberger,2 Christian Stroszczynski,1 Andreas Schreyer,1 Lena-Marie Dendl,1 Stephan Schleder1 1Department of Radiology, 2Department of Trauma Surgery, University Medical Center, Regensburg, Germany Purpose: The aim of this study was to evaluate the true incidence of cervical artery dissections (CeADs in trauma patients with an Injury Severity Score (ISS of ≥16, since head-and-neck computed tomography angiogram (CTA is not a compulsory component of whole-body trauma computed tomography (CT protocols. Patients and methods: A total of 230 consecutive trauma patients with an ISS of ≥16 admitted to our Level I trauma center during a 24-month period were prospectively included. Standardized whole-body CT in a 256-detector row scanner included a head-and-neck CTA. Incidence, mortality, patient and trauma characteristics, and concomitant injuries were recorded and analyzed retrospectively in patients with carotid artery dissection (CAD and vertebral artery dissection (VAD. Results: Of the 230 patients included, 6.5% had a CeAD, 5.2% had a CAD, and 1.7% had a VAD. One patient had both CAD and VAD. For both, CAD and VAD, mortality is 25%. One death was caused by fatal cerebral ischemia due to high-grade CAD. A total of 41.6% of the patients with traumatic CAD and 25% of the patients with VAD had neurological sequelae. Conclusion: Mandatory head-and-neck CTA yields higher CeAD incidence than reported before. We highly recommend the compulsory inclusion of a head-and-neck CTA to whole-body CT routines for severely injured patients. Keywords: polytrauma, carotid artery, vertebral artery, dissection, blunt trauma, computed tomography angiogram

  11. Final results of the 'Benchmark on computer simulation of radioactive nuclides production rate and heat generation rate in a spallation target'

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Pohorecki, W.; Domanska, G.; Maiorino, R.J.; David, J.C.; Velarde, F.A.

    2011-01-01

    A benchmark has been organized to assess the computer simulation of nuclide production and heat generation in a spallation lead target. The physical models applied for the calculation of thick lead target activation do not produce satisfactory results for the majority of analysed nuclides, however one can observe better or worse quantitative compliance with the experimental results. Analysis of the quality of calculated results show the best performance for heavy nuclides (A: 170 - 190). For intermediate nuclides (A: 60 - 130) almost all are underestimated while for A: 130 - 170 mainly overestimated. The shape of the activity distribution in the target is well reproduced in calculations by all models but the numerical comparison shows similar performance as for the whole target. The Isabel model yields best results. As for the whole target heating rate, the results from all participants are consistent. Only small differences are observed between results from physical models. As for the heating distribution in the target it looks not quite similar. The quantitative comparison of the distributions yielded by different spallation reaction models shows for the major part of the target no serious differences - generally below 10%. However, in the most outside parts of the target front layers and the part of the target at its end behind the primary protons range, a spread higher than 40 % is obtained

  12. Managing previously disposed waste to today's standards

    International Nuclear Information System (INIS)

    1990-01-01

    A Radioactive Waste Management Complex (RWMC) was established at the Idaho National Engineering Laboratory (INEL) in 1952 for controlled disposal of radioactive waste generated at the INEL. Between 1954 and 1970 waste characterized by long lived, alpha emitting radionuclides from the Rocky Flats Plant was also buried at this site. Migration of radionuclides and other hazardous substances from the buried Migration of radionuclides and other hazardous substances from the buried waste has recently been detected. A Buried Waste Program (BWP) was established to manage cleanup of the buried waste. This program has four objectives: (1) determine contaminant sources, (2) determine extent of contamination, (3) mitigate migration, and (4) recommend an alternative for long term management of the waste. Activities designed to meet these objectives have been under way since the inception of the program. The regulatory environment governing these activities is evolving. Pursuant to permitting activities under the Resource Conservation and Recovery Act (RCRA), the Department of Energy (DOE) and the Environmental Protection Agency (EPA) entered into a Consent Order Compliance Agreement (COCA) for cleanup of past practice disposal units at the INEL. Subsequent to identification of the RWMC as a release site, cleanup activities proceeded under dual regulatory coverage of RCRA and the Atomic Energy Act. DOE, EPA, and the State of Idaho are negotiating a RCRA/Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Interagency Agreement (IAG) for management of waste disposal sites at the INEL as a result of the November 1989 listing of the INEL on the National Priority List (NPL). Decision making for selection of cleanup technology will be conducted under the CERCLA process supplemented as required to meet the requirements of the National Environmental Policy Act (NEPA). 7 figs

  13. Investigation of previously derived Hyades, Coma, and M67 reddenings

    International Nuclear Information System (INIS)

    Taylor, B.J.

    1980-01-01

    New Hyades polarimetry and field star photometry have been obtained to check the Hyades reddening, which was found to be nonzero in a previous paper. The new Hyades polarimetry implies essentially zero reddening; this is also true of polarimetry published by Behr (which was incorrectly interpreted in the previous paper). Four photometric techniques which are presumed to be insensitive to blanketing are used to compare the Hyades to nearby field stars; these four techniques also yield essentially zero reddening. When all of these results are combined with others which the author has previously published and a simultaneous solution for the Hyades, Coma, and M67 reddenings is made, the results are E (B-V) =3 +- 2 (sigma) mmag, -1 +- 3 (sigma) mmag, and 46 +- 6 (sigma) mmag, respectively. No support for a nonzero Hyades reddening is offered by the new results. When the newly obtained reddenings for the Hyades, Coma, and M67 are compared with results from techniques given by Crawford and by users of the David Dunlap Observatory photometric system, no differences between the new and other reddenings are found which are larger than about 2 sigma. The author had previously found that the M67 main-sequence stars have about the same blanketing as that of Coma and less blanketing than the Hyades; this conclusion is essentially unchanged by the revised reddenings

  14. 26 CFR 1.1402(f)-1 - Computation of partner's net earnings from self-employment for taxable year which ends as result...

    Science.gov (United States)

    2010-04-01

    ... TAXES Tax on Self-Employment Income § 1.1402(f)-1 Computation of partner's net earnings from self...—(1) In general. The rules for the computation of a partner's net earnings from self-employment are... self-employment computed under such rules for the last taxable year of a deceased partner, if a partner...

  15. Effects on mortality, treatment, and time management as a result of routine use of total body computed tomography in blunt high-energy trauma patients.

    Science.gov (United States)

    van Vugt, Raoul; Kool, Digna R; Deunk, Jaap; Edwards, Michael J R

    2012-03-01

    Currently, total body computed tomography (TBCT) is rapidly implemented in the evaluation of trauma patients. With this review, we aim to evaluate the clinical implications-mortality, change in treatment, and time management-of the routine use of TBCT in adult blunt high-energy trauma patients compared with a conservative approach with the use of conventional radiography, ultrasound, and selective computed tomography. A literature search for original studies on TBCT in blunt high-energy trauma patients was performed. Two independent observers included studies concerning mortality, change of treatment, and/or time management as outcome measures. For each article, relevant data were extracted and analyzed. In addition, the quality according to the Oxford levels of evidence was assessed. From 183 articles initially identified, the observers included nine original studies in consensus. One of three studies described a significant difference in mortality; four described a change of treatment in 2% to 27% of patients because of the use of TBCT. Five studies found a gain in time with the use of immediate routine TBCT. Eight studies scored a level of evidence of 2b and one of 3b. Current literature has predominantly suboptimal design to prove terminally that the routine use of TBCT results in improved survival of blunt high-energy trauma patients. TBCT can give a change of treatment and improves time intervals in the emergency department as compared with its selective use.

  16. The difference of canine, first and second premolar tooth size resulted from cone beam computed tomography imaging with Moyers Prediction Table on the working study model

    Directory of Open Access Journals (Sweden)

    Julies Hariani Sugiaman

    2011-03-01

    Full Text Available Model study is one of the standard orthodontic components which is important for diagnosis and treatment plan, but in some patients with the high gag reflex, it will be difficult to get this kind of study models. The existence of a new device which is able to show the condition of patients' mouth in three space areas (axial, sagittal, and coronal is expected to be an alternative when a study model is difficult to get. The purpose of this study is to find out whether or not there are any differences on the size of canine's mesiodistal, first and second premolar resulted from CBCT imaging with Moyers analysis on the study models. The method of the research is comparative descriptive. Measurements are made on 10 CBCT imaging results and 10 study models. The mesiodistal size, the result of CBCT imaging is measured by the available computer program and also the mesiodistal size of the study models is measured using a sliding compass, and then the size of canines, first and second premolar teeth resulted from CBCT imaging are compared to the result of Moyers method analysis on the study models. The t-test is used to find out if there is a difference between teeth size value between the CBCT imaging with the study models. The significance is determined based on the p-value t table.

  17. Comparison of long-term results of computer-assisted anti-stigma education and reading anti-stigma educational materials.

    Science.gov (United States)

    Finkelstein, Joseph; Lapshin, Oleg; Wasserman, Evgeny

    2007-10-11

    Professionals working with psychiatric patients very often have negative beliefs and attitudes about their clients. We designed our study to investigate the effectiveness of anti-stigma interventions among university students who are trained to provide special education. The objective of our study was to compare sustainability of the effect of two anti-stigma education programs. We enrolled 91 college students from the School of Special Education at the Herzen Russian State Pedagogic University (St Petersburg, Russia). Of those, 36 read two articles and World Health Organization brochure (reading group, RG) devoted to the problem of psychiatric stigma, and 32 studied an anti-stigma web-based program (program group, PG). Twenty-three students were in a control group (CG) and received no intervention. The second study visit in six months was completed by 65 students. To measure the level of stigma we used the Community Attitudes toward the Mentally Ill (CAMI) questionnaire. The web-based program was based on the Computer-assisted Education system (CO-ED) which we described previously. The CO-ED system provides self-paced interactive education driven by adult learning theories. At the time of their first visit the age of the study participants was 19.0+/-1.2 years; of them, 99% were females. After the intervention in PG, the level of stigma assessed by CAMI decreased from 24.0+/-5.0 to 15.8+/- 4.6 points (pstigma dropped from 24.1+/-6.1 to 20.3+/-6.4 points (pstigma in PG was significantly lower than in CG and RG (20.2+/-6.2 in CG, 21.3+/-6.5 in RG, and 18.7+/-4.9 in PG, pstigma materials could be effective in reducing psychiatric stigma among university students. The effect of interactive web-based education based on adult learning theories was more stable as assessed in six months.

  18. Computing nucleon EDM on a lattice

    Science.gov (United States)

    Abramczyk, Michael; Aoki, Sinya; Blum, Tom; Izubuchi, Taku; Ohki, Hiroshi; Syritsyn, Sergey

    2018-03-01

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  19. Computing nucleon EDM on a lattice

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, Michael; Izubuchi, Taku

    2017-06-18

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  20. Determining who responds better to a computer- vs. human-delivered physical activity intervention: results from the community health advice by telephone (CHAT) trial

    Science.gov (United States)

    2013-01-01

    Background Little research has explored who responds better to an automated vs. human advisor for health behaviors in general, and for physical activity (PA) promotion in particular. The purpose of this study was to explore baseline factors (i.e., demographics, motivation, interpersonal style, and external resources) that moderate intervention efficacy delivered by either a human or automated advisor. Methods Data were from the CHAT Trial, a 12-month randomized controlled trial to increase PA among underactive older adults (full trial N = 218) via a human advisor or automated interactive voice response advisor. Trial results indicated significant increases in PA in both interventions by 12 months that were maintained at 18-months. Regression was used to explore moderation of the two interventions. Results Results indicated amotivation (i.e., lack of intent in PA) moderated 12-month PA (d = 0.55, p  0.12). Conclusions Results provide preliminary evidence for generating hypotheses about pathways for supporting later clinical decision-making with regard to the use of either human- vs. computer-delivered interventions for PA promotion. PMID:24053756

  1. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy – Part 2: Computational implementation and first results

    Directory of Open Access Journals (Sweden)

    L. Peruzza

    2017-11-01

    Full Text Available This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017 and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude–scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014. Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M  >  6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M  <  6 local volcano-tectonic earthquakes, which dominate the hazard at the short- to mid-term exposure times considered

  2. Previous climatic alterations are caused by the sun

    International Nuclear Information System (INIS)

    Groenaas, Sigbjoern

    2003-01-01

    The article surveys the scientific results of previous research into the contribution of the sun to climatic alterations. The author concludes that there is evidence of eight cold periods after the last ice age and that the alterations largely were due to climate effects from the sun. However, these effects are only causing a fraction of the registered global warming. It is assumed that the human activities are contributing to the rest of the greenhouse effect

  3. Comparison of endoscopic ultrasonography and multislice spiral computed tomography for the preoperative staging of gastric cancer - results of a single institution study of 610 Chinese patients.

    Directory of Open Access Journals (Sweden)

    Xing-Yu Feng

    Full Text Available BACKGROUND: This study compared the performance of endoscopic ultrasonography (EUS and multislice spiral computed tomography (MSCT in the preoperative staging of gastric cancer. METHODOLOGY/PRINCIPAL FINDINGS: A total of 610 patients participated in this study, all of whom had undergone surgical resection, had confirmed gastric cancer and were evaluated with EUS and MSCT. Tumor staging was evaluated using the Tumor-Node-Metastasis (TNM staging and Japanese classification. The results from the imaging modalities were compared with the postoperative histopathological outcomes. The overall accuracies of EUS and MSCT for the T staging category were 76.7% and 78.2% (P=0.537, respectively. Stratified analysis revealed that the accuracy of EUS for T1 and T2 staging was significantly higher than that of MSCT (P<0.001 for both and that the accuracy of MSCT in T3 and T4 staging was significantly higher than that of EUS (P<0.001 and 0.037, respectively. The overall accuracy of MSCT was 67.2% when using the 13th edition Japanese classification, and this percentage was significantly higher than the accuracy of EUS (49.3% and MSCT (44.6% when using the 6th edition UICC classification (P<0.001 for both values. CONCLUSIONS/SIGNIFICANCE: Our results demonstrated that the overall accuracies of EUS and MSCT for preoperative staging were not significantly different. We suggest that a combination of EUS and MSCT is required for preoperative evaluation of TNM staging.

  4. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  5. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  6. [Fatal amnioinfusion with previous choriocarcinoma in a parturient woman].

    Science.gov (United States)

    Hrgović, Z; Bukovic, D; Mrcela, M; Hrgović, I; Siebzehnrübl, E; Karelovic, D

    2004-04-01

    The case of 36-year-old tercipare is described who developed choriocharcinoma in a previous pregnancy. During the first term labour the patient developed cardiac arrest, so reanimation and sectio cesarea was performed. A male new-born was delivered in good condition, but even after intensive therapy and reanimation occurred death of parturient woman with picture of disseminate intravascular coagulopathia (DIK). On autopsy and on histology there was no sign of malignant disease, so it was not possible to connect previous choricarcinoma with amniotic fluid embolism. Maybe was place of choriocarcinoma "locus minoris resistentiae" which later resulted with failure in placentation what was hard to prove. On autopsy we found embolia of lung with a microthrombosis of terminal circulation with punctiformis bleeding in mucous, what stands for DIK.

  7. Initial results on computational performance of Intel Many Integrated Core (MIC) architecture: implementation of the Weather and Research Forecasting (WRF) Purdue-Lin microphysics scheme

    Science.gov (United States)

    Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.

    2014-10-01

    Purdue-Lin scheme is a relatively sophisticated microphysics scheme in the Weather Research and Forecasting (WRF) model. The scheme includes six classes of hydro meteors: water vapor, cloud water, raid, cloud ice, snow and graupel. The scheme is very suitable for massively parallel computation as there are no interactions among horizontal grid points. In this paper, we accelerate the Purdue Lin scheme using Intel Many Integrated Core Architecture (MIC) hardware. The Intel Xeon Phi is a high performance coprocessor consists of up to 61 cores. The Xeon Phi is connected to a CPU via the PCI Express (PICe) bus. In this paper, we will discuss in detail the code optimization issues encountered while tuning the Purdue-Lin microphysics Fortran code for Xeon Phi. In particularly, getting a good performance required utilizing multiple cores, the wide vector operations and make efficient use of memory. The results show that the optimizations improved performance of the original code on Xeon Phi 5110P by a factor of 4.2x. Furthermore, the same optimizations improved performance on Intel Xeon E5-2603 CPU by a factor of 1.2x compared to the original code.

  8. Prevalence of and risk factors for methamphetamine use in northern Thai youth: results of an audio-computer-assisted self-interviewing survey with urine testing.

    Science.gov (United States)

    Sattah, Martin V; Supawitkul, Somsak; Dondero, Timothy J; Kilmarx, Peter H; Young, Nancy L; Mastro, Timothy D; Chaikummao, Supaporn; Manopaiboon, Chomnad; Griensven, Frits van

    2002-07-01

    Data from drug treatment facilities, drug seizures and drug arrests suggest rapidly increasing methamphetamine use by adolescents in Thailand. However, limited quantitative data are available about the prevalence of its use or correlates of use. The purpose of our study was therefore to estimate the prevalence of methamphetamine use and to identify possible risk factors. Cross-sectional survey using anonymous audio-computer-assisted self-interview and urine specimen analysis. Chiang Rai Province, Thailand. 1725 students, 15-21 years of age (893 male and 832 female) attending one of three vocational schools in Chiang Rai Province. Three hundred and fifty male and 150 female students reported a history of having ever used methamphetamine. In addition, 128 male and 49 female students had positive urine test results, indicating recent methamphetamine use; 27 of these students denied having ever used methamphetamine. According to history, urine test, or both, 41.3% of male students and 19.0% of female students used methamphetamine. In multivariate analysis, methamphetamine use was highly correlated with the use of other substances, sexual activity, peer pressure, positive attitudes toward methamphetamine, and absence of a family confidant. Methamphetamine use is common among adolescent students in northern Thailand. Demographic, behavioral and psychosocial correlates of methamphetamine use identified in this study may be helpful for the design and implementation of preventive interventions.

  9. Estimation of energetic efficiency of heat supply in front of the aircraft at supersonic accelerated flight. Part II. Mathematical model of the trajectory boost part and computational results

    Science.gov (United States)

    Latypov, A. F.

    2009-03-01

    The fuel economy was estimated at boost trajectory of aerospace plane during energy supply to the free stream. Initial and final velocities of the flight were given. A model of planning flight above cold air in infinite isobaric thermal wake was used. The comparison of fuel consumption was done at optimal trajectories. The calculations were done using a combined power plant consisting of ramjet and liquid-propellant engine. An exergy model was constructed in the first part of the paper for estimating the ramjet thrust and specific impulse. To estimate the aerodynamic drag of aircraft a quadratic dependence on aerodynamic lift is used. The energy for flow heating is obtained at the sacrifice of an equivalent decrease of exergy of combustion products. The dependencies are obtained for increasing the range coefficient of cruise flight at different Mach numbers. In the second part of the paper, a mathematical model is presented for the boost part of the flight trajectory of the flying vehicle and computational results for reducing the fuel expenses at the boost trajectory at a given value of the energy supplied in front of the aircraft.

  10. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  11. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  12. The personality trait of behavioral inhibition modulates perceptions of moral character and performance during the trust game: behavioral results and computational modeling

    Directory of Open Access Journals (Sweden)

    Milen L. Radell

    2016-02-01

    Full Text Available Decisions based on trust are critical for human social interaction. We judge the trustworthiness of partners in social interactions based on a number of partner characteristics as well as experiences with those partners. These decisions are also influenced by personality. The current study examined how the personality trait of behavioral inhibition, which involves the tendency to avoid or withdraw from novelty in both social and non-social situations, is related to explicit ratings of trustworthiness as well as decisions made in the trust game. In the game, healthy young adults interacted with three fictional partners who were portrayed as trustworthy, untrustworthy or neutral through biographical information. Participants could choose to keep $1 or send $3 of virtual money to a partner. The partner could then choose to send $1.5 back to the participant or to keep the entire amount. On any trial in which the participant chose to send, the partner always reciprocated with 50% probability, irrespective of how that partner was portrayed in the biography. Behavioral inhibition was assessed through a self-report questionnaire. Finally, a reinforcement learning computational model was fit to the behavior of each participant. Self-reported ratings of trust confirmed that all participants, irrespective of behavioral inhibition, perceived differences in the moral character of the three partners (trustworthiness of good > neutral > bad partner. Decisions made in the game showed that inhibited participants tended to trust the neutral partner less than uninhibited participants. In contrast, this was not reflected in the ratings of the neutral partner (either pre- or post-game, indicating a dissociation between ratings of trustworthiness and decisions made by inhibited participants. Computational modeling showed that this was due to lower initial trust of the neutral partner rather than a higher learning rate associated with loss, suggesting an implicit bias

  13. Evaluation of effectiveness of a computer system (CAD) in the identification of lung nodules with low-dose MSCT: scanning technique and preliminary results

    International Nuclear Information System (INIS)

    Fraioli, Francesco; Catalano, Carlo; Almberger, Maria; Bertoletti, Linda; Cantisani, Vito; Danti, Massimiliano; Pediconi, Federica; Passariello, Roberto

    2005-01-01

    Purpose: Evaluation of the effectiveness of a computer-aided diagnosis (CAD) in the identification of pulmonary nodules. Materials and methods: Two observers (A1, A2) with different levels of experience independently evaluated 20 chest MSCT studies with and without the aid of a CAD system (LungCheck, R2 Technology, Inc.). The study parameters were as follows: 140 kVs, 40 mAs, collimation 4x1 mm, slice thickness 1.25 mm, reconstruction interval 1.0 mm. The observers analysed the images with and without CAD and evaluated: 1) nodule size (longer axis); 2) number and location of nodules; 3) reading time for each observer. The gold standard was represented by the evaluation of both readers in consensus with the aid of the CAD system. Results: Without CAD support the two readers identified 77 (A1) and 79 (A2) nodules and with CAD 81 (A1) and 82 (A2) nodules. Working in consensus the two observers identified 81 nodules without the aid of CAD and 84 nodules with the aid of CAD. Total number of nodules identified by CAD was 104, 25 of which were false positive and 5 false negative. The average reading time with the aid of CAD decreased by as much as 40% for both the observers. Conclusions: The preliminary results of our study suggest that the CAD technique is an accurate automatic support tool in the identification of pulmonary nodules. It reduces reading time and automatically supplies the size, volume, density and number of nodules, thus being useful both in screening programmes and in the follow-up of cancer patients, in whom comparison of the images is particularly difficult [it

  14. Comparison of intra-aortic computed tomography angiography to conventional angiography in the presurgical visualization of the Adamkiewicz artery: first results in patients with thoracoabdominal aortic aneurysms

    International Nuclear Information System (INIS)

    Clarencon, Frederic; Maria, Federico di; Cormier, Evelyne; Sourour, Nader; Gabrieli, Joseph; Iosif, Christina; Chiras, Jacques; Gaudric, Julien; Koskas, Fabien; Jenny, Catherine

    2013-01-01

    The aim of this study was to compare the sensitivity of intra-aortic computed tomography angiography (IA-CTA) to that of regular spinal digital subtraction angiography for the presurgical location of the Adamkiewicz artery (AKA). Thirty patients (21 males, 9 females; mean age 64 years) had an IA-CTA for the location of the AKA before surgery of aneurysm (n = 24) or dissection (n = 6) of the thoracoabdominal aorta. After femoral artery puncture, a pigtail catheter was positioned at the origin of the descending aorta. CT acquisition was performed with an intra-aortic iodinated contrast media injection (15 mL/s, 120 mL). The visualization of the AKA and the location of the feeder(s) to the AKA were independently evaluated by two observers. Interrater agreement was calculated using a kappa test. Spinal angiogram by selective catheterization was systematically performed to confirm the results of the IA-CTA. The AKA was visualized by the IA-CTA in 27/30 cases (90 %); in 26/31 (84 %) cases, the continuity with the aorta was satisfactorily seen. Interrater agreement was good for the visualization of the AKA and its feeder(s): 0.625 and 0.87, respectively. In 75 % of the cases for which the AKA was visualized, the selective catheterization confirmed the results of the IA-CTA. In the remaining 25 % of the cases, the selective catheterization could not be performed due to marked vessels' tortuosity or ostium stenosis. IA-CTA is a feasible technique in a daily practice that presents a good sensitivity for the location of the AKA. (orig.)

  15. Monitoring the Microgravity Environment Quality On-board the International Space Station Using Soft Computing Techniques. Part 2; Preliminary System Performance Results

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.

    2002-01-01

    This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known

  16. Development and Flight Results of a PC104/QNX-Based On-Board Computer and Software for the YES2 Tether Experiment

    Science.gov (United States)

    Spiliotopoulos, I.; Mirmont, M.; Kruijff, M.

    2008-08-01

    This paper highlights the flight preparation and mission performance of a PC104-based On-Board Computer for ESA's second Young Engineer's Satellite (YES2), with additional attention to the flight software design and experience of QNX as multi-process real-time operating system. This combination of Commercial-Of-The-Shelf (COTS) technologies is an accessible option for small satellites with high computational demands.

  17. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  18. Avaliação da resistência de união metalocerâmica em função de diferentes tempos de oxidação prévia Evaluation of the bond resistance between metal and ceramics, resulting from different previous oxidation times

    Directory of Open Access Journals (Sweden)

    Stefan Fiuza de Carvalho DEKON

    1999-01-01

    Full Text Available Avaliou-se a resistência de união metal/porcelana utilizando-se uma liga de Ni-Cr, submetida a diferentes tempos de oxidação prévia com o sistema cerâmico Vita-VMK, através do teste preconizado por CHIODI NETTO3. A análise dos resultados permitiu as seguintes conclusões: a ausência da oxidação prévia possibilitou os melhores resultados, sendo que a diferença foi estatisticamente significante; diferentes tempos de oxidação prévia provocaram redução acentuada nos valores obtidos e foram semelhantes entre si; o grupo submetido ao processo de jateamento após a oxidação prévia por cinco minutos mostrou resultados similares aos grupos que também passaram pelo processo de oxidação e não sofreram jateamento posterior.The objective of this research was to evaluate the porcelain-alloy bonding strength using a local-made alloy under different times of pre-oxidation with a ceramic system. The test used was preconized by CHIODI NETTO. The results lead to the following conclusions: the control group (no pre-oxidation showed the best values, statistically significant, when compared with others groups. The different times of pre-oxidation procedures reduced the values significantly, and the groups were similar to each other. The group submitted to the sandblasting process after a pre-oxidation of 5 minutes, showed similar values when compared with the other groups treated with pre-oxidation without sandblasting.

  19. Computing in Hydraulic Engineering Education

    Science.gov (United States)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  20. The Next Step in Deployment of Computer Based Procedures For Field Workers: Insights And Results From Field Evaluations at Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; Le Blanc, Katya L.; Bly, Aaron

    2015-02-01

    The paper-based procedures currently used for nearly all activities in the commercial nuclear power industry have a long history of ensuring safe operation of the plants. However, there is potential to greatly increase efficiency and safety by improving how the human operator interacts with the procedures. One way to achieve these improvements is through the use of computer-based procedures (CBPs). A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools (e.g., placekeeping, correct component verification, etc.), and dynamic step presentation. The latter means that the CBP system could only display relevant steps based on operating mode, plant status, and the task at hand. A dynamic presentation of the procedure (also known as context-sensitive procedures) will guide the operator down the path of relevant steps based on the current conditions. This feature will reduce the operator’s workload and inherently reduce the risk of incorrectly marking a step as not applicable and the risk of incorrectly performing a step that should be marked as not applicable. The research team at the Idaho National Laboratory has developed a prototype CBP system for field workers, which has been evaluated from a human factors and usability perspective in four laboratory studies. Based on the results from each study revisions were made to the CBP system. However, a crucial step to get the end users' (e.g., auxiliary operators, maintenance technicians, etc.) acceptance is to put the system in their hands and let them use it as a part of their everyday work activities. In the spring 2014 the first field evaluation of the INL CBP system was conducted at a nuclear power plant. Auxiliary operators conduct a functional test of one out of three backup air compressors each week. During the field evaluation activity, one auxiliary operator conducted the test with the paper-based procedure while a second auxiliary operator

  1. Complex tibial plateau fractures treated by hybrid external fixation system: A correlation of followup computed tomography derived quality of reduction with clinical results

    Directory of Open Access Journals (Sweden)

    Konstantinos Kateros

    2018-01-01

    Full Text Available Background: Tibial plateau fractures are common due to high energy injuries. The principles of treatment include respect for the soft tissues, restoring the congruity of the articular surface and reduction of the anatomic alignment of the lower limb to enable early movement of the knee joint. There are various surgical fixation methods that can achieve these principles of treatment. Recognition of the particular fracture pattern is important, as this guides the surgical approach required in order to adequately stabilize the fracture. This study evaluates the results of the combined treatment of external fixator and limited internal fixation along with the advantages using postoperative computed tomography (CT scan after implant removal. Materials and Methods: 55 patients with a mean age of 42 years (range 17–65 years with tibial plateau fracture, were managed in our institution between October 2010 and September 2013., Twenty fractures were classified as Schatzker VI and 35 as Schatzker V. There were 8 open fractures (2 Gustilo Anderson 3A and 6 Gustilo Anderson 2. All fractures were treated with closed reduction and hybrid external fixation (n = 21/38.2% or with minimal open reduction internal fixation and a hybrid system (n = 34/61.8%. After the removal of the fixators, CT-scan was programmed for all the cases, for correlation with the results. At final followup, the American Knee Society Score (AKSS was administered. Results: All patients were evaluated with a minimum of 12 months (range 12–21 months followup. Average time to union was 15.5 weeks (range 13–19 weeks. The postoperative joint congruity as evaluated in the postoperative CT-scan was 5° in 19 cases (35%. Patients with residual joint depression 4.5 mm displayed a 100% chance of getting poor-fair scores both in AKSS knee and AKSS function score. The association of a postoperative mechanical axis within 5° of the contralateral limb and improved knee scores was statistically

  2. Five-year clinical and functional multislice computed tomography angiographic results after coronary implantation of the fully resorbable polymeric everolimus-eluting scaffold in patients with de novo coronary artery disease

    DEFF Research Database (Denmark)

    Onuma, Yoshinobu; Dudek, Dariusz; Thuesen, Leif

    2013-01-01

    This study sought to demonstrate the 5-year clinical and functional multislice computed tomography angiographic results after implantation of the fully resorbable everolimus-eluting scaffold (Absorb BVS, Abbott Vascular, Santa Clara, California).......This study sought to demonstrate the 5-year clinical and functional multislice computed tomography angiographic results after implantation of the fully resorbable everolimus-eluting scaffold (Absorb BVS, Abbott Vascular, Santa Clara, California)....

  3. Analysis of previous screening examinations for patients with breast cancer

    International Nuclear Information System (INIS)

    Lee, Eun Hye; Cha, Joo Hee; Han, Dae Hee; Choi, Young Ho; Hwang, Ki Tae; Ryu, Dae Sik; Kwak, Jin Ho; Moon, Woo Kyung

    2007-01-01

    We wanted to improve the quality of subsequent screening by reviewing the previous screening of breast cancer patients. Twenty-four breast cancer patients who underwent previous screening were enrolled. All 24 took mammograms and 15 patients also took sonograms. We reviewed the screening retrospectively according to the BI-RADS criteria and we categorized the results into false negative, true negative, true positive and occult cancers. We also categorized the causes of false negative cancers into misperception, misinterpretation and technical factors and then we analyzed the attributing factors. Review of the previous screening revealed 66.7% (16/24) false negative, 25.0% (6/24) true negative, and 8.3% (2/24) true positive cancers. False negative cancers were caused by the mammogram in 56.3% (9/16) and by the sonogram in 43.7% (7/16). For the false negative cases, all of misperception were related with mammograms and this was attributed to dense breast, a lesion located at the edge of glandular tissue or the image, and findings seen on one view only. Almost all misinterpretations were related with sonograms and attributed to loose application of the final assessment. To improve the quality of breast screening, it is essential to overcome the main causes of false negative examinations, including misperception and misinterpretation. We need systematic education and strict application of final assessment categories of BI-RADS. For effective communication among physicians, it is also necessary to properly educate them about BI-RADS

  4. [Indication for limited surgery on small lung cancer tumors measuring 1cm or less in diameter on preoperative computed tomography and long-term results].

    Science.gov (United States)

    Togashi, K; Koike, T; Emura, I; Usuda, H

    2008-07-01

    Non-invasive lung cancers showed a good prognosis after limited surgery. But it is still uncertain about invasive lung cancers. We investigated the indications for limited surgery for small lung cancer tumors measuring 1 cm or less in diameter on preoperative computed tomography (CT). This study retrospectively analyzed of 1,245 patients who underwent complete resection of lung cancer between 1989 and 2004 in our hospital. Sixty-two patients (5%) had tumors measuring 1 cm or less in diameter. The probability of survival was calculated using the Kaplan-Meier method. All diseases were detected by medical checkup, 52 % of the patients were not definitively diagnosed with lung cancer before surgery. Adenocarcinoma was histologically diagnosed in 49 patients (79%). Other histologic types included squamous cell carcinoma (8), large cell carcinoma (1), small cell carcinoma (1), carcinoid (2), and adenosquamous cell carcinoma (1). Fifty-seven patients (92%) showed pathologic stage IA. The other stages were IB (2), IIA (1), and IIIB (2). There were 14 bronchioloalveolar carcinomas (25% of IA diseases). The 5-year survival rates of IA patients were 90%. The 5-year survival rate of patients with tumors measuring 1cm or less diameter was 91% after lobectomy or pneumonectomy, and 90% after wedge resection or segmentectomy. There were 3 deaths from cancer recurrence, while there were no deaths in 14 patients with bronchioloalveolar carcinoma After limited surgery, non-invasive cancer showed good long-term results, while invasive cancer showed a recurrence rate of 2.3% to 79% even though the tumor measured 1 cm or less in diameter on preoperative CT.

  5. Screening for early lung cancer with low-dose spiral computed tomography: results of annual follow-up examinations in asymptomatic smokers

    International Nuclear Information System (INIS)

    Diederich, Stefan; Thomas, Michael; Semik, Michael; Lenzen, Horst; Roos, Nikolaus; Weber, Anushe; Heindel, Walter; Wormanns, Dag

    2004-01-01

    The aim of this study was analysis of incidence results in a prospective one-arm feasibility study of lung cancer screening with low-radiation-dose spiral computed tomography in heavy smokers. Eight hundred seventeen smokers (≥40 years, ≥20 pack years of smoking history) underwent baseline low-dose CT. Biopsy was recommended in nodules >10 mm with CT morphology suggesting malignancy. In all other lesions follow-up with low-dose CT was recommended. Annual repeat CT was offered to all study participants. Six hundred sixty-eight (81.8%) of the 817 subjects underwent annual repeat CT with a total of 1735 follow-up years. Follow-up of non-calcified nodules present at baseline CT demonstrated growth in 11 of 792 subjects. Biopsy was performed in 8 of 11 growing nodules 7 of which represented lung cancer. Of 174 new nodules, 3 represented lung cancer. The 10 screen-detected lung cancers were all non-small cell cancer (6 stage IA, 1 stage IB, 1 stage IIIA, 2 stage IV). Five symptom-diagnosed cancers (2 small cell lung cancer: 1 limited disease, 1 extensive disease, 3 central/endobronchial non-small cell lung cancer, 2 stage IIIA, 1 stage IIIB) were diagnosed because of symptoms in the 12-month interval between two annual CT scans. Incidence of lung cancer was lower than prevalence, screen-detected cancers were smaller, and stage I was found in 70% (7 of 10) of screen-detected tumors. Only 27% (4 of 15) of invasive procedures was performed for benign lesions; however, 33% (5 of 15) of all cancers diagnosed in the population were symptom-diagnosed cancers (3 central NSCLC, all stage III, 2 SCLC) demonstrating the limitations of CT screening. (orig.)

  6. The impact of round window vs cochleostomy surgical approaches on interscalar excursions in the cochlea: Preliminary results from a flat-panel computed tomography study

    Directory of Open Access Journals (Sweden)

    Nicole T. Jiam

    2016-09-01

    Full Text Available Objective: To evaluate incidence of interscalar excursions between round window (RW and cochleostomy approaches for cochlear implant (CI insertion. Methods: This was a retrospective case-comparison. Flat-panel CT (FPCT scans for 8 CI users with Med-El standard length electrode arrays were collected. Surgical technique was identified by a combination of operative notes and FPCT imaging. Four cochleae underwent round window insertion and 4 cochleae underwent cochleostomy approaches anterior and inferior to the round window. Results: In our pilot study, cochleostomy approaches were associated with a higher likelihood of interscalar excursion. Within the cochleostomy group, we found 29% of electrode contacts (14 of 48 electrodes to be outside the scala tympani. On the other hand, 8.5% of the electrode contacts (4 of 47 electrodes in the round window insertion group were extra-scalar to the scala tympani. These displacements occurred at a mean angle of occurrence of 364° ± 133°, near the apex of the cochlea. Round window electrode displacements tend to localize at angle of occurrences of 400° or greater. Cochleostomy electrodes occurred at an angle of occurrence of 19°–490°. Conclusions: Currently, the optimal surgical approach for standard CI electrode insertion is highly debated, to a certain extent due to a lack of post-operative assessment of intracochlear electrode contact. Based on our preliminary findings, cochleostomy approach is associated with an increased likelihood of interscalar excursions, and these findings should be further evaluated with future prospective studies. Keywords: Cochlear implantation, Round window insertion, Cochleostomy, Interscalar excursion, Electrode position, Flat-panel computed tomography, Surgical approach

  7. Noninvasive assessment of coronary artery disease by multislice spiral computed tomography using a new retrospectively ECG-gated image reconstruction technique. Comparison with angiographic results

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Yuichi; Matsumoto, Naoya; Kato, Masahiko [Nihon Univ., Tokyo (Japan). Surugadai Hospital] [and others

    2003-04-01

    The present study was designed to investigate the accuracy of multislice spiral computed tomography (MSCT) in detecting coronary artery disease, compared with coronary angiography (CAG), using a new retrospectively ECG-gated reconstruction method that reduced cardiac motion artifact. The study group comprised 54 consecutive patients undergoing MSCT and CAG. MSCT was performed using a SOMATOM Volume Zoom (4-detector-row, Siemens, Germany) with slice thickness 1.0 mm, pitch 1.5 (table feed: 1.5 mm per rotation) and gantry rotation time 500 ms. Metoprolol (20-60 mg) was administered orally prior to MSCT imaging. ECG-gated image reconstruction was performed with the reconstruction window (250 ms) positioned immediately before atrial contraction in order to reduce the cardiac motion artifact caused by the abrupt diastolic ventricular movement occurring during the rapid filling and atrial contraction periods. Following inspection of the volume rendering images, multiplanar reconstruction images and axial images of the left main coronary artery (LMCA), left anterior descending artery (LAD), left circumflex artery (LCx) and right coronary artery (RCA) were obtained and evaluated for luminal narrowing. The results were compared with those obtained by CAG. Of 216 coronary arteries, 206 (95.4%) were assessable; 10 arteries were excluded from the analysis because of severe calcification (n=4), stents (n=3) or insufficient contrast enhancement (n=3). The sensitivity to detect coronary stenoses {>=}50% was 93.5% and the specificity to define luminal narrowing <50% was 97.2%. The positive predictive value and the negative predictive value were 93.5% and 97.2%, respectively. The sensitivity was still satisfactory (80.6%) even when non-assessable arteries were included in the analysis. The new retrospectively ECG-gated reconstruction method for MSCT has excellent diagnostic accuracy in detecting significant coronary artery stenoses. (author)

  8. Pancreatic gross tumor volume contouring on computed tomography (CT) compared with magnetic resonance imaging (MRI): Results of an international contouring conference.

    Science.gov (United States)

    Hall, William A; Heerkens, Hanne D; Paulson, Eric S; Meijer, Gert J; Kotte, Alexis N; Knechtges, Paul; Parikh, Parag J; Bassetti, Michael F; Lee, Percy; Aitken, Katharine L; Palta, Manisha; Myrehaug, Sten; Koay, Eugene J; Portelance, Lorraine; Ben-Josef, Edgar; Erickson, Beth A

    Accurate identification of the gross tumor volume (GTV) in pancreatic adenocarcinoma is challenging. We sought to understand differences in GTV delineation using pancreatic computed tomography (CT) compared with magnetic resonance imaging (MRI). Twelve attending radiation oncologists were convened for an international contouring symposium. All participants had a clinical and research interest in pancreatic adenocarcinoma. CT and MRI scans from 3 pancreatic cases were used for contouring. CT and MRI GTVs were analyzed and compared. Interobserver variability was compared using Dice's similarity coefficient (DSC), Hausdorff distances, and Jaccard indices. Mann-Whitney tests were used to check for significant differences. Consensus contours on CT and MRI scans and constructed count maps were used to visualize the agreement. Agreement regarding the optimal method to determine GTV definition using MRI was reached. Six contour sets (3 from CT and 3 from MRI) were obtained and compared for each observer, totaling 72 contour sets. The mean volume of contours on CT was significantly larger at 57.48 mL compared with a mean of 45.76 mL on MRI, P = .011. The standard deviation obtained from the CT contours was significantly larger than the standard deviation from the MRI contours (P = .027). The mean DSC was 0.73 for the CT and 0.72 for the MRI (P = .889). The conformity index measurement was similar for CT and MRI (P = .58). Count maps were created to highlight differences in the contours from CT and MRI. Using MRI as a primary image set to define a pancreatic adenocarcinoma GTV resulted in smaller contours compared with CT. No differences in DSC or the conformity index were seen between MRI and CT. A stepwise method is recommended as an approach to contour a pancreatic GTV using MRI. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  9. MCNP HPGe detector benchmark with previously validated Cyltran model.

    Science.gov (United States)

    Hau, I D; Russ, W R; Bronson, F

    2009-05-01

    An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.

  10. Comparing between predicted output temperature of flat-plate solar collector and experimental results: computational fluid dynamics and artificial neural network

    Directory of Open Access Journals (Sweden)

    F Nadi

    2017-05-01

    Full Text Available Introduction The significant of solar energy as a renewable energy source, clean and without damage to the environment, for the production of electricity and heat is of great importance. Furthermore, due to the oil crisis as well as reducing the cost of home heating by 70%, solar energy in the past two decades has been a favorite of many researchers. Solar collectors are devices for collecting solar radiant energy through which this energy is converted into heat and then heat is transferred to a fluid (usually air or water. Therefore, a key component in performance improvement of solar heating system is a solar collector optimization under different testing conditions. However, estimation of output parameters under different testing conditions is costly, time consuming and mostly impossible. As a result, smart use of neural networks as well as CFD (computational fluid dynamics to predict the properties with which desired output would have been acquired is valuable. To the best of our knowledge, there are no any studies that compare experimental results with CFD and ANN. Materials and Methods A corrugated galvanized iron sheet of 2 m length, 1 m wide and 0.5 mm in thickness was used as an absorber plate for absorbing the incident solar radiation (Fig. 1 and 2. Corrugations in absorber were caused turbulent air and improved heat transfer coefficient. Computational fluid dynamics K-ε turbulence model was used for simulation. The following assumptions are made in the analysis. (1 Air is a continuous medium and incompressible. (2 The flow is steady and possesses have turbulent flow characteristics, due to the high velocity of flow. (3 The thermal-physical properties of the absorber sheet and the absorber tube are constant with respect to the operating temperature. (4 The bottom side of the absorber tube and the absorber plate are assumed to be adiabatic. Artificial neural network In this research a one-hidden-layer feed-forward network based on the

  11. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  12. HEART TRANSPLANTATION IN PATIENTS WITH PREVIOUS OPEN HEART SURGERY

    Directory of Open Access Journals (Sweden)

    R. Sh. Saitgareev

    2016-01-01

    Full Text Available Heart Transplantation (HTx to date remains the most effective and radical method of treatment of patients with end-stage heart failure. The defi cit of donor hearts is forcing to resort increasingly to the use of different longterm mechanical circulatory support systems, including as a «bridge» to the follow-up HTx. According to the ISHLT Registry the number of recipients underwent cardiopulmonary bypass surgery increased from 40% in the period from 2004 to 2008 to 49.6% for the period from 2009 to 2015. HTx performed in repeated patients, on the one hand, involves considerable technical diffi culties and high risks; on the other hand, there is often no alternative medical intervention to HTx, and if not dictated by absolute contradictions the denial of the surgery is equivalent to 100% mortality. This review summarizes the results of a number of published studies aimed at understanding the immediate and late results of HTx in patients, previously underwent open heart surgery. The effect of resternotomy during HTx and that of the specifi c features associated with its implementation in recipients previously operated on open heart, and its effects on the immediate and long-term survival were considered in this review. Results of studies analyzing the risk factors for perioperative complications in repeated recipients were also demonstrated. Separately, HTx risks after implantation of prolonged mechanical circulatory support systems were examined. The literature does not allow to clearly defi ning the impact factor of earlier performed open heart surgery on the course of perioperative period and on the prognosis of survival in recipients who underwent HTx. On the other hand, subject to the regular fl ow of HTx and the perioperative period the risks in this clinical situation are justifi ed as a long-term prognosis of recipients previously conducted open heart surgery and are comparable to those of patients who underwent primary HTx. Studies

  13. A School Competition on the computation of the solar parallax using observations from the Mercury Transit of 9 May 2016 - Results and Discussion

    Science.gov (United States)

    Zender, Joe; Barnes, Rebecca; Zuidervaart, Huib; Benkhoff, Johannes; Martinez, Santa; Breitfellner, Michel; Almeida, Miguel

    2017-04-01

    On 9 May 2016 an intriguing and rare event occurred. Seen from most countries in Europe, Mercury, the planet nearest to the Sun, crossed the Sun's surface. Such a phenomenon is better known for the moon, for during such an eclipse it gets dark (or darker), so everyone will notice that something special is going on. But as Mercury is very, very small compared to the Sun, one will never remark such a Mercury-eclipse by oneself. It was the famous astronomer Johannes Kepler who realized in 1601 that Mercury (or Venus) transits could be observed from the Earth. Later in 1691, Edmund Halley published a mathematical algorithm to compute the solar parallax (from which one can determine the distance from Earth to the Sun) from observations made during the transit. It is sad to note that neither of the both scientists had the chance to witness a Mercury transit during their lifetime. Well before the event, the ESA Communication Office announced a school competition to observe the Mercury transit and repeat the measurements proposed by Edmund Halley and other scientists since then. Several hints were given on the observation possibilities (telescope, binoculars, solar glasses), and examples of the algorithms in form of written formulae or excel sheet formulae were given. All schools were encouraged to share their data with each other and the needed support was provided by ESA. After the transit, all school teams were asked to provided their results and an accompanying report to allow us to get a picture of the team's technical, mathematical, and social activities in preparation of the event and the event itself. In our presentation, we will give a short overview of the participants and their efforts. We analyze our school competition expectations against the results as seen from a scientist point of view (1st and 3rd author) and a scientific communicator point of view (2nd author), and give our perspective towards upcoming planetary eclipse opportunities, i.e. the Mercury

  14. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.; Yan, Lie

    2014-01-01

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  15. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.

    2014-08-29

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  16. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  17. Computation of local exchange coefficients in strongly interacting one-dimensional few-body systems: local density approximation and exact results

    DEFF Research Database (Denmark)

    Marchukov, O. V.; Eriksen, E. H.; Midtgaard, J. M.

    2016-01-01

    -trivial geometric factors that depend solely on the geometry of the confinement through the single-particle eigenstates of the external potential. To obtain accurate effective Hamiltonians to describe such systems one needs to be able to compute these geometric factors with high precision which is difficult due...

  18. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  19. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  20. Computer-aided detection (CAD) and assessment of malignant lesions in the liver and lung using a novel PET/CT software tool. Initial results

    International Nuclear Information System (INIS)

    Hahn, Steffen; Heusner, T.; Forsting, M.; Antoch, G.; Zhou, X.; Zhan, Y.; Peng, Z.; Hamami, M.; Bockisch, A.

    2010-01-01

    Purpose: To determine the feasibility of a PET/CT software tool (PET computer-aided detection: PET-CAD) for automated detection and assessment of pulmonary and hepatic lesions. Materials and Methods: 20 consecutive patients with colorectal liver metastases and 20 consecutive patients suffering from non-small cell lung cancer (NSCLC) were examined with FDG-PET/CT. In a first step the maximum standardized uptake values (SUV max ) of non-tumorous liver and lung tissues were determined manually. This value was used as a threshold value for software-based lesion detection. The number of lesions detected, their SUV max , and their sizes in the x, y, and z-planes, as automatically provided by PET-CAD, were compared to visual lesion detection and manual measurements on CT. Results: The sensitivity for automated detection was 96% (86-99%) for colorectal liver metastases and 90% (70-99%) for lung lesions. The positive predictive value was 80% for liver and 68% for lung lesions. The mean SUV max of all lung lesions was 9.3 and 8.8 for the liver lesions. When assessed by PET-CAD, the mean lesion sizes for liver lesions in the x, y, and z-planes were 4.3 cm, 4.6 cm, and 4.2 cm compared to 3.5 cm, 3.8 cm, and 3.6 cm for manual measurements. The mean lesion sizes of lung lesions were 7.4 cm, 7.7 cm, and 8.4 cm in the x, y, and z-planes when assessed by PET-CAD compared to 5.8 cm, 6.1 cm, and 7.1 cm when measured manually. Using manual assessment, the lesion sizes were significantly smaller in all planes (p < 0.005). Conclusion: Software tools for automated lesion detection and assessment are expected to improve the clinical PET/CT workflow. Before implementation in the clinical routine, further improvements to the measurement accuracy are required. (orig.)

  1. The Effects of the Previous Outcome on Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2014-01-01

    This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915

  2. Impact of previously disadvantaged land-users on sustainable ...

    African Journals Online (AJOL)

    Impact of previously disadvantaged land-users on sustainable agricultural ... about previously disadvantaged land users involved in communal farming systems ... of input, capital, marketing, information and land use planning, with effect on ...

  3. Milky Way Past Was More Turbulent Than Previously Known

    Science.gov (United States)

    2004-04-01

    Results of 1001 observing nights shed new light on our Galaxy [1] Summary A team of astronomers from Denmark, Switzerland and Sweden [2] has achieved a major breakthrough in our understanding of the Milky Way, the galaxy in which we live. After more than 1,000 nights of observations spread over 15 years, they have determined the spatial motions of more than 14,000 solar-like stars residing in the neighbourhood of the Sun. For the first time, the changing dynamics of the Milky Way since its birth can now be studied in detail and with a stellar sample sufficiently large to allow a sound analysis. The astronomers find that our home galaxy has led a much more turbulent and chaotic life than previously assumed. PR Photo 10a/04: Distribution on the sky of the observed stars. PR Photo 10b/04: Stars in the solar neigbourhood and the Milky Way galaxy (artist's view). PR Video Clip 04/04: The motions of the observed stars during the past 250 million years. Unknown history Home is the place we know best. But not so in the Milky Way - the galaxy in which we live. Our knowledge of our nearest stellar neighbours has long been seriously incomplete and - worse - skewed by prejudice concerning their behaviour. Stars were generally selected for observation because they were thought to be "interesting" in some sense, not because they were typical. This has resulted in a biased view of the evolution of our Galaxy. The Milky Way started out just after the Big Bang as one or more diffuse blobs of gas of almost pure hydrogen and helium. With time, it assembled into the flattened spiral galaxy which we inhabit today. Meanwhile, generation after generation of stars were formed, including our Sun some 4,700 million years ago. But how did all this really happen? Was it a rapid process? Was it violent or calm? When were all the heavier elements formed? How did the Milky Way change its composition and shape with time? Answers to these and many other questions are 'hot' topics for the

  4. Incidence of Acneform Lesions in Previously Chemically Damaged Persons-2004

    Directory of Open Access Journals (Sweden)

    N Dabiri

    2008-04-01

    Full Text Available ABSTRACT: Introduction & Objective: Chemical gas weapons especially nitrogen mustard which was used in Iraq-Iran war against Iranian troops have several harmful effects on skin. Some other chemical agents also can cause acne form lesions on skin. The purpose of this study was to compare the incidence of acneform in previously chemically damaged soldiers and non chemically damaged persons. Materials & Methods: In this descriptive and analytical study, 180 chemically damaged soldiers, who have been referred to dermatology clinic between 2000 – 2004, and forty non-chemically damaged people, were chosen randomly and examined for acneform lesions. SPSS software was used for statistic analysis of the data. Results: The mean age of the experimental group was 37.5 ± 5.2 and that of the control group was 38.7 ± 5.9 years. The mean percentage of chemical damage in cases was 31 percent and the time after the chemical damage was 15.2 ± 1.1 years. Ninety seven cases (53.9 percent of the subjects and 19 people (47.5 percent of the control group had some degree of acne. No significant correlation was found in incidence, degree of lesions, site of lesions and age of subjects between two groups. No significant correlation was noted between percentage of chemical damage and incidence and degree of lesions in case group. Conclusion: Incidence of acneform lesions among previously chemically injured peoples was not higher than the normal cases.

  5. Computer-enhanced interventions for drug use and HIV risk in the emergency room: preliminary results on psychological precursors of behavior change.

    Science.gov (United States)

    Bonar, Erin E; Walton, Maureen A; Cunningham, Rebecca M; Chermack, Stephen T; Bohnert, Amy S B; Barry, Kristen L; Booth, Brenda M; Blow, Frederic C

    2014-01-01

    This article describes process data from a randomized controlled trial among 781 adults recruited in the emergency department who reported recent drug use and were randomized to: intervener-delivered brief intervention (IBI) assisted by computer, computerized BI (CBI), or enhanced usual care (EUC). Analyses examined differences between baseline and post-intervention on psychological constructs theoretically related to changes in drug use and HIV risk: importance, readiness, intention, help-seeking, and confidence. Compared to EUC, participants receiving the IBI significantly increased in confidence and intentions; CBI patients increased importance, readiness, confidence, and help-seeking. Both groups increased relative to the EUC in likelihood of condom use with regular partners. Examining BI components suggested that benefits of change and tools for change were associated with changes in psychological constructs. Delivering BIs targeting drug use and HIV risk using computers appears promising for implementation in healthcare settings. This trial is ongoing and future work will report behavioral outcomes. © 2013.

  6. Computation accuracy of flow conditions around a very large floating structure using a multi-layer model. Comparison with experimental results; Taso model ni yoru choogata futai mawari no ryukyo keisan seido ni tsuite. Jikken tono hikaku

    Energy Technology Data Exchange (ETDEWEB)

    Kyotsuka, Y [Kyushu University, Fukuoka (Japan); Omori, H; Nakagawa, H; Kobayashi, M [Mitsui Engineering and Shipbuilding Co. Ltd., Tokyo (Japan)

    1996-04-10

    As one of the environmental problems in sea areas surrounding a very large floating structure (VLFS), change in flow condition is important, and it is one of the factors dominating the prediction of succeeding diffusion and ecosystems. Although a multi-layer model is in wide use for computation of flow condition and diffusion in one inner bay, its applicability should be reexamined because of no consideration of VLFSs. In this study, flow velocity profiles around a barge were then measured through the towing test of a barge in shallow water, and compared with computation results using a multi-layer model. The multi-layer model computed the flow velocity profiles by dividing the flow region to be computed into normal one and that under VLFS, and determined pressures under VLFS by 2-D Poisson`s equation. Slip condition was used as boundary condition at the bottom considering the number of layers under VLFS. Further numerical computation was conducted by 2-D MAC method, in particular, to compare flow around the wake of VLFS with experimental one. Both computation results well agreed with experimental one. 3 refs., 9 figs., 1 tab.

  7. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  8. Fire Risk Scoping Study: Investigation of nuclear power plant fire risk, including previously unaddressed issues

    International Nuclear Information System (INIS)

    Lambright, J.A.; Nowlen, S.P.; Nicolette, V.F.; Bohn, M.P.

    1989-01-01

    An investigation of nuclear power plant fire risk issues raised as a result of the USNRC sponsored Fire Protection Research Program at Sandia National Laboratories has been performed. The specific objectives of this study were (1) to review and requantify fire risk scenarios from four fire probabilistic risk assessments (PRAs) in light of updated data bases made available as a result of USNRC sponsored Fire Protection Research Program and updated computer fire modeling capabilities, (2) to identify potentially significant fire risk issues that have not been previously addressed in a fire risk context and to quantify the potential impact of those identified fire risk issues where possible, and (3) to review current fire regulations and plant implementation practices for relevance to the identified unaddressed fire risk issues. In performance of the fire risk scenario requantifications several important insights were gained. It was found that utilization of a more extensive operational experience base resulted in both fire occurrence frequencies and fire duration times (i.e., time required for fire suppression) increasing significantly over those assumed in the original works. Additionally, some thermal damage threshold limits assumed in the original works were identified as being nonconservative based on more recent experimental data. Finally, application of the COMPBRN III fire growth model resulted in calculation of considerably longer fire damage times than those calculated in the original works using COMPBRN I. 14 refs., 2 figs., 16 tabs

  9. Evaluation of an interdisciplinary re-isolation policy for patients with previous Clostridium difficile diarrhea.

    Science.gov (United States)

    Boone, N; Eagan, J A; Gillern, P; Armstrong, D; Sepkowitz, K A

    1998-12-01

    Diarrhea caused by Clostridium difficile is increasingly recognized as a nosocomial problem. The effectiveness and cost of a new program to decrease nosocomial spread by identifying patients scheduled for readmission who were previously positive for toxin was evaluated. The Memorial Sloan-Kettering Cancer Center is a 410-bed comprehensive cancer center in New York City. Many patients are readmitted during their course of cancer therapy. In 1995 as a result of concern about the nosocomial spread of C difficile, we implemented a policy that all patients who were positive for C difficile toxin in the previous 6 months with no subsequent toxin-negative stool as an outpatient would be placed into contact isolation on readmission pending evaluation of stool specimens. Patients who were previously positive for C difficile toxin were identified to infection control and admitting office databases via computer. Admitting personnel contacted infection control with all readmissions to determine whether a private room was required. Between July 1, 1995, and June 30, 1996, 47 patients who were previously positive for C difficile toxin were readmitted. Before their first scheduled readmission, the specimens for 15 (32%) of these patients were negative for C difficile toxin. They were subsequently cleared as outpatients and were readmitted without isolation. Workup of the remaining 32 patients revealed that the specimens for 7 patients were positive for C difficile toxin and 86 isolation days were used. An additional 25 patients used 107 isolation days and were either cleared after a negative specimen was obtained in-house or discharged without having an appropriate specimen sent. Four patients (9%) had reoccurring C difficile after having toxin-negative stools. We estimate (because outpatient specimens were not collected) the cost incurred at $48,500 annually, including the incremental cost of hospital isolation and equipment. Our policy to control the spread of nosocomial C

  10. Fiscal 1997 report on the results of the international standardization R and D. International standards for computers/manikins; 1997 nendo seika hokokusho kokusai hyojun soseigata kenkyu kaihatsu. Computer manikin ni kansuru kokusai hyojun kikaku

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Through the development of computer manikins (CM) which assess human adaptability to products and environments, a draft for international standardization was worked out to propose to ISO. A draft for the international standardization was presented to ISO through a development of `a structure model` changing based on human attributes, a study of `a motion model` enabling changes in posture and movement, a study of `an evaluation model` evaluating attainment ranges and ecodynamic loads, and a development of `computer functions` realizing the above-mentioned functions. The development of CM having the following characteristics: a function to reproduce `the structure model` based on the ISO7250 human body dimensional measuring values which were regulated in items for the human body dimensional measuring, a function to change posture/movement based on the joint movable range data, a function to evaluate geometrical human adaptability such as attainment ranges. As a plug-in to Autodesk Mechanical Desktop 2.0, the above-mentioned functions were realized, and the modular structure platform was constructed which enables the wide-range cross-industry option and functional expansion by the advance of CM. 7 refs., 41 figs., 18 tabs.

  11. [(18)F]-fluorocholine positron-emission/computed tomography for lymph node staging of patients with prostate cancer: preliminary results of a prospective study

    DEFF Research Database (Denmark)

    Poulsen, Mads H; Bouchelouche, Kirsten; Gerke, Oke

    2010-01-01

    Study Type - Diagnostic (case series) Level of Evidence 4 OBJECTIVES To evaluate prospectively [(18)F]-fluorocholine positron-emission/computed tomography (FCH PET/CT) for lymph node staging of prostate cancer before intended curative therapy, and to determine whether imaging 15 or 60 min after......; the corresponding 95% confidence intervals were 29.2-100%, 77.2-99.9%, 19.4-99.4% and 83.9-100%, respectively. Values of SUV(max) at early and late imaging were not significantly different. CONCLUSIONS This small series supports the use of FCH PET/CT as a tool for lymph node staging of patients with prostate cancer...

  12. Dynamic contrast-enhanced computed tomography as a potential biomarker in patients with metastatic renal cell carcinoma: preliminary results from the Danish Renal Cancer Group Study-1

    DEFF Research Database (Denmark)

    Mains, Jill Rachel; Donskov, Frede; Pedersen, Erik Morre

    2014-01-01

    OBJECTIVES: The aim of this study was to explore the impact of dynamic contrast-enhanced (DCE) computer tomography (CT) as a biomarker in metastatic renal cell carcinoma (mRCC). MATERIALS AND METHODS: Twelve patients with favorable or intermediate Memorial Sloan Kettering Cancer Center risk group...... blinded to treatment group. The DCE-CT scans were performed at baseline, at weeks 5 and 10, and thereafter every third month. Blood flow (BF; mL/min/100 mL), peak enhancement (Hounsfield units), time to peak (seconds), and blood volume (BV; mL/100 g) were calculated. Parameters for DCE-CT were correlated...

  13. Computed secondary-particle energy spectra following nonelastic neutron interactions with 12C for En between 15 and 60 MeV: Comparisons of results from two calculational methods

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1991-04-01

    The organic scintillation detector response code SCINFUL has been used to compute secondary-particle energy spectra, dσ/dE, following nonelastic neutron interactions with 12 C for incident neutron energies between 15 and 60 MeV. The resulting spectra are compared with published similar spectra computed by Brenner and Prael who used an intranuclear cascade code, including alpha clustering, a particle pickup mechanism, and a theoretical approach to sequential decay via intermediate particle-unstable states. The similarities of and the differences between the results of the two approaches are discussed. 16 refs., 44 figs., 2 tabs

  14. Is Previous Respiratory Disease a Risk Factor for Lung Cancer?

    Science.gov (United States)

    Denholm, Rachel; Schüz, Joachim; Straif, Kurt; Stücker, Isabelle; Jöckel, Karl-Heinz; Brenner, Darren R.; De Matteis, Sara; Boffetta, Paolo; Guida, Florence; Brüske, Irene; Wichmann, Heinz-Erich; Landi, Maria Teresa; Caporaso, Neil; Siemiatycki, Jack; Ahrens, Wolfgang; Pohlabeln, Hermann; Zaridze, David; Field, John K.; McLaughlin, John; Demers, Paul; Szeszenia-Dabrowska, Neonila; Lissowska, Jolanta; Rudnai, Peter; Fabianova, Eleonora; Dumitru, Rodica Stanescu; Bencko, Vladimir; Foretova, Lenka; Janout, Vladimir; Kendzia, Benjamin; Peters, Susan; Behrens, Thomas; Vermeulen, Roel; Brüning, Thomas; Kromhout, Hans

    2014-01-01

    Rationale: Previous respiratory diseases have been associated with increased risk of lung cancer. Respiratory conditions often co-occur and few studies have investigated multiple conditions simultaneously. Objectives: Investigate lung cancer risk associated with chronic bronchitis, emphysema, tuberculosis, pneumonia, and asthma. Methods: The SYNERGY project pooled information on previous respiratory diseases from 12,739 case subjects and 14,945 control subjects from 7 case–control studies conducted in Europe and Canada. Multivariate logistic regression models were used to investigate the relationship between individual diseases adjusting for co-occurring conditions, and patterns of respiratory disease diagnoses and lung cancer. Analyses were stratified by sex, and adjusted for age, center, ever-employed in a high-risk occupation, education, smoking status, cigarette pack-years, and time since quitting smoking. Measurements and Main Results: Chronic bronchitis and emphysema were positively associated with lung cancer, after accounting for other respiratory diseases and smoking (e.g., in men: odds ratio [OR], 1.33; 95% confidence interval [CI], 1.20–1.48 and OR, 1.50; 95% CI, 1.21–1.87, respectively). A positive relationship was observed between lung cancer and pneumonia diagnosed 2 years or less before lung cancer (OR, 3.31; 95% CI, 2.33–4.70 for men), but not longer. Co-occurrence of chronic bronchitis and emphysema and/or pneumonia had a stronger positive association with lung cancer than chronic bronchitis “only.” Asthma had an inverse association with lung cancer, the association being stronger with an asthma diagnosis 5 years or more before lung cancer compared with shorter. Conclusions: Findings from this large international case–control consortium indicate that after accounting for co-occurring respiratory diseases, chronic bronchitis and emphysema continue to have a positive association with lung cancer. PMID:25054566

  15. Twelve previously unknown phage genera are ubiquitous in global oceans.

    Science.gov (United States)

    Holmfeldt, Karin; Solonenko, Natalie; Shah, Manesh; Corrier, Kristen; Riemann, Lasse; Verberkmoes, Nathan C; Sullivan, Matthew B

    2013-07-30

    Viruses are fundamental to ecosystems ranging from oceans to humans, yet our ability to study them is bottlenecked by the lack of ecologically relevant isolates, resulting in "unknowns" dominating culture-independent surveys. Here we present genomes from 31 phages infecting multiple strains of the aquatic bacterium Cellulophaga baltica (Bacteroidetes) to provide data for an underrepresented and environmentally abundant bacterial lineage. Comparative genomics delineated 12 phage groups that (i) each represent a new genus, and (ii) represent one novel and four well-known viral families. This diversity contrasts the few well-studied marine phage systems, but parallels the diversity of phages infecting human-associated bacteria. Although all 12 Cellulophaga phages represent new genera, the podoviruses and icosahedral, nontailed ssDNA phages were exceptional, with genomes up to twice as large as those previously observed for each phage type. Structural novelty was also substantial, requiring experimental phage proteomics to identify 83% of the structural proteins. The presence of uncommon nucleotide metabolism genes in four genera likely underscores the importance of scavenging nutrient-rich molecules as previously seen for phages in marine environments. Metagenomic recruitment analyses suggest that these particular Cellulophaga phages are rare and may represent a first glimpse into the phage side of the rare biosphere. However, these analyses also revealed that these phage genera are widespread, occurring in 94% of 137 investigated metagenomes. Together, this diverse and novel collection of phages identifies a small but ubiquitous fraction of unknown marine viral diversity and provides numerous environmentally relevant phage-host systems for experimental hypothesis testing.

  16. Black box integration of computer-aided diagnosis into PACS deserves a second chance: results of a usability study concerning bone age assessment.

    Science.gov (United States)

    Geldermann, Ina; Grouls, Christoph; Kuhl, Christiane; Deserno, Thomas M; Spreckelsen, Cord

    2013-08-01

    Usability aspects of different integration concepts for picture archiving and communication systems (PACS) and computer-aided diagnosis (CAD) were inquired on the example of BoneXpert, a program determining the skeletal age from a left hand's radiograph. CAD-PACS integration was assessed according to its levels: data, function, presentation, and context integration focusing on usability aspects. A user-based study design was selected. Statements of seven experienced radiologists using two alternative types of integration provided by BoneXpert were acquired and analyzed using a mixed-methods approach based on think-aloud records and a questionnaire. In both variants, the CAD module (BoneXpert) was easily integrated in the workflow, found comprehensible and fitting in the conceptual framework of the radiologists. Weak points of the software integration referred to data and context integration. Surprisingly, visualization of intermediate image processing states (presentation integration) was found less important as compared to efficient handling and fast computation. Seamlessly integrating CAD into the PACS without additional work steps or unnecessary interrupts and without visualizing intermediate images may considerably improve software performance and user acceptance with efforts in time.

  17. Influence of Previous Knowledge in Torrance Tests of Creative Thinking

    Directory of Open Access Journals (Sweden)

    María Aranguren

    2015-07-01

    Full Text Available The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974 performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertising (Communication Sciences. Results found in this research seem to indicate that there in none influence of the study field, expertise and recreational activities participation in neither of the TTCT tests. Instead, the findings seem to suggest some kind of interaction between certain skills needed to succeed in specific studies fields and performance on creativity tests, such as the TTCT. These results imply that TTCT is a useful and valid instrument to measure creativity and that some cognitive process involved in innovative thinking can be promoted using different intervention programs in schools and universities regardless the students study field.

  18. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  19. Functional high-resolution computed tomography of pulmonary vascular and airway reactions. Experimental results. Funktionelle HR-CT der Lunge. Experimentelle Untersuchungen pulmonaler Gefaess- und Atemwegsreaktionen

    Energy Technology Data Exchange (ETDEWEB)

    Herold, C.J. (Universitaetsklinik fuer Radiodiagnostik, Vienna (Austria) Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Radiology); Brown, R.H. (Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Radiology Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Anesthesiology and Intensive Care Medicine Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Physiology); Wetzel, R.C.; Herold, S.M. (Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Anesthesiology and Intensive Care Medicine); Zeerhouni, E.A. (Johns Hopkins Medical Institutions, Baltimore, MD (United States). Dept. of Radiology)

    1993-03-01

    We describe the use of high-resolution computed tomography (HRCT) for assessment of the function of pulmonary vessels and airways. With its excellent spatial resolution, HRCT is able to demonstrate pulmonary structures as small as 300 [mu]m and can be used to monitor changes following various stimuli. HRCT also provides information about structures smaller than 300 [mu]m through measurement of parenchymal background density. To date, sequential, spiral and ultrafast HRCT techniques have been used in a variety of challenges to gather information about the anatomical correlates of traditional physiological measurements, thus making anatomical-physiological correlation possible. HRCT of bronchial reactivity can demonstrate the location and time course of aerosol-induced broncho-constriction and may show changes not apparent on spirometry. HRCT of the pulmonary vascular system visualized adaptations of vessels during hypoxia and intravascular volume loading and elucidates cardiorespiratory interactions. Experimental studies provide a basis for potential clinical applications of this method. (orig.).

  20. [Potentialities of computed tomography and ultrasound in diagnosis of hormonally active adrenal diseases: results of comparison CT and US with operative adn histological data].

    Science.gov (United States)

    Denisova, L B; Vorontsova, S V; Emel'ianova, L N

    2000-01-01

    The data given in the paper suggest that X-ray computed tomography (CT) is highly effective in detecting all types of hormonally active adrenal abnormalities. CT used in hormonally active adrenal diseases yielded data on major quantitative and qualitative (primarily densitometric) criteria that could be used in assessing the images of the adrenal area in these patients. Ultrasound study (USS) made at the first stage of topical diagnostic searches was of informative value in detecting adrenal tumor lesions, the technique being highly sensitive in the diagnosis of adrenal pheochromocytomas and adenocarcinomas, but less informative in the detection of hormonally active adrenocortical adenomas (aldesterone-producing ones in particular) than CT. The diagnosis of various adrenocortical hyperplasies and the differentiation of hyperplastic and tumor forms of hypercorticoidism are a prerogative of CT that substantially supplements USS findings in such cases.

  1. Feasibility and efficacy of a computer-based intervention aimed at preventing reading decoding deficits among children undergoing active treatment for medulloblastoma: results of a randomized trial.

    Science.gov (United States)

    Palmer, Shawna L; Leigh, Laurie; Ellison, Susan C; Onar-Thomas, Arzu; Wu, Shengjie; Qaddoumi, Ibrahim; Armstrong, Gregory T; Wright, Karen; Wetmore, Cynthia; Broniscer, Alberto; Gajjar, Amar

    2014-05-01

    To investigate the feasibility of a computer-based reading intervention completed by patients diagnosed with a brain tumor. Patients were randomized to the intervention (n = 43) or standard of care group (n = 38). The intervention consisted of 30 sessions using Fast ForWord® exercises in a game-like format. Change in reading decoding scores over time since diagnosis was examined. Gender, race, parent education, parent marital status, and age at diagnosis were examined as covariates. 17 patients (39.5%) were able to complete the target goal of 30 intervention sessions. Females had significantly greater training time than males (p = .022). Age at diagnosis was associated with average training time/session for females (r = .485, p = .041). No significant differences were found in reading scores between the randomized groups. The study was well accepted by families and adherence by patients undergoing radiation therapy for medulloblastoma was moderate. Suggestions for improved methodology are discussed.

  2. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  3. 49 CFR 173.23 - Previously authorized packaging.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  4. 28 CFR 10.5 - Incorporation of papers previously filed.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  5. 75 FR 76056 - FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT:

    Science.gov (United States)

    2010-12-07

    ... SECURITIES AND EXCHANGE COMMISSION Sunshine Act Meeting FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT: STATUS: Closed meeting. PLACE: 100 F Street, NE., Washington, DC. DATE AND TIME OF PREVIOUSLY ANNOUNCED MEETING: Thursday, December 9, 2010 at 2 p.m. CHANGE IN THE MEETING: Time change. The closed...

  6. No discrimination against previous mates in a sexually cannibalistic spider

    Science.gov (United States)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  7. Calculation of limits for significant unidirectional changes in two or more serial results of a biomarker based on a computer simulation model

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2015-01-01

    BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV...... the presented factors. The first result is multiplied by the appropriate factor for increase or decrease, which gives the limits for a significant difference.......BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV......,000 simulated data from healthy individuals, a series of up to 20 results from an individual was generated using different values for the within-subject biological variation plus the analytical variation. Each new result in this series was compared to the initial measurement result. These successive serial...

  8. Aortic pseudoaneurysm detected on external jugular venous distention following a Bentall procedure 10 years previously.

    Science.gov (United States)

    Fukunaga, Naoto; Shomura, Yu; Nasu, Michihiro; Okada, Yukikatsu

    2010-11-01

    An asymptomatic 49-year-old woman was admitted for the purpose of surgery for aortic pseudoaneurysm. She had Marfan syndrome and had undergone an emergent Bentall procedure 10 years previously. About six months previously, she could palpate distended bilateral external jugular veins, which became distended only in a supine position and without any other symptoms. Enhanced computed tomography revealed an aortic pseudoaneurysm originating from a previous distal anastomosis site. During induction of general anesthesia in a supine position, bilateral external jugular venous distention was remarkable. Immediately after a successful operation, distention completely resolved. The present case emphasizes the importance of physical examination leading to a diagnosis of asymptomatic life-threatening diseases in patients with a history of previous aortic surgery.

  9. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Mencel, Liam A.

    2014-01-01

    computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result

  10. Coronary collateral vessels in patients with previous myocardial infarction

    International Nuclear Information System (INIS)

    Nakatsuka, M.; Matsuda, Y.; Ozaki, M.

    1987-01-01

    To assess the degree of collateral vessels after myocardial infarction, coronary angiograms, left ventriculograms, and exercise thallium-201 myocardial scintigrams of 36 patients with previous myocardial infarction were reviewed. All 36 patients had total occlusion of infarct-related coronary artery and no more than 70% stenosis in other coronary arteries. In 19 of 36 patients with transient reduction of thallium-201 uptake in the infarcted area during exercise (Group A), good collaterals were observed in 10 patients, intermediate collaterals in 7 patients, and poor collaterals in 2 patients. In 17 of 36 patients without transient reduction of thallium-201 uptake in the infarcted area during exercise (Group B), good collaterals were seen in 2 patients, intermediate collaterals in 7 patients, and poor collaterals in 8 patients (p less than 0.025). Left ventricular contractions in the infarcted area were normal or hypokinetic in 10 patients and akinetic or dyskinetic in 9 patients in Group A. In Group B, 1 patient had hypokinetic contraction and 16 patients had akinetic or dyskinetic contraction (p less than 0.005). Thus, patients with transient reduction of thallium-201 uptake in the infarcted area during exercise had well developed collaterals and preserved left ventricular contraction, compared to those in patients without transient reduction of thallium-201 uptake in the infarcted area during exercise. These results suggest that the presence of viable myocardium in the infarcted area might be related to the degree of collateral vessels

  11. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  12. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  13. Is Whole-Body Computed Tomography the Standard Work-up for Severely-Injured Children? Results of a Survey among German Trauma Centers.

    Science.gov (United States)

    Bayer, J; Reising, K; Kuminack, K; Südkamp, N P; Strohm, P C

    2015-01-01

    Whole-body computed tomography is accepted as the standard procedure in the primary diagnostic of polytraumatised adults in the emergency room. Up to now there is still controversial discussion about the same algorithm in the primary diagnostic of children. The aim of this study was to survey the participation of German trauma-centres in the care of polytraumatised children and the hospital dependant use of whole-body computed tomography for initial patient work-up. A questionnaire was mailed to every Department of Traumatology registered in the DGU (German Trauma Society) databank. We received 60,32% of the initially sent questionnaires and after applying exclusion criteria 269 (53,91%) were applicable to statistical analysis. In the three-tiered German hospital system no statistical difference was seen in the general participation of children polytrauma care between hospitals of different tiers (p = 0.315). Even at the lowest hospital level 69,47% of hospitals stated to participate in polytrauma care for children, at the intermediate and highest level hospitals 91,89% and 95,24% stated to be involved in children polytrauma care, respectively. Children suspicious of multiple injuries or polytrauma received significantly fewer primary whole-body CTs in lowest level compared to intermediate level hospitals (36,07% vs. 56,57%; p = 0.015) and lowest level compared to highest level hospitals (36,07% vs. 68,42%; p = 0.001). Comparing the use of whole-body CT in intermediate to highest level hospitals a not significant increase in its use could be seen in highest level hospitals (56,57% vs. 68,42%; p = 0.174). According to our survey, taking care of polytraumatised children in Germany is not limited to specialised hospitals or a defined hospital level-of-care. Additionally, there is no established radiologic standard in work-up of the polytraumatised child. However, in higher hospital care -levels a higher percentage of hospitals employs whole-body CTs for primary

  14. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  15. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  16. RESRAD-BUILD: A computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material

    International Nuclear Information System (INIS)

    Yu, C.; LePoire, D.J.; Jones, L.G.

    1994-11-01

    The RESRAD-BUILD computer code is a pathway analysis model designed to evaluate the potential radiological dose incurred by an individual who works or lives in a building contaminated with radioactive material. The transport of radioactive material inside the building from one compartment to another is calculated with an indoor air quality model. The air quality model considers the transport of radioactive dust particulates and radon progeny due to air exchange, deposition and resuspension, and radioactive decay and ingrowth. A single run of the RESRAD-BUILD code can model a building with up to: three compartments, 10 distinct source geometries, and 10 receptor locations. A shielding material can be specified between each source-receptor pair for external gamma dose calculations. Six exposure pathways are considered in the RESRAD-BUILD code: (1) external exposure directly from the source; (2) external exposure to materials deposited on the floor; (3) external exposure due to air submersion; (4) inhalation of airborne radioactive particulates; (5) inhalation of aerosol indoor radon progeny; and (6) inadvertent ingestion of radioactive material, either directly from the sources or from materials deposited on the surfaces of the building compartments

  17. [Misinterpretation of the anteversion in computer-assisted acetabular cup navigation as a result of a simplified palpation method of the frontal pelvic plane].

    Science.gov (United States)

    Richolt, J A; Rittmeister, M E

    2006-01-01

    Computer assisted navigation of the acetabular cup in THR requires reliable digitalisation of bony landmarks defining the frontal pelvic plane by user driven palpation. According to the system recommendations the subcutaneous fat should be held aside during epicutaneous digitalization. To improve intraoperative practicability this is often neglected in the symphysis area. In these cases the fat is just compressed and not pushed aside. In this study soft tissue thickness was assessed by ultrasound and pelvic geometry was measured in 72 patients to quantify potential misinterpretation of cup anteversion triggered by the simplified palpation. As reference we employed data of the same patients that had been acquired by recommended palpation. Anteversion misinterpretation averaged at 8.2 degrees with extremes from 2 to 24 degrees. There were no correlations between soft tissue thickness or misinterpretation and body weight, height and pelvic size. Anteversion misinterpretation was highly significant worse compared to the reference data. In 31 % of the patients the anteversion misinterpretation of a navigation system would have been wrong by over 10 degrees and in 81 % over 5 degrees . Therefore the simplified palpation should not be utilized. For epicutaneous digitalization of the bony landmarks it is mandatory to push the subcutaneous fat aside.

  18. Low-dose computed tomography of the paranasal sinus and facial skull using a high-pitch dual-source system - First clinical results

    International Nuclear Information System (INIS)

    Schell, Boris; Bauer, Ralf W.; Lehnert, Thomas; Kerl, J.M.; Vogl, Thomas J.; Mack, Martin G.; Hambek, Markus; May, Angelika

    2011-01-01

    Computed tomography (CT) of the paranasal sinus is the standard diagnostic tool for a wide range of indications in mostly younger patients. This study aims to assess the image quality of CT of the sinus by using a high-pitch dual-source technique with special regard to the radiation dose. Examinations were performed on a second-generation dual-source CT with a pitch factor of 3.0 (dual-source mode). Images were compared with those with a pitch factor of 0.9 on the same system (single-source mode) and with those of 16-slice CT. Image quality was evaluated by four blinded readers using a 5-point scale (1 = poor, 5 = excellent). Comparison of the dose length product (DLP) was used to estimate radiation exposure. Seventy-three consecutive patients underwent imaging with the proposed CT protocols. The viewers rated the image quality of the dual-source image sets as nearly as good (3.62) as the single-source images on the same device (4.18) and those on 16-slice CT (3.7). DLP was cut to half of the dose [51 mGycm vs. 97.8 mGycm vs. 116.9 mGycm (p < 0.01)]. Using the proposed dual-source mode when examining the paranasal sinus, diagnostic image quality can be achieved while drastically lowering the patient's radiation exposure. (orig.)

  19. Clinical results with beta-methyl-p-(123I)iodophenylpentadecanoic acid, single-photon emission computed tomography in cardiac disease.

    Science.gov (United States)

    Nishimura, T; Uehara, T; Shimonagata, T; Nagata, S; Haze, K

    1994-01-01

    This study was undertaken to evaluate the relationships, between myocardial perfusion and metabolism. Simultaneous beta-methyl-p(123I)iodophenylpentadecanoic acid (123I-BMIPP) and thallium 201 myocardial single-photon emission computed tomography (SPECT) were performed in 25 patients with myocardial infarction (group A) and 16 patients with hypertrophic cardiomyopathy (group B). The severity scores of 123I-BMIPP and 201Tl myocardial SPECT images were evaluated semiquantitatively by segmental analysis. In Group A, dissociations between thallium- and 123I-BMIPP-imaged defects were frequently observed in patients with successful reperfusion compared with those with no reperfusion and those with reinfarction. In four patients with successful reperfusion, repeated 123I-BMIPP and 201Tl myocardial SPECT showed gradual improvement of the 123I-BMIPP severity score compared with the thallium severity score. In group B, dissociations between thallium- and 123I-BMIPP-imaged defects were also demonstrated in hypertrophic myocardium. In addition, nonhypertrophic myocardium also had decreased 123I-BMIPP uptake. In groups A and B, 123I-BMIPP severity scores correlated well with left ventricular function compared with thallium severity scores. These findings indicate that 123I-BMIPP is a suitable agent for the assessment of functional integrity, because left ventricular wall motion is energy dependent and 123I-BMIPP may reflect an aspect of myocardial energy production. This agent may be useful for the early detection and patient management of various heart diseases as an alternative to positron emission tomographic study.

  20. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  1. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  2. Personality disorders in previously detained adolescent females: a prospective study

    NARCIS (Netherlands)

    Krabbendam, A.; Colins, O.F.; Doreleijers, T.A.H.; van der Molen, E.; Beekman, A.T.F.; Vermeiren, R.R.J.M.

    2015-01-01

    This longitudinal study investigated the predictive value of trauma and mental health problems for the development of antisocial personality disorder (ASPD) and borderline personality disorder (BPD) in previously detained women. The participants were 229 detained adolescent females who were assessed

  3. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  4. Choice of contraception after previous operative delivery at a family ...

    African Journals Online (AJOL)

    Choice of contraception after previous operative delivery at a family planning clinic in Northern Nigeria. Amina Mohammed‑Durosinlorun, Joel Adze, Stephen Bature, Caleb Mohammed, Matthew Taingson, Amina Abubakar, Austin Ojabo, Lydia Airede ...

  5. A previous hamstring injury affects kicking mechanics in soccer players.

    Science.gov (United States)

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  6. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  7. Sacrococcygeal pilonidal disease: analysis of previously proposed risk factors

    Directory of Open Access Journals (Sweden)

    Ali Harlak

    2010-01-01

    Full Text Available PURPOSE: Sacrococcygeal pilonidal disease is a source of one of the most common surgical problems among young adults. While male gender, obesity, occupations requiring sitting, deep natal clefts, excessive body hair, poor body hygiene and excessive sweating are described as the main risk factors for this disease, most of these need to be verified with a clinical trial. The present study aimed to evaluate the value and effect of these factors on pilonidal disease. METHOD: Previously proposed main risk factors were evaluated in a prospective case control study that included 587 patients with pilonidal disease and 2,780 healthy control patients. RESULTS: Stiffness of body hair, number of baths and time spent seated per day were the three most predictive risk factors. Adjusted odds ratios were 9.23, 6.33 and 4.03, respectively (p<0.001. With an adjusted odds ratio of 1.3 (p<.001, body mass index was another risk factor. Family history was not statistically different between the groups and there was no specific occupation associated with the disease. CONCLUSIONS: Hairy people who sit down for more than six hours a day and those who take a bath two or less times per week are at a 219-fold increased risk for sacrococcygeal pilonidal disease than those without these risk factors. For people with a great deal of hair, there is a greater need for them to clean their intergluteal sulcus. People who engage in work that requires sitting in a seat for long periods of time should choose more comfortable seats and should also try to stand whenever possible.

  8. Impact of Students’ Class Attendance on Recalling Previously Acquired Information

    Directory of Open Access Journals (Sweden)

    Camellia Hemyari

    2018-03-01

    Full Text Available Background: In recent years, availability of class material including typed lectures, the professor’s Power Point slides, sound recordings, and even videos made a group of students feel that it is unnecessary to attend the classes. These students usually read and memorize typed lectures within two or three days prior to the exams and usually pass the tests even with low attendance rate. Thus, the question is how effective is this learning system and how long the one-night memorized lessons may last.Methods: A group of medical students (62 out of 106 students, with their class attendance and educational achievements in the Medical Mycology and Parasitology course being recorded since two years ago, was selected and their knowledge about this course was tested by multiple choice questions (MCQ designed based on the previous lectures.Results: Although the mean re-exam score of the students at the end of the externship was lower than the corresponding final score, a significant association was found between the scores of the students in these two exams (r=0.48, P=0.01. Moreover, a significant negative association was predicted between the number of absences and re-exam scores (r=-0.26, P=0.037.Conclusion: As our findings show, the phenomenon of recalling the acquired lessons is preserved for a long period of time and it is associated with the students’ attendance. Many factors including generation effect (by taking notes and cued-recall (via slide picture might play a significant role in the better recalling of the learned information in students with good class attendance.Keywords: STUDENT, MEMORY, LONG-TERM, RECALL, ABSENTEEISM, LEARNING

  9. Repeat immigration: A previously unobserved source of heterogeneity?

    Science.gov (United States)

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  10. Brain-Computer Interface-based robotic end effector system for wrist and hand rehabilitation: results of a three-armed randomized controlled trial for chronic stroke

    Directory of Open Access Journals (Sweden)

    Kai Keng eAng

    2014-07-01

    Full Text Available The objective of this study was to investigate the efficacy of an Electroencephalography (EEG-based Motor Imagery (MI Brain-Computer Interface (BCI coupled with a Haptic Knob (HK robot for arm rehabilitation in stroke patients. In this three-arm, single-blind, randomized controlled trial; 21 chronic hemiplegic stroke patients (Fugl-Meyer Motor Assessment (FMMA score 10-50, recruited after pre-screening for MI BCI ability, were randomly allocated to BCI-HK, HK or Standard Arm Therapy (SAT groups. All groups received 18 sessions of intervention over 6 weeks, 3 sessions per week, 90 minutes per session. The BCI-HK group received 1 hour of BCI coupled with HK intervention, and the HK group received 1 hour of HK intervention per session. Both BCI-HK and HK groups received 120 trials of robot-assisted hand grasping and knob manipulation followed by 30 minutes of therapist-assisted arm mobilization. The SAT group received 1.5 hours of therapist-assisted arm mobilization and forearm pronation-supination movements incorporating wrist control and grasp-release functions. In all, 14 males, 7 females, mean age 54.2 years, mean stroke duration 385.1 days, with baseline FMMA score 27.0 were recruited. The primary outcome measure was upper-extremity FMMA scores measured mid-intervention at week 3, end-intervention at week 6, and follow-up at weeks 12 and 24. Seven, 8 and 7 subjects underwent BCI-HK, HK and SAT interventions respectively. FMMA score improved in all groups, but no intergroup differences were found at any time points. Significantly larger motor gains were observed in the BCI-HK group compared to the SAT group at weeks 3, 12 and 24, but motor gains in the HK group did not differ from the SAT group at any time point. In conclusion, BCI-HK is effective, safe, and may have the potential for enhancing motor recovery in chronic stroke when combined with therapist-assisted arm mobilization.

  11. Brain-computer interface-based robotic end effector system for wrist and hand rehabilitation: results of a three-armed randomized controlled trial for chronic stroke.

    Science.gov (United States)

    Ang, Kai Keng; Guan, Cuntai; Phua, Kok Soon; Wang, Chuanchu; Zhou, Longjiang; Tang, Ka Yin; Ephraim Joseph, Gopal J; Kuah, Christopher Wee Keong; Chua, Karen Sui Geok

    2014-01-01

    The objective of this study was to investigate the efficacy of an Electroencephalography (EEG)-based Motor Imagery (MI) Brain-Computer Interface (BCI) coupled with a Haptic Knob (HK) robot for arm rehabilitation in stroke patients. In this three-arm, single-blind, randomized controlled trial; 21 chronic hemiplegic stroke patients (Fugl-Meyer Motor Assessment (FMMA) score 10-50), recruited after pre-screening for MI BCI ability, were randomly allocated to BCI-HK, HK or Standard Arm Therapy (SAT) groups. All groups received 18 sessions of intervention over 6 weeks, 3 sessions per week, 90 min per session. The BCI-HK group received 1 h of BCI coupled with HK intervention, and the HK group received 1 h of HK intervention per session. Both BCI-HK and HK groups received 120 trials of robot-assisted hand grasping and knob manipulation followed by 30 min of therapist-assisted arm mobilization. The SAT group received 1.5 h of therapist-assisted arm mobilization and forearm pronation-supination movements incorporating wrist control and grasp-release functions. In all, 14 males, 7 females, mean age 54.2 years, mean stroke duration 385.1 days, with baseline FMMA score 27.0 were recruited. The primary outcome measure was upper extremity FMMA scores measured mid-intervention at week 3, end-intervention at week 6, and follow-up at weeks 12 and 24. Seven, 8 and 7 subjects underwent BCI-HK, HK and SAT interventions respectively. FMMA score improved in all groups, but no intergroup differences were found at any time points. Significantly larger motor gains were observed in the BCI-HK group compared to the SAT group at weeks 3, 12, and 24, but motor gains in the HK group did not differ from the SAT group at any time point. In conclusion, BCI-HK is effective, safe, and may have the potential for enhancing motor recovery in chronic stroke when combined with therapist-assisted arm mobilization.

  12. Can the possibility of transverse iliosacral screw fixation for first sacral segment be predicted preoperatively? Results of a computational cadaveric study.

    Science.gov (United States)

    Jeong, Jin-Hoon; Jin, Jin Woo; Kang, Byoung Youl; Jung, Gu-Hee

    2017-10-01

    The purpose of this study was to predict the possibility of transverse iliosacral (TIS) screw fixation into the first sacral segment (S 1 ) and introduce practical anatomical variables using conventional computed tomography (CT) scans. A total of 82 cadaveric sacra (42 males and 40 females) were used for continuous 1.0-mm slice CT scans, which were imported into Mimics ® software to produce a three-dimensional pelvis model. The anterior height (BH) and superior width (BW) of the elevated sacral segment was measured, followed by verification of the safe zone (SZ S1 and SZ S2 ) in a true lateral view. Their vertical (VD S1 and VD S2 ) and horizontal (HD S1 and HD S2 ) distances were measured. VD S1 less than 7mm was classified as impossible sacrum, since the transverse fixation of 7.0 mm-sized IS screw could not be done safely. Fourteen models (16.7%; six females, eight males) were assigned as the impossible sacrum. There was no statistical significance regarding gender (p=0.626) and height (p=0.419). The average values were as follows: BW, 31.4mm (SD 2.9); BH, 16.7mm (SD 6.8); VD S1 , 13.4mm (SD 6.1); HD S1 , 22.5mm (SD 4.5); SZ S1 , 239.5mm 2 (SD 137.1); VD S2 , 15.5mm (SD 3.0); HD S2 , 18.3mm (SD 2.9); and SZ S2 , 221.1mm 2 (SD 68.5). Logistic regression analysis identified BH (p=0.001) and HD S1 (p=0.02) as the only statistically significant variables to predict the possibility. Receiver operating characteristic curve analysis established a cut-off value for BH and HD S1 of impossible sacrum of 20.6mm and 18.6mm, respectively. BH and HD S1 could be used to predict the possibility of TIS screw fixation. If the BH exceeds 20.6mm or HD S1 is less than 18.6mm, TIS screw fixation for S 1 should not be undertaken because of narrowed SZ. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing

    2014-09-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  14. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Mencel, Liam A.

    2014-05-06

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (log n) log r + r^(4/3 + ε)) time for any ε > 0. On degenerate input, our time bound increases to O(n (log n) log r + r^(17/11 + ε))

  15. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing; Mencel, Liam A.; Vigneron, Antoine E.

    2014-01-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  16. Long-term safety assessment of trench-type surface repository at Chernobyl, Ukraine - computer model and comparison with results from simplified models

    International Nuclear Information System (INIS)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-01-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, the surface repository for solid low and intermediate level waste (LILW) is still being operated but its maximum capacity is nearly reached. Long-existing plans for increasing the capacity of the facility shall be implemented in the framework of the European Commission INSC Programme (Instrument for Nuclear Safety Co-operation). Within the first phase of this project, DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safety analysis report (PSAR) for a future extended facility based on the planned enlargement. In addition to a detailed mathematical model, also simplified models have been developed to verify results of the former one and enhance confidence in the results. Comparison of the results show that - depending on the boundary conditions - simplifications like modeling the multi trench repository as one generic trench might have very limited influence on the overall results compared to the general uncertainties associated with respective long-term calculations. In addition to their value in regard to verification of more complex models which is important to increase confidence in the overall results, such simplified models can also offer the possibility to carry out time consuming calculations like probabilistic calculations or detailed sensitivity analysis in an economic manner. (authors)

  17. Detailed computational fluid dynamics calculations in order to assess respective safety issues regarding existing nuclear power plant. Interpretation and presentation of results

    International Nuclear Information System (INIS)

    2014-11-01

    The results of analysis showed that in case of injection of cold water to loops water temperature stratification in the pipes will occur if all the RCPs (Reactor Coolant Pump) are stopped. This water temperature stratification can lead to temperature misreading by sensors if they are located improperly, and may lead to wrong operation of PTS (Pressurized Thermal Shock) protection system. Therefore, results of current analysis can be used to choose the correct locations for temperature sensors to be installed in order to determine whether PTS protection system should be activated

  18. A Computer Prescribing Order Entry-Clinical Decision Support system designed for neonatal care: results of the 'preselected prescription' concept at the bedside.

    Science.gov (United States)

    Gouyon, B; Iacobelli, S; Saliba, E; Quantin, C; Pignolet, A; Jacqz-Aigrain, E; Gouyon, J B

    2017-02-01

    The neonatal intensive care units (NICUs) are at the highest risk of drug dose error of all hospital wards. NICUs also have the most complicated prescription modalities. The computerization of the prescription process is currently recommended to decrease the risk of preventable adverse drug effects (pADEs) in NICUs. However, Computer Prescribing Order Entry-Clinical Decision Support (C.P.O.E./C.D.S.) systems have been poorly studied in NICUs, and their technical compatibility with neonatal specificities has been limited. We set up a performance study of the preselected prescription of drugs for neonates, which limited the role of the prescriber to choosing the drugs and their indications. A single 29 bed neonatal ward used this neonatal C.P.O.E./C.D.S. system for all prescriptions of all hospitalized newborns over an 18-month period. The preselected prescription of drugs was based on the indication, gestational age, body weight and post-natal age. The therapeutic protocols were provided by a formulary reference (330 drugs) that had been specifically designed for newborns. The preselected prescription also gave complete information about preparation and administration of drugs by nurses. The prescriber was allowed to modify the preselected prescription but alarms provided warning when the prescription was outside the recommended range. The main clinical characteristics and all items of each line of prescription were stored in a data warehouse, thus enabling this study to take place. Seven hundred and sixty successive newborns (from 24 to 42 weeks' gestation) were prescribed 52 392 lines of prescription corresponding to 65 drugs; About 30·4% of neonates had at least one out of licensed prescription; A prescription out of the recommended range for daily dose was recorded for 1·0% of all drug prescriptions. WHAT IS NEW?: The C.P.O.E./C.D.S. systems can currently provide a complete preselected prescription in NICUs according to dose rules, which are specific to

  19. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  20. Erlotinib-induced rash spares previously irradiated skin

    International Nuclear Information System (INIS)

    Lips, Irene M.; Vonk, Ernest J.A.; Koster, Mariska E.Y.; Houwing, Ronald H.

    2011-01-01

    Erlotinib is an epidermal growth factor receptor inhibitor prescribed to patients with locally advanced or metastasized non-small cell lung carcinoma after failure of at least one earlier chemotherapy treatment. Approximately 75% of the patients treated with erlotinib develop acneiform skin rashes. A patient treated with erlotinib 3 months after finishing concomitant treatment with chemotherapy and radiotherapy for non-small cell lung cancer is presented. Unexpectedly, the part of the skin that had been included in his previously radiotherapy field was completely spared from the erlotinib-induced acneiform skin rash. The exact mechanism of erlotinib-induced rash sparing in previously irradiated skin is unclear. The underlying mechanism of this phenomenon needs to be explored further, because the number of patients being treated with a combination of both therapeutic modalities is increasing. The therapeutic effect of erlotinib in the area of the previously irradiated lesion should be assessed. (orig.)

  1. Reasoning with Previous Decisions: Beyond the Doctrine of Precedent

    DEFF Research Database (Denmark)

    Komárek, Jan

    2013-01-01

    in different jurisdictions use previous judicial decisions in their argument, we need to move beyond the concept of precedent to a wider notion, which would embrace practices and theories in legal systems outside the Common law tradition. This article presents the concept of ‘reasoning with previous decisions...... law method’, but they are no less rational and intellectually sophisticated. The reason for the rather conceited attitude of some comparatists is in the dominance of the common law paradigm of precedent and the accompanying ‘case law method’. If we want to understand how courts and lawyers......’ as such an alternative and develops its basic models. The article first points out several shortcomings inherent in limiting the inquiry into reasoning with previous decisions by the common law paradigm (1). On the basis of numerous examples provided in section (1), I will present two basic models of reasoning...

  2. [Prevalence of previously diagnosed diabetes mellitus in Mexico.

    Science.gov (United States)

    Rojas-Martínez, Rosalba; Basto-Abreu, Ana; Aguilar-Salinas, Carlos A; Zárate-Rojas, Emiliano; Villalpando, Salvador; Barrientos-Gutiérrez, Tonatiuh

    2018-01-01

    To compare the prevalence of previously diagnosed diabetes in 2016 with previous national surveys and to describe treatment and its complications. Mexico's national surveys Ensa 2000, Ensanut 2006, 2012 and 2016 were used. For 2016, logistic regression models and measures of central tendency and dispersion were obtained. The prevalence of previously diagnosed diabetes in 2016 was 9.4%. The increase of 2.2% relative to 2012 was not significant and only observed in patients older than 60 years. While preventive measures have increased, the access to medical treatment and lifestyle has not changed. The treatment has been modified, with an increase in insulin and decrease in hypoglycaemic agents. Population aging, lack of screening actions and the increase in diabetes complications will lead to an increase on the burden of disease. Policy measures targeting primary and secondary prevention of diabetes are crucial.

  3. Human population doses: Comparative analysis of CREAM code results with currently computer codes of Nuclear Regulatory Authority; Dosis en la poblacion: comparacion de los resultados del codigo CREAM con resultados de modelos vigentes en la ARN

    Energy Technology Data Exchange (ETDEWEB)

    Alonso Jimenez, Maria Teresa; Curti, Adriana [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina)]. E-mail: mtalonso@sede.arn.gov.ar; acurti@sede.arn.gov.ar

    2001-07-01

    The Nuclear Regulatory Authority is performing an analysis with PC CREAM, developed at the NRPB, for updating computer programs and models used for calculating the transfer of radionuclides through the environment. For CREAM dose assessment verification for local scenarios, this paper presents a comparison of population doses assessed with the computer codes used nowadays and with CREAM, for unitary releases of main radionuclides in nuclear power plant discharges. The results of atmospheric dispersion processes and the transfer of radionuclides through the environment for local scenarios are analysed. The programs used are PLUME for atmospheric dispersion, FARMLAND for the transfer of radionuclides into foodstuffs following atmospheric deposition in the terrestrial environment and ASSESSOR for individual and collective dose assessments.This paper presents the general assumptions made for dose assessments. The results show some differences between doses due to differences in models, in the complexity level of the same models, or in parameters. (author)

  4. Impact of energy conservation policy measures on innovation, investment and long-term development of the Swiss economy. Results from the computable induced technical change and energy (CITE) model - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bretschger, L.; Ramer, R.; Schwark, F.

    2010-09-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the results of a study made on the Computable Induced Technical Change and Energy (CITE) model. The authors note that, in the past two centuries, the Swiss economy experienced an unprecedented increase in living standards. At the same time, the stock of various natural resources declined and the environmental conditions changed substantially. The evaluation of the sustainability of a low energy and low carbon society as well as an optimum transition to this state is discussed. An economic analysis is made and the CITE and GCE (Computable General Equilibrium) numerical simulation models are discussed. The results obtained are presented and discussed.

  5. Cardiovascular magnetic resonance in adults with previous cardiovascular surgery.

    Science.gov (United States)

    von Knobelsdorff-Brenkenhoff, Florian; Trauzeddel, Ralf Felix; Schulz-Menger, Jeanette

    2014-03-01

    Cardiovascular magnetic resonance (CMR) is a versatile non-invasive imaging modality that serves a broad spectrum of indications in clinical cardiology and has proven evidence. Most of the numerous applications are appropriate in patients with previous cardiovascular surgery in the same manner as in non-surgical subjects. However, some specifics have to be considered. This review article is intended to provide information about the application of CMR in adults with previous cardiovascular surgery. In particular, the two main scenarios, i.e. following coronary artery bypass surgery and following heart valve surgery, are highlighted. Furthermore, several pictorial descriptions of other potential indications for CMR after cardiovascular surgery are given.

  6. Computer programs for locating and fitting full energie peak in γ-ray spectra. Test and rules for an estimation of the main results

    International Nuclear Information System (INIS)

    1980-12-01

    After the different interlaboratory tests on gamma spectrum analysis organised by the 'Laboratoire de Metrologie des Rayonnements Ionisants' and by the International Atomic Energy Agency, it looked useful to manage a same type of intercomparison with the different supplies of Data acquisition and Analysis systems including mini-ordinator or microprocessor. Four spectrum have been chosen between those of the interlaboratory tests. The test dealt with the investigation of total absorption peaks of different levels in a complex spectrum and the calculation of their main parameters. Four supplies participed in the intercomparison with their own logicial. The result allow to suggest a few tests in order to try a new logicial, or to compare results with standards [fr

  7. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results

    Science.gov (United States)

    2017-08-01

    This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...parametric approach. This introduces uncertainty, because the parametric models are only as good as the available observations that form the basis for

  8. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  9. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  10. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  11. On the Evaluation of Computational Results Obtained from Solving System of linear Equations With matlab The Dual affine Scalling interior Point

    International Nuclear Information System (INIS)

    Murfi, Hendri; Basaruddin, T.

    2001-01-01

    The interior point method for linear programming has gained extraordinary interest as an alternative to simplex method since Karmarkar presented a polynomial-time algorithm for linear programming based on interior point method. In implementation of the algorithm of this method, there are two important things that have impact heavily to performance of the algorithm; they are data structure and used method to solve linear equation system in the algorithm. This paper describes about solving linear equation system in variants of the algorithm called dual-affine scaling algorithm. Next, we evaluate experimentally results of some used methods, either direct method or iterative method. The experimental evaluation used Matlab

  12. Squamous cell carcinoma arising in previously burned or irradiated skin

    International Nuclear Information System (INIS)

    Edwards, M.J.; Hirsch, R.M.; Broadwater, J.R.; Netscher, D.T.; Ames, F.C.

    1989-01-01

    Squamous cell carcinoma (SCC) arising in previously burned or irradiated skin was reviewed in 66 patients treated between 1944 and 1986. Healing of the initial injury was complicated in 70% of patients. Mean interval from initial injury to diagnosis of SCC was 37 years. The overwhelming majority of patients presented with a chronic intractable ulcer in previously injured skin. The regional relapse rate after surgical excision was very high, 58% of all patients. Predominant patterns of recurrence were in local skin and regional lymph nodes (93% of recurrences). Survival rates at 5, 10, and 20 years were 52%, 34%, and 23%, respectively. Five-year survival rates in previously burned and irradiated patients were not significantly different (53% and 50%, respectively). This review, one of the largest reported series, better defines SCC arising in previously burned or irradiated skin as a locally aggressive disease that is distinct from SCC arising in sunlight-damaged skin. An increased awareness of the significance of chronic ulceration in scar tissue may allow earlier diagnosis. Regional disease control and survival depend on surgical resection of all known disease and may require radical lymph node dissection or amputation

  13. Outcome Of Pregnancy Following A Previous Lower Segment ...

    African Journals Online (AJOL)

    Background: A previous ceasarean section is an important variable that influences patient management in subsequent pregnancies. A trial of vaginal delivery in such patients is a feasible alternative to a secondary section, thus aiding to reduce the ceasarean section rate and its associated co-morbidities. Objective: To ...

  14. 24 CFR 1710.552 - Previously accepted state filings.

    Science.gov (United States)

    2010-04-01

    ... of Substantially Equivalent State Law § 1710.552 Previously accepted state filings. (a) Materials... and contracts or agreements contain notice of purchaser's revocation rights. In addition see § 1715.15..., unless the developer is obligated to do so in the contract. (b) If any such filing becomes inactive or...

  15. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged ..... I am still riding the cloud … I hope it lasts. .... as a way of creating a climate and culture in schools where individuals are willing to explore.

  16. Haemophilus influenzae type f meningitis in a previously healthy boy

    DEFF Research Database (Denmark)

    Ronit, Andreas; Berg, Ronan M G; Bruunsgaard, Helle

    2013-01-01

    Non-serotype b strains of Haemophilus influenzae are extremely rare causes of acute bacterial meningitis in immunocompetent individuals. We report a case of acute bacterial meningitis in a 14-year-old boy, who was previously healthy and had been immunised against H influenzae serotype b (Hib...

  17. Research Note Effects of previous cultivation on regeneration of ...

    African Journals Online (AJOL)

    We investigated the effects of previous cultivation on regeneration potential under miombo woodlands in a resettlement area, a spatial product of Zimbabwe's land reforms. We predicted that cultivation would affect population structure, regeneration, recruitment and potential grazing capacity of rangelands. Plant attributes ...

  18. Cryptococcal meningitis in a previously healthy child | Chimowa ...

    African Journals Online (AJOL)

    An 8-year-old previously healthy female presented with a 3 weeks history of headache, neck stiffness, deafness, fever and vomiting and was diagnosed with cryptococcal meningitis. She had documented hearing loss and was referred to tertiary-level care after treatment with fluconazole did not improve her neurological ...

  19. Rapid fish stock depletion in previously unexploited seamounts: the ...

    African Journals Online (AJOL)

    Rapid fish stock depletion in previously unexploited seamounts: the case of Beryx splendens from the Sierra Leone Rise (Gulf of Guinea) ... A spectral analysis and red-noise spectra procedure (REDFIT) algorithm was used to identify the red-noise spectrum from the gaps in the observed time-series of catch per unit effort by ...

  20. 18 CFR 154.302 - Previously submitted material.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... concurrently with the rate change filing. There must be furnished to the Director, Office of Energy Market...

  1. Process cells dismantling of EUREX pant: previous activities

    International Nuclear Information System (INIS)

    Gili, M.

    1998-01-01

    In the '98-'99 period some process cells of the EUREX pant will be dismantled, in order to place there the liquid wastes conditioning plant 'CORA'. This report resumes the previous activities (plant rinsing campaigns and inactive Cell 014 dismantling), run in the past three years and the drawn experience [it

  2. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged school principals in North-West Province. Evans's theory of job satisfaction, morale and motivation was useful as a conceptual framework. A mixedmethods explanatory research design was important in discovering issues with ...

  3. Obstructive pulmonary disease in patients with previous tuberculosis ...

    African Journals Online (AJOL)

    Obstructive pulmonary disease in patients with previous tuberculosis: Pathophysiology of a community-based cohort. B.W. Allwood, R Gillespie, M Galperin-Aizenberg, M Bateman, H Olckers, L Taborda-Barata, G.L. Calligaro, Q Said-Hartley, R van Zyl-Smit, C.B. Cooper, E van Rikxoort, J Goldin, N Beyers, E.D. Bateman ...

  4. Abiraterone in metastatic prostate cancer without previous chemotherapy

    NARCIS (Netherlands)

    Ryan, Charles J.; Smith, Matthew R.; de Bono, Johann S.; Molina, Arturo; Logothetis, Christopher J.; de Souza, Paul; Fizazi, Karim; Mainwaring, Paul; Piulats, Josep M.; Ng, Siobhan; Carles, Joan; Mulders, Peter F. A.; Basch, Ethan; Small, Eric J.; Saad, Fred; Schrijvers, Dirk; van Poppel, Hendrik; Mukherjee, Som D.; Suttmann, Henrik; Gerritsen, Winald R.; Flaig, Thomas W.; George, Daniel J.; Yu, Evan Y.; Efstathiou, Eleni; Pantuck, Allan; Winquist, Eric; Higano, Celestia S.; Taplin, Mary-Ellen; Park, Youn; Kheoh, Thian; Griffin, Thomas; Scher, Howard I.; Rathkopf, Dana E.; Boyce, A.; Costello, A.; Davis, I.; Ganju, V.; Horvath, L.; Lynch, R.; Marx, G.; Parnis, F.; Shapiro, J.; Singhal, N.; Slancar, M.; van Hazel, G.; Wong, S.; Yip, D.; Carpentier, P.; Luyten, D.; de Reijke, T.

    2013-01-01

    Abiraterone acetate, an androgen biosynthesis inhibitor, improves overall survival in patients with metastatic castration-resistant prostate cancer after chemotherapy. We evaluated this agent in patients who had not received previous chemotherapy. In this double-blind study, we randomly assigned

  5. Response to health insurance by previously uninsured rural children.

    Science.gov (United States)

    Tilford, J M; Robbins, J M; Shema, S J; Farmer, F L

    1999-08-01

    To examine the healthcare utilization and costs of previously uninsured rural children. Four years of claims data from a school-based health insurance program located in the Mississippi Delta. All children who were not Medicaid-eligible or were uninsured, were eligible for limited benefits under the program. The 1987 National Medical Expenditure Survey (NMES) was used to compare utilization of services. The study represents a natural experiment in the provision of insurance benefits to a previously uninsured population. Premiums for the claims cost were set with little or no information on expected use of services. Claims from the insurer were used to form a panel data set. Mixed model logistic and linear regressions were estimated to determine the response to insurance for several categories of health services. The use of services increased over time and approached the level of utilization in the NMES. Conditional medical expenditures also increased over time. Actuarial estimates of claims cost greatly exceeded actual claims cost. The provision of a limited medical, dental, and optical benefit package cost approximately $20-$24 per member per month in claims paid. An important uncertainty in providing health insurance to previously uninsured populations is whether a pent-up demand exists for health services. Evidence of a pent-up demand for medical services was not supported in this study of rural school-age children. States considering partnerships with private insurers to implement the State Children's Health Insurance Program could lower premium costs by assembling basic data on previously uninsured children.

  6. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  7. 3D face reconstruction from 2D pictures: first results of a web-based computer aided system for aesthetic procedures.

    Science.gov (United States)

    Oliveira-Santos, Thiago; Baumberger, Christian; Constantinescu, Mihai; Olariu, Radu; Nolte, Lutz-Peter; Alaraibi, Salman; Reyes, Mauricio

    2013-05-01

    The human face is a vital component of our identity and many people undergo medical aesthetics procedures in order to achieve an ideal or desired look. However, communication between physician and patient is fundamental to understand the patient's wishes and to achieve the desired results. To date, most plastic surgeons rely on either "free hand" 2D drawings on picture printouts or computerized picture morphing. Alternatively, hardware dependent solutions allow facial shapes to be created and planned in 3D, but they are usually expensive or complex to handle. To offer a simple and hardware independent solution, we propose a web-based application that uses 3 standard 2D pictures to create a 3D representation of the patient's face on which facial aesthetic procedures such as filling, skin clearing or rejuvenation, and rhinoplasty are planned in 3D. The proposed application couples a set of well-established methods together in a novel manner to optimize 3D reconstructions for clinical use. Face reconstructions performed with the application were evaluated by two plastic surgeons and also compared to ground truth data. Results showed the application can provide accurate 3D face representations to be used in clinics (within an average of 2 mm error) in less than 5 min.

  8. Reoperative sentinel lymph node biopsy after previous mastectomy.

    Science.gov (United States)

    Karam, Amer; Stempel, Michelle; Cody, Hiram S; Port, Elisa R

    2008-10-01

    Sentinel lymph node (SLN) biopsy is the standard of care for axillary staging in breast cancer, but many clinical scenarios questioning the validity of SLN biopsy remain. Here we describe our experience with reoperative-SLN (re-SLN) biopsy after previous mastectomy. Review of the SLN database from September 1996 to December 2007 yielded 20 procedures done in the setting of previous mastectomy. SLN biopsy was performed using radioisotope with or without blue dye injection superior to the mastectomy incision, in the skin flap in all patients. In 17 of 20 patients (85%), re-SLN biopsy was performed for local or regional recurrence after mastectomy. Re-SLN biopsy was successful in 13 of 20 patients (65%) after previous mastectomy. Of the 13 patients, 2 had positive re-SLN, and completion axillary dissection was performed, with 1 having additional positive nodes. In the 11 patients with negative re-SLN, 2 patients underwent completion axillary dissection demonstrating additional negative nodes. One patient with a negative re-SLN experienced chest wall recurrence combined with axillary recurrence 11 months after re-SLN biopsy. All others remained free of local or axillary recurrence. Re-SLN biopsy was unsuccessful in 7 of 20 patients (35%). In three of seven patients, axillary dissection was performed, yielding positive nodes in two of the three. The remaining four of seven patients all had previous modified radical mastectomy, so underwent no additional axillary surgery. In this small series, re-SLN was successful after previous mastectomy, and this procedure may play some role when axillary staging is warranted after mastectomy.

  9. Radioiodine treatment of recurrent hyperthyroidism in patients previously treated for Graves' disease by subtotal thyroidectomy

    DEFF Research Database (Denmark)

    Vestergaard, H; Laurberg, P

    1992-01-01

    showed a higher sensitivity to radioiodine, with more cases of early hypothyroidism, than non-operated patients. However, after 50 months of follow-up the outcome was identical. The results indicate that frequent assessment is necessary after radioiodine treatment of previously operated patients, since......Radioiodine therapy is often employed for treatment of patients with relapse of hyperthyroidism due to Graves' disease, after previous thyroid surgery. Little is known about the outcome of this treatment compared to patients with no previous surgery. A total of 20 patients who had received surgical...... treatment for Graves' hyperthyroidism 1-46 years previously and with relapse of the hyperthyroidism, and 25 patients with hyperthyroidism due to Graves' disease and no previous thyroid surgery were treated with radioiodine, following the same protocol. Early after treatment the previously operated patients...

  10. Predictive factors for the development of diabetes in women with previous gestational diabetes mellitus

    DEFF Research Database (Denmark)

    Damm, P.; Kühl, C.; Bertelsen, Aksel

    1992-01-01

    OBJECTIVES: The purpose of this study was to determine the incidence of diabetes in women with previous dietary-treated gestational diabetes mellitus and to identify predictive factors for development of diabetes. STUDY DESIGN: Two to 11 years post partum, glucose tolerance was investigated in 241...... women with previous dietary-treated gestational diabetes mellitus and 57 women without previous gestational diabetes mellitus (control group). RESULTS: Diabetes developed in 42 (17.4%) women with previous gestational diabetes mellitus (3.7% insulin-dependent diabetes mellitus and 13.7% non...... of previous patients with gestational diabetes mellitus in whom plasma insulin was measured during an oral glucose tolerance test in late pregnancy a low insulin response at diagnosis was found to be an independent predictive factor for diabetes development. CONCLUSIONS: Women with previous dietary...

  11. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  12. Previously unreported abnormalities in Wolfram Syndrome Type 2.

    Science.gov (United States)

    Akturk, Halis Kaan; Yasa, Seda

    2017-01-01

    Wolfram syndrome (WFS) is a rare autosomal recessive disease with non-autoimmune childhood onset insulin dependent diabetes and optic atrophy. WFS type 2 (WFS2) differs from WFS type 1 (WFS1) with upper intestinal ulcers, bleeding tendency and the lack ofdiabetes insipidus. Li-fespan is short due to related comorbidities. Only a few familieshave been reported with this syndrome with the CISD2 mutation. Here we report two siblings with a clinical diagnosis of WFS2, previously misdiagnosed with type 1 diabetes mellitus and diabetic retinopathy-related blindness. We report possible additional clinical and laboratory findings that have not been pre-viously reported, such as asymptomatic hypoparathyroidism, osteomalacia, growth hormone (GH) deficiency and hepatomegaly. Even though not a requirement for the diagnosis of WFS2 currently, our case series confirm hypogonadotropic hypogonadism to be also a feature of this syndrome, as reported before. © Polish Society for Pediatric Endocrinology and Diabetology.

  13. Influence of previous knowledge in Torrance tests of creative thinking

    OpenAIRE

    Aranguren, María; Consejo Nacional de Investigaciones Científicas y Técnicas CONICET

    2015-01-01

    The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974) performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertisin...

  14. Dose-reduced 16-slice multidetector-row spiral computed tomography in children with bronchoscopically suspected vascular tracheal stenosis - initial results

    International Nuclear Information System (INIS)

    Honnef, D.; Wildberger, J.E.; Das, M.; Hohl, C.; Mahnken, A.; Guenther, R.W.; Staatz, G.; Schnoering, H.; Vazquez-Jimenez, J.

    2006-01-01

    Purpose: To evaluate the diagnostic accuracy of contrast-enhanced dose-reduced 16-slice multidetector-row CT (MDCT) in newborns and infants with fiberoptic bronchoscopically suspected vascular-induced tracheal stenosis. Materials and Methods: 12 children (4 days to 3 years, 1.2-13.5 kg body weight) were examined using i.v. contrast-enhanced 16-slice MDCT (SOMATOM Sensation 16, Forchheim, Germany) without breath-hold and under sedation (11/12). All MDCTs were performed with a dose reduction. The beam collimation was 16 x 0.75 mm, except in the case of one child. MPRs along the tracheal axis in the x-, y- and z-directions and volume-rendering-reconstructions (VRTs) were calculated based on a secondary raw data set in addition to conventional axial slices. 2 radiologists used a three-point grade scale to evaluate the image quality, motion, and contrast media artifacts as well as the usefulness of the 2D- and 3D-reconstructions for determining the diagnosis. Statistical analysis was performed on the basis of a Kappa test. Results: In all cases the cause of the fiberoptic bronchoscopically suspected tracheal stenosis was revealed: compression due to the brachiocephalic trunk (n=7), double aortic arch (n=2), lusorian artery (n=1), vascular compression of the left main bronchus (n=2). In 3 patients further thoracic anomalies, such as tracheobronchial (n=2), and vascular (n=2) and vertebral (n=1) anomalies were found. The attenuation in the anomalous vessels was 307±140 HU. The image noise was 9.8±1.9 HU. The mean dose reduction was 82.7±3.2% compared to a standard adult thoracic CT. All examinations were rated as diagnostically good (median 1, range 1, k=1). 3D images did not show any stair artifacts (median 2, range 1-2, k=1). The image noise was minor to moderate and hardly any motion artifacts were seen (median 1, range 1-2, k=0.8). Contrast media artifacts were rated zero to minor (median 1.5, range 1-2, k=0.676). MPRs (median 1, range 1, k=1) and VRTs (median 1

  15. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  16. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  17. Improvements on the seismic catalog previous to the 2011 El Hierro eruption.

    Science.gov (United States)

    Domínguez Cerdeña, Itahiza; del Fresno, Carmen

    2017-04-01

    Precursors from the submarine eruption of El Hierro (Canary Islands) in 2011 included 10,000 low magnitude earthquakes and 5 cm crustal deformation within 81 days previous to the eruption onset on the 10th October. Seismicity revealed a 20 km horizontal migration from the North to the South of the island and depths ranging from 10 and 17 km with deeper events occurring further South. The earthquakes of the seismic catalog were manually picked by the IGN almost in real time, but there has not been a subsequent revision to check for new non located events jet and the completeness magnitude for the seismic catalog have strong changes during the entire swarm due to the variable number of events per day. In this work we used different techniques to improve the quality of the seismic catalog. First we applied different automatic algorithms to detect new events including the LTA-STA method. Then, we performed a semiautomatic system to correlate the new P and S detections with known phases from the original catalog. The new detected earthquakes were also located using Hypoellipse algorithm. The resulting new catalog included 15,000 new events mainly concentrated in the last weeks of the swarm and we assure a completeness magnitude of 1.2 during the whole series. As the seismicity from the original catalog was already relocated using hypoDD algorithm, we improved the location of the new events using a master-cluster relocation. This method consists in relocating earthquakes towards a cluster of well located events instead of a single event as the master-event method. In our case this cluster correspond to the relocated earthquakes from the original catalog. Finally, we obtained a new equation for the local magnitude estimation which allow us to include corrections for each seismic station in order to avoid local effects. The resulting magnitude catalog has a better fit with the moment magnitude catalog obtained for the strong earthquakes of this series in previous studies

  18. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  19. Acceleration of planes segmentation using normals from previous frame

    Science.gov (United States)

    Gritsenko, Pavel; Gritsenko, Igor; Seidakhmet, Askar; Abduraimov, Azizbek

    2017-12-01

    One of the major problem in integration process of robots is to make them able to function in a human environment. In terms of computer vision, the major feature of human made rooms is the presence of planes [1, 2, 20, 21, 23]. In this article, we will present an algorithm dedicated to increase speed of a plane segmentation. The algorithm uses information about location of a plane and its normal vector to speed up the segmentation process in the next frame. In conjunction with it, we will address such aspects of ICP SLAM as performance and map representation.

  20. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  1. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  2. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  3. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  4. Moyamoya disease in a child with previous acute necrotizing encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taik-Kun; Cha, Sang Hoon; Chung, Kyoo Byung; Kim, Jung Hyuck; Kim, Baek Hyun; Chung, Hwan Hoon [Department of Diagnostic Radiology, Korea University College of Medicine, Ansan Hospital, 516 Kojan-Dong, Ansan City, Kyungki-Do 425-020 (Korea); Eun, Baik-Lin [Department of Pediatrics, Korea University College of Medicine, Seoul (Korea)

    2003-09-01

    A previously healthy 24-day-old boy presented with a 2-day history of fever and had a convulsion on the day of admission. MRI showed abnormal signal in the thalami, caudate nuclei and central white matter. Acute necrotising encephalopathy was diagnosed, other causes having been excluded after biochemical and haematological analysis of blood, urine and CSF. He recovered, but with spastic quadriparesis. At the age of 28 months, he suffered sudden deterioration of consciousness and motor weakness of his right limbs. MRI was consistent with an acute cerebrovascular accident. Angiography showed bilateral middle cerebral artery stenosis or frank occlusion with numerous lenticulostriate collateral vessels consistent with moyamoya disease. (orig.)

  5. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Directory of Open Access Journals (Sweden)

    Yoshiyuki eKaneko

    2015-05-01

    Full Text Available Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime. Using functional magnetic resonance imaging, we also found that activation in the left intraparietal sulcus was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target.

  6. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Science.gov (United States)

    Kaneko, Yoshiyuki; Sakai, Katsuyuki

    2015-01-01

    Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime). Using functional magnetic resonance imaging (fMRI), we also found that activation in the left intraparietal sulcus (IPS) was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus (IFG) was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target. PMID:25999844

  7. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  8. Validation of One-Dimensional Module of MARS-KS1.2 Computer Code By Comparison with the RELAP5/MOD3.3/patch3 Developmental Assessment Results

    International Nuclear Information System (INIS)

    Bae, S. W.; Chung, B. D.

    2010-07-01

    This report records the results of the code validation for the one-dimensional module of the MARS-KS thermal hydraulics analysis code by means of result-comparison with the RELAP5/MOD3.3 computer code. For the validation calculations, simulations of the RELAP5 Code Developmental Assessment Problem, which consists of 22 simulation problems in 3 categories, have been selected. The results of the 3 categories of simulations demonstrate that the one-dimensional module of the MARS code and the RELAP5/MOD3.3 code are essentially the same code. This is expected as the two codes have basically the same set of field equations, constitutive equations and main thermal hydraulic models. The result suggests that the high level of code validity of the RELAP5/MOD3.3 can be directly applied to the MARS one-dimensional module

  9. Spatial Computing and Spatial Practices

    DEFF Research Database (Denmark)

    Brodersen, Anders; Büsher, Monika; Christensen, Michael

    2007-01-01

    The gathering momentum behind the research agendas of pervasive, ubiquitous and ambient computing, set in motion by Mark Weiser (1991), offer dramatic opportunities for information systems design. They raise the possibility of "putting computation where it belongs" by exploding computing power out...... the "disappearing computer" we have, therefore, carried over from previous research an interdisciplinary perspective, and a focus on the sociality of action (Suchman 1987)....

  10. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  11. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  12. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  13. Proteomics Analysis Reveals Previously Uncharacterized Virulence Factors in Vibrio proteolyticus

    Directory of Open Access Journals (Sweden)

    Ann Ray

    2016-07-01

    Full Text Available Members of the genus Vibrio include many pathogens of humans and marine animals that share genetic information via horizontal gene transfer. Hence, the Vibrio pan-genome carries the potential to establish new pathogenic strains by sharing virulence determinants, many of which have yet to be characterized. Here, we investigated the virulence properties of Vibrio proteolyticus, a Gram-negative marine bacterium previously identified as part of the Vibrio consortium isolated from diseased corals. We found that V. proteolyticus causes actin cytoskeleton rearrangements followed by cell lysis in HeLa cells in a contact-independent manner. In search of the responsible virulence factor involved, we determined the V. proteolyticus secretome. This proteomics approach revealed various putative virulence factors, including active type VI secretion systems and effectors with virulence toxin domains; however, these type VI secretion systems were not responsible for the observed cytotoxic effects. Further examination of the V. proteolyticus secretome led us to hypothesize and subsequently demonstrate that a secreted hemolysin, belonging to a previously uncharacterized clan of the leukocidin superfamily, was the toxin responsible for the V. proteolyticus-mediated cytotoxicity in both HeLa cells and macrophages. Clearly, there remains an armory of yet-to-be-discovered virulence factors in the Vibrio pan-genome that will undoubtedly provide a wealth of knowledge on how a pathogen can manipulate host cells.

  14. Relationship of deer and moose populations to previous winters' snow

    Science.gov (United States)

    Mech, L.D.; McRoberts, R.E.; Peterson, R.O.; Page, R.E.

    1987-01-01

    (1) Linear regression was used to relate snow accumulation during single and consecutive winters with white-tailed deer (Odocoileus virginianus) fawn:doe ratios, mosse (Alces alces) twinning rates and calf:cow ratios, and annual changes in deer and moose populations. Significant relationships were found between snow accumulation during individual winters and these dependent variables during the following year. However, the strongest relationships were between the dependent variables and the sums of the snow accumulations over the previous three winters. The percentage of the variability explained was 36 to 51. (2) Significant relationships were also found between winter vulnerability of moose calves and the sum of the snow accumulations in the current, and up to seven previous, winters, with about 49% of the variability explained. (3) No relationship was found between wolf numbers and the above dependent variables. (4) These relationships imply that winter influences on maternal nutrition can accumulate for several years and that this cumulative effect strongly determines fecundity and/or calf and fawn survivability. Although wolf (Canis lupus L.) predation is the main direct mortality agent on fawns and calves, wolf density itself appears to be secondary to winter weather in influencing the deer and moose populations.

  15. [ANTITHROMBOTIC MEDICATION IN PREGNANT WOMEN WITH PREVIOUS INTRAUTERINE GROWTH RESTRICTION].

    Science.gov (United States)

    Neykova, K; Dimitrova, V; Dimitrov, R; Vakrilova, L

    2016-01-01

    To analyze pregnancy outcome in patients who were on antithrombotic medication (AM) because of previous pregnancy with fetal intrauterine growth restriction (IUGR). The studied group (SG) included 21 pregnancies in 15 women with history of previous IUGR. The patients were on low dose aspirin (LDA) and/or low molecular weight heparin (LMWH). Pregnancy outcome was compared to the one in two more groups: 1) primary group (PG) including the previous 15 pregnancies with IUGR of the same women; 2) control group (CG) including 45 pregnancies of women matched for parity with the ones in the SG, with no history of IUGR and without medication. The SG, PG and CG were compared for the following: mean gestational age (g.a.) at birth, mean birth weight (BW), proportion of cases with early preeclampsia (PE), IUGR (total, moderate, and severe), intrauterine fetal death (IUFD), neonatal death (NND), admission to NICU, cesarean section (CS) because of chronic or acute fetal distress (FD) related to IUGR, PE or placental abruption. Student's t-test was applied to assess differences between the groups. P values < 0.05 were considered statistically significant. The differences between the SG and the PG regarding mean g. a. at delivery (33.7 and 29.8 w.g. respectively) and the proportion of babies admitted to NICU (66.7% vs. 71.4%) were not statistically significant. The mean BW in the SG (2114,7 g.) was significantly higher than in the PG (1090.8 g.). In the SG compared with the PG there were significantly less cases of IUFD (14.3% and 53.3% respectively), early PE (9.5% vs. 46.7%) moderate and severe IUGR (10.5% and 36.8% vs. 41.7% and 58.3%). Neonatal mortality in the SG (5.6%) was significantly lower than in the PG (57.1%), The proportion of CS for FD was not significantly different--53.3% in the SG and 57.1% in the PG. On the other hand, comparison between the SG and the CG demonstrated significantly lower g.a. at delivery in the SG (33.7 vs. 38 w.g.) an lower BW (2114 vs. 3094 g

  16. LHCb Computing Resources: 2012 re-assessment, 2013 request and 2014 forecast

    CERN Document Server

    Graciani Diaz, Ricardo

    2012-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2012 data-taking period, request of computing resource needs for 2013, and a first forecast of the 2014 needs, when restart of data-taking is foreseen. Estimates are based on 2011 experience, as well as on the results of a simulation of the computing model described in the document. Differences in the model and deviations in the estimates from previous presented results are stressed.

  17. A methodology for modeling photocatalytic reactors for indoor pollution control using previously estimated kinetic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Passalia, Claudio; Alfano, Orlando M. [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina); Brandi, Rodolfo J., E-mail: rbrandi@santafe-conicet.gov.ar [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina)

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Indoor pollution control via photocatalytic reactors. Black-Right-Pointing-Pointer Scaling-up methodology based on previously determined mechanistic kinetics. Black-Right-Pointing-Pointer Radiation interchange model between catalytic walls using configuration factors. Black-Right-Pointing-Pointer Modeling and experimental validation of a complex geometry photocatalytic reactor. - Abstract: A methodology for modeling photocatalytic reactors for their application in indoor air pollution control is carried out. The methodology implies, firstly, the determination of intrinsic reaction kinetics for the removal of formaldehyde. This is achieved by means of a simple geometry, continuous reactor operating under kinetic control regime and steady state. The kinetic parameters were estimated from experimental data by means of a nonlinear optimization algorithm. The second step was the application of the obtained kinetic parameters to a very different photoreactor configuration. In this case, the reactor is a corrugated wall type using nanosize TiO{sub 2} as catalyst irradiated by UV lamps that provided a spatially uniform radiation field. The radiative transfer within the reactor was modeled through a superficial emission model for the lamps, the ray tracing method and the computation of view factors. The velocity and concentration fields were evaluated by means of a commercial CFD tool (Fluent 12) where the radiation model was introduced externally. The results of the model were compared experimentally in a corrugated wall, bench scale reactor constructed in the laboratory. The overall pollutant conversion showed good agreement between model predictions and experiments, with a root mean square error less than 4%.

  18. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  19. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  20. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...