WorldWideScience

Sample records for significant results computer

  1. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  2. Clinical significance of measurement of hepatic volume by computed tomography

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Matsuda, Yoshiro; Takada, Akira

    1984-01-01

    Hepatic volumes were measured by computed tomography (CT) in 91 patients with chronic liver diseases. Mean hepatic volume in alcoholic liver disease was significantly larger than that in non-alcoholic liver disease. Hepatic volumes in the majority of decompensated liver cirrhosis were significantly smaller than those of compensated liver cirrhosis. In liver cirrhosis, significant correlations between hepatic volume and various hepatic tests which reflect the total functioning hepatic cell masses were found. Combinations of hepatic volume with ICG maximum removal rate and with serum cholinesterase activity were most useful for the assessment of prognosis in liver cirrhosis. These results indicated that estimation of hepatic volume by CT is useful for analysis of pathophysiology and prognosis of chronic liver diseases, and for diagnosis of alcoholic liver diseases. (author)

  3. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  4. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  5. Significance of a postenhancement computed tomography findings in liver cirrhosis: In view of hemodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suck Hong; Kim, Byung Soo [Pusan National University College of Medicine, Pusan (Korea, Republic of)

    1985-04-15

    We observed a significant sign in postenhancement computed tomography of liver cirrhosis, that is visualization of portal venous branches. During postenhancement computed tomography scanning of liver, the portal vein can not be identified in liver parenchyme in 84% of patients without known cirrhosis (including chronic active hepatitis). The two have the same hemodynamic changes in that there is diffuse fibrosis and resultant decrease in vascular bed. Visualization of intrahepatic portal branches in postenhancement computed tomography is because of decreased diffusion ability and portal hypertension.

  6. Re-Computation of Numerical Results Contained in NACA Report No. 496

    Science.gov (United States)

    Perry, Boyd, III

    2015-01-01

    An extensive examination of NACA Report No. 496 (NACA 496), "General Theory of Aerodynamic Instability and the Mechanism of Flutter," by Theodore Theodorsen, is described. The examination included checking equations and solution methods and re-computing interim quantities and all numerical examples in NACA 496. The checks revealed that NACA 496 contains computational shortcuts (time- and effort-saving devices for engineers of the time) and clever artifices (employed in its solution methods), but, unfortunately, also contains numerous tripping points (aspects of NACA 496 that have the potential to cause confusion) and some errors. The re-computations were performed employing the methods and procedures described in NACA 496, but using modern computational tools. With some exceptions, the magnitudes and trends of the original results were in fair-to-very-good agreement with the re-computed results. The exceptions included what are speculated to be computational errors in the original in some instances and transcription errors in the original in others. Independent flutter calculations were performed and, in all cases, including those where the original and re-computed results differed significantly, were in excellent agreement with the re-computed results. Appendix A contains NACA 496; Appendix B contains a Matlab(Reistered) program that performs the re-computation of results; Appendix C presents three alternate solution methods, with examples, for the two-degree-of-freedom solution method of NACA 496; Appendix D contains the three-degree-of-freedom solution method (outlined in NACA 496 but never implemented), with examples.

  7. Computations for a condenser. Experimental results

    International Nuclear Information System (INIS)

    Walden, Jean.

    1975-01-01

    Computations for condensers are presented with experimental results. The computations are concerned with the steam flux at the condenser input, and inside the tube bundle. Experimental results are given for the flux inside the condenser sleeve and the flow passing through the tube bundle [fr

  8. BaBar computing - From collisions to physics results

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The BaBar experiment at SLAC studies B-physics at the Upsilon(4S) resonance using the high-luminosity e+e- collider PEP-II at the Stanford Linear Accelerator Center (SLAC). Taking, processing and analyzing the very large data samples is a significant computing challenge. This presentation will describe the entire BaBar computing chain and illustrate the solutions chosen as well as their evolution with the ever higher luminosity being delivered by PEP-II. This will include data acquisition and software triggering in a high availability, low-deadtime online environment, a prompt, automated calibration pass through the data SLAC and then the full reconstruction of the data that takes place at INFN-Padova within 24 hours. Monte Carlo production takes place in a highly automated fashion in 25+ sites. The resulting real and simulated data is distributed and made available at SLAC and other computing centers. For analysis a much more sophisticated skimming pass has been introduced in the past year, ...

  9. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    Science.gov (United States)

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  10. Computation of spatial significance of mountain objects extracted from multiscale digital elevation models

    International Nuclear Information System (INIS)

    Sathyamoorthy, Dinesh

    2014-01-01

    The derivation of spatial significance is an important aspect of geospatial analysis and hence, various methods have been proposed to compute the spatial significance of entities based on spatial distances with other entities within the cluster. This paper is aimed at studying the spatial significance of mountain objects extracted from multiscale digital elevation models (DEMs). At each scale, the value of spatial significance index SSI of a mountain object is the minimum number of morphological dilation iterations required to occupy all the other mountain objects in the terrain. The mountain object with the lowest value of SSI is the spatially most significant mountain object, indicating that it has the shortest distance to the other mountain objects. It is observed that as the area of the mountain objects reduce with increasing scale, the distances between the mountain objects increase, resulting in increasing values of SSI. The results obtained indicate that the strategic location of a mountain object at the centre of the terrain is more important than its size in determining its reach to other mountain objects and thus, its spatial significance

  11. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  12. Computational Approach to Annotating Variants of Unknown Significance in Clinical Next Generation Sequencing.

    Science.gov (United States)

    Schulz, Wade L; Tormey, Christopher A; Torres, Richard

    2015-01-01

    Next generation sequencing (NGS) has become a common technology in the clinical laboratory, particularly for the analysis of malignant neoplasms. However, most mutations identified by NGS are variants of unknown clinical significance (VOUS). Although the approach to define these variants differs by institution, software algorithms that predict variant effect on protein function may be used. However, these algorithms commonly generate conflicting results, potentially adding uncertainty to interpretation. In this review, we examine several computational tools used to predict whether a variant has clinical significance. In addition to describing the role of these tools in clinical diagnostics, we assess their efficacy in analyzing known pathogenic and benign variants in hematologic malignancies. Copyright© by the American Society for Clinical Pathology (ASCP).

  13. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  14. Significance of computed tomography for diagnosis of heart diseases

    International Nuclear Information System (INIS)

    Senda, Kohei; Sakuma, Sadayuki

    1983-01-01

    Computed tomography (CT) with a 2 sec scanner was carried out on 105 cases with various heart disease in order to detect CT findings in each heart disease. Significance of CT as a imaging study was evaluated in comparison with scintigraphic, echographic and roentgenographic studies. CT with contrast enhancement in moderate inspiration was able to demonstrate accurately organic changes of intra-and extracardiac structure. Comparing with other imaging studies, CT was superior in detection of calcified or intracardiac mass lesion in spite of low value in evaluating cardiac function or dynamics. (author)

  15. Membrane computing: brief introduction, recent results and applications.

    Science.gov (United States)

    Păun, Gheorghe; Pérez-Jiménez, Mario J

    2006-07-01

    The internal organization and functioning of living cells, as well as their cooperation in tissues and higher order structures, can be a rich source of inspiration for computer science, not fully exploited at the present date. Membrane computing is an answer to this challenge, well developed at the theoretical (mathematical and computability theory) level, already having several applications (via usual computers), but without having yet a bio-lab implementation. After briefly discussing some general issues related to natural computing, this paper provides an informal introduction to membrane computing, focused on the main ideas, the main classes of results and of applications. Then, three recent achievements, of three different types, are briefly presented, with emphasis on the usefulness of membrane computing as a framework for devising models of interest for biological and medical research.

  16. Significance of computed tomography in urology

    International Nuclear Information System (INIS)

    Harada, Takashi

    1981-01-01

    There are more than five years since computed tomography (CT) was first introduced in this country for practical use. However, cumulative diagnostic experiences in urology have not been discussed thoroughly yet. In the Department of Urology of Kansai Medical University over 120 times CT diagnosis were attempted past three years and the instrument employed during this period has been alternative from the first generation type (ACTA 150) to the third one (CT-3W) this year as to technical advance. These cases are 70 of pelvic lesions and retroperitoneal surveys are made in the rests. As a results, detection of space occupying mass in kidney, adrenal and their surroundings was comparatively easy to deliver by this method, but there are several pitfalls to come misunderstanding in diagnosis of pelvic organs. It seems to be difficult to obtain certain result on closely packed viscera with tightly adhered connective tissue in tiny space. However, these difficulties will be solved by bladder insufflation with olive oil, for instance, and scanning in prone position. Contrast enhancement by injection of dye also give more definite results in genitourinary tract assessment. Moreover, there are much benefit in diagnosis of renal parenchymal change including lacerating renal trauma unable to be differentiated conventional method. Bolus injection of contrast material also allows to calculate CT values obtained from ROI on tomography and enables to fit the value to time-activity curve likewise scintillation scanning. In forthcomming day, new device in this field including emission-CT, NMR-CT and others will open new sight for ideal diagnostic facility in urology. (author)

  17. Prognostic significance of tumor size of small lung adenocarcinomas evaluated with mediastinal window settings on computed tomography.

    Directory of Open Access Journals (Sweden)

    Yukinori Sakao

    Full Text Available BACKGROUND: We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. METHODS: We evaluated 176 patients with small lung adenocarcinomas (diameter, 1-3 cm who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion. Recurrence-free survival was used for prognosis. RESULTS: Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0

  18. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    Science.gov (United States)

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0

  19. FPGAs in High Perfomance Computing: Results from Two LDRD Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Underwood, Keith D; Ulmer, Craig D.; Thompson, David; Hemmert, Karl Scott

    2006-11-01

    Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave order of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5

  20. Computer usage and national energy consumption: Results from a field-metering study

    Energy Technology Data Exchange (ETDEWEB)

    Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Greenblatt, Jeffery [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Claybaugh, Erin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Nagaraju, Mythri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Young, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division

    2014-12-01

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power

  1. Rackspace: Significance of Cloud Computing to CERN

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The research collaboration between Rackspace and CERN is contributing to how OpenStack cloud computing will move science work around the world for CERN, and to reducing the barriers between clouds for Rackspace.

  2. Significance of frontal cortical atrophy in Parkinson's disease: computed tomographic study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Sang; Suh, Jung Ho; Chung, Tae Sub; Kim, Dong Ik [College of Medicine, Yonsei University, Seoul (Korea, Republic of)

    1987-10-15

    Fifty-five patients with Parkinson's disease were evaluated clinically and with brain computed tomography (CT) in order to determine the incidence of frontal cortical and subcortical atrophy. Twenty cases of age-related healthy control group were also scanned. The CT criteria of frontal cortical atrophy that was used in this study were the maximum width of frontal hemispheric cortical sulci and width of anterior interhemispheric fissure between frontal lobes comparing with maximum width of hemispheric cortical sulci except frontal lobes. And the criteria of frontal subcortical atrophy were bifrontal index bicaudate index, and Evans index. The results are as follows: 1. Cortical atrophic changes in Parkinson's disease were more prominent in frontal lobe rather than other causes of cortical atrophy. 2. Frontal cortical and subcortical atrophic changes were also more prominent in Parkinson's disease rather than age-related control group. 3. Subcortical atrophic changes in frontal lobe were always associated with cortical atrophic changes. 4. Changes of basal ganglia were hardly seen in Parkinson's disease. 5. Cortical atrophic changes in frontal lobe must be the one of significant findings in Parkinson's disease.

  3. Significance of frontal cortical atrophy in Parkinson's disease: computed tomographic study

    International Nuclear Information System (INIS)

    Lee, Kyung Sang; Suh, Jung Ho; Chung, Tae Sub; Kim, Dong Ik

    1987-01-01

    Fifty-five patients with Parkinson's disease were evaluated clinically and with brain computed tomography (CT) in order to determine the incidence of frontal cortical and subcortical atrophy. Twenty cases of age-related healthy control group were also scanned. The CT criteria of frontal cortical atrophy that was used in this study were the maximum width of frontal hemispheric cortical sulci and width of anterior interhemispheric fissure between frontal lobes comparing with maximum width of hemispheric cortical sulci except frontal lobes. And the criteria of frontal subcortical atrophy were bifrontal index bicaudate index, and Evans index. The results are as follows: 1. Cortical atrophic changes in Parkinson's disease were more prominent in frontal lobe rather than other causes of cortical atrophy. 2. Frontal cortical and subcortical atrophic changes were also more prominent in Parkinson's disease rather than age-related control group. 3. Subcortical atrophic changes in frontal lobe were always associated with cortical atrophic changes. 4. Changes of basal ganglia were hardly seen in Parkinson's disease. 5. Cortical atrophic changes in frontal lobe must be the one of significant findings in Parkinson's disease

  4. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    International Nuclear Information System (INIS)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A.

    2007-01-01

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography

  5. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    Energy Technology Data Exchange (ETDEWEB)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A. [Dept. of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Essen (Germany)

    2007-04-15

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography.

  6. Head multidetector computed tomography: emergency medicine physicians overestimate the pretest probability and legal risk of significant findings.

    Science.gov (United States)

    Baskerville, Jerry Ray; Herrick, John

    2012-02-01

    This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. [Usage patterns of internet and computer games : Results of an observational study of Tyrolean adolescents].

    Science.gov (United States)

    Riedl, David; Stöckl, Andrea; Nussbaumer, Charlotte; Rumpold, Gerhard; Sevecke, Kathrin; Fuchs, Martin

    2016-12-01

    The use of digital media such as the Internet and Computer games has greatly increased. In the western world, almost all young people regularly use these relevant technologies. Against this background, forms of use with possible negative consequences for young people have been recognized and scientifically examined. The aim of our study was therefore to investigate the prevalence of pathological use of these technologies in a sample of young Tyrolean people. 398 students (average age 15.2 years, SD ± 2.3 years, 34.2% female) were interviewed by means of the structured questionnaires CIUS (Internet), CSV-S (Computer games) and SWE (Self efficacy). Additionally, socio demographic data were collected. In line with previous studies, 7.7% of the adolescents of our sample showed criteria for problematic internet use, 3.3% for pathological internet use. 5.4% of the sample reported pathological computer game usage. The most important aspect to influence our results was the gender of the subjects. Intensive users in the field of Internet and Computer games were more often young men, young women, however, showed significantly less signs of pathological computer game use. A significant percentage of Tyrolean adolescents showed difficulties in the development of competent media use, indicating the growing significance of prevention measures such as media education. In a follow-up project, a sample of adolescents with mental disorders will be examined concerning their media use and be compared with our school-sample.

  8. Calculation of limits for significant unidirectional changes in two or more serial results of a biomarker based on a computer simulation model

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2015-01-01

    BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV...... the presented factors. The first result is multiplied by the appropriate factor for increase or decrease, which gives the limits for a significant difference.......BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV......,000 simulated data from healthy individuals, a series of up to 20 results from an individual was generated using different values for the within-subject biological variation plus the analytical variation. Each new result in this series was compared to the initial measurement result. These successive serial...

  9. Introduction of e-learning in dental radiology reveals significantly improved results in final examination.

    Science.gov (United States)

    Meckfessel, Sandra; Stühmer, Constantin; Bormann, Kai-Hendrik; Kupka, Thomas; Behrends, Marianne; Matthies, Herbert; Vaske, Bernhard; Stiesch, Meike; Gellrich, Nils-Claudius; Rücker, Martin

    2011-01-01

    Because a traditionally instructed dental radiology lecture course is very time-consuming and labour-intensive, online courseware, including an interactive-learning module, was implemented to support the lectures. The purpose of this study was to evaluate the perceptions of students who have worked with web-based courseware as well as the effect on their results in final examinations. Users (n(3+4)=138) had access to the e-program from any networked computer at any time. Two groups (n(3)=71, n(4)=67) had to pass a final exam after using the e-course. Results were compared with two groups (n(1)=42, n(2)=48) who had studied the same content by attending traditional lectures. In addition a survey of the students was statistically evaluated. Most of the respondents reported a positive attitude towards e-learning and would have appreciated more access to computer-assisted instruction. Two years after initiating the e-course the failure rate in the final examination dropped significantly, from 40% to less than 2%. The very positive response to the e-program and improved test scores demonstrated the effectiveness of our e-course as a learning aid. Interactive modules in step with clinical practice provided learning that is not achieved by traditional teaching methods alone. To what extent staff savings are possible is part of a further study. Copyright © 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  10. Significance of Computed Tomography in the Diagnosis of Cerebrovascular Accidents

    Directory of Open Access Journals (Sweden)

    Sumnima Acharya

    2014-06-01

    Full Text Available Introduction: Cerebrovascular Accident (CVA is defined as abrupt onset of a neurological deficit that is attributable to a focal vascular cause. CT scan is a widely available, affordable, non-invasive and relatively accurate investigation in patients with stroke and is important to identify stroke pathology and exclude mimics. Aim of this study is to establish the diagnostic significance of computed tomography in cerebrovascular accident and to differentiate between cerebral infarction and cerebral haemorrhage with CT for better management of CVA. Methods: A one year observational cross sectional study was conducted in 100 patients that presented at the department of radiodiagnosis from emergency or ward within the one year of study period with the clinical diagnosis of stroke, and had a brain CT scan done within one to fourteen days of onset. Results: A total of 100 patients were studied. 66 were male and 34 were female with a male/female ratio of 1.9:1. Maximum number of cases (39% was in the age group of 61-80 yrs. Among 100 patients, 55 cases were clinically diagnosed as hemorrhagic stroke and 45 cases were clinically diagnosed with an infarct. Out of the 55 hemorrhagic cases, two cases were diagnosed as both hemorrhage and infarct by CT scan, one case had normal CT scan findings and one had subdural haemorrhage. These four cases were excluded while comparing the clinical diagnosis with CT scan finding. Among 51 clinically diagnosed cases of hemorrhagic stroke, 32(62.7% cases were proved by CT scan as hemorrhagic stroke and among clinically diagnosed cases of infarct, 39(86.7% cases were proved by CT scan as infarct which is statistically significant (p <0.001. A significant agreement between clinical and CT diagnosis was observed as indicated by kappa value of 0.49. Sensitivity, specificity, positive predictive value and negative predictive value of clinical findings as compared to CT in diagnosing hemorrhage were 84.2%, 67.2%, 62.8% and 86

  11. Are studies reporting significant results more likely to be published?

    Science.gov (United States)

    Koletsi, Despina; Karagianni, Anthi; Pandis, Nikolaos; Makou, Margarita; Polychronopoulou, Argy; Eliades, Theodore

    2009-11-01

    Our objective was to assess the hypothesis that there are variations of the proportion of articles reporting a significant effect, with a higher percentage of those articles published in journals with impact factors. The contents of 5 orthodontic journals (American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontist, European Journal of Orthodontics, Journal of Orthodontics, and Orthodontics and Craniofacial Research), published between 2004 and 2008, were hand-searched. Articles with statistical analysis of data were included in the study and classified into 4 categories: behavior and psychology, biomaterials and biomechanics, diagnostic procedures and treatment, and craniofacial growth, morphology, and genetics. In total, 2622 articles were examined, with 1785 included in the analysis. Univariate and multivariate logistic regression analyses were applied with statistical significance as the dependent variable, and whether the journal had an impact factor, the subject, and the year were the independent predictors. A higher percentage of articles showed significant results relative to those without significant associations (on average, 88% vs 12%) for those journals. Overall, these journals published significantly more studies with significant results, ranging from 75% to 90% (P = 0.02). Multivariate modeling showed that journals with impact factors had a 100% increased probability of publishing a statistically significant result compared with journals with no impact factor (odds ratio [OR], 1.99; 95% CI, 1.19-3.31). Compared with articles on biomaterials and biomechanics, all other subject categories showed lower probabilities of significant results. Nonsignificant findings in behavior and psychology and diagnosis and treatment were 1.8 (OR, 1.75; 95% CI, 1.51-2.67) and 3.5 (OR, 3.50; 95% CI, 2.27-5.37) times more likely to be published, respectively. Journals seem to prefer reporting significant results; this might be because of authors

  12. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  13. Significant prognosticators after primary radiotherapy in 903 nondisseminated nasopharyngeal carcinoma evaluated by computer tomography

    International Nuclear Information System (INIS)

    Teo, P.; Yu, P.; Lee, W.Y.; Leung, S.F.; Kwan, W.H.; Yu, K.H.; Choi, P.; Johnson, P.J.

    1996-01-01

    Purpose: To evaluate the significant prognosticators in nasopharyngeal carcinoma (NPC). Methods and Materials: From 1984 to 1989, 903 treatment-naive nondisseminated (MO) NPC were given primary radical radiotherapy to 60-62.5 Gy in 6 weeks. All patients had computed tomographic (CT) and endoscopic evaluation of the primary tumor. Potentially significant parameters (the patient's age and sex, the anatomical structures infiltrated by the primary lesion, the cervical nodal characteristics, the tumor histological subtypes, and various treatment variables were analyzed by both monovariate and multivariate methods for each of the five clinical endpoints: actuarial survival, disease-free survival, free from distant metastasis, free from local failure, and free from regional failure. Results: The significant prognosticators predicting for an increased risk of distant metastases and poorer survival included male sex, skull base and cranial nerve(s) involvement, advanced Ho's N level, and presence of fixed or partially fixed nodes or nodes contralateral to the side of the bulk of the nasopharyngeal primary. Advanced patient age led to significantly worse survival and poorer local tumor control. Local and regional failures were both increased by tumor infiltrating the skull base and/or the cranial nerves. In addition, regional failure was increased significantly by advancing Ho's N level. Parapharyngeal tumor involvement was the strongest independent prognosticator that determined distant metastasis and survival rates in the absence of the overriding prognosticators of skull base infiltration, cranial nerve(s) palsy, and cervical nodal metastasis. Conclusions: The significant prognosticators are delineated after the advent of CT and these should form the foundation of the modern stage classification for NPC

  14. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  15. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  16. Verification of SACI-2 computer code comparing with experimental results of BIBLIS-A and LOOP-7 computer code

    International Nuclear Information System (INIS)

    Soares, P.A.; Sirimarco, L.F.

    1984-01-01

    SACI-2 is a computer code created to study the dynamic behaviour of a PWR nuclear power plant. To evaluate the quality of its results, SACI-2 was used to recalculate commissioning tests done in BIBLIS-A nuclear power plant and to calculate postulated transients for Angra-2 reactor. The results of SACI-2 computer code from BIBLIS-A showed as much good agreement as those calculated with the KWU Loop 7 computer code for Angra-2. (E.G.) [pt

  17. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  18. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  19. Reproducibility of Dynamic Computed Tomography Brain Perfusion Measurements in Patients with Significant Carotid Artery Stenosis

    International Nuclear Information System (INIS)

    Serafin, Z.; Kotarski, M.; Karolkiewicz, M.; Mindykowski, R.; Lasek, W.; Molski, S.; Gajdzinska, M.; Nowak-Nowacka, A.

    2009-01-01

    Background: Perfusion computed tomography (PCT) determination is a minimally invasive and widely available technique for brain blood flow assessment, but its application may be restricted by large variation of results. Purpose: To determine the intraobserver, interobserver, and inter examination variability of brain PCT absolute measurements in patients with significant carotid artery stenosis (CAS), and to evaluate the effect of the use of relative perfusion values on PCT reproducibility. Material and Methods: PCT imaging was completed in 61 patients before endarterectomy, and in 38 of these within 4 weeks after treatment. Cerebral blood flow (CBF), cerebral blood volume (CBV), time to peak (TTP), and peak enhancement intensity (PEI) were calculated with the maximum slope method. Inter examination variability was evaluated based on perfusion of hemisphere contralateral to the treated CAS, from repeated examinations. Interobserver and intraobserver variability were established for the untreated side, based on pretreatment examination. Results: Interobserver and intraobserver variability were highest for CBF measurement (28.8% and 32.5%, respectively), and inter examination variability was the highest for CBV (24.1%). Intraobserver and interobserver variability were higher for absolute perfusion values compared with their respective ratios for CBF and TTP. The only statistically significant difference between perfusion values measured by two observers was for CBF (mean 78.3 vs. 67.5 ml/100 g/min). The inter examination variability of TTP (12.1%) was significantly lower than the variability of other absolute perfusion measures, and the inter examination variability of ratios was significantly lower than absolute values for all the parameters. Conclusion: In longitudinal studies of patients with chronic cerebral ischemia, PCT ratios and either TTP or CBV are more suitable measures than absolute CBF values, because of their considerably lower inter- and intraobserver

  20. Reproducibility of Dynamic Computed Tomography Brain Perfusion Measurements in Patients with Significant Carotid Artery Stenosis

    Energy Technology Data Exchange (ETDEWEB)

    Serafin, Z.; Kotarski, M.; Karolkiewicz, M.; Mindykowski, R.; Lasek, W.; Molski, S.; Gajdzinska, M.; Nowak-Nowacka, A. (Dept. of Radiology and Diagnostic Imaging, and Dept. of General and Vascular Surgery, Nicolaus Copernicus Univ., Collegium Medicum, Bydgoszcz (Poland))

    2009-02-15

    Background: Perfusion computed tomography (PCT) determination is a minimally invasive and widely available technique for brain blood flow assessment, but its application may be restricted by large variation of results. Purpose: To determine the intraobserver, interobserver, and inter examination variability of brain PCT absolute measurements in patients with significant carotid artery stenosis (CAS), and to evaluate the effect of the use of relative perfusion values on PCT reproducibility. Material and Methods: PCT imaging was completed in 61 patients before endarterectomy, and in 38 of these within 4 weeks after treatment. Cerebral blood flow (CBF), cerebral blood volume (CBV), time to peak (TTP), and peak enhancement intensity (PEI) were calculated with the maximum slope method. Inter examination variability was evaluated based on perfusion of hemisphere contralateral to the treated CAS, from repeated examinations. Interobserver and intraobserver variability were established for the untreated side, based on pretreatment examination. Results: Interobserver and intraobserver variability were highest for CBF measurement (28.8% and 32.5%, respectively), and inter examination variability was the highest for CBV (24.1%). Intraobserver and interobserver variability were higher for absolute perfusion values compared with their respective ratios for CBF and TTP. The only statistically significant difference between perfusion values measured by two observers was for CBF (mean 78.3 vs. 67.5 ml/100 g/min). The inter examination variability of TTP (12.1%) was significantly lower than the variability of other absolute perfusion measures, and the inter examination variability of ratios was significantly lower than absolute values for all the parameters. Conclusion: In longitudinal studies of patients with chronic cerebral ischemia, PCT ratios and either TTP or CBV are more suitable measures than absolute CBF values, because of their considerably lower inter- and intraobserver

  1. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  2. Computation of Quasiperiodic Normally Hyperbolic Invariant Tori: Rigorous Results

    Science.gov (United States)

    Canadell, Marta; Haro, Àlex

    2017-12-01

    The development of efficient methods for detecting quasiperiodic oscillations and computing the corresponding invariant tori is a subject of great importance in dynamical systems and their applications in science and engineering. In this paper, we prove the convergence of a new Newton-like method for computing quasiperiodic normally hyperbolic invariant tori carrying quasiperiodic motion in smooth families of real-analytic dynamical systems. The main result is stated as an a posteriori KAM-like theorem that allows controlling the inner dynamics on the torus with appropriate detuning parameters, in order to obtain a prescribed quasiperiodic motion. The Newton-like method leads to several fast and efficient computational algorithms, which are discussed and tested in a companion paper (Canadell and Haro in J Nonlinear Sci, 2017. doi: 10.1007/s00332-017-9388-z), in which new mechanisms of breakdown are presented.

  3. New computation results for the solar dynamo

    International Nuclear Information System (INIS)

    Csada, I.K.

    1983-01-01

    The analytical solution to the solar dynamo equation leads to a relatively simple algorythm for the computation in terms of kinematic models. The internal and external velocities taken to be in the form of axisymmetric meridional circulation and differential rotation, respectively. Pure radial expanding motions in the corona are also taken into consideration. Numerical results are presented in terms of the velocity parameters for the period of field reversal, decay time, magnitudes and phases of the first four multipoles. (author)

  4. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  5. Some gender issues in educational computer use: results of an international comparative survey

    OpenAIRE

    Janssen Reinen, I.A.M.; Plomp, T.

    1993-01-01

    In the framework of the Computers in Education international study of the International Association for the Evaluation of Educational Achievement (IEA), data have been collected concerning the use of computers in 21 countries. This article examines some results regarding the involvement of women in the implementation and use of computers in the educational practice of elementary, lower secondary and upper secondary education in participating countries. The results show that in many countries ...

  6. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    Science.gov (United States)

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  7. Prevalence and clinical significance of pleural microbubbles in computed tomography of thoracic empyema

    International Nuclear Information System (INIS)

    Smolikov, A.; Smolyakov, R.; Riesenberg, K.; Schlaeffer, F.; Borer, A.; Cherniavsky, E.; Gavriel, A.; Gilad, J.

    2006-01-01

    AIM: To determine the prevalence and clinical significance of pleural microbubbles in thoracic empyema. MATERIALS AND METHODS: The charts of 71 consecutive patients with empyema were retrospectively reviewed for relevant demographic, laboratory, microbiological, therapeutic and outcome data. Computed tomography (CT) images were reviewed for various signs of empyema as well as pleural microbubbles. Two patient groups, with and without microbubbles were compared. RESULTS: Mean patient age was 49 years and 72% were males. Microbubbles were detected in 58% of patients. There were no significant differences between patients with and without microbubbles in regard to pleural fluid chemistry. A causative organism was identified in about 75% of cases in both. There was no difference in the rates of pleural thickening and enhancement, increased extra-pleural fat attenuation, air-fluid levels or loculations. Microbubbles were diagnosed after a mean of 7.8 days from admission. Thoracentesis before CT was performed in 90 and 57% of patients with and without microbubbles (p=0.0015), respectively. Patients with microbubbles were more likely to require repeated drainage (65.9 versus 36.7%, p=0.015) and surgical decortication (31.7 versus 6.7%, p=0.011). Mortalities were 9.8 and 6.6% respectively (p=0.53). CONCLUSION: Pleural microbubbles are commonly encountered in CT imaging of empyema but have not been systematically studied to date. Microbubbles may be associated with adverse outcome such as repeated drainage or surgical decortication. The sensitivity and specificity of this finding and its prognostic implications need further assessment

  8. Significance of triplane computed tomography in otolaryngology

    International Nuclear Information System (INIS)

    Taiji, Hidenobu; Namiki, Hideo; Kano, Shigeru; Hojoh, Yoshio

    1985-01-01

    The authors obtained direct sagittal CT scans of the head using a new method for positioning the head of patient in sitting position. Direct sagittal scans are more useful than computed rearranged scans in a better spatial and density resolution. The triplane CT (axial, coronal, and sagittal CT) greatly improves three dimentional recognition of the intracranial and facial structures and the extent of the lesion. A series of patients with various nasal and oropharyngeal tumors was examined with the triplane CT. The advantages of direct sagittal scans are (1) the recognition of localization and extension of the lesion. (2) the evaluation of the extent of the deep facial and nasopharygeal tumors, especially in the intracranial and intraorbital regions. (3) the more accurate determination of staging of the maxillary cancer. (author)

  9. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  10. A result-driven minimum blocking method for PageRank parallel computing

    Science.gov (United States)

    Tao, Wan; Liu, Tao; Yu, Wei; Huang, Gan

    2017-01-01

    Matrix blocking is a common method for improving computational efficiency of PageRank, but the blocking rules are hard to be determined, and the following calculation is complicated. In tackling these problems, we propose a minimum blocking method driven by result needs to accomplish a parallel implementation of PageRank algorithm. The minimum blocking just stores the element which is necessary for the result matrix. In return, the following calculation becomes simple and the consumption of the I/O transmission is cut down. We do experiments on several matrixes of different data size and different sparsity degree. The results show that the proposed method has better computational efficiency than traditional blocking methods.

  11. The significance of computed tomography in optic neuropathy

    International Nuclear Information System (INIS)

    Awai, Tsugumi; Yasutake, Hirohide; Ono, Yoshiko; Kumagai, Kazuhisa; Kairada, Kensuke

    1981-01-01

    Computed tomography (CT scan) has become one of the important and useful modes of examination for ophthalmological and neuro-ophthalmological disorders. CT scan (EMI scan) was performed on 21 patients with optic neuropathy in order to detect the cause. Of these 21 patients, the CT scan was abnormal in six. These six patients were verified, histopathologically, as having chromophobe pituitary adenoma, craniopharyngioma, plasmocytoma from sphenoidal sinus, optic nerve glioma and giant aneurysma of anterior communicating artery. The practical diagnostic value of CT scan for optic neuropathy is discussed. (author)

  12. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  13. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  14. Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry.

    OpenAIRE

    Manning, B.; Gadd, C. S.

    2001-01-01

    Although published reports describe specific handheld computer applications in medical training, we know very little yet about how, and how well, handheld computing fits into the spectrum of information resources available for patient care and physician training. This paper reports preliminary quantitative and qualitative results from an evaluation study designed to track changes in computer usage patterns and computer-related attitudes before and after introduction of handheld computing. Pre...

  15. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis: Results From the VIRTU-Fast Study.

    Science.gov (United States)

    Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2017-08-01

    Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  16. Intensity-Modulated Radiotherapy Results in Significant Decrease in Clinical Toxicities Compared With Conventional Wedge-Based Breast Radiotherapy

    International Nuclear Information System (INIS)

    Harsolia, Asif; Kestin, Larry; Grills, Inga; Wallace, Michelle; Jolly, Shruti; Jones, Cortney; Lala, Moinaktar; Martinez, Alvaro; Schell, Scott; Vicini, Frank A.

    2007-01-01

    Purpose: We have previously demonstrated that intensity-modulated radiotherapy (IMRT) with a static multileaf collimator process results in a more homogenous dose distribution compared with conventional wedge-based whole breast irradiation (WBI). In the present analysis, we reviewed the acute and chronic toxicity of this IMRT approach compared with conventional wedge-based treatment. Methods and Materials: A total of 172 patients with Stage 0-IIB breast cancer were treated with lumpectomy followed by WBI. All patients underwent treatment planning computed tomography and received WBI (median dose, 45 Gy) followed by a boost to 61 Gy. Of the 172 patients, 93 (54%) were treated with IMRT, and the 79 patients (46%) treated with wedge-based RT in a consecutive fashion immediately before this cohort served as the control group. The median follow-up was 4.7 years. Results: A significant reduction in acute Grade 2 or worse dermatitis, edema, and hyperpigmentation was seen with IMRT compared with wedges. A trend was found toward reduced acute Grade 3 or greater dermatitis (6% vs. 1%, p = 0.09) in favor of IMRT. Chronic Grade 2 or worse breast edema was significantly reduced with IMRT compared with conventional wedges. No difference was found in cosmesis scores between the two groups. In patients with larger breasts (≥1,600 cm 3 , n = 64), IMRT resulted in reduced acute (Grade 2 or greater) breast edema (0% vs. 36%, p <0.001) and hyperpigmentation (3% vs. 41%, p 0.001) and chronic (Grade 2 or greater) long-term edema (3% vs. 30%, p 0.007). Conclusion: The use of IMRT in the treatment of the whole breast results in a significant decrease in acute dermatitis, edema, and hyperpigmentation and a reduction in the development of chronic breast edema compared with conventional wedge-based RT

  17. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  18. Technical Note. The Concept of a Computer System for Interpretation of Tight Rocks Using X-Ray Computed Tomography Results

    Directory of Open Access Journals (Sweden)

    Habrat Magdalena

    2017-03-01

    Full Text Available The article presents the concept of a computer system for interpreting unconventional oil and gas deposits with the use of X-ray computed tomography results. The functional principles of the solution proposed are presented in the article. The main goal is to design a product which is a complex and useful tool in a form of a specialist computer software for qualitative and quantitative interpretation of images obtained from X-ray computed tomography. It is devoted to the issues of prospecting and identification of unconventional hydrocarbon deposits. The article focuses on the idea of X-ray computed tomography use as a basis for the analysis of tight rocks, considering especially functional principles of the system, which will be developed by the authors. The functional principles include the issues of graphical visualization of rock structure, qualitative and quantitative interpretation of model for visualizing rock samples, interpretation and a description of the parameters within realizing the module of quantitative interpretation.

  19. 14th annual Results and Review Workshop on High Performance Computing in Science and Engineering

    CERN Document Server

    Nagel, Wolfgang E; Resch, Michael M; Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2011; High Performance Computing in Science and Engineering '11

    2012-01-01

    This book presents the state-of-the-art in simulation on supercomputers. Leading researchers present results achieved on systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2011. The reports cover all fields of computational science and engineering, ranging from CFD to computational physics and chemistry, to computer science, with a special emphasis on industrially relevant applications. Presenting results for both vector systems and microprocessor-based systems, the book allows readers to compare the performance levels and usability of various architectures. As HLRS

  20. Results of application of automatic computation of static corrections on data from the South Banat Terrain

    Science.gov (United States)

    Milojević, Slavka; Stojanovic, Vojislav

    2017-04-01

    Due to the continuous development of the seismic acquisition and processing method, the increase of the signal/fault ratio always represents a current target. The correct application of the latest software solutions improves the processing results and justifies their development. A correct computation and application of static corrections represents one of the most important tasks in pre-processing. This phase is of great importance for further processing steps. Static corrections are applied to seismic data in order to compensate the effects of irregular topography, the difference between the levels of source points and receipt in relation to the level of reduction, of close to the low-velocity surface layer (weathering correction), or any reasons that influence the spatial and temporal position of seismic routes. The refraction statics method is the most common method for computation of static corrections. It is successful in resolving of both the long-period statics problems and determining of the difference in the statics caused by abrupt lateral changes in velocity in close to the surface layer. XtremeGeo FlatironsTM is a program whose main purpose is computation of static correction through a refraction statics method and allows the application of the following procedures: picking of first arrivals, checking of geometry, multiple methods for analysis and modelling of statics, analysis of the refractor anisotropy and tomography (Eikonal Tomography). The exploration area is located on the southern edge of the Pannonian Plain, in the plain area with altitudes of 50 to 195 meters. The largest part of the exploration area covers Deliblato Sands, where the geological structure of the terrain and high difference in altitudes significantly affects the calculation of static correction. Software XtremeGeo FlatironsTM has powerful visualization and tools for statistical analysis which contributes to significantly more accurate assessment of geometry close to the surface

  1. Computations for the 1:5 model of the THTR pressure vessel compared with experimental results

    International Nuclear Information System (INIS)

    Stangenberg, F.

    1972-01-01

    In this report experimental results measured at the 1:5-model of the prestressed concrete pressure vessel of the THTR-nuclear power station Schmehausen in 1971, are compared with the results of axis-symmetrical computations. Linear-elastic computations were performed as well as approximate computations for overload pressures taking into consideration the influences of the load history (prestressing, temperature, creep) and the effects of the steel components. (orig.) [de

  2. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  3. Results of the deepest all-sky survey for continuous gravitational waves on LIGO S6 data running on the Einstein@Home volunteer distributed computing project

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acemese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Arker, Bd.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Be, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitoss, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Boutfanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, O.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, C.; Cahillane, C.; Bustillo, J. Calderon; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, Laura; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Costa, C. F. Da Silva; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.A.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M. Di; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Dreyer, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Egizenstein, H. -B.; Ehrens, P.; Eichholel, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, O.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Far, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.M.; Fournier, J. -D.; Frasca, J. -D; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garuti, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gi, K.; Glaetke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Granta, A.; Gras, S.; Cray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, S.; Hennig, J.; Henry, J.A.; Heptonsta, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howel, E. J.; Hu, Y. M.; Huang, O.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Isogai, T.; Lyer, B. R.; Fzumi, K.; Jaccimin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jones, R.; Jonker, R. J. G.; Ju, L.; Wads, k; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kefelian, F.; Keh, M. S.; Keite, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, Namjun; Kim, W.; Kimbre, S. J.; King, E. J.; King, P. J.; Kisse, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringe, V.; Krishnan, B.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Liick, H.; Lundgren, A. P.; Lynch, R.; Ivia, Y.; Machenschalk, B.; Maclnnis, M.; Macleod, D. M.; Magafia-Sandoval, F.; Zertuche, L. Magafia; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Manse, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matiehard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Miche, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecehia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Gutierrez-Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Hang, S.; Ohme, F.; Oliver, M.; Oppermann, P.; Ram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Phelps, M.; Piccinni, . J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powel, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L. G.; Puncken, .; Punturo, M.; Purrer, PuppoM.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rowan, RosiliskaS.; Ruggi, RiidigerP.; Ryan, K.; Sachdev, Perminder S; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Saulson, P. R.; Sauter, E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabe, R.; Schofield, R. M. S.; Schonbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T. J.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Sielleez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, António Dias da; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazus, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sunil, Suns; Sutton, P. J.; Swinkels, B. L.; Szczepariczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tomasi, Z.; Torres, C. V.; Tome, C.; Tot, D.; Travasso, F.; Traylor, G.; Trifire, D.; Tringali, M. C.; Trozz, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Valente, G.; Valdes, G.; van Bake, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; Van Heilningen, J. V.; Van Vegge, A. A.; Vardaro, M.; Vass, S.; Vaslith, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P.J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Vvang, G.; Wang, O.; Wang, X.; Wiang, Y.; Ward, R. L.; Wiarner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weliels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; WilIke, B.; Wimmer, M. H.; Whinkler, W.; Wipf, C. C.; De Witte, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J.L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S.J.; Zhu, X.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    We report results of a deep all-sky search for periodic gravitational waves from isolated neutron stars in data from the S6 LIGO science run. The search was possible thanks to the computing power provided by the volunteers of the Einstein@Home distributed computing project. We find no significant

  4. Separation of electron ion ring components (computational simulation and experimental results)

    International Nuclear Information System (INIS)

    Aleksandrov, V.S.; Dolbilov, G.V.; Kazarinov, N.Yu.; Mironov, V.I.; Novikov, V.G.; Perel'shtejn, Eh.A.; Sarantsev, V.P.; Shevtsov, V.F.

    1978-01-01

    The problems of the available polarization value of electron-ion rings in the regime of acceleration and separation of its components at the final stage of acceleration are studied. The results of computational simulation by use of the macroparticle method and experiments on the ring acceleration and separation are given. The comparison of calculation results with experiment is presented

  5. Results of a Research Evaluating Quality of Computer Science Education

    Science.gov (United States)

    Záhorec, Ján; Hašková, Alena; Munk, Michal

    2012-01-01

    The paper presents the results of an international research on a comparative assessment of the current status of computer science education at the secondary level (ISCED 3A) in Slovakia, the Czech Republic, and Belgium. Evaluation was carried out based on 14 specific factors gauging the students' point of view. The authors present qualitative…

  6. Cone-beam computed tomography analysis of accessory maxillary ostium and Haller cells: Prevalence and clinical significance

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Ibrahim K.; Sansare, Kaustubh; Karjodkar, Freny R.; Vanga, Kavita; Salve, Prashant [Dept. of Oral Medicine and Radiology, Nair Hospital Dental College, Mumbai (India); Pawar, Ajinkya M. [Dept. of Conservative Dentistry and Endodontics, Nair Hospital Dental College, Mumbai (India)

    2017-03-15

    This study aimed to evaluate the prevalence of Haller cells and accessory maxillary ostium (AMO) in cone-beam computed tomography (CBCT) images, and to analyze the relationships among Haller cells, AMO, and maxillary sinusitis. Volumetric CBCT scans from 201 patients were retrieved from our institution's Digital Imaging and Communications in Medicine archive folder. Two observers evaluated the presence of Haller cells, AMO, and maxillary sinusitis in the CBCT scans. AMO was observed in 114 patients, of whom 27 (23.7%) had AMO exclusively on the right side, 26 (22.8%) only on the left side, and 61 (53.5%) bilaterally. Haller cells were identified in 73 (36.3%) patients. In 24 (32.9%) they were present exclusively on the right side, in 17 (23.3%) they were only present on the left side, and in 32 (43.8%) they were located bilaterally. Of the 73 (36.3%) patients with Haller cells, maxillary sinusitis was also present in 50 (68.5%). On using chi-square test, a significant association was observed between AMO and maxillary sinusitis in the presence of Haller cells. Our results showed AMO and Haller cells to be associated with maxillary sinusitis. This study provides evidence for the usefulness of CBCT in imaging the bony anatomy of the sinonasal complex with significantly higher precision and a smaller radiation dose.

  7. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  8. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  9. Transversity results and computations in symplectic field theory

    International Nuclear Information System (INIS)

    Fabert, Oliver

    2008-01-01

    Although the definition of symplectic field theory suggests that one has to count holomorphic curves in cylindrical manifolds R x V equipped with a cylindrical almost complex structure J, it is already well-known from Gromov-Witten theory that, due to the presence of multiply-covered curves, we in general cannot achieve transversality for all moduli spaces even for generic choices of J. In this thesis we treat the transversality problem of symplectic field theory in two important cases. In the first part of this thesis we are concerned with the rational symplectic field theory of Hamiltonian mapping tori, which is also called the Floer case. For this observe that in the general geometric setup for symplectic field theory, the contact manifolds can be replaced by mapping tori M φ of symplectic manifolds (M,ω M ) with symplectomorphisms φ. While the cylindrical contact homology of M φ is given by the Floer homologies of powers of φ, the other algebraic invariants of symplectic field theory for M φ provide natural generalizations of symplectic Floer homology. For symplectically aspherical M and Hamiltonian φ we study the moduli spaces of rational curves and prove a transversality result, which does not need the polyfold theory by Hofer, Wysocki and Zehnder and allows us to compute the full contact homology of M φ ≅ S 1 x M. The second part of this thesis is devoted to the branched covers of trivial cylinders over closed Reeb orbits, which are the trivial examples of punctured holomorphic curves studied in rational symplectic field theory. Since all moduli spaces of trivial curves with virtual dimension one cannot be regular, we use obstruction bundles in order to find compact perturbations making the Cauchy-Riemann operator transversal to the zero section and show that the algebraic count of elements in the resulting regular moduli spaces is zero. Once the analytical foundations of symplectic field theory are established, our result implies that the

  10. Thermodynamic properties of indan: Experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Steele, William V.; Kazakov, Andrei F.

    2016-01-01

    Highlights: • Heat capacities were measured for the temperature range (5 to 445) K. • Vapor pressures were measured for the temperature range (338 to 495) K. • Densities at saturation pressure were measured from T = (323 to 523) K. • Computed and experimentally derived properties for ideal gas entropies are in excellent accord. • Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Measurements leading to the calculation of thermodynamic properties in the ideal-gas state for indan (Chemical Abstracts registry number [496-11-7], 2,3-dihydro-1H-indene) are reported. Experimental methods were adiabatic heat-capacity calorimetry, differential scanning calorimetry, comparative ebulliometry, and vibrating-tube densitometry. Molar thermodynamic functions (enthalpies, entropies, and Gibbs energies) for the condensed and ideal-gas states were derived from the experimental studies at selected temperatures. Statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d, p) level of theory. Computed ideal-gas properties derived with the rigid-rotor harmonic-oscillator approximation are shown to be in excellent accord with ideal-gas entropies derived from thermophysical property measurements of this research, as well as with experimental heat capacities for the ideal-gas state reported in the literature. Literature spectroscopic studies and ab initio calculations report a range of values for the barrier to ring puckering. Results of the present work are consistent with a large barrier that allows use of the rigid-rotor harmonic-oscillator approximation for ideal-gas entropy and heat-capacity calculations, even with the stringent uncertainty requirements imposed by the calorimetric and physical property measurements reported here. All experimental results are compared with property values reported in the literature.

  11. Discrete ordinates cross-section generation in parallel plane geometry -- 2: Computational results

    International Nuclear Information System (INIS)

    Yavuz, M.

    1998-01-01

    In Ref. 1, the author presented inverse discrete ordinates (S N ) methods for cross-section generation with an arbitrary scattering anisotropy of order L (L ≤ N - 1) in parallel plane geometry. The solution techniques depend on the S N eigensolutions. The eigensolutions are determined by the inverse simplified S N method (ISS N ), which uses the surface Green's function matrices (T and R). Inverse problems are generally designed so that experimentally measured physical quantities can be used in the formulations. In the formulations, although T and R (TR matrices) are measurable quantities, the author does not have such data to check the adequacy and accuracy of the methods. However, it is possible to compute TR matrices by S N methods. The author presents computational results and computationally observed properties

  12. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  13. Increasing the trustworthiness of research results: the role of computers in qualitative text analysis

    Science.gov (United States)

    Lynne M. Westphal

    2000-01-01

    By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...

  14. [Study on computed tomography features of nasal septum cellule and its clinical significance].

    Science.gov (United States)

    Huang, Dingqiang; Li, Wanrong; Gao, Liming; Xu, Guanqiang; Ou, Xiaoyi; Tang, Guangcai

    2008-03-01

    To investigate the features of nasal septum cellule in computed tomographic (CT) images and its clinical significance. CT scans data of nasal septum in 173 patients were randomly obtained from January 2001 to June 2005. Prevalence and clinical features were summarized in the data of 19 patients with nasal septum cellule retrospectively. (1) Nineteen cases with nasal septum cellule were found in 173 patients. (2) All nasal septum cellule of 19 cases located in perpendicular plate of the ethmoid bone, in which 8 cases located in upper part of nasal septum and 11 located in middle. (3) There were totally seven patients with nasal diseases related to nasal septum cellule, in which 3 cases with inflammation, 2 cases with bone fracture, 1 case with cholesterol granuloma, 1 case with mucocele. Nasal septum cellule is an anatomic variation of nasal septum bone, and its features can provide further understanding of some diseases related to nasal septum cellule.

  15. Transversity results and computations in symplectic field theory

    Energy Technology Data Exchange (ETDEWEB)

    Fabert, Oliver

    2008-02-21

    Although the definition of symplectic field theory suggests that one has to count holomorphic curves in cylindrical manifolds R x V equipped with a cylindrical almost complex structure J, it is already well-known from Gromov-Witten theory that, due to the presence of multiply-covered curves, we in general cannot achieve transversality for all moduli spaces even for generic choices of J. In this thesis we treat the transversality problem of symplectic field theory in two important cases. In the first part of this thesis we are concerned with the rational symplectic field theory of Hamiltonian mapping tori, which is also called the Floer case. For this observe that in the general geometric setup for symplectic field theory, the contact manifolds can be replaced by mapping tori M{sub {phi}} of symplectic manifolds (M,{omega}{sub M}) with symplectomorphisms {phi}. While the cylindrical contact homology of M{sub {phi}} is given by the Floer homologies of powers of {phi}, the other algebraic invariants of symplectic field theory for M{sub {phi}} provide natural generalizations of symplectic Floer homology. For symplectically aspherical M and Hamiltonian {phi} we study the moduli spaces of rational curves and prove a transversality result, which does not need the polyfold theory by Hofer, Wysocki and Zehnder and allows us to compute the full contact homology of M{sub {phi}} {approx_equal} S{sup 1} x M. The second part of this thesis is devoted to the branched covers of trivial cylinders over closed Reeb orbits, which are the trivial examples of punctured holomorphic curves studied in rational symplectic field theory. Since all moduli spaces of trivial curves with virtual dimension one cannot be regular, we use obstruction bundles in order to find compact perturbations making the Cauchy-Riemann operator transversal to the zero section and show that the algebraic count of elements in the resulting regular moduli spaces is zero. Once the analytical foundations of symplectic

  16. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  17. Diagnostic significance of rib series in minor thorax trauma compared to plain chest film and computed tomography.

    Science.gov (United States)

    Hoffstetter, Patrick; Dornia, Christian; Schäfer, Stephan; Wagner, Merle; Dendl, Lena M; Stroszczynski, Christian; Schreyer, Andreas G

    2014-01-01

    Rib series (RS) are a special radiological technique to improve the visualization of the bony parts of the chest. The aim of this study was to evaluate the diagnostic accuracy of rib series in minor thorax trauma. Retrospective study of 56 patients who received RS, 39 patients where additionally evaluated by plain chest film (PCF). All patients underwent a computed tomography (CT) of the chest. RS and PCF were re-read independently by three radiologists, the results were compared with the CT as goldstandard. Sensitivity, specificity, negative and positive predictive value were calculated. Significance in the differences of findings was determined by McNemar test, interobserver variability by Cohens kappa test. 56 patients were evaluated (34 men, 22 women, mean age =61 y.). In 22 patients one or more rib fracture could be identified by CT. In 18 of these cases (82%) the correct diagnosis was made by RS, in 16 cases (73%) the correct number of involved ribs was detected. These differences were significant (p = 0.03). Specificity was 100%, negative and positive predictive value were 85% and 100%. Kappa values for the interobserver agreement was 0.92-0.96. Sensitivity of PCF was 46% and was significantly lower (p = 0.008) compared to CT. Rib series does not seem to be an useful examination in evaluating minor thorax trauma. CT seems to be the method of choice to detect rib fractures, but the clinical value of the radiological proof has to be discussed and investigated in larger follow up studies.

  18. Computer Self-Efficacy: A Practical Indicator of Student Computer Competency in Introductory IS Courses

    Directory of Open Access Journals (Sweden)

    Rex Karsten

    1998-01-01

    Full Text Available Students often receive their first college-level computer training in introductory information systems courses. Students and faculty frequently expect this training to develop a level of student computer competence that will support computer use in future courses. In this study, we applied measures of computer self-efficacy to students in a typical introductory IS course. The measures provided useful evidence that student perceptions of their ability to use computers effectively in the future significantly improved as a result of their training experience. The computer self-efficacy measures also provided enhanced insight into course-related factors of practical concern to IS educators. Study results also suggest computer self-efficacy measures may be a practical and informative means of assessing computer-training outcomes in the introductory IS course context

  19. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    International Nuclear Information System (INIS)

    Eyler, L.L.; Trent, D.S.; Budden, M.J.

    1983-09-01

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs

  20. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  1. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  2. First results from a combined analysis of CERN computing infrastructure metrics

    Science.gov (United States)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  4. Computer processing of the Δlambda/lambda measured results

    International Nuclear Information System (INIS)

    Draguniene, V.J.; Makariuniene, E.K.

    1979-01-01

    For the processing of the experimental data on the influence of the chemical environment on the radioactive decay constants, five programs have been written in the Fortran language in the version for the monitoring system DUBNA on the BESM-6 computer. Each program corresponds to a definite stage of data processing and acquirement of the definite answer. The first and second programs are calculation of the ratio of the pulse numbers measured with different sources and calculation of the mean value of dispersions. The third program is the averaging of the ratios of the pulse numbers. The fourth and the fifth are determination of the change of the radioactive decay constant. The created programs for the processing of the measurement results permit the processing of the experimental data beginning from the values of pulse numbers obtained directly in the experiments. The programs allow to treat a file of the experimental results, to calculated various errors in all the stages of the calculations. Printing of the obtained results is convenient for usage

  5. Operating Wireless Sensor Nodes without Energy Storage: Experimental Results with Transient Computing

    Directory of Open Access Journals (Sweden)

    Faisal Ahmed

    2016-12-01

    Full Text Available Energy harvesting is increasingly used for powering wireless sensor network nodes. Recently, it has been suggested to combine it with the concept of transient computing whereby the wireless sensor nodes operate without energy storage capabilities. This new combined approach brings benefits, for instance ultra-low power nodes and reduced maintenance, but also raises new challenges, foremost dealing with nodes that may be left without power for various time periods. Although transient computing has been demonstrated on microcontrollers, reports on experiments with wireless sensor nodes are still scarce in the literature. In this paper, we describe our experiments with solar, thermal, and RF energy harvesting sources that are used to power sensor nodes (including wireless ones without energy storage, but with transient computing capabilities. The results show that the selected solar and thermal energy sources can operate both the wired and wireless nodes without energy storage, whereas in our specific implementation, the developed RF energy source can only be used for the selected nodes without wireless connectivity.

  6. Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results

    Science.gov (United States)

    Davis, Gloria J.

    1991-01-01

    NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.

  7. What is the clinical significance of chest CT when the chest x-ray result is normal in patients with blunt trauma?

    Science.gov (United States)

    Kea, Bory; Gamarallage, Ruwan; Vairamuthu, Hemamalini; Fortman, Jonathan; Lunney, Kevin; Hendey, Gregory W; Rodriguez, Robert M

    2013-08-01

    Computed tomography (CT) has been shown to detect more injuries than plain radiography in patients with blunt trauma, but it is unclear whether these injuries are clinically significant. This study aimed to determine the proportion of patients with normal chest x-ray (CXR) result and injury seen on CT and abnormal initial CXR result and no injury on CT and to characterize the clinical significance of injuries seen on CT as determined by a trauma expert panel. Patients with blunt trauma older than 14 years who received emergency department chest imaging as part of their evaluation at 2 urban level I trauma centers were enrolled. An expert trauma panel a priori classified thoracic injuries and subsequent interventions as major, minor, or no clinical significance. Of 3639 participants, 2848 (78.3%) had CXR alone and 791 (21.7%) had CXR and chest CT. Of 589 patients who had chest CT after a normal CXR result, 483 (82.0% [95% confidence interval [CI], 78.7-84.9%]) had normal CT results, and 106 (18.0% [95% CI, 15.1%-21.3%]) had CTs diagnosing injuries-primarily rib fractures, pulmonary contusion, and incidental pneumothorax. Twelve patients had injuries classified as clinically major (2.0% [95% CI, 1.2%-3.5%]), 78 were clinically minor (13.2% [95% CI, 10.7%-16.2%]), and 16 were clinically insignificant (2.7% (95% CI, 1.7%-4.4%]). Of 202 patients with CXRs suggesting injury, 177 (87.6% [95% CI, 82.4%-91.5%]) had chest CTs confirming injury and 25 (12.4% [95% CI, 8.5%-17.6%]) had no injury on CT. Chest CT after a normal CXR result in patients with blunt trauma detects injuries, but most do not lead to changes in patient management. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Comparative analysis of the results obtained by computer code ASTEC V2 and RELAP 5.3.2 for small leak ID 80 for VVER 1000

    International Nuclear Information System (INIS)

    Atanasova, B.; Grudev, P.

    2011-01-01

    The purpose of this report is to present the results obtained by simulation and subsequent analysis of emergency mode for small leak with ID 80 for WWER 1000/B320 - Kozloduy NPP Units 5 and 6. Calculations were performed with the ASTEC v2 computer code used for calculation of severe accident, which was designed by French and German groups - IRSN and GRS. Integral RELAP5 computer code is used as a reference for comparison of results. The analyzes are focused on the processes occurring in reactor internals phase of emergency mode with significant core damage. The main thermohydraulic parameters, start of reactor core degradation and subsequent fuel relocalization till reactor vessel failure are evaluated in the analysis. RELAP5 computer code is used as a reference code to compare the results obtained till early core degradation that occurs after core stripping and excising of fuel temperature above 1200 0 C

  9. Estimation of the genetically significant dose resulting from diagnostic radiology

    International Nuclear Information System (INIS)

    Angerstein, W.

    1978-01-01

    Based on the average gonad dose received per examination or per film and on the frequency of x-ray examinations (36 million per annum), the mean annual gonad dose to individuals in the GDR has been determined to be 33 mR. Considering different age groups of patients and the fact that the gonad dose to children is often significantly reduced in comparison to adults, estimates of the genetically significant dose (GSD) range from 7 to 19 mR per annum. Examinations of women have accounted for about 66 per cent of the GSD. The highest contribution to the GSD result from examinations of the following organs: kidneys, colon, bile duct (only in women), lumbar spine, pelois, hips, and proximal femur. Despite their high frequency, examinations of the stomach account for only about 3 per cent of the GSD. All thorax examinations (nearly 10,000,000 per annum) contribute less than 0.5 per cent, and the most frequent x-ray examinations of the skeletal system, skull, cervical spine, and teeth account for less than 3 per cent. The GSD values obtained are comparable with those from countries such as India, Japan, Netherlands, USSR, and USA. (author)

  10. [Computer-assisted analysis of the results of training in internal medicine].

    Science.gov (United States)

    Vrbová, H; Spunda, M

    1991-06-01

    Analysis of the results of teaching of clinical disciplines has in the long run an impact on the standard and value of medical care. It requires processing of quantitative and qualitative data. The selection of indicators which will be followed up and procedures used for their processing are of fundamental importance. The submitted investigation is an example how to use possibilities to process results of effectiveness analysis in teaching internal medicine by means of computer technique. As an indicator of effectiveness the authors selected the percentage of students who had an opportunity during the given period of their studies to observe a certain pathological condition, and as method of data collection a survey by means of questionnaires was used. The task permits to differentiate the students' experience (whether the student examined the patient himself or whether the patient was only demonstrated) and it makes it possible to differentiate the place of observation (at the university teaching hospital or regional non-teaching hospital attachment). The task permits also to form sub-groups of respondents to combine them as desired and to compare their results. The described computer programme support comprises primary processing of the output of the questionnaire survey. The questionnaires are transformed and stored by groups of respondents in data files of suitable format (programme SDFORM); the processing of results is described as well as their presentation as output listing or on the display in the interactive way (SDRESULT programme). Using the above programmes, the authors processed the results of a survey made among students during and after completion of the studies in a series of 70 recommended pathological conditions. As an example the authors compare results of observations in 20 selected pathological conditions important for the diagnosis and therapy in primary care in the final stage of the medical course in 1981 and 1985.

  11. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  12. Clinical significance of adrenal computed tomography in Addison's disease

    International Nuclear Information System (INIS)

    Sun, Zhong-Hua; Nomura, Kaoru; Toraya, Shohzoh; Ujihara, Makoto; Horiba, Nobuo; Suda, Toshihiro; Tsushima, Toshio; Demura, Hiroshi; Kono, Atsushi

    1992-01-01

    Adrenal computed tomographic (CT) scanning was conducted in twelve patients with Addison's disease during the clinical course. In tuberculous Addison's disease (n=8), three of four patients examined during the first two years after disease onset had bilaterally enlarged adrenals, while one of four had a unilaterally enlarged one. At least one adrenal gland was enlarged after onset in all six patients examined during the first four years. Thereafter, the adrenal glands was atrophied bilaterally, in contrast to adrenal glands in idiopathic Addison's disease which was atrophied bilaterally from disease onset (n=2). Adrenal calcification was a less sensitive clue in tracing pathogenesis, i.e., adrenal calcification was observed in five of eight patients with tuberculous Addison's disease, but not idiopathic patients. Thus, adrenal CT scanning could show the etiology of Addison's disease (infection or autoimmunity) and the phase of Addison's disease secondary to tuberculosis, which may be clinically important for initiating antituberculous treatment. (author)

  13. Summary of the most significant results reported in this session

    CERN Document Server

    Sens, J C

    1980-01-01

    D1e most interesting although speculative result is the observation of a 4 standard deviation effect at 5. 3 GeV in the l)JK 0TI - and lj!K- 'ff+ mass plots (SPS Exp. WJ\\11) with a crosssection of 180 nb (assuming 1 % branching ratio). This is a cancliclatc bare b-state. + Tiw next most significant experimental result is the observation of Ac at the CERN Intersecting Storage Rings (ISR). TI1is state was discovered at BNL by Samios et al. and has since been seen in several neutrino experiments. It was seen at the ISR by Lockman ct al. about a year ago (reported at Budapest) but not in a convincing way. The analysis has now been improved, and the result shows a peak which is most clearly present in the stnnmed A(31T)+ and K-p1T+ mass spectra. 'TI1e signal has furthennore been seen in Exp. R606 (reported - + by F. Muller in this parallel session) in both A3TI and pK TI . 111e most convincing signal comes from the Spli t-Ficlcl Magnet (SFM) in K-pn + 'TI1e three observations together, all at the ISR, make this an...

  14. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    Directory of Open Access Journals (Sweden)

    Yamashiro T

    2015-02-01

    Full Text Available Tsuneo Yamashiro,1 Tetsuhiro Miyara,1 Osamu Honda,2 Noriyuki Tomiyama,2 Yoshiharu Ohno,3 Satoshi Noma,4 Sadayuki Murayama1 On behalf of the ACTIve Study Group 1Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, Nishihara, Okinawa, Japan; 2Department of Radiology, Osaka University Graduate School of Medicine, Suita, Osaka, Japan; 3Department of Radiology, Kobe University Graduate School of Medicine, Kobe, Hyogo, Japan; 4Department of Radiology, Tenri Hospital, Tenri, Nara, Japan Purpose: To assess the advantages of iterative reconstruction for quantitative computed tomography (CT analysis of pulmonary emphysema. Materials and methods: Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < -950 Hounsfield units and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results: Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001. For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01, but was not

  15. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  16. Defining Spaces of Potential Art: The significance of representation in computer-aided creativity

    DEFF Research Database (Denmark)

    Dahlstedt, Palle

    2005-01-01

    One way of looking at the creative process is as a search in a space of possible answers. One way of simulating such a process is through evolutionary algorithms, i.e., simulated evolution by random variation and selection. The search space is defined by the chosen genetic representation, a kind...... of formal description, and the ways of navigating the space are defined by the choice of genetic operators (e.g., mutations). In creative systems, such as computer-aided music composition tools, these choices determine the efficiency of the system, in terms of the diversity of the results, the degree...... of novelty and the coherence within the material. Based on various implementations developed during five years of research, and experiences from real-life artistic applications, I will explain and discuss these mechanisms, from a perspective of the creative artist....

  17. [Excessive computer usage in adolescents--results of a psychometric evaluation].

    Science.gov (United States)

    Grüsser, Sabine M; Thalemann, Ralf; Albrecht, Ulrike; Thalemann, Carolin N

    2005-03-01

    Excessive computer and video game playing among children is being critically discussed from a pedagogic and public health point of view. To date, no reliable data for this phenomenon in Germany exists. In the present study, the excessive usage of computer and video games is seen as a rewarding behavior which can, due to learning mechanisms, become a prominent and inadequate strategy for children to cope with negative emotions like frustration, uneasiness and fears. In the survey, 323 children ranging in age from 11 to 14 years were asked about their video game playing behavior. Criteria for excessive computer and video game playing were developed in accordance with the criteria for dependency and pathological gambling (DSM-IV, ICD-10). Data show that 9.3% (N = 30) of the children fulfill all criteria for excessive computer and video game playing. Furthermore, these children differ from their class mates with respect to watching television, communication patterns, the ability to concentrate in school lectures and the preferred strategies coping with negative emotions. In accordance with findings in studies about substance-related addiction, data suggest that excessive computer and video game players use their excessive rewarding behavior specifically as an inadequate stress coping strategy.

  18. Methodics of computing the results of monitoring the exploratory gallery

    Directory of Open Access Journals (Sweden)

    Krúpa Víazoslav

    2000-09-01

    Full Text Available At building site of motorway tunnel Višòové-Dubná skala , the priority is given to driving of exploration galley that secures in detail: geologic, engineering geology, hydrogeology and geotechnics research. This research is based on gathering information for a supposed use of the full profile driving machine that would drive the motorway tunnel. From a part of the exploration gallery which is driven by the TBM method, a fulfilling information is gathered about the parameters of the driving process , those are gathered by a computer monitoring system. The system is mounted on a driving machine. This monitoring system is based on the industrial computer PC 104. It records 4 basic values of the driving process: the electromotor performance of the driving machine Voest-Alpine ATB 35HA, the speed of driving advance, the rotation speed of the disintegrating head TBM and the total head pressure. The pressure force is evaluated from the pressure in the hydraulic cylinders of the machine. Out of these values, the strength of rock mass, the angle of inner friction, etc. are mathematically calculated. These values characterize rock mass properties as their changes. To define the effectivity of the driving process, the value of specific energy and the working ability of driving head is used. The article defines the methodics of computing the gathered monitoring information, that is prepared for the driving machine Voest – Alpine ATB 35H at the Institute of Geotechnics SAS. It describes the input forms (protocols of the developed method created by an EXCEL program and shows selected samples of the graphical elaboration of the first monitoring results obtained from exploratory gallery driving process in the Višòové – Dubná skala motorway tunnel.

  19. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  20. Highly Parallel Computing Architectures by using Arrays of Quantum-dot Cellular Automata (QCA): Opportunities, Challenges, and Recent Results

    Science.gov (United States)

    Fijany, Amir; Toomarian, Benny N.

    2000-01-01

    There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA

  1. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    International Nuclear Information System (INIS)

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  2. p-Curve and Effect Size: Correcting for Publication Bias Using Only Significant Results.

    Science.gov (United States)

    Simonsohn, Uri; Nelson, Leif D; Simmons, Joseph P

    2014-11-01

    Journals tend to publish only statistically significant evidence, creating a scientific record that markedly overstates the size of effects. We provide a new tool that corrects for this bias without requiring access to nonsignificant results. It capitalizes on the fact that the distribution of significant p values, p-curve, is a function of the true underlying effect. Researchers armed only with sample sizes and test results of the published findings can correct for publication bias. We validate the technique with simulations and by reanalyzing data from the Many-Labs Replication project. We demonstrate that p-curve can arrive at conclusions opposite that of existing tools by reanalyzing the meta-analysis of the "choice overload" literature. © The Author(s) 2014.

  3. An assessment of future computer system needs for large-scale computation

    Science.gov (United States)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  4. Clinical significance of computed tomographic arteriography for minute hepatocellular carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Itoh, H; Matsui, O; Suzuki, M; Ida, M; Kitagawa, K [Kanazawa Univ. (Japan). School of Medicine

    1982-03-01

    Computed tomographic arteriography (CTA) can clearly demonstrate minute hepatocellular carcinoma (H.C.C.) more than 2 cm in diameter as an enhanced mass lesion. In this case the precise localization of H.C.C. becomes so obvious that CTA plays an important role to evaluate its resectability. However, H.C.C. of the size from 2 cm to 1 cm indiameter, which is visualized with celiac and infusion hepatic angiography, becomes more difficult in detection, and particularly H.C.C. of less than 1 cm in diameter can hardly be recognized, nor be diagnosed as a malignant nodule by CTA, therefore it appears that in these sizes of H.C.C. the detectability of CTA is not superior to the hepatic angiography.

  5. An analysis of true- and false-positive results of vocal fold uptake in positron emission tomography-computed tomography imaging.

    Science.gov (United States)

    Seymour, N; Burkill, G; Harries, M

    2018-03-01

    Positron emission tomography-computed tomography with fluorine-18 fluorodeoxy-D-glucose has a major role in the investigation of head and neck cancers. Fluorine-18 fluorodeoxy-D-glucose is not a tumour-specific tracer and can also accumulate in benign pathology. Therefore, positron emission tomography-computed tomography scan interpretation difficulties are common in the head and neck, which can produce false-positive results. This study aimed to investigate patients detected as having abnormal vocal fold uptake on fluorine-18 fluorodeoxy-D-glucose positron emission tomography-computed tomography. Positron emission tomography-computed tomography scans were identified over a 15-month period where reports contained evidence of unilateral vocal fold uptake or vocal fold pathology. Patients' notes and laryngoscopy results were analysed. Forty-six patients were identified as having abnormal vocal fold uptake on positron emission tomography-computed tomography. Twenty-three patients underwent positron emission tomography-computed tomography and flexible laryngoscopy: 61 per cent of patients had true-positive positron emission tomography-computed tomography scans and 39 per cent had false-positive scan results. Most patients referred to ENT for abnormal findings on positron emission tomography-computed tomography scans had true-positive findings. Asymmetrical fluorine-18 fluorodeoxy-D-glucose uptake should raise suspicion of vocal fold pathology, accepting a false-positive rate of approximately 40 per cent.

  6. Computer evaluation of the results of batch fermentations

    Energy Technology Data Exchange (ETDEWEB)

    Nyeste, L; Sevella, B

    1980-01-01

    A useful aid to the mathematical modeling of fermentation systems, for the kinetic evaluation of batch fermentations, is described. The generalized logistic equation may be used to describe the growth curves, substrate consumption, and product formation. A computer process was developed to fit the equation to experimental points, automatically determining the equation constants on the basis of the iteration algorithm of the method of non-linear least squares. By fitting the process to different master programs of various fermentations, the complex kinetic evaluation of fermentations becomes possible. Based on the analysis easily treatable generalized logistic equation, it is possible to calculate by computer different kinetic characteristics, e.g. rates, special rates, yields, etc. The possibility of committing subjective errors was reduced to a minimum. Employment of the method is demonstrated on some fermentation processes and problems arising in the course of application are discussed.

  7. Dynamic computed tomography scanning of benign bone lesions: Preliminary results

    International Nuclear Information System (INIS)

    Levine, E.; Neff, J.R.

    1983-01-01

    The majority of benign bone lesions can be evaluated adequately using conventional radiologic techniques. However, it is not always possible to differentiate reliably between different types of benign bone lesions on the basis of plain film appearances alone. Dynamic computed tomography (CT) scanning provides a means for further characterizing such lesions by assessing their degree of vascularity. Thus, it may help in distinguishing an osteoid osteoma, which has a hypervascular nidus, from a Brodie's abscess, which is avascular. Dynamic CT scanning may also help in the differentiation between a fluid-containing simple bone cyst, which is avascular, and other solid or semi-solid benign bone lesions which slow varying degrees of vascularity. However, because of the additional irradiation involved, dynamic CT scanning should be reserved for evaluation of selected patients with benign bone lesions in whom the plain film findings are not definitive and in whom the CT findings may have a significant influence on management. (orig.)

  8. Calculating buoy response for a wave energy converter—A comparison of two computational methods and experimental results

    Directory of Open Access Journals (Sweden)

    Linnea Sjökvist

    2017-05-01

    Full Text Available When designing a wave power plant, reliable and fast simulation tools are required. Computational fluid dynamics (CFD software provides high accuracy but with a very high computational cost, and in operational, moderate sea states, linear potential flow theories may be sufficient to model the hydrodynamics. In this paper, a model is built in COMSOL Multiphysics to solve for the hydrodynamic parameters of a point-absorbing wave energy device. The results are compared with a linear model where the hydrodynamical parameters are computed using WAMIT, and to experimental results from the Lysekil research site. The agreement with experimental data is good for both numerical models.

  9. Flexibility of Bricard's linkages and other structures via resultants and computer algebra.

    Science.gov (United States)

    Lewis, Robert H; Coutsias, Evangelos A

    2016-07-01

    Flexibility of structures is extremely important for chemistry and robotics. Following our earlier work, we study flexibility using polynomial equations, resultants, and a symbolic algorithm of our creation that analyzes the resultant. We show that the software solves a classic arrangement of quadrilaterals in the plane due to Bricard. We fill in several gaps in Bricard's work and discover new flexible arrangements that he was apparently unaware of. This provides strong evidence for the maturity of the software, and is a wonderful example of mathematical discovery via computer assisted experiment.

  10. Cloud Computing (SaaS Adoption as a Strategic Technology: Results of an Empirical Study

    Directory of Open Access Journals (Sweden)

    Pedro R. Palos-Sanchez

    2017-01-01

    Full Text Available The present study empirically analyzes the factors that determine the adoption of cloud computing (SaaS model in firms where this strategy is considered strategic for executing their activity. A research model has been developed to evaluate the factors that influence the intention of using cloud computing that combines the variables found in the technology acceptance model (TAM with other external variables such as top management support, training, communication, organization size, and technological complexity. Data compiled from 150 companies in Andalusia (Spain are used to test the formulated hypotheses. The results of this study reflect what critical factors should be considered and how they are interrelated. They also show the organizational demands that must be considered by those companies wishing to implement a real management model adopted to the digital economy, especially those related to cloud computing.

  11. Technique and results of the spinal computed tomography in the diagnosis of cervical disc disease

    International Nuclear Information System (INIS)

    Artmann, H.; Salbeck, R.; Grau, H.

    1985-01-01

    We give a description of a technique of the patient's positioning with traction of the arms during the cervical spinal computed tomography which allows to draw the shoulders downwards by about one to three cervical segments. By this method the quality of the images can be improved in 96% in the cervical segment 6/7 and in 81% in the cervical/thoracal segment 7/1 to such a degree that a reliable judgement of the soft parts in the spinal canal becomes possible. The diagnostic reliability of the computed tomography of the cervical disc herniation is thus improved so that the necessity of a myelography is decreasing. The results of 396 cervical spinal computed tomographies are presented. (orig.) [de

  12. Thermodynamic properties of 1-naphthol: Mutual validation of experimental and computational results

    International Nuclear Information System (INIS)

    Chirico, Robert D.; Steele, William V.; Kazakov, Andrei F.

    2015-01-01

    Highlights: • Heat capacities were measured for the temperature range 5 K to 445 K. • Vapor pressures were measured for the temperature range 370 K to 570 K. • Computed and derived properties for ideal gas entropies are in excellent accord. • The enthalpy of combustion was measured and shown to be consistent with reliable literature values. • Thermodynamic consistency analysis revealed anomalous literature data. - Abstract: Thermodynamic properties for 1-naphthol (Chemical Abstracts registry number [90-15-3]) in the ideal-gas state are reported based on both experimental and computational methods. Measured properties included the triple-point temperature, enthalpy of fusion, and heat capacities for the crystal and liquid phases by adiabatic calorimetry; vapor pressures by inclined-piston manometry and comparative ebulliometry; and the enthalpy of combustion of the crystal phase by oxygen bomb calorimetry. Critical properties were estimated. Entropies for the ideal-gas state were derived from the experimental studies for the temperature range 298.15 ⩽ T/K ⩽ 600, and independent statistical calculations were performed based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. The mutual validation of the independent experimental and computed results is achieved with a scaling factor of 0.975 applied to the calculated vibrational frequencies. This same scaling factor was successfully applied in the analysis of results for other polycyclic molecules, as described in a series of recent articles by this research group. This article reports the first extension of this approach to a hydroxy-aromatic compound. All experimental results are compared with property values reported in the literature. Thermodynamic consistency between properties is used to show that several studies in the literature are erroneous. The enthalpy of combustion for 1-naphthol was also measured in this research, and excellent

  13. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Science.gov (United States)

    Putten, Jim Vander; Nolen, Amanda L.

    2010-01-01

    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  14. General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 12 (2003), s. 2727-2778 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : computational power * computational complexity * perceptrons * radial basis functions * spiking neurons * feedforward networks * reccurent networks * probabilistic computation * analog computation Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  15. Results of the First National Assessment of Computer Competence (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)

  16. Influence of chamber type integrated with computer-assisted semen analysis (CASA) system on the results of boar semen evaluation.

    Science.gov (United States)

    Gączarzewicz, D

    2015-01-01

    The objective of the study was to evaluate the effect of different types of chambers used in computer-assisted semen analysis (CASA) on boar sperm concentration and motility parameters. CASA measurements were performed on 45 ejaculates by comparing three commonly used chambers: Leja chamber (LJ), Makler chamber (MK) and microscopic slide-coverslip (SL). Concentration results obtained with CASA were verified by manual counting on a Bürker hemocytometer (BH). No significant differences were found between the concentrations determined with BH vs. LJ and SL, whereas higher (p0.05). The results obtained show that CASA assessment of boar semen should account for the effect of counting chamber on the results of sperm motility and concentration, which confirms the need for further study on standardizing the automatic analysis of boar semen.

  17. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  18. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Science.gov (United States)

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  19. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    Directory of Open Access Journals (Sweden)

    Carolina Milesi

    2017-04-01

    Full Text Available While the underrepresentation of women in the fast-growing STEM field of computer science (CS has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  20. Computation of complexity measures of morphologically significant zones decomposed from binary fractal sets via multiscale convexity analysis

    International Nuclear Information System (INIS)

    Lim, Sin Liang; Koo, Voon Chet; Daya Sagar, B.S.

    2009-01-01

    Multiscale convexity analysis of certain fractal binary objects-like 8-segment Koch quadric, Koch triadic, and random Koch quadric and triadic islands-is performed via (i) morphologic openings with respect to recursively changing the size of a template, and (ii) construction of convex hulls through half-plane closings. Based on scale vs convexity measure relationship, transition levels between the morphologic regimes are determined as crossover scales. These crossover scales are taken as the basis to segment binary fractal objects into various morphologically prominent zones. Each segmented zone is characterized through normalized morphologic complexity measures. Despite the fact that there is no notably significant relationship between the zone-wise complexity measures and fractal dimensions computed by conventional box counting method, fractal objects-whether they are generated deterministically or by introducing randomness-possess morphologically significant sub-zones with varied degrees of spatial complexities. Classification of realistic fractal sets and/or fields according to sub-zones possessing varied degrees of spatial complexities provides insight to explore links with the physical processes involved in the formation of fractal-like phenomena.

  1. Mathematical structures for computer graphics

    CERN Document Server

    Janke, Steven J

    2014-01-01

    A comprehensive exploration of the mathematics behind the modeling and rendering of computer graphics scenes Mathematical Structures for Computer Graphics presents an accessible and intuitive approach to the mathematical ideas and techniques necessary for two- and three-dimensional computer graphics. Focusing on the significant mathematical results, the book establishes key algorithms used to build complex graphics scenes. Written for readers with various levels of mathematical background, the book develops a solid foundation for graphics techniques and fills in relevant grap

  2. New results on classical problems in computational geometry in the plane

    DEFF Research Database (Denmark)

    Abrahamsen, Mikkel

    In this thesis, we revisit three classical problems in computational geometry in the plane. An obstacle that often occurs as a subproblem in more complicated problems is to compute the common tangents of two disjoint, simple polygons. For instance, the common tangents turn up in problems related...... to visibility, collision avoidance, shortest paths, etc. We provide a remarkably simple algorithm to compute all (at most four) common tangents of two disjoint simple polygons. Given each polygon as a read-only array of its corners in cyclic order, the algorithm runs in linear time and constant workspace...... and is the first to achieve the two complexity bounds simultaneously. The set of common tangents provides basic information about the convex hulls of the polygons—whether they are nested, overlapping, or disjoint—and our algorithm thus also decides this relationship. One of the best-known problems in computational...

  3. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    Science.gov (United States)

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  4. Replacing gasoline with corn ethanol results in significant environmental problem-shifting.

    Science.gov (United States)

    Yang, Yi; Bae, Junghan; Kim, Junbeum; Suh, Sangwon

    2012-04-03

    Previous studies on the life-cycle environmental impacts of corn ethanol and gasoline focused almost exclusively on energy balance and greenhouse gas (GHG) emissions and largely overlooked the influence of regional differences in agricultural practices. This study compares the environmental impact of gasoline and E85 taking into consideration 12 different environmental impacts and regional differences among 19 corn-growing states. Results show that E85 does not outperform gasoline when a wide spectrum of impacts is considered. If the impacts are aggregated using weights developed by the National Institute of Standards and Technology (NIST), overall, E85 generates approximately 6% to 108% (23% on average) greater impact compared with gasoline, depending on where corn is produced, primarily because corn production induces significant eutrophication impacts and requires intensive irrigation. If GHG emissions from the indirect land use changes are considered, the differences increase to between 16% and 118% (33% on average). Our study indicates that replacing gasoline with corn ethanol may only result in shifting the net environmental impacts primarily toward increased eutrophication and greater water scarcity. These results suggest that the environmental criteria used in the Energy Independence and Security Act (EISA) be re-evaluated to include additional categories of environmental impact beyond GHG emissions.

  5. Degenerative dementia: nosological aspects and results of single photon emission computed tomography

    International Nuclear Information System (INIS)

    Dubois, B.; Habert, M.O.

    1999-01-01

    Ten years ago, the diagnosis discussion of a dementia case for the old patient was limited to two pathologies: the Alzheimer illness and the Pick illness. During these last years, the frame of these primary degenerative dementia has fallen into pieces. The different diseases and the results got with single photon emission computed tomography are discussed. for example: fronto-temporal dementia, primary progressive aphasia, progressive apraxia, visio-spatial dysfunction, dementia at Lewy's bodies, or cortico-basal degeneration. (N.C.)

  6. Writing Apprehension, Computer Anxiety and Telecomputing: A Pilot Study.

    Science.gov (United States)

    Harris, Judith; Grandgenett, Neal

    1992-01-01

    A study measured graduate students' writing apprehension and computer anxiety levels before and after using electronic mail, computer conferencing, and remote database searching facilities during an educational technology course. Results indicted postcourse computer anxiety levels significantly related to usage statistics. Precourse writing…

  7. Results of work of neurological clinic in first year of computer tomograph application

    Energy Technology Data Exchange (ETDEWEB)

    Volejnik, V; Nettl, S; Heger, L [Karlova Univ., Hradec Kralove (Czechoslovakia). Lekarska Fakulta

    1980-11-01

    The results are analyzed of one year's use of a computer tomograph (CT) by a department of neurology. Detailed comparisons with corresponding PEG and CT findings showed the accuracy of CT examinations in the descriptions of the width of the subarachnoid spaces and of the ventricular system. The advantages of CT are assessed from the medical, economic, and ethical points of view.

  8. Results of work of neurological clinic in first year of computer tomograph application

    International Nuclear Information System (INIS)

    Volejnik, V.; Nettl, S.; Heger, L.

    1980-01-01

    The results are analyzed of one year's use of a computer tomograph (CT) by a department of neurology. Detailed comparisons with corresponding PEG and CT findings showed the accuracy of CT examinations in the descriptions of the width of the subarachnoid spaces and of the ventricular system. The advantages of CT are assessed from the medical, economic, and ethical points of view. (author)

  9. Phasic firing in vasopressin cells: understanding its functional significance through computational models.

    Directory of Open Access Journals (Sweden)

    Duncan J MacGregor

    Full Text Available Vasopressin neurons, responding to input generated by osmotic pressure, use an intrinsic mechanism to shift from slow irregular firing to a distinct phasic pattern, consisting of long bursts and silences lasting tens of seconds. With increased input, bursts lengthen, eventually shifting to continuous firing. The phasic activity remains asynchronous across the cells and is not reflected in the population output signal. Here we have used a computational vasopressin neuron model to investigate the functional significance of the phasic firing pattern. We generated a concise model of the synaptic input driven spike firing mechanism that gives a close quantitative match to vasopressin neuron spike activity recorded in vivo, tested against endogenous activity and experimental interventions. The integrate-and-fire based model provides a simple physiological explanation of the phasic firing mechanism involving an activity-dependent slow depolarising afterpotential (DAP generated by a calcium-inactivated potassium leak current. This is modulated by the slower, opposing, action of activity-dependent dendritic dynorphin release, which inactivates the DAP, the opposing effects generating successive periods of bursting and silence. Model cells are not spontaneously active, but fire when perturbed by random perturbations mimicking synaptic input. We constructed one population of such phasic neurons, and another population of similar cells but which lacked the ability to fire phasically. We then studied how these two populations differed in the way that they encoded changes in afferent inputs. By comparison with the non-phasic population, the phasic population responds linearly to increases in tonic synaptic input. Non-phasic cells respond to transient elevations in synaptic input in a way that strongly depends on background activity levels, phasic cells in a way that is independent of background levels, and show a similar strong linearization of the response

  10. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  11. Computer Game Lugram - Version for Blind Children

    Directory of Open Access Journals (Sweden)

    V. Delić

    2011-06-01

    Full Text Available Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and adaptation of the educational computer game Lugram to the needs of completely blind children, as well as the testing of the prototype, whose results are encouraging to further research and development in the same direction.

  12. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  13. Predictive modeling of liquid-sodium thermal–hydraulics experiments and computations

    International Nuclear Information System (INIS)

    Arslan, Erkan; Cacuci, Dan G.

    2014-01-01

    Highlights: • We applied the predictive modeling method of Cacuci and Ionescu-Bujor (2010). • We assimilated data from sodium flow experiments. • We used computational fluid dynamics simulations of sodium experiments. • The predictive modeling method greatly reduced uncertainties in predicted results. - Abstract: This work applies the predictive modeling procedure formulated by Cacuci and Ionescu-Bujor (2010) to assimilate data from liquid-sodium thermal–hydraulics experiments in order to reduce systematically the uncertainties in the predictions of computational fluid dynamics (CFD) simulations. The predicted CFD-results for the best-estimate model parameters and results describing sodium-flow velocities and temperature distributions are shown to be significantly more precise than the original computations and experiments, in that the predicted uncertainties for the best-estimate results and model parameters are significantly smaller than both the originally computed and the experimental uncertainties

  14. [The significance of dermatologic management in computer-assisted occupational dermatology consultation].

    Science.gov (United States)

    Rakoski, J; Borelli, S

    1989-01-15

    At our occupational outpatient clinic, 230 patients were treated for about 15 months. With the help of a standardized questionary, we registered all the data regarding the relevant substances the patients contacted during their work as well as their various jobs since they left school. The patients were repeatedly seen and trained in procedures of skin care and skin protection. If required, we took steps to find new jobs for them within their employing company; this was done in cooperation with the trade cooperative association according to the dermatological insurance consultanship. If these proceedings did not work out, the patient had to change his profession altogether. All data were computerized. As an example for this computer-based documentation we present the data of barbers.

  15. Computing and the Crisis: The Significant Role of New Information Technologies in the Current Socio-economic Meltdown

    Directory of Open Access Journals (Sweden)

    David Hakken

    2010-08-01

    Full Text Available There is good reason to be concerned about the long-term implications of the current crisis for the reproduction of contemporary social formations. Thus there is an urgent need to understand it character, especially its distinctive features. This article identifies profound ambiguities in valuing assets as new and key economic features of this crisis, ambiguities traceable to the dominant, “computationalist” computing used to develop new financial instruments. After some preliminaries, the article identifies four specific ways in which computerization of finance is generative of crisis. It then demonstrates how computationalist computing is linked to other efforts to extend commodification based on the ideology of so-called “intellectual property” (IP. Several other accounts for the crisis are considered and then demonstrated to have less explanatory value. After considering how some commons-oriented (e.g., Free/Libre and/or Opening Source Software development projects forms of computing also undermine the IP project, the article concludes with a brief discussion of what research on Socially Robust and Enduring Computing might contribute to fostering alternative, non-crisis generative ways to compute.

  16. Partnership in Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  17. Computer Game Lugram - Version for Blind Children

    OpenAIRE

    V. Delić; N. Vujnović Sedlar; B. Lučić

    2011-01-01

    Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and ...

  18. Nonlinear ultrasound propagation through layered liquid and tissue-equivalent media: computational and experimental results at high frequency

    International Nuclear Information System (INIS)

    Williams, Ross; Cherin, Emmanuel; Lam, Toby Y J; Tavakkoli, Jahangir; Zemp, Roger J; Foster, F Stuart

    2006-01-01

    Nonlinear propagation has been demonstrated to have a significant impact on ultrasound imaging. An efficient computational algorithm is presented to simulate nonlinear ultrasound propagation through layered liquid and tissue-equivalent media. Results are compared with hydrophone measurements. This study was undertaken to investigate the role of nonlinear propagation in high frequency ultrasound micro-imaging. The acoustic field of a focused transducer (20 MHz centre frequency, f-number 2.5) was simulated for layered media consisting of water and tissue-mimicking phantom, for several wide-bandwidth source pulses. The simulation model accounted for the effects of diffraction, attenuation and nonlinearity, with transmission and refraction at layer boundaries. The parameter of nonlinearity, B/A, of the water and tissue-mimicking phantom were assumed to be 5.2 and 7.4, respectively. The experimentally measured phantom B/A value found using a finite-amplitude insert-substitution method was shown to be 7.4 ± 0.6. Relative amounts of measured second and third harmonic pressures as a function of the fundamental pressures at the focus were in good agreement with simulations. Agreement within 3% was found between measurements and simulations of the beam widths of the fundamental and second harmonic signals following propagation through the tissue phantom. The results demonstrate significant nonlinear propagation effects for high frequency imaging beams

  19. An Evaluation of Computer Anxiety in Sport Organizations

    Directory of Open Access Journals (Sweden)

    Esmaeily Nerges

    2012-01-01

    Full Text Available Computer with its widespread influence over the world has involved everyone in recent years. The growing demand for this technology is associated somehow with anxiety and stress. Therefore, the present study has been implemented in order to investigate the amount of computer anxiety in iranian sport organizations. 574 managers and experts were selected as a sample by random selection. The tools of the measurement were the standard Computer Anxiety Questionnaire (1987 of Heinz, Glass and Knight for which the face and content validities were confirmed by an expert academic group. Confirmatory factor analysis, t - independent test, correlation coefficients of two fields - points (rPbis, general discipline (rser and one - way ANOVA were used in order to analyze the data. The results demonstrated that there is no significant relationship between age (P<0.821, sex (P<0.599 and computer anxiety, but the relationship between educational level (P <0.025, Organizational post (P<0.035, work history (P<0.037, work experience (P <0.004 and computer anxiety were significant. The results also demonstrated that there is a significant difference between the computer anxiety observed in the organizations of physical education, sport federations and the departments of physical education in the schools (P<0.037 , F2,347 = 3.339. In the end we came to the conclusion that computer anxiety is a dynamic process with a variety of dimensions and complexities that should not be ignored easily and must be studied attentively.

  20. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  1. Cognitive impairment and computer tomography image in patients with arterial hypertension -preliminary results

    International Nuclear Information System (INIS)

    Yaneva-Sirakova, T.; Tarnovska-Kadreva, R.; Traykov, L.; Zlatareva, D.

    2012-01-01

    Arterial hypertension is the leading risk factor for cognitive impairment, but it is developed only in some of the patients with pour control. On the other hand, not all of the patents with white matter changes have cognitive deficit. There may be a variety of reasons for this: the accuracy of methods for blood pressure measurement, the specific brain localization or some other reason. Here are the preliminary results of a study (or the potential correlation between self-measured, office-, ambulatory monitored blood pressure, central aortic blood pressure, minimal cognitive impairment and the specific brain image on contrast computer tomography. We expect to answer, the question whether central aortic or self-measured blood pressure have the leading role for the development of cognitive impairment in the presence of a specific neuroimaging finding, as well as what is the prerequisite for the clinical manifestation of cognitive dysfunction in patients with computer tomographic pathology. (authors)

  2. Computational problems in engineering

    CERN Document Server

    Mladenov, Valeri

    2014-01-01

    This book provides readers with modern computational techniques for solving variety of problems from electrical, mechanical, civil and chemical engineering. Mathematical methods are presented in a unified manner, so they can be applied consistently to problems in applied electromagnetics, strength of materials, fluid mechanics, heat and mass transfer, environmental engineering, biomedical engineering, signal processing, automatic control and more.   • Features contributions from distinguished researchers on significant aspects of current numerical methods and computational mathematics; • Presents actual results and innovative methods that provide numerical solutions, while minimizing computing times; • Includes new and advanced methods and modern variations of known techniques that can solve difficult scientific problems efficiently.  

  3. Comparison of Experimental Surface and Flow Field Measurements to Computational Results of the Juncture Flow Model

    Science.gov (United States)

    Roozeboom, Nettie H.; Lee, Henry C.; Simurda, Laura J.; Zilliac, Gregory G.; Pulliam, Thomas H.

    2016-01-01

    Wing-body juncture flow fields on commercial aircraft configurations are challenging to compute accurately. The NASA Advanced Air Vehicle Program's juncture flow committee is designing an experiment to provide data to improve Computational Fluid Dynamics (CFD) modeling in the juncture flow region. Preliminary design of the model was done using CFD, yet CFD tends to over-predict the separation in the juncture flow region. Risk reduction wind tunnel tests were requisitioned by the committee to obtain a better understanding of the flow characteristics of the designed models. NASA Ames Research Center's Fluid Mechanics Lab performed one of the risk reduction tests. The results of one case, accompanied by CFD simulations, are presented in this paper. Experimental results suggest the wall mounted wind tunnel model produces a thicker boundary layer on the fuselage than the CFD predictions, resulting in a larger wing horseshoe vortex suppressing the side of body separation in the juncture flow region. Compared to experimental results, CFD predicts a thinner boundary layer on the fuselage generates a weaker wing horseshoe vortex resulting in a larger side of body separation.

  4. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  5. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  6. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  7. Elucidating reaction mechanisms on quantum computers.

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  8. Feature Extraction on Brain Computer Interfaces using Discrete Dyadic Wavelet Transform: Preliminary Results

    International Nuclear Information System (INIS)

    Gareis, I; Gentiletti, G; Acevedo, R; Rufiner, L

    2011-01-01

    The purpose of this work is to evaluate different feature extraction alternatives to detect the event related evoked potential signal on brain computer interfaces, trying to minimize the time employed and the classification error, in terms of sensibility and specificity of the method, looking for alternatives to coherent averaging. In this context the results obtained performing the feature extraction using discrete dyadic wavelet transform using different mother wavelets are presented. For the classification a single layer perceptron was used. The results obtained with and without the wavelet decomposition were compared; showing an improvement on the classification rate, the specificity and the sensibility for the feature vectors obtained using some mother wavelets.

  9. Performance of various mathematical methods for computer-aided processing of radioimmunoassay results

    International Nuclear Information System (INIS)

    Vogt, W.; Sandel, P.; Langfelder, Ch.; Knedel, M.

    1978-01-01

    The performance of 6 algorithms were compared for computer aided determination of radioimmunological end results. These were weighted and unweighted linear logit log regression; quadratic logit log regression, smoothing spline interpolation with a large and small smoothing factor, respectively, and polygonal interpolation and the manual curve fitting on the basis of three radioimmunoassays with different reference curve characteristics (digoxin, estriol, human chorionic somatomammotrophin (HCS)). Great store was set by the accuracy of the approximation at the intermediate points on the curve, i.e. those points that lie midway between two standard concentrations. These concentrations were obtained by weighing and inserted as unknown samples. In the case of digoxin and estriol the polygonal interpolation provided the best results, while the weighted logit log regression proved superior in the case of HCS. (Auth.)

  10. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  11. Gene expression results in lipopolysaccharide-stimulated monocytes depend significantly on the choice of reference genes

    Directory of Open Access Journals (Sweden)

    Øvstebø Reidun

    2010-05-01

    Full Text Available Abstract Background Gene expression in lipopolysaccharide (LPS-stimulated monocytes is mainly studied by quantitative real-time reverse transcription PCR (RT-qPCR using GAPDH (glyceraldehyde 3-phosphate dehydrogenase or ACTB (beta-actin as reference gene for normalization. Expression of traditional reference genes has been shown to vary substantially under certain conditions leading to invalid results. To investigate whether traditional reference genes are stably expressed in LPS-stimulated monocytes or if RT-qPCR results are dependent on the choice of reference genes, we have assessed and evaluated gene expression stability of twelve candidate reference genes in this model system. Results Twelve candidate reference genes were quantified by RT-qPCR in LPS-stimulated, human monocytes and evaluated using the programs geNorm, Normfinder and BestKeeper. geNorm ranked PPIB (cyclophilin B, B2M (beta-2-microglobulin and PPIA (cyclophilin A as the best combination for gene expression normalization in LPS-stimulated monocytes. Normfinder suggested TBP (TATA-box binding protein and B2M as the best combination. Compared to these combinations, normalization using GAPDH alone resulted in significantly higher changes of TNF-α (tumor necrosis factor-alpha and IL10 (interleukin 10 expression. Moreover, a significant difference in TNF-α expression between monocytes stimulated with equimolar concentrations of LPS from N. meningitides and E. coli, respectively, was identified when using the suggested combinations of reference genes for normalization, but stayed unrecognized when employing a single reference gene, ACTB or GAPDH. Conclusions Gene expression levels in LPS-stimulated monocytes based on RT-qPCR results differ significantly when normalized to a single gene or a combination of stably expressed reference genes. Proper evaluation of reference gene stabiliy is therefore mandatory before reporting RT-qPCR results in LPS-stimulated monocytes.

  12. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  13. Significant improvement in one-dimensional cursor control using Laplacian electroencephalography over electroencephalography

    Science.gov (United States)

    Boudria, Yacine; Feltane, Amal; Besio, Walter

    2014-06-01

    Objective. Brain-computer interfaces (BCIs) based on electroencephalography (EEG) have been shown to accurately detect mental activities, but the acquisition of high levels of control require extensive user training. Furthermore, EEG has low signal-to-noise ratio and low spatial resolution. The objective of the present study was to compare the accuracy between two types of BCIs during the first recording session. EEG and tripolar concentric ring electrode (TCRE) EEG (tEEG) brain signals were recorded and used to control one-dimensional cursor movements. Approach. Eight human subjects were asked to imagine either ‘left’ or ‘right’ hand movement during one recording session to control the computer cursor using TCRE and disc electrodes. Main results. The obtained results show a significant improvement in accuracies using TCREs (44%-100%) compared to disc electrodes (30%-86%). Significance. This study developed the first tEEG-based BCI system for real-time one-dimensional cursor movements and showed high accuracies with little training.

  14. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    Science.gov (United States)

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  15. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  16. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10 8 kg, with a corresponding kinetic energy of 1.88 x 10 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references

  17. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  18. Positron computed tomography: current state, clinical results and future trends

    International Nuclear Information System (INIS)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-01-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends

  19. Diagnostic significance and therapeutic consequences of computed tomography (patient outcome research). Pt. 1. Diagnosis in traumatology

    International Nuclear Information System (INIS)

    Schroeder, R.J.; Hidajat, N.; Vogl, T.; Haas, N.; Suedkamp, N.; Schedel, H.; Felix, R.

    1995-01-01

    During 1993, 201 primary traumatologic patients underwent 230 computed tomography examinations. 87% of the CT's were performed completely without contrast media, 2.6% exclusively supported by intravenously given contrast media, 9.1% in both ways, and 1.3% after intra-articular contrast media administration. 97.4% served for primary diagnostic purposes and 2.6% for the control of therapeutic results. In 47.8% of the CT's, the principle diagnosis was known before CT. In 52.2%, the diagnosis without CT was impossible by other methods. The CT diagnoses were correctly positive in 58.7% and correctly negative in 41.3%. 60.9% of CT's demonstrated a missing indication for operation in the examined body region; in 39.1% the operation followed. (orig.) [de

  20. Coupled-Flow Simulation of HP-LP Turbines Has Resulted in Significant Fuel Savings

    Science.gov (United States)

    Veres, Joseph P.

    2001-01-01

    Our objective was to create a high-fidelity Navier-Stokes computer simulation of the flow through the turbines of a modern high-bypass-ratio turbofan engine. The simulation would have to capture the aerodynamic interactions between closely coupled high- and low-pressure turbines. A computer simulation of the flow in the GE90 turbofan engine's high-pressure (HP) and low-pressure (LP) turbines was created at GE Aircraft Engines under contract with the NASA Glenn Research Center. The three-dimensional steady-state computer simulation was performed using Glenn's average-passage approach named APNASA. The areas upstream and downstream of each blade row mutually interact with each other during engine operation. The embedded blade row operating conditions are modeled since the average passage equations in APNASA actively include the effects of the adjacent blade rows. The turbine airfoils, platforms, and casing are actively cooled by compressor bleed air. Hot gas leaks around the tips of rotors through labyrinth seals. The flow exiting the high work HP turbines is partially transonic and, therefore, has a strong shock system in the transition region. The simulation was done using 121 processors of a Silicon Graphics Origin 2000 (NAS 02K) cluster at the NASA Ames Research Center, with a parallel efficiency of 87 percent in 15 hr. The typical average-passage analysis mesh size per blade row was 280 by 45 by 55, or approx.700,000 grid points. The total number of blade rows was 18 for a combined HP and LP turbine system including the struts in the transition duct and exit guide vane, which contain 12.6 million grid points. Design cycle turnaround time requirements ran typically from 24 to 48 hr of wall clock time. The number of iterations for convergence was 10,000 at 8.03x10(exp -5) sec/iteration/grid point (NAS O2K). Parallel processing by up to 40 processors is required to meet the design cycle time constraints. This is the first-ever flow simulation of an HP and LP

  1. Positron Computed Tomography: Current State, Clinical Results and Future Trends

    Science.gov (United States)

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  2. Positron computed tomography: current state, clinical results and future trends

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H.R.; Phelps, M.E.; Kuhl, D.E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

  3. Energy-resolved computed tomography: first experimental results

    International Nuclear Information System (INIS)

    Shikhaliev, Polad M

    2008-01-01

    First experimental results with energy-resolved computed tomography (CT) are reported. The contrast-to-noise ratio (CNR) in CT has been improved with x-ray energy weighting for the first time. Further, x-ray energy weighting improved the CNR in material decomposition CT when applied to CT projections prior to dual-energy subtraction. The existing CT systems use an energy (charge) integrating x-ray detector that provides a signal proportional to the energy of the x-ray photon. Thus, the x-ray photons with lower energies are scored less than those with higher energies. This underestimates contribution of lower energy photons that would provide higher contrast. The highest CNR can be achieved if the x-ray photons are scored by a factor that would increase as the x-ray energy decreases. This could be performed by detecting each x-ray photon separately and measuring its energy. The energy selective CT data could then be saved, and any weighting factor could be applied digitally to a detected x-ray photon. The CT system includes a photon counting detector with linear arrays of pixels made from cadmium zinc telluride (CZT) semiconductor. A cylindrical phantom with 10.2 cm diameter made from tissue-equivalent material was used for CT imaging. The phantom included contrast elements representing calcifications, iodine, adipose and glandular tissue. The x-ray tube voltage was 120 kVp. The energy selective CT data were acquired, and used to generate energy-weighted and material-selective CT images. The energy-weighted and material decomposition CT images were generated using a single CT scan at a fixed x-ray tube voltage. For material decomposition the x-ray spectrum was digitally spilt into low- and high-energy parts and dual-energy subtraction was applied. The x-ray energy weighting resulted in CNR improvement of calcifications and iodine by a factor of 1.40 and 1.63, respectively, as compared to conventional charge integrating CT. The x-ray energy weighting was also applied

  4. Computer use and ulnar neuropathy: results from a case-referent study

    DEFF Research Database (Denmark)

    Andersen, JH; Frost, P.; Fuglsang-Frederiksen, A.

    2012-01-01

    We aimed to evaluate associations between vocational computer use and 1) ulnar neuropathy, and 2) ulnar neuropathy- like symptoms as distinguished by electroneurography. We identified all patients aged 18-65 years, examined at the Department of Neurophysiology on suspicion of ulnar neuropathy, 2001...... was performed by conditional logistic regression.There were a negative association between daily hours of computer use and the two outcomes of interest. Participants who reported their elbow to be in contact with their working table for 2 hours or more during the workday had an elevated risk for ulnar...

  5. Computer-Based Training in Math and Working Memory Improves Cognitive Skills and Academic Achievement in Primary School Children: Behavioral Results

    Directory of Open Access Journals (Sweden)

    Noelia Sánchez-Pérez

    2018-01-01

    Full Text Available Student academic achievement has been positively related to further development outcomes, such as the attainment of higher educational, employment, and socioeconomic aspirations. Among all the academic competences, mathematics has been identified as an essential skill in the field of international leadership as well as for those seeking positions in disciplines related to science, technology, and engineering. Given its positive consequences, studies have designed trainings to enhance children's mathematical skills. Additionally, the ability to regulate and control actions and cognitions, i.e., executive functions (EF, has been associated with school success, which has resulted in a strong effort to develop EF training programs to improve students' EF and academic achievement. The present study examined the efficacy of a school computer-based training composed of two components, namely, working memory and mathematics tasks. Among the advantages of using a computer-based training program is the ease with which it can be implemented in school settings and the ease by which the difficulty of the tasks can be adapted to fit the child's ability level. To test the effects of the training, children's cognitive skills (EF and IQ and their school achievement (math and language grades and abilities were evaluated. The results revealed a significant improvement in cognitive skills, such as non-verbal IQ and inhibition, and better school performance in math and reading among the children who participated in the training compared to those children who did not. Most of the improvements were related to training on WM tasks. These findings confirmed the efficacy of a computer-based training that combined WM and mathematics activities as part of the school routines based on the training's impact on children's academic competences and cognitive skills.

  6. Computer-Based Training in Math and Working Memory Improves Cognitive Skills and Academic Achievement in Primary School Children: Behavioral Results.

    Science.gov (United States)

    Sánchez-Pérez, Noelia; Castillo, Alejandro; López-López, José A; Pina, Violeta; Puga, Jorge L; Campoy, Guillermo; González-Salinas, Carmen; Fuentes, Luis J

    2017-01-01

    Student academic achievement has been positively related to further development outcomes, such as the attainment of higher educational, employment, and socioeconomic aspirations. Among all the academic competences, mathematics has been identified as an essential skill in the field of international leadership as well as for those seeking positions in disciplines related to science, technology, and engineering. Given its positive consequences, studies have designed trainings to enhance children's mathematical skills. Additionally, the ability to regulate and control actions and cognitions, i.e., executive functions (EF), has been associated with school success, which has resulted in a strong effort to develop EF training programs to improve students' EF and academic achievement. The present study examined the efficacy of a school computer-based training composed of two components, namely, working memory and mathematics tasks. Among the advantages of using a computer-based training program is the ease with which it can be implemented in school settings and the ease by which the difficulty of the tasks can be adapted to fit the child's ability level. To test the effects of the training, children's cognitive skills (EF and IQ) and their school achievement (math and language grades and abilities) were evaluated. The results revealed a significant improvement in cognitive skills, such as non-verbal IQ and inhibition, and better school performance in math and reading among the children who participated in the training compared to those children who did not. Most of the improvements were related to training on WM tasks. These findings confirmed the efficacy of a computer-based training that combined WM and mathematics activities as part of the school routines based on the training's impact on children's academic competences and cognitive skills.

  7. Definition of bulky disease in early stage Hodgkin lymphoma in computed tomography era: prognostic significance of measurements in the coronal and transverse planes.

    Science.gov (United States)

    Kumar, Anita; Burger, Irene A; Zhang, Zhigang; Drill, Esther N; Migliacci, Jocelyn C; Ng, Andrea; LaCasce, Ann; Wall, Darci; Witzig, Thomas E; Ristow, Kay; Yahalom, Joachim; Moskowitz, Craig H; Zelenetz, Andrew D

    2016-10-01

    Disease bulk is an important prognostic factor in early stage Hodgkin lymphoma, but its definition is unclear in the computed tomography era. This retrospective analysis investigated the prognostic significance of bulky disease measured in transverse and coronal planes on computed tomography imaging. Early stage Hodgkin lymphoma patients (n=185) treated with chemotherapy with or without radiotherapy from 2000-2010 were included. The longest diameter of the largest lymph node mass was measured in transverse and coronal axes on pre-treatment imaging. The optimal cut off for disease bulk was maximal diameter greater than 7 cm measured in either the transverse or coronal plane. Thirty patients with maximal transverse diameter of 7 cm or under were found to have bulk in coronal axis. The 4-year overall survival was 96.5% (CI: 93.3%, 100%) and 4-year relapse-free survival was 86.8% (CI: 81.9%, 92.1%) for all patients. Relapse-free survival at four years for bulky patients was 80.5% (CI: 73%, 88.9%) compared to 94.4% (CI: 89.1%, 100%) for non-bulky; Cox HR 4.21 (CI: 1.43, 12.38) (P=0.004). In bulky patients, relapse-free survival was not impacted in patients treated with chemoradiotherapy; however, it was significantly lower in patients treated with chemotherapy alone. In an independent validation cohort of 38 patients treated with chemotherapy alone, patients with bulky disease had an inferior relapse-free survival [at 4 years, 71.1% (CI: 52.1%, 97%) vs 94.1% (CI: 83.6%, 100%), Cox HR 5.27 (CI: 0.62, 45.16); P=0.09]. Presence of bulky disease on multidimensional computed tomography imaging is a significant prognostic factor in early stage Hodgkin lymphoma. Coronal reformations may be included for routine Hodgkin lymphoma staging evaluation. In future, our definition of disease bulk may be useful in identifying patients who are most appropriate for chemotherapy alone. Copyright© Ferrata Storti Foundation.

  8. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  9. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  10. Sense of coherence is significantly associated with both metabolic syndrome and lifestyle in Japanese computer software office workers

    Directory of Open Access Journals (Sweden)

    Yusaku Morita

    2014-12-01

    Full Text Available Objectives: Sense of coherence (SOC is an individual characteristic related to a positive life orientation, leading to effective coping. Little is known about the relationship between SOC and metabolic syndrome (MetS. This cross-sectional study aimed at testing the hypothesis that workers with a strong SOC have fewer atherosclerotic risk factors, including MetS, and healthier lifestyle behaviors. Material and Methods: One hundred and sixty-seven computer software workers aged 20–64 years underwent a periodical health examination including assessment of body mass index, waist circumference, blood pressure, blood lipid levels, fasting blood sugar (FBS levels and lifestyle behaviors (walking duration, smoking status, nutrition, alcohol consumption, and sleep duration. During this period, the participants also completed a 29-item questionnaire of SOC and the Brief Job Stress Questionnaire to assess job stressors such as job strain and workplace social support. Results: Our results showed that the participants with a stronger SOC were likely to walk for at least 1 h a day, to eat slowly or at a moderate speed, and to sleep for at least 6 h. Compared with the participants with the weakest SOC, those with the strongest SOC had a significantly lower odds ratio (OR for being overweight (OR = 0.31; 95% confidence interval (CI: 0.11–0.81, and having higher FBS levels (OR = 0.11; 95% CI: 0.02–0.54, dyslipidemia (OR = 0.29; 95% CI: 0.09–0.84, and MetS (OR = 0.12; 95% CI: 0.02–0.63, even after adjusting for age, gender and job stressors. Conclusions: High SOC is associated with a healthy lifestyle and fewer atherosclerotic risk factors, including MetS.

  11. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  12. GPU-computing in econophysics and statistical physics

    Science.gov (United States)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  13. VX hydrolysis by human serum paraoxonase 1: a comparison of experimental and computational results.

    Directory of Open Access Journals (Sweden)

    Matthew W Peterson

    Full Text Available Human Serum paraoxonase 1 (HuPON1 is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VX(ts in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VX(ts complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VX(ts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX.

  14. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  15. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-04-01

    A computational approach used for subsurface explosion cratering has been extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for our first computer simulation because it was the most thoroughly studied. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Shoemaker estimates that the impact occurred about 20,000 to 30,000 years ago [Roddy (1977)]. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s. meteorite mass of 1.57E + 08 kg, with a corresponding kinetic energy of 1.88E + 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation a Tillotson equation-of-state description for iron and limestone was used with no shear strength. A color movie based on this calculation was produced using computer-generated graphics. Results obtained for this preliminary calculation of the formation of Meteor Crater, Arizona, are in good agreement with Meteor Crater Measurements

  16. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-01-01

    of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation

  17. Computation and measurement of air temperature distribution of an industrial melt blowing die

    Directory of Open Access Journals (Sweden)

    Wu Li-Li

    2014-01-01

    Full Text Available The air flow field of the dual slot die on an HDF-6D melt blowing non-woven equipment is computed numerically. A temperature measurement system is built to measure air temperatures. The computation results tally with the measured results proving the correctness of the computation. The results have great valuable significance in the actual melt blowing production.

  18. Computer self-efficacy - is there a gender gap in tertiary level introductory computing classes?

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available This paper explores the relationship between introductory computing students, self-efficacy, and gender. Since the use of computers has become more common there has been speculation that the confidence and ability to use them differs between genders. Self-efficacy is an important and useful concept used to describe how a student may perceive their own ability or confidence in using and learning new technology. A survey of students in an introductory computing class has been completed intermittently since the late 1990\\'s. Although some questions have been adapted to meet the changing technology the aim of the survey has remain unchanged. In this study self-efficacy is measured using two self-rating questions. Students are asked to rate their confidence using a computer and also asked to give their perception of their computing knowledge. This paper examines these two aspects of a person\\'s computer self-efficacy in order to identify any differences that may occur between genders in two introductory computing classes, one in 1999 and the other in 2012. Results from the 1999 survey are compared with those from the survey completed in 2012 and investigated to ascertain if the perception that males were more likely to display higher computer self-efficacy levels than their female classmates does or did exist in a class of this type. Results indicate that while overall there has been a general increase in self-efficacy levels in 2012 compared with 1999, there is no significant gender gap.

  19. Numerical computation of homogeneous slope stability.

    Science.gov (United States)

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS).

  20. Low velocity gunshot wounds result in significant contamination regardless of ballistic characteristics.

    Science.gov (United States)

    Weinstein, Joseph; Putney, Emily; Egol, Kenneth

    2014-01-01

    Controversy exists among the orthopedic community regarding the treatment of gunshot injuries. No consistent treatment algorithm exists for treatment of low energy gunshot wound (GSW) trauma. The purpose of this study was to critically examine the wound contamination following low velocity GSW based upon bullet caliber and clothing fiber type found within the injury track. Four types of handguns were fired at ballistic gel from a 10-foot distance. Various clothing materials were applied (denim, cotton, polyester, and wool) circumferentially around the tissue agar in a loose manor. A total of 32 specimens were examined. Each caliber handgun was fired a minimum of 5 times into a gel. Regardless of bullet caliber there was gross contamination of the entire bullet track in 100% of specimens in all scenarios and for all fiber types. Furthermore, as would be expected, the degree of contamination appeared to increase as the size of the bullet increased. Low velocity GSWs result in significant contamination regardless of bullet caliber and jacket type. Based upon our results further investigation of low velocity GSW tracks is warranted. Further clinical investigation should focus on the degree to which debridement should be undertaken.

  1. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    OpenAIRE

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    BACKGROUND: We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. METHODS: We evaluated 176 patients with small lung adenocarcinomas (diameter, 1-3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution ...

  2. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  3. Significance of cranial computer tomography for the early diagnosis of peri- and postnatal damage

    Energy Technology Data Exchange (ETDEWEB)

    Richter, E I

    1981-01-01

    It is reported on examination-technical possibilities with craniocerebral Computer Tomography in the peri- and postnatal period. Some typical tomographic images from a 17 1/2 months period in our own patient material of 327 children are demonstrated. The special advantages of this new technical-extensive method are: exact diagnoses, observation possibility of the longitudinal section, and the absolute harmlessness to the child.

  4. Modelling the Intention to Adopt Cloud Computing Services: A Transaction Cost Theory Perspective

    Directory of Open Access Journals (Sweden)

    Ogan Yigitbasioglu

    2014-11-01

    Full Text Available This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.

  5. Iranian EFL Teachers' Sense of Professional Identity and their Computer Literacy

    Directory of Open Access Journals (Sweden)

    Toktam Abtahi

    2016-03-01

    Full Text Available This study examines Iranian EFL teachers’ sense of professional identity and their computer literacy. To these end, 718 EFL teachers from different cities in Iran filled out job satisfaction, occupational commitment, and computer literacy questionnaires. SPSS software was employed to summarize the collected data. Independent Sample t-test and Pearson Product-Moment Correlation were run to check the level of significance. For qualitative data collection, five open-ended questions were added to the end of the job satisfaction questionnaire. The obtained answers were categorized and the frequency for each category was calculated. The results revealed that computer literacy has a significant relation with continuance commitment, job satisfaction, and gender. The results further suggested that teacher computer literacy provided an encouraging base for their professional identity.

  6. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  7. On the elastostatic significance of four boundary integrals involving biharmonic functions

    DEFF Research Database (Denmark)

    Christiansen, Søren

    1998-01-01

    For a biharmonic function U, depending upon two space variables, it is known that four curve integrals, which involve U and some derivatives of U evaluated at a closed boundary, must be equal to zero. When U plays the role of an Airy stress function, we investigate the elastostatic significance o...... with the values of the four integrals. The computer algebra system Maple V has been an invaluable tool. By suitable comparisons among the various results obtained we are led to the conclusions about the elastostatic significance of the integrals....

  8. The significance of routine thoracic computed tomography in patients with blunt chest trauma.

    Science.gov (United States)

    Çorbacıoğlu, Seref Kerem; Er, Erhan; Aslan, Sahin; Seviner, Meltem; Aksel, Gökhan; Doğan, Nurettin Özgür; Güler, Sertaç; Bitir, Aysen

    2015-05-01

    The purpose of this study is to investigate whether the use of thoracic computed tomography (TCT) as part of nonselective computed tomography (CT) guidelines is superior to selective CT during the diagnosis of blunt chest trauma. This study was planned as a prospective cohort study, and it was conducted at the emergency department between 2013 and 2014. A total of 260 adult patients who did not meet the exclusion criteria were enrolled in the study. All patients were evaluated by an emergency physician, and their primary surveys were completed based on the Advanced Trauma Life Support (ATLS) principles. Based on the initial findings and ATLS recommendations, patients in whom thoracic CT was indicated were determined (selective CT group). Routine CTs were then performed on all patients. Thoracic injuries were found in 97 (37.3%) patients following routine TCT. In 53 (20%) patients, thoracic injuries were found by selective CT. Routine TCT was able to detect chest injury in 44 (16%) patients for whom selective TCT would not otherwise be ordered based on the EP evaluation (nonselective TCT group). Five (2%) patients in this nonselective TCT group required tube thoracostomy, while there was no additional treatment provided for thoracic injuries in the remaining 39 (15%). In conclusion, we found that the nonselective TCT method was superior to the selective TCT method in detecting thoracic injuries in patients with blunt trauma. Furthermore, we were able to demonstrate that the nonselective TCT method can change the course of patient management albeit at low rates. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  10. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  11. Abdominal alterations in disseminated paracoccidioidomycosis: computed tomography findings

    Energy Technology Data Exchange (ETDEWEB)

    Vermelho, Marli Batista Fernandes; Correia, Ademir Silva; Michailowsky, Tania Cibele de Almeida; Suzart, Elizete Kazumi Kuniyoshi; Ibanes, Aline Santos; Almeida, Lanamar Aparecida; Khoury, Zarifa; Barba, Mario Flores, E-mail: marlivermelho@globo.com [Instituto de Infectologia Emilio Ribas (IIER), Sao Paulo, SP (Brazil)

    2015-03-15

    Objective: to evaluate the incidence and spectrum of abdominal computed tomography imaging findings in patients with paracoccidioidomycosis. Materials and methods: retrospective analysis of abdominal computed tomography images of 26 patients with disseminated paracoccidioidomycosis. Results: abnormal abdominal tomographic findings were observed in 18 patients (69.2%), while no significant finding was observed in the other 8 (30.8%) patients. Conclusion: computed tomography has demonstrated to play a relevant role in the screening and detection of abdominal abnormalities in patients with disseminated paracoccidioidomycosis. (author)

  12. Prevalance of neck pain in computer users

    International Nuclear Information System (INIS)

    Sabeen, F.; Bashir, M.S.; Hussain, S.I.

    2013-01-01

    Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. Neck pain and computer users are clearly connected due to extended periods of sitting in a certain position with no breaks to stretch the neck muscles. Pro-longed computer use with neck bent forward, will cause the anterior neck muscles to gradually get shorter and tighter, while the muscles in the back of neck will grow longer and weaker. These changes will lead to development of neck pain. Objectives: To find incidence of neck pain in computer users, association between neck pain and prolong sitting in wrong posture, association between effects of break during prolong work, association between types of chair in use in prolong sitting and occurrence of neck pain. Methodology: For this observational study data was collected through Questionnaires from office workers (computer users), and students. Results: Out of 50 persons 72% of computer users had neck pain. Strong association was found between neck pain and prolonged computer use (p = 0.001). Those who took break during their work had less neck pain. No significant association was found between type of chair in use and neck pain. Neck pain and type of system in use also had no significant association. Conclusion: So duration of computer use and frequency of breaks are associated with neck pain at work. Severe Neck pain was found in people who use computer for more than 5 hours a day. (author)

  13. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  14. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  15. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  16. Effective Computer-Aided Assessment of Mathematics; Principles, Practice and Results

    Science.gov (United States)

    Greenhow, Martin

    2015-01-01

    This article outlines some key issues for writing effective computer-aided assessment (CAA) questions in subjects with substantial mathematical or statistical content, especially the importance of control of random parameters and the encoding of wrong methods of solution (mal-rules) commonly used by students. The pros and cons of using CAA and…

  17. Low density in liver of idiopathic portal hypertension. A computed tomographic observation with possible diagnostic significance

    Energy Technology Data Exchange (ETDEWEB)

    Ishito, Hiroyuki

    1988-01-01

    In order to evaluate the diagnostic value of low density in liver on computed tomography (CT), CT scans of 11 patients with idiopathic portal hypertension (IPH) were compared with those from 22 cirrhotic patients, two patients with scarred liver and 16 normal subjects. Low densities on plain CT scans in patients with IPH were distinctly different from those observed in normal liver. Some of the low densities had irregular shape with unclear margin and were scattered near the liver surface, and others had vessel-like structures with unclear margin and extended as far as near the liver surface. Ten of the 11 patients with IPH had low densities mentioned above, while none of the 22 cirrhotic patients had such low densities. The present results suggest that the presence of low densities in liver on plain CT scan is clinically beneficial in diagnosis of IPH.

  18. Computing quantum discord is NP-complete

    International Nuclear Information System (INIS)

    Huang, Yichen

    2014-01-01

    We study the computational complexity of quantum discord (a measure of quantum correlation beyond entanglement), and prove that computing quantum discord is NP-complete. Therefore, quantum discord is computationally intractable: the running time of any algorithm for computing quantum discord is believed to grow exponentially with the dimension of the Hilbert space so that computing quantum discord in a quantum system of moderate size is not possible in practice. As by-products, some entanglement measures (namely entanglement cost, entanglement of formation, relative entropy of entanglement, squashed entanglement, classical squashed entanglement, conditional entanglement of mutual information, and broadcast regularization of mutual information) and constrained Holevo capacity are NP-hard/NP-complete to compute. These complexity-theoretic results are directly applicable in common randomness distillation, quantum state merging, entanglement distillation, superdense coding, and quantum teleportation; they may offer significant insights into quantum information processing. Moreover, we prove the NP-completeness of two typical problems: linear optimization over classical states and detecting classical states in a convex set, providing evidence that working with classical states is generically computationally intractable. (paper)

  19. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  20. Significant ELCAP analysis results: Summary report. [End-use Load and Consumer Assessment Program

    Energy Technology Data Exchange (ETDEWEB)

    Pratt, R.G.; Conner, C.C.; Drost, M.K.; Miller, N.E.; Cooke, B.A.; Halverson, M.A.; Lebaron, B.A.; Lucas, R.G.; Jo, J.; Richman, E.E.; Sandusky, W.F. (Pacific Northwest Lab., Richland, WA (USA)); Ritland, K.G. (Ritland Associates, Seattle, WA (USA)); Taylor, M.E. (USDOE Bonneville Power Administration, Portland, OR (USA)); Hauser, S.G. (Solar Energy Research Inst., Golden, CO (USA))

    1991-02-01

    The evolution of the End-Use Load and Consumer Assessment Program (ELCAP) since 1983 at Bonneville Power Administration (Bonneville) has been eventful and somewhat tortuous. The birth pangs of a data set so large and encompassing as this have been overwhelming at times. The early adolescent stage of data set development and use has now been reached and preliminary results of early analyses of the data are becoming well known. However, the full maturity of the data set and the corresponding wealth of analytic insights are not fully realized. This document is in some sense a milestone in the brief history of the program. It is a summary of the results of the first five years of the program, principally containing excerpts from a number of previous reports. It is meant to highlight significant accomplishments and analytical results, with a focus on the principal results. Many of the results have a broad application in the utility load research community in general, although the real breadth of the data set remains largely unexplored. The first section of the document introduces the data set: how the buildings were selected, how the metering equipment was installed, and how the data set has been prepared for analysis. Each of the sections that follow the introduction summarize a particular analytic result. A large majority of the analyses to date involve the residential samples, as these were installed first and had highest priority on the analytic agenda. Two exploratory analyses using commercial data are included as an introduction to the commercial analyses that are currently underway. Most of the sections reference more complete technical reports which the reader should refer to for details of the methodology and for more complete discussion of the results. Sections have been processed separately for inclusion on the data base.

  1. A shift from significance test to hypothesis test through power analysis in medical research.

    Science.gov (United States)

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  2. Numerical Computation of Homogeneous Slope Stability

    Directory of Open Access Journals (Sweden)

    Shuangshuang Xiao

    2015-01-01

    Full Text Available To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM and particle swarm optimization algorithm (PSO to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759 were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS.

  3. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  4. On the Integration of Computer Algebra Systems (CAS) by Canadian Mathematicians: Results of a National Survey

    Science.gov (United States)

    Buteau, Chantal; Jarvis, Daniel H.; Lavicza, Zsolt

    2014-01-01

    In this article, we outline the findings of a Canadian survey study (N = 302) that focused on the extent of computer algebra systems (CAS)-based technology use in postsecondary mathematics instruction. Results suggest that a considerable number of Canadian mathematicians use CAS in research and teaching. CAS use in research was found to be the…

  5. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  6. Application of protons to computer tomography

    International Nuclear Information System (INIS)

    Hanson, K.M.; Bradbury, J.N.; Cannon, T.M.; Hutson, R.L.; Laubacher, D.B.; Macek, R.; Paciotti, M.A.; Taylor, C.A.

    1977-01-01

    It was demonstrated that the application of protons to computed tomography can result in a significant dose advantage relative to x rays. Thus, at the same dose as is delivered by contemporary commercial x-ray scanners, a proton scanner could produce reconstructions with a factor of 2 or more improvement in density resolution. Whether such an improvement can result in significantly better diagnoses of human disease is an open question which can only be answered by the implementation of a proton scanner in a clinical situation

  7. First principle calculations of effective exchange integrals: Comparison between SR (BS) and MR computational results

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Kizashi [Institute for Nano Science Design Center, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan and TOYOTA Physical and Chemical Research Institute, Nagakute, Aichi, 480-1192 (Japan); Nishihara, Satomichi; Saito, Toru; Yamanaka, Shusuke; Kitagawa, Yasutaka; Kawakami, Takashi; Yamada, Satoru; Isobe, Hiroshi; Okumura, Mitsutaka [Department of Chemistry, Graduate School of Science, Osaka University, 1-1 Machikaneyama, Toyonaka, Osaka 560-0043 (Japan)

    2015-01-22

    First principle calculations of effective exchange integrals (J) in the Heisenberg model for diradical species were performed by both symmetry-adapted (SA) multi-reference (MR) and broken-symmetry (BS) single reference (SR) methods. Mukherjee-type (Mk) state specific (SS) MR coupled-cluster (CC) calculations by the use of natural orbital (NO) references of ROHF, UHF, UDFT and CASSCF solutions were carried out to elucidate J values for di- and poly-radical species. Spin-unrestricted Hartree Fock (UHF) based coupled-cluster (CC) computations were also performed to these species. Comparison between UHF-NO(UNO)-MkMRCC and BS UHF-CC computational results indicated that spin-contamination of UHF-CC solutions still remains at the SD level. In order to eliminate the spin contamination, approximate spin-projection (AP) scheme was applied for UCC, and the AP procedure indeed corrected the error to yield good agreement with MkMRCC in energy. The CC double with spin-unrestricted Brueckner's orbital (UBD) was furthermore employed for these species, showing that spin-contamination involved in UHF solutions is largely suppressed, and therefore AP scheme for UBCCD removed easily the rest of spin-contamination. We also performed spin-unrestricted pure- and hybrid-density functional theory (UDFT) calculations of diradical and polyradical species. Three different computational schemes for total spin angular momentums were examined for the AP correction of the hybrid (H) UDFT. HUDFT calculations followed by AP, HUDFT(AP), yielded the S-T gaps that were qualitatively in good agreement with those of MkMRCCSD, UHF-CC(AP) and UB-CC(AP). Thus a systematic comparison among MkMRCCSD, UCC(AP) UBD(AP) and UDFT(AP) was performed concerning with the first principle calculations of J values in di- and poly-radical species. It was found that BS (AP) methods reproduce MkMRCCSD results, indicating their applicability to large exchange coupled systems.

  8. Computer assisted diagnosis of benign bone tumours

    International Nuclear Information System (INIS)

    Samardziski, M.; Zafiroski, G.; Janevska, V.; Miladinova, D.; Popeska, Z.

    2004-01-01

    Background. The aim of this study is to determine the correlation between computer-assisted diagnosis (CAD) of benign bone tumours (BBT) and their histological type. Patients and method. Altogether 120 patients were included in two groups. The retrospective group comprised 68 patients in whom the histological type of BBT was known prior to computer analysis. The prospective group comprised 52 patients in whom the histological type of BBT was unknown prior to computer analysis. Computer program was efficient and easy to use. Results. Average percent of histological type confirmed with CAD in the retrospective and prospective groups was 72.06% and 76.92%, respectively. Histological confirmation of CAD in specific BBT was 91.42% for enchondroma, 96.15% for osteoid-osteoma, and 98.08% for osteochondroma. Significantly lower percentage of CAD confirmation of fibroma, chondromixoid fibroma, osteoclastoma, desmoplastic fibroma and osteobalstoma due to their adverse biological character or complex anatomic localization is understandable. Conclusions. The results speak in favour of the assumption that computer assisted diagnosis of bone tumours program may improve the diagnostic accuracy of the examiner. (author)

  9. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Igari, Yui, E-mail: igari@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosoya, Tadashi [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan)

    2015-04-15

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  10. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Usui, Akihito; Hosokai, Yoshiyuki; Igari, Yui; Hosoya, Tadashi; Hayashizaki, Yoshie; Saito, Haruo; Ishibashi, Tadashi; Funayama, Masato

    2015-01-01

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  11. Altered reward processing in pathological computer gamers--ERP-results from a semi-natural gaming-design.

    Science.gov (United States)

    Duven, Eva C P; Müller, Kai W; Beutel, Manfred E; Wölfling, Klaus

    2015-01-01

    Internet Gaming Disorder has been added as a research diagnosis in section III for the DSM-V. Previous findings from neuroscientific research indicate an enhanced motivational attention toward cues related to computer games, similar to findings in substance-related addictions. On the other hand in clinical observational studies tolerance effects are reported by patients with Internet Gaming disorder. In the present study we investigated whether an enhanced motivational attention or tolerance effects are present in patients with Internet Gaming Disorder. A clinical sample from the Outpatient Clinic for Behavioral Addictions in Mainz, Germany was recruited, fulfilling the diagnostic criteria for Internet Gaming Disorder. In a semi-natural EEG design participants played a computer game during the recording of event-related potentials to assess reward processing. The results indicated an attenuated P300 for patients with Internet Gaming Disorder in response to rewards in comparison to healthy controls, while the latency of N100 was prolonged and the amplitude of N100 was increased. Our findings support the hypothesis that tolerance effects are present in patients with Internet Gaming Disorder, when actively playing computer games. In addition, the initial orienting toward the gaming reward is suggested to consume more capacity for patients with Internet Gaming Disorder, which has been similarly reported by other studies with other methodological background in disorders of substance-related addictions.

  12. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  13. Whole-body computed tomography in trauma patients: optimization of the patient scanning position significantly shortens examination time while maintaining diagnostic image quality

    Directory of Open Access Journals (Sweden)

    Hickethier T

    2018-05-01

    Full Text Available Tilman Hickethier,1,* Kamal Mammadov,1,* Bettina Baeßler,1 Thorsten Lichtenstein,1 Jochen Hinkelbein,2 Lucy Smith,3 Patrick Sven Plum,4 Seung-Hun Chon,4 David Maintz,1 De-Hua Chang1 1Department of Radiology, University Hospital of Cologne, Cologne, Germany; 2Department of Anesthesiology and Intensive Care Medicine, University Hospital of Cologne, Cologne, Germany; 3Faculty of Medicine, Memorial University of Newfoundland, St. John’s, Canada; 4Department of General, Visceral and Cancer Surgery, University Hospital of Cologne, Cologne, Germany *These authors contributed equally to this work Background: The study was conducted to compare examination time and artifact vulnerability of whole-body computed tomographies (wbCTs for trauma patients using conventional or optimized patient positioning. Patients and methods: Examination time was measured in 100 patients scanned with conventional protocol (Group A: arms positioned alongside the body for head and neck imaging and over the head for trunk imaging and 100 patients scanned with optimized protocol (Group B: arms flexed on a chest pillow without repositioning. Additionally, influence of two different scanning protocols on image quality in the most relevant body regions was assessed by two blinded readers. Results: Total wbCT duration was about 35% or 3:46 min shorter in B than in A. Artifacts in aorta (27 vs 6%, liver (40 vs 8% and spleen (27 vs 5% occurred significantly more often in B than in A. No incident of non-diagnostic image quality was reported, and no significant differences for lungs and spine were found. Conclusion: An optimized wbCT positioning protocol for trauma patients allows a significant reduction of examination time while still maintaining diagnostic image quality. Keywords: CT scan, polytrauma, acute care, time requirement, positioning

  14. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  15. Finding the most significant common sequence and structure motifs in a set of RNA sequences

    DEFF Research Database (Denmark)

    Gorodkin, Jan; Heyer, L.J.; Stormo, G.D.

    1997-01-01

    We present a computational scheme to locally align a collection of RNA sequences using sequence and structure constraints, In addition, the method searches for the resulting alignments with the most significant common motifs, among all possible collections, The first part utilizes a simplified...

  16. Quantum computing with trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  17. Significance of computed tomography in the diagnosis of the mediastinal mass lesions

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, Masanori; Takashima, Tsutomu; Suzuki, Masayuki; Itoh, Hiroshi; Hirose, Jinichiro; Choto, Shuichi (Kanazawa Univ. (Japan). School of Medicine)

    1983-08-01

    Thirty cases of the mediastinal mass lesions were examined by computed tomography and diagnostic ability of CT was retrospectively evaluated. We devided them into two major groups: cystic and solid lesions. Cysts and cystic teratomas were differentiated on the thickness of their wall. Pericardial cysts were typically present at the cardiophrenic angle. In the solid mediastinal lesions, the presence of calcific and/or fatty components, the presence of necrosis, the irregularity of the margin and the obliteration of the surrounding fat layer were the clues to differential diagnosis and of evaluation for their invasiveness. Although differential diagnosis of the solid anterior mediastinal tumors was often difficult, teratomas with calcific and fatty componets were easily diagnosed. Invasiveness of the malignant thymoma and other malignant lesions were successfully evaluated to some extent. Neurogenic posterior mediastinal tumors were easily diagnosed because of the presence of the spine deformity and typical dumbbell shaped appearance. We stress that our diagnostic approach is useful to differentiate the mediastinal mass lesions.

  18. Significance of computed tomography in the diagnosis of the mediastinal mass lesions

    International Nuclear Information System (INIS)

    Kimura, Masanori; Takashima, Tsutomu; Suzuki, Masayuki; Itoh, Hiroshi; Hirose, Jinichiro; Choto, Shuichi

    1983-01-01

    Thirty cases of the mediastinal mass lesions were examined by computed tomography and diagnostic ability of CT was retrospectively evaluated. We devided them into two major groups: cystic and solid lesions. Cysts and cystic teratomas were differentiated on the thickness of their wall. Pericardial cysts were typically present at the cardiophrenic angle. In the solid mediastinal lesions, the presence of calcific and/or fatty components, the presence of necrosis, the irregularity of the margin and the obliteration of the surrounding fat layer were the clues to differential diagnosis and of evaluation for their invasiveness. Although differential diagnosis of the solid anterior mediastinal tumors was often difficult, teratomas with calcific and fatty componets were easily diagnosed. Invasiveness of the malignant thymoma and other malignant lesions were successfully evaluated to some extent. Neurogenic posterior mediastinal tumors were easily diagnosed because of the presence of the spine deformity and typical dumbbell shaped appearance. We stress that our diagnostic approach is useful to differentiate the mediastinal mass lesions. (author)

  19. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    Science.gov (United States)

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  20. Students Computer Skills in Faculty of Education

    Directory of Open Access Journals (Sweden)

    Mehmet Caglar

    2010-09-01

    Full Text Available Nowadays; the usage of technology is not a privilege but an obligation. Technological developments influence structures andfunctions of educational institutions. It is also expected from the teachers that they integrate technology in their lessons inorder to educate the individuals of information society. This research has covered 145(68 female, 78 male students, studying inNear East University Faculty of Education. The Computer Skills Scale developed by Güçlü (2010 was used as a data collectingtool. Data were analysed using SPSS software program. In this study, students’ computer skills were investigated; the variationsin the relationships between computer skills and (a gender, (b family’s net monthly income, (c presence of computers athome, (d presence of a computer laboratory at school and (e parents’ computer skills were examined. Frequency analysis,percentage and mean calculations were used. In addition, t-test and multi-variate analysis were used to look at the relationshipbetween different variables. As a result of this study, a statistically significant relationship between computer skills of studentswho had a computer at home and computer skills of those who didn’t have a computer at home were found.

  1. Improving the management of diabetes in hospitalized patients: the results of a computer-based house staff training program.

    Science.gov (United States)

    Vaidya, Anand; Hurwitz, Shelley; Yialamas, Maria; Min, Le; Garg, Rajesh

    2012-07-01

    Poorly controlled diabetes in hospitalized patients is associated with poor clinical outcomes. We hypothesized that computer-based diabetes training could improve house staff knowledge and comfort for the management of diabetes in a large tertiary-care hospital. We implemented a computer-based training program on inpatient diabetes for internal medicine house staff at the Brigham and Women's Hospital (Boston, MA) in September 2009. House staff were required to complete the program and answer a set of questions, before and after the program, to evaluate their level of comfort and knowledge of inpatient diabetes. Chart reviews of all non-critically ill patients with diabetes managed by house staff in August 2009 (before the program) and December 2009 (after the program) were performed. Chart reviews were also performed for August 2008 and December 2008 to compare house staff management practices when the computer-based educational program was not available. A significant increase in comfort levels and knowledge in the management of inpatient diabetes was seen among house staff at all levels of training (Pstaff compared with junior house staff. Nonsignificant trends suggesting increased use of basal-bolus insulin (P=0.06) and decreased use of sliding-scale insulin (P=0.10) were seen following the educational intervention in 2009, whereas no such change was seen in 2008 (P>0.90). Overall, house staff evaluated the training program as "very relevant" and the technology interface as "good." A computer-based diabetes training program can improve the comfort and knowledge of house staff and potentially improve their insulin administration practices at large academic centers.

  2. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  3. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    Science.gov (United States)

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  4. Techniques for animation of CFD results. [computational fluid dynamics

    Science.gov (United States)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  5. Impossibility results for distributed computing

    CERN Document Server

    Attiya, Hagit

    2014-01-01

    To understand the power of distributed systems, it is necessary to understand their inherent limitations: what problems cannot be solved in particular systems, or without sufficient resources (such as time or space). This book presents key techniques for proving such impossibility results and applies them to a variety of different problems in a variety of different system models. Insights gained from these results are highlighted, aspects of a problem that make it difficult are isolated, features of an architecture that make it inadequate for solving certain problems efficiently are identified

  6. African-American males in computer science---Examining the pipeline for clogs

    Science.gov (United States)

    Stone, Daryl Bryant

    " self-efficacy between lower-level computer science majors and upper-level computer science majors. (5) There is no significant difference in "Computer Science Degree" self-efficacy between each of the five groups of students. Finally, the researcher selected African-American male students attending six primary schools, including the predominately African-American elementary, middle and high school that the researcher attended during his own academic career. Additionally, a racially mixed elementary, middle and high school was selected from the same county in Maryland. Bowie State University provided both the underclass and upperclass computer science majors surveyed in this study. Of the five hypotheses, the sample provided enough evidence to support the claim that there are significant differences in the "Computer Science Degree" self-efficacy between each of the five groups of students. ANOVA analysis by question and total self-efficacy scores provided more results of statistical significance. Additionally, factor analysis and review of the qualitative data provide more insightful results. Overall, the data suggest 'a clog' may exist in the middle school level and students attending racially mixed schools were more confident in their computer, math and science skills. African-American males admit to spending lots of time on social networking websites and emailing, but are 'dis-aware' of the skills and knowledge needed to study in the computing disciplines. The majority of the subjects knew little, if any, AAMs in the 'computing discipline pipeline'. The collegian African-American males, in this study, agree that computer programming is a difficult area and serves as a 'major clog in the pipeline'.

  7. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  8. Office ergonomics: deficiencies in computer workstation design.

    Science.gov (United States)

    Shikdar, Ashraf A; Al-Kindi, Mahmoud A

    2007-01-01

    The objective of this research was to study and identify ergonomic deficiencies in computer workstation design in typical offices. Physical measurements and a questionnaire were used to study 40 workstations. Major ergonomic deficiencies were found in physical design and layout of the workstations, employee postures, work practices, and training. The consequences in terms of user health and other problems were significant. Forty-five percent of the employees used nonadjustable chairs, 48% of computers faced windows, 90% of the employees used computers more than 4 hrs/day, 45% of the employees adopted bent and unsupported back postures, and 20% used office tables for computers. Major problems reported were eyestrain (58%), shoulder pain (45%), back pain (43%), arm pain (35%), wrist pain (30%), and neck pain (30%). These results indicated serious ergonomic deficiencies in office computer workstation design, layout, and usage. Strategies to reduce or eliminate ergonomic deficiencies in computer workstation design were suggested.

  9. The Evaluation of CEIT Teacher Candidates in Terms of Computer Games, Educational Use of Computer Games and Game Design Qualifications

    Directory of Open Access Journals (Sweden)

    Hakkı BAĞCI

    2014-04-01

    Full Text Available Computer games have an important usage potential in the education of today’s digital student profile. Also computer teachers known as technology leaders in schools are the main stakeholders of this potential. In this study, opinions of the computer teachers about computer games are examined from different perspectives. 119 computer teacher candidates participated in this study, and the data were collected by a questionnaire. As a result of this study, computer teacher candidates have a positive thinking about the usage of computer games in education and they see themselves qualified for the analysis and design of educational games. But they partially have negative attitudes about some risks like addiction and lose of time. Also the candidates who attended the educational game courses and play games from their mobile phones have more positive opinions, and they see themselves more qualified than others. Males have more positive opinions about computer games than females, but in terms of educational games and the analysis and design of the computer games, there is no significant difference between males and females.

  10. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  11. Computer-based, personalized cognitive training versus classical computer games: a randomized double-blind prospective trial of cognitive stimulation.

    Science.gov (United States)

    Peretz, Chava; Korczyn, Amos D; Shatil, Evelyn; Aharonson, Vered; Birnboim, Smadar; Giladi, Nir

    2011-01-01

    Many studies have suggested that cognitive training can result in cognitive gains in healthy older adults. We investigated whether personalized computerized cognitive training provides greater benefits than those obtained by playing conventional computer games. This was a randomized double-blind interventional study. Self-referred healthy older adults (n = 155, 68 ± 7 years old) were assigned to either a personalized, computerized cognitive training or to a computer games group. Cognitive performance was assessed at baseline and after 3 months by a neuropsychological assessment battery. Differences in cognitive performance scores between and within groups were evaluated using mixed effects models in 2 approaches: adherence only (AO; n = 121) and intention to treat (ITT; n = 155). Both groups improved in cognitive performance. The improvement in the personalized cognitive training group was significant (p computer games group it was significant (p games in improving visuospatial working memory (p = 0.0001), visuospatial learning (p = 0.0012) and focused attention (p = 0.0019). Personalized, computerized cognitive training appears to be more effective than computer games in improving cognitive performance in healthy older adults. Further studies are needed to evaluate the ecological validity of these findings. Copyright © 2011 S. Karger AG, Basel.

  12. Speed test results and hardware/software study of computational speed problem, appendix D

    Science.gov (United States)

    1984-01-01

    The HP9845C is a desktop computer which is tested and evaluated for processing speed. A study was made to determine the availability and approximate cost of computers and/or hardware accessories necessary to meet the 20 ms sample period speed requirements. Additional requirements were that the control algorithm could be programmed in a high language and that the machine have sufficient storage to store the data from a complete experiment.

  13. Computing Gröbner fans

    DEFF Research Database (Denmark)

    Fukuda, K.; Jensen, Anders Nedergaard; Thomas, R.R.

    2005-01-01

    This paper presents algorithms for computing the Gröbner fan of an arbitrary polynomial ideal. The computation involves enumeration of all reduced Gröbner bases of the ideal. Our algorithms are based on a uniform definition of the Gröbner fan that applies to both homogeneous and non......-homogeneous ideals and a proof that this object is a polyhedral complex. We show that the cells of a Gröbner fan can easily be oriented acyclically and with a unique sink, allowing their enumeration by the memory-less reverse search procedure. The significance of this follows from the fact that Gröbner fans...... are not always normal fans of polyhedra in which case reverse search applies automatically. Computational results using our implementation of these algorithms in the software package Gfan are included....

  14. A shift from significance test to hypothesis test through power analysis in medical research

    Directory of Open Access Journals (Sweden)

    Singh Girish

    2006-01-01

    Full Text Available Medical research literature until recently, exhibited substantial dominance of the Fisher′s significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson′s hypothesis test considering both probability of type I and II error. Fisher′s approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson′s approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher′s significance test to Neyman-Pearson′s hypothesis test procedure.

  15. [Formula: see text]Determination of the smoking gun of intent: significance testing of forced choice results in social security claimants.

    Science.gov (United States)

    Binder, Laurence M; Chafetz, Michael D

    2018-01-01

    Significantly below-chance findings on forced choice tests have been described as revealing "the smoking gun of intent" that proved malingering. The issues of probability levels, one-tailed vs. two-tailed tests, and the combining of PVT scores on significantly below-chance findings were addressed in a previous study, with a recommendation of a probability level of .20 to test the significance of below-chance results. The purpose of the present study was to determine the rate of below-chance findings in a Social Security Disability claimant sample using the previous recommendations. We compared the frequency of below-chance results on forced choice performance validity tests (PVTs) at two levels of significance, .05 and .20, and when using significance testing on individual subtests of the PVTs compared with total scores in claimants for Social Security Disability in order to determine the rate of the expected increase. The frequency of significant results increased with the higher level of significance for each subtest of the PVT and when combining individual test sections to increase the number of test items, with up to 20% of claimants showing significantly below-chance results at the higher p-value. These findings are discussed in light of Social Security Administration policy, showing an impact on policy issues concerning child abuse and neglect, and the importance of using these techniques in evaluations for Social Security Disability.

  16. The Effect of Computer Games on Students’ Critical Thinking Disposition and Educational Achievement

    Directory of Open Access Journals (Sweden)

    Mohammad Seifi

    2015-10-01

    Full Text Available The main aim of this research was to investigate the effect of computer games on student’ critical thinking disposition and educational achievement. The research method was descriptive, and its type was casual-comparative. The sample included 270 female high school students in Andimeshk town selected by multistage cluster method. Ricketts questionnaire was used to test critical thinking and the researcher made questionnaires were used to test computer games. T-test and one-way ANOVA were employed to analysis of the data. The findings of the study showed that playing computer games has no significant effect on critical thinking, however, there were a significant effect of playing computer games on students’ educational achievement (P<0/05. Furthermore, the results showed that the type of computer game has no significant effect on students’ disposition to critical thinking and their educational achievement. Keywords: Computer games, disposition to critical thinking, educational achievement, secondary students

  17. Quantum Heterogeneous Computing for Satellite Positioning Optimization

    Science.gov (United States)

    Bass, G.; Kumar, V.; Dulny, J., III

    2016-12-01

    Hard optimization problems occur in many fields of academic study and practical situations. We present results in which quantum heterogeneous computing is used to solve a real-world optimization problem: satellite positioning. Optimization problems like this can scale very rapidly with problem size, and become unsolvable with traditional brute-force methods. Typically, such problems have been approximately solved with heuristic approaches; however, these methods can take a long time to calculate and are not guaranteed to find optimal solutions. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. There are now commercially available quantum annealing (QA) devices that are designed to solve difficult optimization problems. These devices have 1000+ quantum bits, but they have significant hardware size and connectivity limitations. We present a novel heterogeneous computing stack that combines QA and classical machine learning and allows the use of QA on problems larger than the quantum hardware could solve in isolation. We begin by analyzing the satellite positioning problem with a heuristic solver, the genetic algorithm. The classical computer's comparatively large available memory can explore the full problem space and converge to a solution relatively close to the true optimum. The QA device can then evolve directly to the optimal solution within this more limited space. Preliminary experiments, using the Quantum Monte Carlo (QMC) algorithm to simulate QA hardware, have produced promising results. Working with problem instances with known global minima, we find a solution within 8% in a matter of seconds, and within 5% in a few minutes. Future studies include replacing QMC with commercially available quantum hardware and exploring more problem sets and model parameters. Our results have important implications for how heterogeneous quantum computing can be used to solve difficult optimization problems in any

  18. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    Directory of Open Access Journals (Sweden)

    Bundschuh Bettina B

    2011-11-01

    Full Text Available Abstract Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

  19. Environmental significance of atmospheric emission resulting from in situ burning of oiled salt marsh

    International Nuclear Information System (INIS)

    Devai, I.; DeLaune, R.D.; Henry, C.B. Jr.; Roberts, P.O.; Lindau, C.W.

    1998-01-01

    The environmental significance of atmospheric emissions resulting from in-situ burning used as remediation technique for removal of petroleum hydrocarbons entering Louisiana coastal salt marshes was quantified. Research conducted documented atmospheric pollutants produced and emitted to the atmosphere as the result of burning of oil contaminated wetlands. Samples collected from the smoke plume contained a variety of gaseous sulfur and carbon compounds. Carbonyl sulfide and carbon disulfide were the main volatile sulfur compounds. In contrast, concentrations of sulfur dioxide were almost negligible. Concentrations of methane and carbon dioxide in the smoke plume increased compared to ambient levels. Air samples collected for aromatic hydrocarbons in the smoke plume were dominated by pyrogenic or combustion derived aromatic hydrocarbons. The particulate fraction was dominated by phenanthrene and the C-1 and C-2 alkylated phenanthrene homologues. The vapor fraction was dominated by naphthalene and the C-1 to C-3 naphthalene homologues. (author)

  20. Functional analysis of rare variants in mismatch repair proteins augments results from computation-based predictive methods

    Science.gov (United States)

    Arora, Sanjeevani; Huwe, Peter J.; Sikder, Rahmat; Shah, Manali; Browne, Amanda J.; Lesh, Randy; Nicolas, Emmanuelle; Deshpande, Sanat; Hall, Michael J.; Dunbrack, Roland L.; Golemis, Erica A.

    2017-01-01

    ABSTRACT The cancer-predisposing Lynch Syndrome (LS) arises from germline mutations in DNA mismatch repair (MMR) genes, predominantly MLH1, MSH2, MSH6, and PMS2. A major challenge for clinical diagnosis of LS is the frequent identification of variants of uncertain significance (VUS) in these genes, as it is often difficult to determine variant pathogenicity, particularly for missense variants. Generic programs such as SIFT and PolyPhen-2, and MMR gene-specific programs such as PON-MMR and MAPP-MMR, are often used to predict deleterious or neutral effects of VUS in MMR genes. We evaluated the performance of multiple predictive programs in the context of functional biologic data for 15 VUS in MLH1, MSH2, and PMS2. Using cell line models, we characterized VUS predicted to range from neutral to pathogenic on mRNA and protein expression, basal cellular viability, viability following treatment with a panel of DNA-damaging agents, and functionality in DNA damage response (DDR) signaling, benchmarking to wild-type MMR proteins. Our results suggest that the MMR gene-specific classifiers do not always align with the experimental phenotypes related to DDR. Our study highlights the importance of complementary experimental and computational assessment to develop future predictors for the assessment of VUS. PMID:28494185

  1. Significance and management of computed tomography detected pulmonary nodules: a report from the National Wilms Tumor Study Group

    International Nuclear Information System (INIS)

    Meisel, Jay A.; Guthrie, Katherine A.; Breslow, Norman E.; Donaldson, Sarah S.; Green, Daniel M.

    1999-01-01

    Purpose: To define the optimal treatment for children with Wilms tumor who have pulmonary nodules identified on chest computed tomography (CT) scan, but have a negative chest radiograph, we evaluated the outcome of all such patients randomized or followed on National Wilms Tumor Study (NWTS)-3 and -4. Patients and Methods: We estimated the event-free and overall survival percentages of 53 patients with favorable histology tumors and pulmonary densities identified only by CT scan (CT-only) who were treated as Stage IV with intensive doxorubicin-containing chemotherapy and whole-lung irradiation, and compared these to the event-free and overall survival percentages of 37 CT-only patients who were treated less aggressively based on the extent of locoregional disease with 2 or 3 drugs, and without whole-lung irradiation. Results: The 4-year event-free and overall survival percentages of the 53 patients with CT-only nodules and favorable histology Wilms tumor who were treated as Stage IV were 89% and 91%, respectively. The 4-year event-free and overall survival percentages for the 37 patients with CT-only nodules and favorable histology who were treated according to the extent of locoregional disease were 80% and 85%, respectively. The differences observed between the 2 groups were not statistically significant. Among the patients who received whole-lung irradiation, there were fewer pulmonary relapses, but more deaths attributable to lung toxicity. Conclusions: The current data raise the possibility that children with Wilms tumor and CT-only pulmonary nodules who receive whole lung irradiation have fewer pulmonary relapses, but a greater number of deaths due to treatment toxicity. The role of whole lung irradiation in the treatment of this group of patients cannot be definitively determined based on the present data. Prolonged follow-up of this group of patients is necessary to accurately estimate the frequency of late, treatment-related mortality

  2. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  3. An initiative to improve the management of clinically significant test results in a large health care network.

    Science.gov (United States)

    Roy, Christopher L; Rothschild, Jeffrey M; Dighe, Anand S; Schiff, Gordon D; Graydon-Baker, Erin; Lenoci-Edwards, Jennifer; Dwyer, Cheryl; Khorasani, Ramin; Gandhi, Tejal K

    2013-11-01

    The failure of providers to communicate and follow up clinically significant test results (CSTR) is an important threat to patient safety. The Massachusetts Coalition for the Prevention of Medical Errors has endorsed the creation of systems to ensure that results can be received and acknowledged. In 2008 a task force was convened that represented clinicians, laboratories, radiology, patient safety, risk management, and information systems in a large health care network with the goals of providing recommendations and a road map for improvement in the management of CSTR and of implementing this improvement plan during the sub-force sequent five years. In drafting its charter, the task broadened the scope from "critical" results to "clinically significant" ones; clinically significant was defined as any result that requires further clinical action to avoid morbidity or mortality, regardless of the urgency of that action. The task force recommended four key areas for improvement--(1) standardization of policies and definitions, (2) robust identification of the patient's care team, (3) enhanced results management/tracking systems, and (4) centralized quality reporting and metrics. The task force faced many challenges in implementing these recommendations, including disagreements on definitions of CSTR and on who should have responsibility for CSTR, changes to established work flows, limitations of resources and of existing information systems, and definition of metrics. This large-scale effort to improve the communication and follow-up of CSTR in a health care network continues with ongoing work to address implementation challenges, refine policies, prepare for a new clinical information system platform, and identify new ways to measure the extent of this important safety problem.

  4. CMS software and computing for LHC Run 2

    CERN Document Server

    INSPIRE-00067576

    2016-11-09

    The CMS offline software and computing system has successfully met the challenge of LHC Run 2. In this presentation, we will discuss how the entire system was improved in anticipation of increased trigger output rate, increased rate of pileup interactions and the evolution of computing technology. The primary goals behind these changes was to increase the flexibility of computing facilities where ever possible, as to increase our operational efficiency, and to decrease the computing resources needed to accomplish the primary offline computing workflows. These changes have resulted in a new approach to distributed computing in CMS for Run 2 and for the future as the LHC luminosity should continue to increase. We will discuss changes and plans to our data federation, which was one of the key changes towards a more flexible computing model for Run 2. Our software framework and algorithms also underwent significant changes. We will summarize the our experience with a new multi-threaded framework as deployed on ou...

  5. Scientific Computing Kernels on the Cell Processor

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel W.; Shalf, John; Oliker, Leonid; Kamil, Shoaib; Husbands, Parry; Yelick, Katherine

    2007-04-04

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of using the recently-released STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. First, we introduce a performance model for Cell and apply it to several key scientific computing kernels: dense matrix multiply, sparse matrix vector multiply, stencil computations, and 1D/2D FFTs. The difficulty of programming Cell, which requires assembly level intrinsics for the best performance, makes this model useful as an initial step in algorithm design and evaluation. Next, we validate the accuracy of our model by comparing results against published hardware results, as well as our own implementations on a 3.2GHz Cell blade. Additionally, we compare Cell performance to benchmarks run on leading superscalar (AMD Opteron), VLIW (Intel Itanium2), and vector (Cray X1E) architectures. Our work also explores several different mappings of the kernels and demonstrates a simple and effective programming model for Cell's unique architecture. Finally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  6. Computer tomography in Caisson's disease

    International Nuclear Information System (INIS)

    Horvath, F.; Csobaly, S.; Institute for Advanced Training of Physicians, Budapest

    1981-01-01

    Computer tomography was performed on 20 patients with the early stages of Caisson osteoarthropathy, as well as in other patients with chronic bone infarcts. From their results the authors have formed the opinion that CT is valuable, not only in the diagnosis of early cases, but that it can provide significant information concerning the osteopathy and bone infarcts. (orig.) [de

  7. Implementation of depression screening in antenatal clinics through tablet computers: results of a feasibility study.

    Science.gov (United States)

    Marcano-Belisario, José S; Gupta, Ajay K; O'Donoghue, John; Ramchandani, Paul; Morrison, Cecily; Car, Josip

    2017-05-10

    Mobile devices may facilitate depression screening in the waiting area of antenatal clinics. This can present implementation challenges, of which we focused on survey layout and technology deployment. We assessed the feasibility of using tablet computers to administer a socio-demographic survey, the Whooley questions and the Edinburgh Postnatal Depression Scale (EPDS) to 530 pregnant women attending National Health Service (NHS) antenatal clinics across England. We randomised participants to one of two layout versions of these surveys: (i) a scrolling layout where each survey was presented on a single screen; or (ii) a paging layout where only one question appeared on the screen at any given time. Overall, 85.10% of eligible pregnant women agreed to take part. Of these, 90.95% completed the study procedures. Approximately 23% of participants answered Yes to at least one Whooley question, and approximately 13% of them scored 10 points of more on the EPDS. We observed no association between survey layout and the responses given to the Whooley questions, the median EPDS scores, the number of participants at increased risk of self-harm, and the number of participants asking for technical assistance. However, we observed a difference in the number of participants at each EPDS scoring interval (p = 0.008), which provide an indication of a woman's risk of depression. A scrolling layout resulted in faster completion times (median = 4 min 46 s) than a paging layout (median = 5 min 33 s) (p = 0.024). However, the clinical significance of this difference (47.5 s) is yet to be determined. Tablet computers can be used for depression screening in the waiting area of antenatal clinics. This requires the careful consideration of clinical workflows, and technology-related issues such as connectivity and security. An association between survey layout and EPDS scoring intervals needs to be explored further to determine if it corresponds to a survey layout effect

  8. Vanderbilt University: Campus Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    Despite the decentralized nature of computing at Vanderbilt, there is significant evidence of cooperation and use of each other's resources by the various computing entities. Planning for computing occurs in every school and department. Caravan, a campus-wide network, is described. (MLW)

  9. Computed tomography of the chest in blunt thoracic trauma: results of a prospective study

    International Nuclear Information System (INIS)

    Blostein, P.; Hodgman, C.

    1998-01-01

    Blunt thoracic injuries detected by computed tomography of the chest infrequently require immediate therapy. If immediate therapy is needed, findings will be visible on plain roentgenograms or on clinical exam. Routine Computed Tomography of the chest in blunt trauma is not recommended but may be helpful in selected cases. (N.C.)

  10. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  11. On Elasticity Measurement in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ai

    2016-01-01

    Full Text Available Elasticity is the foundation of cloud performance and can be considered as a great advantage and a key benefit of cloud computing. However, there is no clear, concise, and formal definition of elasticity measurement, and thus no effective approach to elasticity quantification has been developed so far. Existing work on elasticity lack of solid and technical way of defining elasticity measurement and definitions of elasticity metrics have not been accurate enough to capture the essence of elasticity measurement. In this paper, we present a new definition of elasticity measurement and propose a quantifying and measuring method using a continuous-time Markov chain (CTMC model, which is easy to use for precise calculation of elasticity value of a cloud computing platform. Our numerical results demonstrate the basic parameters affecting elasticity as measured by the proposed measurement approach. Furthermore, our simulation and experimental results validate that the proposed measurement approach is not only correct but also robust and is effective in computing and comparing the elasticity of cloud platforms. Our research in this paper makes significant contribution to quantitative measurement of elasticity in cloud computing.

  12. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  13. MELMRK 2.0: A description of computer models and results of code testing

    International Nuclear Information System (INIS)

    Wittman, R.S.; Denny, V.; Mertol, A.

    1992-01-01

    An advanced version of the MELMRK computer code has been developed that provides detailed models for conservation of mass, momentum, and thermal energy within relocating streams of molten metallics during meltdown of Savannah River Site (SRS) reactor assemblies. In addition to a mechanistic treatment of transport phenomena within a relocating stream, MELMRK 2.0 retains the MOD1 capability for real-time coupling of the in-depth thermal response of participating assembly heat structure and, further, augments this capability with models for self-heating of relocating melt owing to steam oxidation of metallics and fission product decay power. As was the case for MELMRK 1.0, the MOD2 version offers state-of-the-art numerics for solving coupled sets of nonlinear differential equations. Principal features include application of multi-dimensional Newton-Raphson techniques to accelerate convergence behavior and direct matrix inversion to advance primitive variables from one iterate to the next. Additionally, MELMRK 2.0 provides logical event flags for managing the broad range of code options available for treating such features as (1) coexisting flow regimes, (2) dynamic transitions between flow regimes, and (3) linkages between heatup and relocation code modules. The purpose of this report is to provide a detailed description of the MELMRK 2.0 computer models for melt relocation. Also included are illustrative results for code testing, as well as an integrated calculation for meltdown of a Mark 31a assembly

  14. [False positive results or what's the probability that a significant P-value indicates a true effect?

    Science.gov (United States)

    Cucherat, Michel; Laporte, Silvy

    2017-09-01

    The use of statistical test is central in the clinical trial. At the statistical level, obtaining a Pinformation about the plausibility of the existence of treatment effect. With "Pfalse positive is very high. This is the case if the power is low, if there is an inflation of the alpha risk or if the result is exploratory or chance discoveries. This possibility is important to take into consideration when interpreting the results of clinical trials in order to avoid pushing ahead significant results in appearance, but which are likely to be actually false positive results. Copyright © 2017 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  15. How the detector resolution affects the clinical significance of SBRT pre-treatment quality assurance results.

    Science.gov (United States)

    Bruschi, A; Esposito, M; Pini, S; Ghirelli, A; Zatelli, G; Russo, S

    2017-12-01

    Aim of this work was to study how the detector resolution can affect the clinical significance of SBRT pre-treatment volumetric modulated arc therapy (VMAT) verification results. Three detectors (PTW OCTAVIUS 4D 729, 1500 and 100 SRS) used in five configurations with different resolution were compared: 729, 729 merged, 1500, 1500 merged and 1000 SRS. Absolute local gamma passing rates of 3D pre-treatment quality assurance (QA) were evaluated for 150 dose distributions in 30 plans. Five different kinds of error were introduced in order to establish the detection sensitivity of the three devices. Percentage dosimetric differences were evaluated between planned dosevolume histogram (DVH) and patients' predicted DVH calculated by PTW DVH 4D® software. The mean gamma passing rates and the standard deviations were 92.4% ± 3.7%, 94.6% ± 1.8%, 95.3% ± 4.2%, 97.4% ± 2.5% and 97.6% ± 1.4 respectively for 729, 729 merged, 1500, 1500 merged and 1000 SRS with 2% local dose/2mm criterion. The same trend was found on the sensitivity analysis: using a tight gamma analysis criterion (2%L/1mm) only the 1000 SRS detected every kind of error, while 729 and 1500 merged detected three and four kinds of error respectively. Regarding dose metrics extracted from DVH curves, D50% was within the tolerance level in more than 90% of cases only for the 1000 SRS. The detector resolution can significantly affect the clinical significance of SBRT pre-treatment verification results. The choice of a detector with resolution suitable to the investigated field size is of main importance to avoid getting false positive. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  16. Nurses' computer literacy and attitudes towards the use of computers in health care.

    Science.gov (United States)

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  17. Description of test facilities bound to the research on sodium aerosols - some significant results

    Energy Technology Data Exchange (ETDEWEB)

    Dolias, M; Lafon, A; Vidard, M; Schaller, K H [DRNR/STRS - Centre de Cadarache, Saint-Paul-lez-Durance (France)

    1977-01-01

    This communication is dedicated to the description of the CEA (French Atomic Energy Authority) testing located at CADARACHE and which are utilized for the study of sodium aerosols behavior. These testing loops are necessary for studying the operating of equipment such as filters, sodium vapour traps, condensers and separators. It is also possible to study the effect of characteristics parameters on formation, coagulation and carrying away of sodium aerosols in the cover gas. Sodium aerosols deposits in a vertical annular space configuration with a cold area in its upper part are also studied. Some significant results emphasize the importance of operating conditions on the formation of aerosols. (author)

  18. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  19. Non-invasive imaging of myocardial bridge by coronary computed tomography angiography: the value of transluminal attenuation gradient to predict significant dynamic compression

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yuehua; Yu, Mengmeng; Zhang, Jiayin; Li, Minghua [Shanghai Jiao Tong University Affiliated Sixth People' s Hospital, Institute of Diagnostic and Interventional Radiology, Shanghai (China); Lu, Zhigang; Wei, Meng [Shanghai Jiao Tong University Affiliated Sixth People' s Hospital, Department of Cardiology, Shanghai (China)

    2017-05-15

    To study the diagnostic value of transluminal attenuation gradient (TAG) measured by coronary computed tomography angiography (CCTA) for identifying relevant dynamic compression of myocardial bridge (MB). Patients with confirmed MB who underwent both CCTA and ICA within one month were retrospectively included. TAG was defined as the linear regression coefficient between luminal attenuation and distance. The TAG of MB vessel, length and depth of MB were measured and correlated with the presence and degree of dynamic compression observed at ICA. Systolic compression ≥50 % was considered significant. 302 patients with confirmed MB lesions were included. TAG was lowest (-17.4 ± 6.7 HU/10 mm) in patients with significant dynamic compression and highest in patients without MB compression (-9.5 ± 4.3 HU/10 mm, p < 0.001). Linear correlation revealed relation between the percentage of systolic compression and TAG (Pearson correlation, r = -0.52, p < 0.001) and no significant relation between the percentage of systolic compression and MB depth or length. ROC curve analysis determined the best cut-off value of TAG as -14.8HU/10 mm (area under curve = 0.813, 95 % confidence interval = 0.764-0.855, p < 0.001), which yielded high diagnostic accuracy (82.1 %, 248/302). The degree of ICA-assessed systolic compression of MB significantly correlates with TAG but not MB depth or length. (orig.)

  20. Cryptography, quantum computation and trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  1. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  2. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  3. A Review on the Role of Color and Light in Affective Computing

    Directory of Open Access Journals (Sweden)

    Marina V. Sokolova

    2015-08-01

    Full Text Available Light and color are ubiquitous environmental factors which have an influence on the human being. Hence, light and color issues have to be considered especially significant in human-computer interaction (HCI and fundamental in affective computing. Affective computing is an interdisciplinary research field which aims to integrate issues dealing with emotions and computers. As a consequence, it seems important to provide an updated review on the significance of light and color in affective computing. With this purpose, the relationship between HCI/affective computing and the emotions affected by light and color are introduced in first place. So far, there is a considerable number of studies and experiments that offer empirical results on the topic. In addition, the color models generally used in affective computing are briefly described. The review on the usage of color and light in affective computing includes a detailed study of the characteristics of methods and the most recent research trends. The paper is complemented with the study of the importance of light and color from demographic, gender and cultural perspectives.

  4. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  5. The impact of optimize solar radiation received on the levels and energy disposal of levels on architectural design result by using computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Rezaei, Davood; Farajzadeh Khosroshahi, Samaneh; Sadegh Falahat, Mohammad [Zanjan University (Iran, Islamic Republic of)], email: d_rezaei@znu.ac.ir, email: ronas_66@yahoo.com, email: Safalahat@yahoo.com

    2011-07-01

    In order to minimize the energy consumption of a building it is important to achieve optimum solar energy. The aim of this paper is to introduce the use of computer modeling in the early stages of design to optimize solar radiation received and energy disposal in an architectural design. Computer modeling was performed on 2 different projects located in Los Angeles, USA, using ECOTECT software. Changes were made to the designs following analysis of the modeling results and a subsequent analysis was carried out on the optimized designs. Results showed that the computer simulation allows the designer to set the analysis criteria and improve the energy performance of a building before it is constructed; moreover, it can be used for a wide range of optimization levels. This study pointed out that computer simulation should be performed in the design stage to optimize a building's energy performance.

  6. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  7. First results with twisted mass fermions towards the computation of parton distribution functions on the lattice

    International Nuclear Information System (INIS)

    Alexandrou, Constantia; Cyprus Institute, Nicosia; Deutsches Elektronen-Synchrotron; Cichy, Krzysztof; Poznan Univ.; Drach, Vincent; Garcia-Ramos, Elena; Humboldt-Universitaet, Berlin; Hadjiyiannakou, Kyriakos; Jansen, Karl; Steffens, Fernanda; Wiese, Christian

    2014-11-01

    We report on our exploratory study for the evaluation of the parton distribution functions from lattice QCD, based on a new method proposed in Ref.∝arXiv:1305.1539. Using the example of the nucleon, we compare two different methods to compute the matrix elements needed, and investigate the application of gauge link smearing. We also present first results from a large production ensemble and discuss the future challenges related to this method.

  8. Computer-assisted comparison of analysis and test results in transportation experiments

    International Nuclear Information System (INIS)

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-01-01

    As a part of its ongoing research efforts, Sandia National Laboratories' Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment

  9. The determination of surface of powders by BET method using nitrogen and krypton with computer calculation of the results

    International Nuclear Information System (INIS)

    Dembinski, W.; Zlotowski, T.

    1973-01-01

    A computer program written in FORTRAN language for calculations of final results of specific surface analysis based on BET theory has been described. Two gases - nitrogen and krypton were used. A technical description of measuring apparaturs is presented as well as theoretical basis of the calculations together with statistical analysis of the results for uranium compounds powders. (author)

  10. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  11. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This

  12. Implementation of the Principal Component Analysis onto High-Performance Computer Facilities for Hyperspectral Dimensionality Reduction: Results and Comparisons

    Directory of Open Access Journals (Sweden)

    Ernestina Martel

    2018-06-01

    Full Text Available Dimensionality reduction represents a critical preprocessing step in order to increase the efficiency and the performance of many hyperspectral imaging algorithms. However, dimensionality reduction algorithms, such as the Principal Component Analysis (PCA, suffer from their computationally demanding nature, becoming advisable for their implementation onto high-performance computer architectures for applications under strict latency constraints. This work presents the implementation of the PCA algorithm onto two different high-performance devices, namely, an NVIDIA Graphics Processing Unit (GPU and a Kalray manycore, uncovering a highly valuable set of tips and tricks in order to take full advantage of the inherent parallelism of these high-performance computing platforms, and hence, reducing the time that is required to process a given hyperspectral image. Moreover, the achieved results obtained with different hyperspectral images have been compared with the ones that were obtained with a field programmable gate array (FPGA-based implementation of the PCA algorithm that has been recently published, providing, for the first time in the literature, a comprehensive analysis in order to highlight the pros and cons of each option.

  13. Determining who responds better to a computer- vs. human-delivered physical activity intervention: results from the community health advice by telephone (CHAT) trial

    Science.gov (United States)

    2013-01-01

    Background Little research has explored who responds better to an automated vs. human advisor for health behaviors in general, and for physical activity (PA) promotion in particular. The purpose of this study was to explore baseline factors (i.e., demographics, motivation, interpersonal style, and external resources) that moderate intervention efficacy delivered by either a human or automated advisor. Methods Data were from the CHAT Trial, a 12-month randomized controlled trial to increase PA among underactive older adults (full trial N = 218) via a human advisor or automated interactive voice response advisor. Trial results indicated significant increases in PA in both interventions by 12 months that were maintained at 18-months. Regression was used to explore moderation of the two interventions. Results Results indicated amotivation (i.e., lack of intent in PA) moderated 12-month PA (d = 0.55, p  0.12). Conclusions Results provide preliminary evidence for generating hypotheses about pathways for supporting later clinical decision-making with regard to the use of either human- vs. computer-delivered interventions for PA promotion. PMID:24053756

  14. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  15. 3D computation of the shape of etched tracks in CR-39 for oblique particle incidence and comparison with experimental results

    International Nuclear Information System (INIS)

    Doerschel, B.; Hermsdorf, D.; Reichelt, U.; Starke, S.; Wang, Y.

    2003-01-01

    Computation of the shape of etch pits needs to know the varying track etch rate along the particle trajectories. Experiments with alpha particles and 7 Li ions entering CR-39 detectors under different angles showed that this function is not affected by the inclination of the particle trajectory with respect to the normal on the detector surface. Track formation for oblique particle incidence can, therefore, be simulated using the track etch rates determined for perpendicular incidence. 3D computation of the track shape was performed applying a model recently described in literature. A special program has been written for computing the x,y,z coordinates of points on the etch pit walls. In addition, the etch pit profiles in sagittal sections as well as the contours of the etch pit openings on the detector surface have been determined experimentally. Computed and experimental results were in good agreement confirming the applicability of the 3D computational model in combination with the functions for the depth-dependent track etch rates determined experimentally

  16. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  17. Preliminary Results of Emergency Computed Tomography-Guided Ventricular Drain Placement-Precision for the Most Difficult Cases.

    Science.gov (United States)

    Nowacki, Andreas; Wagner, Franca; Söll, Nicole; Hakim, Arsany; Beck, Jürgen; Raabe, Andreas; Z'Graggen, Werner J

    2018-04-05

    External ventricular drainage (EVD) catheter placement is one of the most commonly performed neurosurgical procedures. The study's objective was to compare a computed tomography (CT) bolt scan-guided approach for the placement of EVDs with conventional landmark-based insertion. In this retrospective case-control study, we analyzed patients undergoing bolt-kit EVD catheter placement, either CT-guided or landmark-based, between 2013 and 2016. The CT bolt scan-guided approach was based on a dose-reduced CT scan after bolt fixation with immediate image reconstruction along the axis of the bolt to evaluate the putative insertion axis. If needed, angulation of the bolt was corrected and the procedure repeated before the catheter was inserted. Primary endpoint was the accuracy of insertion. Secondary endpoints were the overall number of attempts, duration of intervention, complication rates, and cumulative radiation dose. In total, 34 patients were included in the final analysis. In the group undergoing CT-guided placement, the average ventricle width was significantly smaller (P = 0.04) and average midline shift significantly more pronounced (P = 0.01). CT-guided placement resulted in correct positioning of the catheter in the ipsilateral frontal horn in all 100% of the cases compared with landmark-guided insertion (63%; P = 0.01). Application of the CT-guided approach increased the number of total CT scans (3.6 ± 1.9) and the overall radiation dose (3.34 ± 1.61 mSv) compared with the freehand insertion group (1.84 ± 2.0 mSv and 1.55 ± 1.66 mSv). No differences were found for the other secondary outcome parameters. CT-guided bolt-kit EVD catheter placement is feasible and accurate in the most difficult cases. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Evaluation of results for computed tomography in head region

    International Nuclear Information System (INIS)

    Himeji, Toshiharu

    1983-01-01

    In 2 years and 5 months from April 1980 to May 1982, I had maked examination for computed tomography (CT) in head region by TCT-60A (TOSHIBA), and so reported the evaluation of following those results; 1) The number of CT scan was 1228 patients and total 1513 scannings. The contents of its scan were plain CT (86.1%), CE (contrast enhancement) CT (7.3%) and both application methods (6.6%), and included from 1 CT time (85.3%), 2 CT times (9.6%), 3 CT times (3.3%),... til 7 CT times. Our CT scan cases were 720 males (58.6%) and 508 females (41.4%);its scan age level was mostly 40 y.o. -- over 70 y.o., but low age patients (under 10 y.o.) indicated number of 15.3%. In consideration of this fact the advantage of CT scan was very easily and safely procedure free from body lesion. 2) In number of CT scan: the most many patients were visiting department of internal medicine clinic, and following pediatric clinic, surgery and orthopedic department. Above all CT scan cases were included of other all clinical departments in our hospital. (CT scan was very useful for neurological examination). 3) In CT diagnosis our cases were it of cerebral infarction 128 (10.4%), cerebral hemorrage 19 (1.5%) and brain tumor 24 (2.3%), in small cases other craniocerebral diseases. 4) The visiting cases in internal medicine often complain of cerebrovascular symptomes, and in pediatric clinic chief complain was often suspected mental retardation and neurological sign. In surgery department it was suspected metastatic brain tumor from other malignant cancers, and in orthopedic surgery often skull injury or traffic accident. (J.P.N.)

  19. Development and application of a new deterministic method for calculating computer model result uncertainties

    International Nuclear Information System (INIS)

    Maerker, R.E.; Worley, B.A.

    1989-01-01

    Interest in research into the field of uncertainty analysis has recently been stimulated as a result of a need in high-level waste repository design assessment for uncertainty information in the form of response complementary cumulative distribution functions (CCDFs) to show compliance with regulatory requirements. The solution to this problem must obviously rely on the analysis of computer code models, which, however, employ parameters that can have large uncertainties. The motivation for the research presented in this paper is a search for a method involving a deterministic uncertainty analysis approach that could serve as an improvement over those methods that make exclusive use of statistical techniques. A deterministic uncertainty analysis (DUA) approach based on the use of first derivative information is the method studied in the present procedure. The present method has been applied to a high-level nuclear waste repository problem involving use of the codes ORIGEN2, SAS, and BRINETEMP in series, and the resulting CDF of a BRINETEMP result of interest is compared with that obtained through a completely statistical analysis

  20. Workstation computer systems for in-core fuel management

    International Nuclear Information System (INIS)

    Ciccone, L.; Casadei, A.L.

    1992-01-01

    The advancement of powerful engineering workstations has made it possible to have thermal-hydraulics and accident analysis computer programs operating efficiently with a significant performance/cost ratio compared to large mainframe computer. Today, nuclear utilities are acquiring independent engineering analysis capability for fuel management and safety analyses. Computer systems currently available to utility organizations vary widely thus requiring that this software be operational on a number of computer platforms. Recognizing these trends Westinghouse adopted a software development life cycle process for the software development activities which strictly controls the development, testing and qualification of design computer codes. In addition, software standards to ensure maximum portability were developed and implemented, including adherence to FORTRAN 77, and use of uniform system interface and auxiliary routines. A comprehensive test matrix was developed for each computer program to ensure that evolution of code versions preserves the licensing basis. In addition, the results of such test matrices establish the Quality Assurance basis and consistency for the same software operating on different computer platforms. (author). 4 figs

  1. Preliminary results of very fast computation of Moment Magnitude and focal mechanism in the context of tsunami warning

    Science.gov (United States)

    Schindelé, François; Roch, Julien; Rivera, Luis

    2015-04-01

    Various methodologies were recently developed to compute the moment magnitude and the focal mechanism, thanks to the real time access to numerous broad-band seismic data. Several methods were implemented at the CENALT, in particular the W-Phase method developed by H. Kanamori and L. Rivera. For earthquakes of magnitudes in the range 6.5-9.0, this method provides accurate results in less than 40 minutes. The context of the tsunami warning in Mediterranean, a small basin impacted in less than one hour, and with small sources but some with high tsunami potential (Boumerdes 2003), a comprehensive tsunami warning system in that region should include very fast computation of the seismic parameters. The results of the values of Mw, the focal depth and the type of fault (reverse, normal, strike-slip) are the most relevant parameters expected for the tsunami warning. Preliminary results will be presented using data in the North-eastern and Mediterranean region for the recent period 2010-2014. This work is funded by project ASTARTE - - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  2. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  3. Percutaneous computed tomography-guided core needle biopsy of soft tissue tumors: results and correlation with surgical specimen analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chojniak, Rubens; Grigio, Henrique Ramos; Bitencourt, Almir Galvao Vieira; Pinto, Paula Nicole Vieira; Tyng, Chiang J.; Cunha, Isabela Werneck da; Aguiar Junior, Samuel; Lopes, Ademar, E-mail: chojniak@uol.com.br [Hospital A.C. Camargo, Sao Paulo, SP (Brazil)

    2012-09-15

    Objective: To evaluate the efficacy of percutaneous computed tomography (CT)-guided core needle biopsy of soft tissue tumors in obtaining appropriate samples for histological analysis, and compare its diagnosis with the results of the surgical pathology as available. Materials and Methods: The authors reviewed medical records, imaging and histological reports of 262 patients with soft-tissue tumors submitted to CT-guided core needle biopsy in an oncologic reference center between 2003 and 2009. Results: Appropriate samples were obtained in 215 (82.1%) out of the 262 patients. The most prevalent tumors were sarcomas (38.6%), metastatic carcinomas (28.8%), benign mesenchymal tumors (20.5%) and lymphomas (9.3%). Histological grading was feasible in 92.8% of sarcoma patients, with the majority of them (77.9%) being classified as high grade tumors. Out of the total sample, 116 patients (44.3%) underwent surgical excision and diagnosis confirmation. Core biopsy demonstrated 94.6% accuracy in the identification of sarcomas, with 96.4% sensitivity and 89.5% specificity. A significant intermethod agreement about histological grading was observed between core biopsy and surgical resection (p < 0.001; kappa = 0.75). Conclusion: CT-guided core needle biopsy demonstrated a high diagnostic accuracy in the evaluation of soft tissue tumors as well as in the histological grading of sarcomas, allowing an appropriate therapeutic planning (author)

  4. Gender Differences in Computer Ethics among Business Administration Students

    Directory of Open Access Journals (Sweden)

    Ali ACILAR

    2010-12-01

    Full Text Available Because of the various benefits and advantages that computers and the Internet offer, these technologies have become an essential part of our daily life. The dependence on these technologies has been continuously and rapidly increasing. Computers and the Internet use also has become an important part for instructional purposes in academic environments. Even though the pervasive use of computers and the Internet has many benefits for almost everyone, but it has also increased the use of these technologies for illegal purposes or unethical activities such as spamming, making illegal copies of software, violations of privacy, hacking and computer viruses. The main purpose of this study is to explore gender differences in computer ethics among Business Administration students and examine their attitudes towards ethical use of computers. Results from 248 students in the Department of Business Administration at a public university in Turkey reveal that significant differences exist between male and female students’ attitudes towards ethical use of computers

  5. High-speed linear optics quantum computing using active feed-forward.

    Science.gov (United States)

    Prevedel, Robert; Walther, Philip; Tiefenbacher, Felix; Böhi, Pascal; Kaltenbaek, Rainer; Jennewein, Thomas; Zeilinger, Anton

    2007-01-04

    As information carriers in quantum computing, photonic qubits have the advantage of undergoing negligible decoherence. However, the absence of any significant photon-photon interaction is problematic for the realization of non-trivial two-qubit gates. One solution is to introduce an effective nonlinearity by measurements resulting in probabilistic gate operations. In one-way quantum computation, the random quantum measurement error can be overcome by applying a feed-forward technique, such that the future measurement basis depends on earlier measurement results. This technique is crucial for achieving deterministic quantum computation once a cluster state (the highly entangled multiparticle state on which one-way quantum computation is based) is prepared. Here we realize a concatenated scheme of measurement and active feed-forward in a one-way quantum computing experiment. We demonstrate that, for a perfect cluster state and no photon loss, our quantum computation scheme would operate with good fidelity and that our feed-forward components function with very high speed and low error for detected photons. With present technology, the individual computational step (in our case the individual feed-forward cycle) can be operated in less than 150 ns using electro-optical modulators. This is an important result for the future development of one-way quantum computers, whose large-scale implementation will depend on advances in the production and detection of the required highly entangled cluster states.

  6. Shadow Replication: An Energy-Aware, Fault-Tolerant Computational Model for Green Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaolong Cui

    2014-08-01

    Full Text Available As the demand for cloud computing continues to increase, cloud service providers face the daunting challenge to meet the negotiated SLA agreement, in terms of reliability and timely performance, while achieving cost-effectiveness. This challenge is increasingly compounded by the increasing likelihood of failure in large-scale clouds and the rising impact of energy consumption and CO2 emission on the environment. This paper proposes Shadow Replication, a novel fault-tolerance model for cloud computing, which seamlessly addresses failure at scale, while minimizing energy consumption and reducing its impact on the environment. The basic tenet of the model is to associate a suite of shadow processes to execute concurrently with the main process, but initially at a much reduced execution speed, to overcome failures as they occur. Two computationally-feasible schemes are proposed to achieve Shadow Replication. A performance evaluation framework is developed to analyze these schemes and compare their performance to traditional replication-based fault tolerance methods, focusing on the inherent tradeoff between fault tolerance, the specified SLA and profit maximization. The results show that Shadow Replication leads to significant energy reduction, and is better suited for compute-intensive execution models, where up to 30% more profit increase can be achieved due to reduced energy consumption.

  7. The use of computers in education worldwide : results from a comparative survey in 18 countries

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.

    1991-01-01

    In 1989, the International Association for the Evaluation of Educational Achievement (IEA) Computers in Education study collected data on computer use in elementary, and lower- and upper-secondary education in 22 countries. Although all data sets from the participating countries had not been

  8. The effect of ergonomic training and intervention on reducing occupational stress among computer users

    Directory of Open Access Journals (Sweden)

    T. Yektaee

    2014-05-01

    Result: According to covariance analysis, ergonomic training and interventions lead to reduction of occupational stress of computer users. .Conclusion: Training computer users and informing them of the ergonomic principals and also providing interventions such as correction of posture, reducing duration of work time, using armrest and footrest would have significant implication in reducing occupational stress among computer users.

  9. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  10. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  11. Safety Significance of the Halden IFA-650 LOCA Test Results

    International Nuclear Information System (INIS)

    Fuketa, Toyoshi; Nagase, Fumihisa; Grandjean, Claude; Petit, Marc; Hozer, Zoltan; Kelppe, Seppo; Khvostov, Grigori; Hafidi, Biya; Therache, Benjamin; Heins, Lothar; Valach, Mojmir; Voglewede, John; Wiesenack, Wolfgang

    2010-01-01

    CSNI therefore posed the question to the Working Group on Fuel Safety (WGFS): How could the Halden LOCA tests affect regulation? The WGFS agreed that the main safety concern would be fuel dispersal (and hence the potential for loss of coolable geometry) occurring at relatively low temperature, i.e. 800 deg. C. In order to assess the applicability of the IFA-650.4 results to actual power plant situations and the possible impact on safety criteria, a number of aspects should be clarified before considering a safety significance of the Halden IFA-650 series results: - Representativeness for NPP cases - Gas flow - Relocation - Burnup effect - Repeatability - Power history These items will be discussed one by one in this CSNI report. On April 17, 2009, test 650.9 was carried out with 650.4 sibling fuel. The target cladding peak temperature was 1100 deg. C in this case, but otherwise the experimental conditions were very similar. In many respects, 650.9 repeated the 650.4 experiment, e.g. by showing clear signs of fuel relocation which was confirmed by gamma scanning later on. The WGFS therefore decided that 650.9 should be considered as well for this CSNI report. Mention is also made of IFA-650.3, which failed with a small crack in a weak spot induced by thermocouple welding, and IFA-650.5 which involved ballooning and fuel ejection under conditions of restricted gas flow

  12. Boxers--computed tomography, EEG, and neurological evaluation

    International Nuclear Information System (INIS)

    Ross, R.J.; Cole, M.; Thompson, J.S.; Kim, K.H.

    1983-01-01

    During the last three years, 40 ex-boxers were examined to determine the effects of boxing in regard to their neurological status and the computed tomographic (CT) appearance of the brain. Thirty-eight of these patients had a CT scan of the brain, and 24 had a complete neurological examination including an EEG. The results demonstrate a significant relationship between the number of bouts fought and CT changes indicating cerebral atrophy. Positive neurological findings were not significantly correlated with the number of bouts. Electroencephalographic abnormalities were significantly correlated with the number of bouts fought. Computed tomography and EEG of the brain should be considered as part of a regular neurological examination for active boxers and, if possible, before and after each match, to detect not only the effects of acute life-threatening brain trauma such as subdural hematomas and brain hemorrhages, but the more subtle and debilitating long-term changes of cerebral atrophy

  13. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  14. Testing statistical significance scores of sequence comparison methods with structure similarity

    Directory of Open Access Journals (Sweden)

    Leunissen Jack AM

    2006-10-01

    Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.

  15. Computer vision syndrome among computer office workers in a developing country: an evaluation of prevalence and risk factors.

    Science.gov (United States)

    Ranasinghe, P; Wathurapatha, W S; Perera, Y S; Lamabadusuriya, D A; Kulatunga, S; Jayawardana, N; Katulanda, P

    2016-03-09

    Computer vision syndrome (CVS) is a group of visual symptoms experienced in relation to the use of computers. Nearly 60 million people suffer from CVS globally, resulting in reduced productivity at work and reduced quality of life of the computer worker. The present study aims to describe the prevalence of CVS and its associated factors among a nationally-representative sample of Sri Lankan computer workers. Two thousand five hundred computer office workers were invited for the study from all nine provinces of Sri Lanka between May and December 2009. A self-administered questionnaire was used to collect socio-demographic data, symptoms of CVS and its associated factors. A binary logistic regression analysis was performed in all patients with 'presence of CVS' as the dichotomous dependent variable and age, gender, duration of occupation, daily computer usage, pre-existing eye disease, not using a visual display terminal (VDT) filter, adjusting brightness of screen, use of contact lenses, angle of gaze and ergonomic practices knowledge as the continuous/dichotomous independent variables. A similar binary logistic regression analysis was performed in all patients with 'severity of CVS' as the dichotomous dependent variable and other continuous/dichotomous independent variables. Sample size was 2210 (response rate-88.4%). Mean age was 30.8 ± 8.1 years and 50.8% of the sample were males. The 1-year prevalence of CVS in the study population was 67.4%. Female gender (OR: 1.28), duration of occupation (OR: 1.07), daily computer usage (1.10), pre-existing eye disease (OR: 4.49), not using a VDT filter (OR: 1.02), use of contact lenses (OR: 3.21) and ergonomics practices knowledge (OR: 1.24) all were associated with significantly presence of CVS. The duration of occupation (OR: 1.04) and presence of pre-existing eye disease (OR: 1.54) were significantly associated with the presence of 'severe CVS'. Sri Lankan computer workers had a high prevalence of CVS. Female gender

  16. Algorithm for computing significance levels using the Kolmogorov-Smirnov statistic and valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    The KSTEST code presented here is designed to perform the Kolmogorov-Smirnov one-sample test. The code may be used as a stand-alone program or the principal subroutines may be excerpted and used to service other programs. The Kolmogorov-Smirnov one-sample test is a nonparametric goodness-of-fit test. A number of codes to perform this test are in existence, but they suffer from the inability to provide meaningful results in the case of small sample sizes (number of values less than or equal to 80). The KSTEST code overcomes this inadequacy by using two distinct algorithms. If the sample size is greater than 80, an asymptotic series developed by Smirnov is evaluated. If the sample size is 80 or less, a table of values generated by Birnbaum is referenced. Valid results can be obtained from KSTEST when the sample contains from 3 to 300 data points. The program was developed on a Digital Equipment Corporation PDP-10 computer using the FORTRAN-10 language. The code size is approximately 450 card images and the typical CPU execution time is 0.19 s.

  17. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  18. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  19. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    Science.gov (United States)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  20. Computer games and prosocial behaviour.

    Science.gov (United States)

    Mengel, Friederike

    2014-01-01

    We relate different self-reported measures of computer use to individuals' propensity to cooperate in the Prisoner's dilemma. The average cooperation rate is positively related to the self-reported amount participants spend playing computer games. None of the other computer time use variables (including time spent on social media, browsing internet, working etc.) are significantly related to cooperation rates.

  1. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  2. Angiographic assessment of initial balloon angioplasty results.

    Science.gov (United States)

    Gardiner, Geoffrey A; Sullivan, Kevin L; Halpern, Ethan J; Parker, Laurence; Beck, Margaret; Bonn, Joseph; Levin, David C

    2004-10-01

    To determine the influence of three factors involved in the angiographic assessment of balloon angioplasty-interobserver variability, operator bias, and the definition used to determine success-on the primary (technical) results of angioplasty in the peripheral arteries. Percent stenosis in 107 lesions in lower-extremity arteries was graded by three independent, experienced vascular radiologists ("observers") before and after balloon angioplasty and their estimates were compared with the initial interpretations reported by the physician performing the procedure ("operator") and an automated quantitative computer analysis. Observer variability was measured with use of intraclass correlation coefficients and SD. Differences among the operator, observers, and the computer were analyzed with use of the Wilcoxon signed-rank test and analysis of variance. For each evaluator, the results in this series of lesions were interpreted with three different definitions of success. Estimation of residual stenosis varied by an average range of 22.76% with an average SD of 8.99. The intraclass correlation coefficients averaged 0.59 for residual stenosis after angioplasty for the three observers but decreased to 0.36 when the operator was included as the fourth evaluator. There was good to very good agreement among the three independent observers and the computer, but poor correlation with the operator (P definition of success was used. Significant differences among the operator, the three observers, and the computer were not present when the definition of success was based on less than 50% residual stenosis. Observer variability and bias in the subjective evaluation of peripheral angioplasty can have a significant influence on the reported initial success rates. This effect can be largely eliminated with the use of residual stenosis of less than 50% to define success. Otherwise, meaningful evaluation of angioplasty results will require independent panels of evaluators or

  3. Results from a pilot study of a computer-based role-playing game for young people with psychosis.

    Science.gov (United States)

    Olivet, Jeffrey; Haselden, Morgan; Piscitelli, Sarah; Kenney, Rachael; Shulman, Alexander; Medoff, Deborah; Dixon, Lisa

    2018-03-15

    Recent research on first episode psychosis (FEP) has demonstrated the effectiveness of coordinated specialty care (CSC) models to support young adults and their families, yet few tools exist to promote engagement in care. This study aimed to develop a prototype computer-based role-playing game (RPG) designed for young people who have experienced FEP, and conduct a pilot study to determine feasibility and test whether the game improves consumers' attitudes toward treatment and recovery. Twenty young people with FEP who were receiving services at a CSC program enrolled in the study and played the game for 1 hour. Pre- and post-quantitative assessments measured change in hope, recovery, stigma, empowerment and engagement in treatment. Qualitative interviews explored participants' experience with the game and ideas for further product development. Participants showed significant increase in positive attitudes toward recovery. The qualitative findings further demonstrated the game's positive impact across these domains. Of all game features, participants most highly valued video testimonials of other young adults with FEP telling their stories of hope and recovery. These findings provide modest support for the potential benefits of this type of computer-based RPG, if further developed for individuals experiencing psychosis. © 2018 John Wiley & Sons Australia, Ltd.

  4. Computation of Asteroid Proper Elements on the Grid

    Science.gov (United States)

    Novakovic, B.; Balaz, A.; Knezevic, Z.; Potocnik, M.

    2009-12-01

    A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  5. The Effect of Animation in Multimedia Computer-Based Learning and Learning Style to the Learning Results

    Directory of Open Access Journals (Sweden)

    Muhammad RUSLI

    2017-10-01

    Full Text Available The effectiveness of a learning depends on four main elements, they are content, desired learning outcome, instructional method and the delivery media. The integration of those four elements can be manifested into a learning modul which is called multimedia learning or learning by using multimedia. In learning context by using computer-based multimedia, there are two main things that need to be noticed so that the learning process can run effectively: how the content is presented, and what the learner’s chosen way in accepting and processing the information into a meaningful knowledge. First it is related with the way to visualize the content and how people learn. The second one is related with the learning style of the learner. This research aims to investigate the effect of the type of visualization—static vs animated—on a multimedia computer-based learning, and learning styles—visual vs verbal, towards the students’ capability in applying the concepts, procedures, principles of Java programming. Visualization type act as independent variables, and learning styles of the students act as a moderator variable. Moreover, the instructional strategies followed the Component Display Theory of Merril, and the format of presentation of multimedia followed the Seven Principles of Multimedia Learning of Mayer and Moreno. Learning with the multimedia computer-based learning has been done in the classroom. The subject of this research was the student of STMIK-STIKOM Bali in odd semester 2016-2017 which followed the course of Java programming. The Design experiments used multivariate analysis of variance, MANOVA 2 x 2, with a large sample of 138 students in 4 classes. Based on the results of the analysis, it can be concluded that the animation in multimedia interactive learning gave a positive effect in improving students’ learning outcomes, particularly in the applying the concepts, procedures, and principles of Java programming. The

  6. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  7. Cognitive cooperation groups mediated by computers and internet present significant improvement of cognitive status in older adults with memory complaints: a controlled prospective study

    Directory of Open Access Journals (Sweden)

    Rodrigo de Rosso Krug

    Full Text Available ABSTRACT Objective To estimate the effect of participating in cognitive cooperation groups, mediated by computers and the internet, on the Mini-Mental State Examination (MMSE percent variation of outpatients with memory complaints attending two memory clinics. Methods A prospective controlled intervention study carried out from 2006 to 2013 with 293 elders. The intervention group (n = 160 attended a cognitive cooperation group (20 sessions of 1.5 hours each. The control group (n = 133 received routine medical care. Outcome was the percent variation in the MMSE. Control variables included gender, age, marital status, schooling, hypertension, diabetes, dyslipidaemia, hypothyroidism, depression, vascular diseases, polymedication, use of benzodiazepines, exposure to tobacco, sedentary lifestyle, obesity and functional capacity. The final model was obtained by multivariate linear regression. Results The intervention group obtained an independent positive variation of 24.39% (CI 95% = 14.86/33.91 in the MMSE compared to the control group. Conclusion The results suggested that cognitive cooperation groups, mediated by computers and the internet, are associated with cognitive status improvement of older adults in memory clinics.

  8. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  9. Numerical Studies of Magnetohydrodynamic Activity Resulting from Inductive Transients. Final Report

    International Nuclear Information System (INIS)

    Sovinec, Carl R.

    2005-01-01

    This report describes results from numerical studies of transients in magnetically confined plasmas. The work has been performed by University of Wisconsin graduate students James Reynolds and Giovanni Cone and by the Principal Investigator through support from contract DE-FG02-02ER54687, a Junior Faculty in Plasma Science award from the DOE Office of Science. Results from the computations have added significantly to our knowledge of magnetized plasma relaxation in the reversed-field pinch (RFP) and spheromak. In particular, they have distinguished relaxation activity expected in sustained configurations from transient effects that can persist over a significant fraction of the plasma discharge. We have also developed the numerical capability for studying electrostatic current injection in the spherical torus (ST). These configurations are being investigated as plasma confinement schemes in the international effort to achieve controlled thermonuclear fusion for environmentally benign energy production. Our numerical computations have been performed with the NIMROD code (http://nimrodteam.org) using local computing resources and massively parallel computing hardware at the National Energy Research Scientific Computing Center. Direct comparisons of simulation results for the spheromak with laboratory measurements verify the effectiveness of our numerical approach. The comparisons have been published in refereed journal articles by this group and by collaborators at Lawrence Livermore National Laboratory (see Section 4). In addition to the technical products, this grant has supported the graduate education of the two participating students for three years

  10. Computation Results from a Parametric Study to Determine Bounding Critical Systems of Homogeneously Water-Moderated Mixed Plutonium--Uranium Oxides

    Energy Technology Data Exchange (ETDEWEB)

    Shimizu, Y.

    2001-01-11

    This report provides computational results of an extensive study to examine the following: (1) infinite media neutron-multiplication factors; (2) material bucklings; (3) bounding infinite media critical concentrations; (4) bounding finite critical dimensions of water-reflected and homogeneously water-moderated one-dimensional systems (i.e., spheres, cylinders of infinite length, and slabs that are infinite in two dimensions) that were comprised of various proportions and densities of plutonium oxides and uranium oxides, each having various isotopic compositions; and (5) sensitivity coefficients of delta k-eff with respect to critical geometry delta dimensions were determined for each of the three geometries that were studied. The study was undertaken to support the development of a standard that is sponsored by the International Standards Organization (ISO) under Technical Committee 85, Nuclear Energy (TC 85)--Subcommittee 5, Nuclear Fuel Technology (SC 5)--Working Group 8, Standardization of Calculations, Procedures and Practices Related to Criticality Safety (WG 8). The designation and title of the ISO TC 85/SC 5/WG 8 standard working draft is WD 14941, ''Nuclear energy--Fissile materials--Nuclear criticality control and safety of plutonium-uranium oxide fuel mixtures outside of reactors.'' Various ISO member participants performed similar computational studies using their indigenous computational codes to provide comparative results for analysis in the development of the standard.

  11. A new computer-based counselling system for the promotion of physical activity in patients with chronic diseases--results from a pilot study.

    Science.gov (United States)

    Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna

    2011-05-01

    To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. All-Particle Multiscale Computation of Hypersonic Rarefied Flow

    Science.gov (United States)

    Jun, E.; Burt, J. M.; Boyd, I. D.

    2011-05-01

    This study examines a new hybrid particle scheme used as an alternative means of multiscale flow simulation. The hybrid particle scheme employs the direct simulation Monte Carlo (DSMC) method in rarefied flow regions and the low diffusion (LD) particle method in continuum flow regions. The numerical procedures of the low diffusion particle method are implemented within an existing DSMC algorithm. The performance of the LD-DSMC approach is assessed by studying Mach 10 nitrogen flow over a sphere with a global Knudsen number of 0.002. The hybrid scheme results show good overall agreement with results from standard DSMC and CFD computation. Subcell procedures are utilized to improve computational efficiency and reduce sensitivity to DSMC cell size in the hybrid scheme. This makes it possible to perform the LD-DSMC simulation on a much coarser mesh that leads to a significant reduction in computation time.

  13. Comparison between children dilated computer and retinoscopy

    Directory of Open Access Journals (Sweden)

    Li-Li Qi

    2015-06-01

    Full Text Available AIM: To investigate the dilation effect of computer optometry and retinoscopy optometry before and after mydriasis in children and to understand whether the application of computer refractor in children.METHODS: Therelated data of 500 children cases(1 000 eyeswith ametropia in our hospital were analyzed. The children firstly received computer optometry, and then use the 10g/L atropine sulfate eye gel drops, respectively. After 3d, they were performed computer optometry and retinoscopy, and compared the effect of two refraction.RESULTS: Spherical reading of computer optometry group was 2.70±2.75DS, cylinder degree was 1.54±1.10DC, were lower than those of retinoscopy group(PPPP>0.05. Before mydriasis, astigmatism was 1.54±1.10D, astigmatic axis was(14.38±11.11°. After mydriasis, astigmatism was 1.45±1.21D and astigmatic axis was(12.78±10.31°, significantly higher than those of retinoscopy(PCONCLUSION: Children optometry concerns the visual development of children. Computer optometry and retinoscopy are the pros and cons. As for computer optometry can not replace retinoscopy optometry, it can be used as an auxiliary tool for fast optometry.

  14. Internal Clock Drift Estimation in Computer Clusters

    Directory of Open Access Journals (Sweden)

    Hicham Marouani

    2008-01-01

    Full Text Available Most computers have several high-resolution timing sources, from the programmable interrupt timer to the cycle counter. Yet, even at a precision of one cycle in ten millions, clocks may drift significantly in a single second at a clock frequency of several GHz. When tracing the low-level system events in computer clusters, such as packet sending or reception, each computer system records its own events using an internal clock. In order to properly understand the global system behavior and performance, as reported by the events recorded on each computer, it is important to estimate precisely the clock differences and drift between the different computers in the system. This article studies the clock precision and stability of several computer systems, with different architectures. It also studies the typical network delay characteristics, since time synchronization algorithms rely on the exchange of network packets and are dependent on the symmetry of the delays. A very precise clock, based on the atomic time provided by the GPS satellite network, was used as a reference to measure clock drifts and network delays. The results obtained are of immediate use to all applications which depend on computer clocks or network time synchronization accuracy.

  15. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  16. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  17. Computed tomographic findings of intracranial pyogenic abscess

    International Nuclear Information System (INIS)

    Kim, S. J.; Suh, J. H.; Park, C. Y.; Lee, K. C.; Chung, S. S.

    1982-01-01

    The early diagnosis and effective treatment of brain abscess pose a difficult clinical problem. With the advent of computed tomography, however, it appears that mortality due to intracranial abscess has significantly diminished. 54 cases of intracranial pyogenic abscess are presented. Etiologic factors and computed tomographic findings are analyzed and following result are obtained. 1. The common etiologic factors are otitis media, post operation, and head trauma, in order of frequency. 2. The most common initial computed tomographic findings of brain abscess is ring contrast enhancement with surrounding brain edema. 3. The most characteristic computed tomographic finding of ring contrast enhancement is smooth thin walled ring contrast enhancement. 4. Most of thick irregular ring contrast enhancement are abscess associated with cyanotic heart disease or poor operation. 5. The most common findings of epidural and subdural empyema is crescentic radiolucent area with thin wall contrast enhancement without surrounding brain edema in convexity of brain

  18. Reflectivity of 1D photonic crystals: A comparison of computational schemes with experimental results

    Science.gov (United States)

    Pérez-Huerta, J. S.; Ariza-Flores, D.; Castro-García, R.; Mochán, W. L.; Ortiz, G. P.; Agarwal, V.

    2018-04-01

    We report the reflectivity of one-dimensional finite and semi-infinite photonic crystals, computed through the coupling to Bloch modes (BM) and through a transfer matrix method (TMM), and their comparison to the experimental spectral line shapes of porous silicon (PS) multilayer structures. Both methods reproduce a forbidden photonic bandgap (PBG), but slowly-converging oscillations are observed in the TMM as the number of layers increases to infinity, while a smooth converged behavior is presented with BM. The experimental reflectivity spectra is in good agreement with the TMM results for multilayer structures with a small number of periods. However, for structures with large amount of periods, the measured spectral line shapes exhibit better agreement with the smooth behavior predicted by BM.

  19. Teaching programming to non-STEM novices: a didactical study of computational thinking and non-STEM computing education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid

    research approach. Computational thinking plays a significant role in computing education but it is still unclear how it should be interpreted to best serve its purpose. Constructionism and Computational Making seems to be promising frameworks to do this. In regards to specific teaching activities...

  20. Migration goals and risk management in cloud computing: A review of state of the art and survey results on practitioners

    OpenAIRE

    Islam, Shareeful; Fenz, Stefan; Weippl, Edgar; Kalloniatis, Christos

    2016-01-01

    Organizations are now seriously considering adopting cloud into the existing business context, but migrating\\ud data, application and services into cloud doesn’t come without substantial risks. These risks are the significant\\ud barriers for the wider cloud adoption. Cloud computing has obtained a lot of attention by both research and\\ud industry communities in recent years. There are works that consolidate the existing work on cloud migration\\ud and technology. However, there is no secondary...

  1. Segmentation process significantly influences the accuracy of 3D surface models derived from cone beam computed tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Schepers, Rutger H; Gerrits, Pieter; Ren, Yijin

    AIMS: To assess the accuracy of surface models derived from 3D cone beam computed tomography (CBCT) with two different segmentation protocols. MATERIALS AND METHODS: Seven fresh-frozen cadaver heads were used. There was no conflict of interests in this study. CBCT scans were made of the heads and 3D

  2. Computer Activities for Persons With Dementia.

    Science.gov (United States)

    Tak, Sunghee H; Zhang, Hongmei; Patel, Hetal; Hong, Song Hee

    2015-06-01

    The study examined participant's experience and individual characteristics during a 7-week computer activity program for persons with dementia. The descriptive study with mixed methods design collected 612 observational logs of computer sessions from 27 study participants, including individual interviews before and after the program. Quantitative data analysis included descriptive statistics, correlational coefficients, t-test, and chi-square. Content analysis was used to analyze qualitative data. Each participant averaged 23 sessions and 591min for 7 weeks. Computer activities included slide shows with music, games, internet use, and emailing. On average, they had a high score of intensity in engagement per session. Women attended significantly more sessions than men. Higher education level was associated with a higher number of different activities used per session and more time spent on online games. Older participants felt more tired. Feeling tired was significantly correlated with a higher number of weeks with only one session attendance per week. More anticholinergic medications taken by participants were significantly associated with a higher percentage of sessions with disengagement. The findings were significant at p < .05. Qualitative content analysis indicated tailoring computer activities appropriate to individual's needs and functioning is critical. All participants needed technical assistance. A framework for tailoring computer activities may provide guidance on developing and maintaining treatment fidelity of tailored computer activity interventions among persons with dementia. Practice guidelines and education protocols may assist caregivers and service providers to integrate computer activities into homes and aging services settings. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Head and eye movement as pointing modalities for eyewear computers

    DEFF Research Database (Denmark)

    Jalaliniya, Shahram; Mardanbeigi, Diako; Pederson, Thomas

    2014-01-01

    examined using head and eye movements to point on a graphical user interface of a wearable computer. The performance of users in head and eye pointing has been compared with mouse pointing as a baseline method. The result of our experiment showed that the eye pointing is significantly faster than head......While the new generation of eyewear computers have increased expectations of a wearable computer, providing input to these devices is still challenging. Hand-held devices, voice commands, and hand gestures have already been explored to provide input to the wearable devices. In this paper, we...

  4. Comparison of endoscopic ultrasonography and multislice spiral computed tomography for the preoperative staging of gastric cancer - results of a single institution study of 610 Chinese patients.

    Directory of Open Access Journals (Sweden)

    Xing-Yu Feng

    Full Text Available BACKGROUND: This study compared the performance of endoscopic ultrasonography (EUS and multislice spiral computed tomography (MSCT in the preoperative staging of gastric cancer. METHODOLOGY/PRINCIPAL FINDINGS: A total of 610 patients participated in this study, all of whom had undergone surgical resection, had confirmed gastric cancer and were evaluated with EUS and MSCT. Tumor staging was evaluated using the Tumor-Node-Metastasis (TNM staging and Japanese classification. The results from the imaging modalities were compared with the postoperative histopathological outcomes. The overall accuracies of EUS and MSCT for the T staging category were 76.7% and 78.2% (P=0.537, respectively. Stratified analysis revealed that the accuracy of EUS for T1 and T2 staging was significantly higher than that of MSCT (P<0.001 for both and that the accuracy of MSCT in T3 and T4 staging was significantly higher than that of EUS (P<0.001 and 0.037, respectively. The overall accuracy of MSCT was 67.2% when using the 13th edition Japanese classification, and this percentage was significantly higher than the accuracy of EUS (49.3% and MSCT (44.6% when using the 6th edition UICC classification (P<0.001 for both values. CONCLUSIONS/SIGNIFICANCE: Our results demonstrated that the overall accuracies of EUS and MSCT for preoperative staging were not significantly different. We suggest that a combination of EUS and MSCT is required for preoperative evaluation of TNM staging.

  5. Computer tomography in Caisson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Horvath, F.; Csobaly, S.

    1981-07-01

    Computer tomography was performed on 20 patients with the early stages of Caisson osteoarthropathy, as well as in other patients with chronic bone infarcts. From their results the authors have formed the opinion that CT is valuable, not only in the diagnosis of early cases, but that it can provide significant information concerning the osteopathy and bone infarcts.

  6. Multidetector computed tomography of urolithiasis. Technique and results; Multidetektor-Computertomografie der Urolithiasis. Technik und Ergebnisse

    Energy Technology Data Exchange (ETDEWEB)

    Karul, M.; Regier, M. [Universitaetsklinikum Hamburg-Eppendorf, Hamburg (Germany). Zentrum fuer Radiologie und Endoskopie; Heuer, R. [Universitaetsklinikum Hamburg-Eppendorf, Hamburg (Germany). Zentrum fuer Operative Medizin

    2013-02-15

    The diagnosis of acute urolithiasis results from unenhanced multidetector computed tomography (MDCT). This test analyses the functional and anatomical possibility for passing an ureteral calculi, the localization and dimension of which are important parameters for further therapy. Alternatively chronic urolithiasis could be ruled out by magnetic resonance urography (MRU). MRU is the first choice especially in pregnant women and children because of radiation hygiene. Enhanced MDCT must be emphasized as an alternative to intravenous urography (IVU) for diagnosis of complex drainage of urine and suspected disorder of the involved kidney. This review illustrates the principles of different tests and the clinical relevance thereof. (orig.)

  7. Substituting computers for services - potential to reduce ICT's environmental footprint

    Energy Technology Data Exchange (ETDEWEB)

    Plepys, A. [The International Inst. for Industrial Environmental Economics at Lund Univ. (Sweden)

    2004-07-01

    The environmental footprint of IT products are significant and, in spite of manufacturing and product design improvements, growing consumption of electronics results in increasing absolute environmental impact. Computers have short technological lifespan and a lot of the in-build performance, although necessary, remains idling for most of the time. Today, most of computers used in non-residential sectors are connected to networks. The premise of this paper is that computer networks are an untapped resource, which could allow addressing environmental impacts of IT products through centralising and sharing computing resources. The article presents results of a comparative study of two computing architectures. The first one is the traditional decentralised PC-based system and the second - centralised server-based computing (SBC) system. Both systems deliver equivalent functions to the final users and this can be compared on a one-to-one basis. The study evaluates product lifespan, energy consumption in user stage, product design and its environmental implications in manufacturing. (orig.)

  8. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    OpenAIRE

    P. O. Umenne; M. O. Odhiambo

    2012-01-01

    Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ ex...

  9. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    Science.gov (United States)

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  10. Analytical and empirical mathematics with computers

    International Nuclear Information System (INIS)

    Wolfram, S.

    1986-01-01

    In this presentation, some of the practical methodological and theoretical implications of computation for the mathematical sciences are discussed. Computers are becoming an increasingly significant tool for research in the mathematical sciences. This paper discusses some of the fundamental ways in which computers have and can be used to do mathematics

  11. Preparing Future Secondary Computer Science Educators

    Science.gov (United States)

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  12. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  13. The clinical significance of Fuji computed radiography on lateral chest radiogram

    International Nuclear Information System (INIS)

    Kifune, Kouichi

    1995-01-01

    The purpose of this study was to clarify the benefits of digital lateral chest radiogram. In the basic study, the modulation transfer factor (MTF) and the wiener spectra (WS) of conventional screen film (CSF) and Fuji computed radiography (FCR) were measured. The visibility of the simulated nodules on FCR using 3 human bodies was subjectively compared with that on CSF by 13 observers. In the clinical study, the visibility of the normal structures on FCR was subjectively compared with that on CSF using 50 lateral chest radiograms by 10 observers. The diagnostic performance to detect pulmonary nodules on FCR was also compared with that on CSF using each 30 positive and negative cases by 8 observers. In the basic study, the MTF of FCR was superior to that of CSF, and the WS of FCR displayed in half size was superior to that of CSF. In all exposure conditions, the visibility of the nodules on FCR in the pulmonary apex was inferior to that on CSF, while FCR was superior to CSF in the other lung field. However, the visibility of the nodules on FCR in the pulmonary apex was improved when the exposure condition was increased. In the clinical study, the visibility of the normal structures on FCR was comparable or superior to that on CSF except for interlobar fissure due to resolution properties. The diagnostic performance of pulmonary nodules on FCR was comparable to that on CSF especially in classifying the marginal character and diameter of the nodules. According to the location of the nodules, the detectability of FCR was superior to that of CSF in the retrosternal space and tended to be inferior to that of CSF in the pulmonary apex. An adequate exposure condition should be considered before discussing the visibility and detectability of abnormal shadow in the lateral chest radiogram. In conclusion, the digital lateral chest radiogram is superior to the CSF images, mainly because of wide latitude in FCR. (author)

  14. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  15. Nutcracker or left renal vein compression phenomenon: multidetector computed tomography findings and clinical significance

    Energy Technology Data Exchange (ETDEWEB)

    Cuellar i Calabria, Hug; Quiroga Gomez, Sergi; Sebastia Cerqueda, Carmen; Boye de la Presa, Rosa; Miranda, Americo; Alvarez-Castells, Agusti [Hospitals Universitaris Vall D' Hebron, Institut de Diagnostic Per La Imatge, Servei De Radiodiagnostic, Barcelona (Spain)

    2005-08-01

    The use of multidetector computed tomography (MDCT) in routine abdominal explorations has increased the detection of the nutcracker phenomenon, defined as left renal vein (LRV) compression by adjacent anatomic structures. The embryology and anatomy of the nutcracker phenomenon are relevant as a background for the nutcracker syndrome, a rare cause of hematuria as well as other symptoms. MDCT examples of collateral renal vein circulation (gonadal, ureteric, azygous, lumbar, capsular) and aortomesenteric (anterior) and retroaortic (posterior) nutcracker phenomena in patients with no urologic complaint are shown as well as studies performed on patients with gross hematuria of uncertain origin. Incidental observation of collateral veins draining the LRV in abdominal MDCT explorations of asymptomatic patients may be a sign of a compensating nutcracker phenomenon. Imbalance between LRV compression and development of collateral circulation may lead to symptomatic nutcracker syndrome. (orig.)

  16. Nutcracker or left renal vein compression phenomenon: multidetector computed tomography findings and clinical significance

    International Nuclear Information System (INIS)

    Cuellar i Calabria, Hug; Quiroga Gomez, Sergi; Sebastia Cerqueda, Carmen; Boye de la Presa, Rosa; Miranda, Americo; Alvarez-Castells, Agusti

    2005-01-01

    The use of multidetector computed tomography (MDCT) in routine abdominal explorations has increased the detection of the nutcracker phenomenon, defined as left renal vein (LRV) compression by adjacent anatomic structures. The embryology and anatomy of the nutcracker phenomenon are relevant as a background for the nutcracker syndrome, a rare cause of hematuria as well as other symptoms. MDCT examples of collateral renal vein circulation (gonadal, ureteric, azygous, lumbar, capsular) and aortomesenteric (anterior) and retroaortic (posterior) nutcracker phenomena in patients with no urologic complaint are shown as well as studies performed on patients with gross hematuria of uncertain origin. Incidental observation of collateral veins draining the LRV in abdominal MDCT explorations of asymptomatic patients may be a sign of a compensating nutcracker phenomenon. Imbalance between LRV compression and development of collateral circulation may lead to symptomatic nutcracker syndrome. (orig.)

  17. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  18. Performing quantum computing experiments in the cloud

    Science.gov (United States)

    Devitt, Simon J.

    2016-09-01

    Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.

  19. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  20. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    Science.gov (United States)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  1. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  2. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  3. The clinical significance of isolated loss of lordosis on cervical spine computed tomography in blunt trauma patients: a prospective evaluation of 1,007 patients.

    Science.gov (United States)

    Mejaddam, Ali Y; Kaafarani, Haytham M A; Ramly, Elie P; Avery, Laura L; Yeh, Dante D; King, David R; de Moya, Marc A; Velmahos, George C

    2015-11-01

    A negative computed tomographic (CT) scan may be used to rule out cervical spine (c-spine) injury after trauma. Loss of lordosis (LOL) is frequently found as the only CT abnormality. We investigated whether LOL should preclude c-spine clearance. All adult trauma patients with isolated LOL at our Level I trauma center (February 1, 2011 to May 31, 2012) were prospectively evaluated. The primary outcome was clinically significant injury on magnetic resonance imaging (MRI), flexion-extension views, and/or repeat physical examination. Of 3,333 patients (40 ± 17 years, 60% men) with a c-spine CT, 1,007 (30%) had isolated LOL. Among 841 patients with a Glasgow Coma Scale score of 15, no abnormalities were found on MRI, flexion-extension views, and/or repeat examinations, and all collars were removed. Among 166 patients with Glasgow Coma Scale less than 15, 3 (.3%) had minor abnormal MRI findings but no clinically significant injury. Isolated LOL on c-spine CT is not associated with a clinically significant injury and should not preclude c-spine clearance. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. A History of Computer Numerical Control.

    Science.gov (United States)

    Haggen, Gilbert L.

    Computer numerical control (CNC) has evolved from the first significant counting method--the abacus. Babbage had perhaps the greatest impact on the development of modern day computers with his analytical engine. Hollerith's functioning machine with punched cards was used in tabulating the 1890 U.S. Census. In order for computers to become a…

  5. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  6. Wing-Body Aeroelasticity Using Finite-Difference Fluid/Finite-Element Structural Equations on Parallel Computers

    Science.gov (United States)

    Byun, Chansup; Guruswamy, Guru P.; Kutler, Paul (Technical Monitor)

    1994-01-01

    In recent years significant advances have been made for parallel computers in both hardware and software. Now parallel computers have become viable tools in computational mechanics. Many application codes developed on conventional computers have been modified to benefit from parallel computers. Significant speedups in some areas have been achieved by parallel computations. For single-discipline use of both fluid dynamics and structural dynamics, computations have been made on wing-body configurations using parallel computers. However, only a limited amount of work has been completed in combining these two disciplines for multidisciplinary applications. The prime reason is the increased level of complication associated with a multidisciplinary approach. In this work, procedures to compute aeroelasticity on parallel computers using direct coupling of fluid and structural equations will be investigated for wing-body configurations. The parallel computer selected for computations is an Intel iPSC/860 computer which is a distributed-memory, multiple-instruction, multiple data (MIMD) computer with 128 processors. In this study, the computational efficiency issues of parallel integration of both fluid and structural equations will be investigated in detail. The fluid and structural domains will be modeled using finite-difference and finite-element approaches, respectively. Results from the parallel computer will be compared with those from the conventional computers using a single processor. This study will provide an efficient computational tool for the aeroelastic analysis of wing-body structures on MIMD type parallel computers.

  7. Computer-aided drug design at Boehringer Ingelheim

    Science.gov (United States)

    Muegge, Ingo; Bergner, Andreas; Kriegl, Jan M.

    2017-03-01

    Computer-Aided Drug Design (CADD) is an integral part of the drug discovery endeavor at Boehringer Ingelheim (BI). CADD contributes to the evaluation of new therapeutic concepts, identifies small molecule starting points for drug discovery, and develops strategies for optimizing hit and lead compounds. The CADD scientists at BI benefit from the global use and development of both software platforms and computational services. A number of computational techniques developed in-house have significantly changed the way early drug discovery is carried out at BI. In particular, virtual screening in vast chemical spaces, which can be accessed by combinatorial chemistry, has added a new option for the identification of hits in many projects. Recently, a new framework has been implemented allowing fast, interactive predictions of relevant on and off target endpoints and other optimization parameters. In addition to the introduction of this new framework at BI, CADD has been focusing on the enablement of medicinal chemists to independently perform an increasing amount of molecular modeling and design work. This is made possible through the deployment of MOE as a global modeling platform, allowing computational and medicinal chemists to freely share ideas and modeling results. Furthermore, a central communication layer called the computational chemistry framework provides broad access to predictive models and other computational services.

  8. The relationship between computer games and quality of life in adolescents.

    Science.gov (United States)

    Dolatabadi, Nayereh Kasiri; Eslami, Ahmad Ali; Mostafavi, Firooze; Hassanzade, Akbar; Moradi, Azam

    2013-01-01

    Term of doing computer games among teenagers is growing rapidly. This popular phenomenon can cause physical and psychosocial issues in them. Therefore, this study examined the relationship between computer games and quality of life domains in adolescents aging 12-15 years. In a cross-sectional study using the 2-stage stratified cluster sampling method, 444 male and female students in Borkhar were selected. The data collection tool consisted of 1) World Health Organization Quality Of Life - BREF questionnaire and 2) personal information questionnaire. The data were analyzed by Pearson correlation, Spearman correlation, chi-square, independent t-tests and analysis of covariance. The total mean score of quality of life in students was 67.11±13.34. The results showed a significant relationship between the age of starting to play games and the overall quality of life score and its fourdomains (range r=-0.13 to -0.18). The mean of overall quality of life score in computer game users was 68.27±13.03 while it was 64.81±13.69 among those who did not play computer games and the difference was significant (P=0.01). There were significant differences in environmental and mental health domains between the two groups (Pcomputer games. Playing computer games for a short time under parental supervision can have positive effects on quality of life in adolescents. However, spending long hours for playing computer games may have negative long-term effects.

  9. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  10. Results of computer-tomographic examination in different forms and course of schizophrenia

    International Nuclear Information System (INIS)

    Stojchev, R.

    1991-01-01

    Data are reported of a clinical and computer-tomographic study of 103 schizophrenic patients. Those with simple form of the disease had most pronounced evidence of dilated III and lateral ventricles (41.8% of the cases for the III ventricle and 72.4% for the lateral ventricles). All patients with circular, simple and catatonic form had signs of pathology of the cortical sulci. Regarding the ventricular system evidences of pathology prevailed in cases of impetus-progredient and constantly progredient course, whereas in respect to cortical pathology, the results were almost identical in all three types of psychosis - 95.2% of cases of constantly progredient and 95.6% - of impetus-progredient course. Attention was called to the 'surprising' data of organic brain injury in patients with paranoid and circular form of the disease, as well as in the most benign (from clinical point of view) impetus course. It is assumed that morphologic changes in the brain of schizophrenic patients are a natural phenomenon, but so far have not been a subject of comprehensive studies, maybe because of prejudice or lack of appropriate methods for examination of the brain during life's time. 6 figs., 15 refs

  11. Computation of asteroid proper elements on the Grid

    Directory of Open Access Journals (Sweden)

    Novaković B.

    2009-01-01

    Full Text Available A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  12. Computation of Asteroid Proper Elements on the Grid

    Directory of Open Access Journals (Sweden)

    Novaković, B.

    2009-12-01

    Full Text Available A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  13. Metaanalysis of Diagnostic Performance of Computed Coronary Tomography Angiography, Computed Tomography Perfusion and Computed Tomography-Fractional Flow Reserve in Functional Myocardial Ischemia Assessment versus Invasive Fractional Flow Reserve

    Science.gov (United States)

    Gonzalez, Jorge A.; Lipinski, Michael J.; Flors, Lucia F.; Shaw, Peter; Kramer, Christopher M.; Salerno, Michael

    2015-01-01

    We sought to compare the diagnostic performance of computed coronary tomography angiography (CCTA), computed tomography perfusion (CTP) and computed tomography fractional flow reserve (CT-FFR) for assessing the functional significance of coronary stenosis as defined by invasive fractional flow reserve (FFR), in patients with known or suspected coronary artery disease. CCTA has proven clinically useful for excluding obstructive CAD due to its high sensitivity and negative predictive value (NPV), however the ability of CTA to identify functionally significant CAD has remained challenging. We searched PubMed/Medline for studies evaluating CCTA, CTP or CT-FFR for the non-invasive detection of obstructive CAD as compared to catheter-derived FFR as the reference standard. Pooled sensitivity, specificity, PPV, NPV, likelihood ratios (LR), odds ratio (OR) of all diagnostic tests were assessed. Eighteen studies involving a total of 1535 patients were included. CTA demonstrated a pooled sensitivity of 0.92, specificity 0.43, PPV of 0.56 and NPV of 0.87 on a per-patient level. CT-FFR and CTP increased the specificity to 0.72 and 0.77 respectively (P=0.004 and P=0.0009)) resulting in higher point estimates for PPV 0.70 and 0.83 respectively. There was no improvement in the sensitivity. The CTP protocol involved more radiation (3.5 mSv CCTA VS 9.6 mSv CTP) and a higher volume of iodinated contrast (145 mL). In conclusion, CTP and CT-FFR improve the specificity of CCTA for detecting functionally significant stenosis as defined by invasive FFR on a per-patient level; both techniques could advance the ability to non-invasively detect the functional significance of coronary lesions. PMID:26347004

  14. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  15. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  16. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    Science.gov (United States)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  17. Web-based thyroid imaging reporting and data system: Malignancy risk of atypia of undetermined significance or follicular lesion of undetermined significance thyroid nodules calculated by a combination of ultrasonography features and biopsy results.

    Science.gov (United States)

    Choi, Young Jun; Baek, Jung Hwan; Shin, Jung Hee; Shim, Woo Hyun; Kim, Seon-Ok; Lee, Won-Hong; Song, Dong Eun; Kim, Tae Yong; Chung, Ki-Wook; Lee, Jeong Hyun

    2018-05-13

    The purpose of this study was to construct a web-based predictive model using ultrasound characteristics and subcategorized biopsy results for thyroid nodules of atypia of undetermined significance/follicular lesion of undetermined significance (AUS/FLUS) to stratify the risk of malignancy. Data included 672 thyroid nodules from 656 patients from a historical cohort. We analyzed ultrasound images of thyroid nodules and biopsy results according to nuclear atypia and architectural atypia. Multivariate logistic regression analysis was performed to predict whether nodules were diagnosed as malignant or benign. The ultrasound features, including spiculated margin, marked hypoechogenicity, calcifications, biopsy results, and cytologic atypia, showed significant differences between groups. A 13-point risk scoring system was developed, and the area under the curve (AUC) of the receiver operating characteristic (ROC) curve of the development and validation sets were 0.837 and 0.830, respectively (http://www.gap.kr/thyroidnodule_b3.php). We devised a web-based predictive model using the combined information of ultrasound characteristics and biopsy results for AUS/FLUS thyroid nodules to stratify the malignant risk. © 2018 Wiley Periodicals, Inc.

  18. Processing computed tomography images by using personal computer

    International Nuclear Information System (INIS)

    Seto, Kazuhiko; Fujishiro, Kazuo; Seki, Hirofumi; Yamamoto, Tetsuo.

    1994-01-01

    Processing of CT images was attempted by using a popular personal computer. The program for image-processing was made with C compiler. The original images, acquired with CT scanner (TCT-60A, Toshiba), were transferred to the computer by 8-inch flexible diskette. Many fundamental image-processing, such as displaying image to the monitor, calculating CT value and drawing the profile curve. The result showed that a popular personal computer had ability to process CT images. It seemed that 8-inch flexible diskette was still useful medium of transferring image data. (author)

  19. Remote sensing of oceanic primary production: Computations using a spectral model

    Digital Repository Service at National Institute of Oceanography (India)

    Sathyendranath, S.; Platt, T.; Caverhill, C.M.; Warnock, R.E.; Lewis, M.R.

    A spectral model of underwater irradiance is coupled with a spectral version of the photosynthesis-light relationship to compute oceanic primary production. The results are shown to be significantly different from those obtained using...

  20. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  1. Deterministic computation of functional integrals

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.

    1995-09-01

    A new method of numerical integration in functional spaces is described. This method is based on the rigorous definition of a functional integral in complete separable metric space and on the use of approximation formulas which we constructed for this kind of integral. The method is applicable to solution of some partial differential equations and to calculation of various characteristics in quantum physics. No preliminary discretization of space and time is required in this method, as well as no simplifying assumptions like semi-classical, mean field approximations, collective excitations, introduction of ''short-time'' propagators, etc are necessary in our approach. The constructed approximation formulas satisfy the condition of being exact on a given class of functionals, namely polynomial functionals of a given degree. The employment of these formulas replaces the evaluation of a functional integral by computation of the ''ordinary'' (Riemannian) integral of a low dimension, thus allowing to use the more preferable deterministic algorithms (normally - Gaussian quadratures) in computations rather than traditional stochastic (Monte Carlo) methods which are commonly used for solution of the problem under consideration. The results of application of the method to computation of the Green function of the Schroedinger equation in imaginary time as well as the study of some models of Euclidean quantum mechanics are presented. The comparison with results of other authors shows that our method gives significant (by an order of magnitude) economy of computer time and memory versus other known methods while providing the results with the same or better accuracy. The funcitonal measure of the Gaussian type is considered and some of its particular cases, namely conditional Wiener measure in quantum statistical mechanics and functional measure in a Schwartz distribution space in two-dimensional quantum field theory are studied in detail. Numerical examples demonstrating the

  2. Cone beam computed tomography in endodontic

    Energy Technology Data Exchange (ETDEWEB)

    Durack, Conor; Patel, Shanon [Unit of Endodontology, Department of Conservative Dentistry, King' s College London, London (United Kingdom)

    2012-07-01

    Cone beam computed tomography (CBCT) is a contemporary, radiological imaging system designed specifically for use on the maxillofacial skeleton. The system overcomes many of the limitations of conventional radiography by producing undistorted, three-dimensional images of the area under examination. These properties make this form of imaging particularly suitable for use in endodontic. The clinician can obtain an enhanced appreciation of the anatomy being assessed, leading to an improvement in the detection of endodontic disease and resulting in more effective treatment planning. In addition, CBCT operates with a significantly lower effective radiation dose when compared with conventional computed tomography (CT). The purpose of this paper is to review the current literature relating to the limitations and potential applications of CBCT in endodontic practice. (author)

  3. Cone beam computed tomography in endodontic

    International Nuclear Information System (INIS)

    Durack, Conor; Patel, Shanon

    2012-01-01

    Cone beam computed tomography (CBCT) is a contemporary, radiological imaging system designed specifically for use on the maxillofacial skeleton. The system overcomes many of the limitations of conventional radiography by producing undistorted, three-dimensional images of the area under examination. These properties make this form of imaging particularly suitable for use in endodontic. The clinician can obtain an enhanced appreciation of the anatomy being assessed, leading to an improvement in the detection of endodontic disease and resulting in more effective treatment planning. In addition, CBCT operates with a significantly lower effective radiation dose when compared with conventional computed tomography (CT). The purpose of this paper is to review the current literature relating to the limitations and potential applications of CBCT in endodontic practice. (author)

  4. Immersive visualization of dynamic CFD model results

    International Nuclear Information System (INIS)

    Comparato, J.R.; Ringel, K.L.; Heath, D.J.

    2004-01-01

    With immersive visualization the engineer has the means for vividly understanding problem causes and discovering opportunities to improve design. Software can generate an interactive world in which collaborators experience the results of complex mathematical simulations such as computational fluid dynamic (CFD) modeling. Such software, while providing unique benefits over traditional visualization techniques, presents special development challenges. The visualization of large quantities of data interactively requires both significant computational power and shrewd data management. On the computational front, commodity hardware is outperforming large workstations in graphical quality and frame rates. Also, 64-bit commodity computing shows promise in enabling interactive visualization of large datasets. Initial interactive transient visualization methods and examples are presented, as well as development trends in commodity hardware and clustering. Interactive, immersive visualization relies on relevant data being stored in active memory for fast response to user requests. For large or transient datasets, data management becomes a key issue. Techniques for dynamic data loading and data reduction are presented as means to increase visualization performance. (author)

  5. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  6. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  7. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  8. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  9. Diagnostic significance of computed tomography in gastric cancer

    International Nuclear Information System (INIS)

    Kang, Eun Young; Cha, Sang Hoon; Seol, Hae Young; Chung, Kyoo Byung; Suh, Won Hyuck

    1985-01-01

    Gastric cancer is the most common gastrointestinal malignancy in Korea. Identification and evaluation of gastric mass lesions and regional-distant metastasis by abdominal CT scan are important for the treatment planning and prognostic implications of gastric cancer patients. Author reviewed CT scans of 61 cases of pathology proven gastric cancer, retrospectively, for recent 20 month from July 1983 to Feb. 1985 at Department of Radiology, Korea University, Hae Wha Hospital. The results were as follows: 1. There were 50 cases of advanced adenocarcinoma, 8 cases of early gastric cancer, 2 cases of leiomyosarcoma, and 1 case of lymphoma in total 61 cases. 2. The sex ratio of male to female was 2 : 1. Age distribution was from 24 to 75 year old and peak incidence was in 6th decade. 3. The most frequent site of involvement with gastric cancer was gastric antrum in 51% 4. 48 of 50 patients with advanced gastric adenocarcinoma (96%) had a wall thickness greater than 1 cm, and all of 8 cases of early gastric cancer had a wall thickness less than 1 cm. Regional lymph node tumor infiltration was found in 100% of gastric wall thickness greater than 2.0 cm, in 64% of cases of 1.5 to 2.0 cm, in 50% of cases of 1.0 to 1.5 cm, and 12.5% of cases of less than 1.0 cm. 5. In a comparison of enlargement of regional lymph node by CT scan to tumor infiltration of regional lymph node by histology, sensitivity was 52%, specificity was 87%, and reliability was 66%. 6. The structure involved by distant metastasis of these cases were the retroperitoneal lymph node in 15, liver in 8, and pancreas in 3. 7. The diagnostic accuracy of CT staging was considered about 68% by correlation of the surgical and histological findings. 8. The CT scan is one of the accurate and simple tool for evaluation of size, shape, extent, as well as distant metastasis in the cases of gastric malignancies

  10. Computer Anxiety, Academic Stress, and Academic Procrastination on College Students

    Directory of Open Access Journals (Sweden)

    Wahyu Rahardjo

    2013-01-01

    Full Text Available Academic procrastination is fairly and commonly found among college students. The lack of understanding in making the best use of computer technology may lead to anxiety in terms of operating computer hence cause postponement in completing course assignments related to computer operation. On the other hand, failure in achieving certain academic targets as expected by parents and/or the students themselves also makes students less focused and leads to tendency of postponing many completions of course assignments. The aim of this research is to investigate contribution of anxiety in operating computer and academic stress toward procrastination on students. As much as 65 students majoring in psychology became participants in this study. The results showed that anxiety in operating computer and academic stress play significant role in influencing academic procrastination among social sciences students. In terms of academic procrastination tendencies, anxiety in operating computer and academic stress, male students have higher percentage than female students.

  11. Handbook of computational quantum chemistry

    CERN Document Server

    Cook, David B

    2005-01-01

    Quantum chemistry forms the basis of molecular modeling, a tool widely used to obtain important chemical information and visual images of molecular systems. Recent advances in computing have resulted in considerable developments in molecular modeling, and these developments have led to significant achievements in the design and synthesis of drugs and catalysts. This comprehensive text provides upper-level undergraduates and graduate students with an introduction to the implementation of quantum ideas in molecular modeling, exploring practical applications alongside theoretical explanations.Wri

  12. Prevalence of computer vision syndrome in Erbil

    Directory of Open Access Journals (Sweden)

    Dler Jalal Ahmed

    2018-04-01

    Full Text Available Background and objective: Nearly all colleges, universities and homes today are regularly using video display terminals, such as computer, iPad, mobile, and TV. Very little research has been carried out on Kurdish users to reveal the effect of video display terminals on the eye and vision. This study aimed to evaluate the prevalence of computer vision syndrome among computer users. Methods: A hospital based cross-sectional study was conducted in the Ophthalmology Department of Rizgary and Erbil teaching hospitals in Erbil city. Those used computers in the months preceding the date of this study were included in the study. Results: Among 173 participants aged between 8 to 48 years (mean age of 23.28±6.6 years, the prevalence of computer vision syndrome found to be 89.65%. The most disturbing symptom was eye irritation (79.8%, followed by blurred vision(75.7%. Participants who were using visual display terminals for more than six hours per day were at higher risk of developing nearly all symptoms of computer vision syndrome. Significant correlation was found between time-consuming on computer and symptoms such as headache (P <0.001, redness (P <0.001, eye irritation (P <0.001, blurred vision (P <0.001 and neck pain (P <0.001. Conclusion: The present study demonstrates that more than three-fourths of the participants had one of the symptoms of computer vision syndrome while working on visual display terminals. Keywords: Computer vision syndrome; Headache; Neck pain; Blurred vision.

  13. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  14. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  15. Value of radio density determined by enhanced computed tomography for the differential diagnosis of lung masses

    International Nuclear Information System (INIS)

    Xie, Min

    2011-01-01

    Lung masses are often difficult to differentiate when their clinical symptoms and shapes or densities on computed tomography images are similar. However, with different pathological contents, they may appear differently on plain and enhanced computed tomography. Objectives: To determine the value of enhanced computed tomography for the differential diagnosis of lung masses based on the differences in radio density with and without enhancement. Patients and Methods: Thirty-six patients with lung cancer, 36 with pulmonary tuberculosis and 10 with inflammatory lung pseudo tumors diagnosed by computed tomography and confirmed by pathology in our hospital were selected. The mean ±SD radio densities of lung masses in the three groups of patients were calculated based on the results of plain and enhanced computed tomography. Results: There were no significant differences in the radio densities of the masses detected by plain computed tomography among patients with inflammatory lung pseudo tumors, tuberculosis and lung cancer (P> 0.05). However, there were significant differences (P< 0.01)between all the groups in terms of radio densities of masses detected by enhanced computed tomography. Conclusions: The radio densities of lung masses detected by enhanced computed tomography could potentially be used to differentiate between lung cancer, pulmonary tuberculosis and inflammatory lung pseudo tumors.

  16. Computer-Based Cognitive Training for Mild Cognitive Impairment: Results from a Pilot Randomized, Controlled Trial

    OpenAIRE

    Barnes, Deborah E.; Yaffe, Kristine; Belfor, Nataliya; Jagust, William J.; DeCarli, Charles; Reed, Bruce R.; Kramer, Joel H.

    2009-01-01

    We performed a pilot randomized, controlled trial of intensive, computer-based cognitive training in 47 subjects with mild cognitive impairment (MCI). The intervention group performed exercises specifically designed to improve auditory processing speed and accuracy for 100 minutes/day, 5 days/week for 6 weeks; the control group performed more passive computer activities (reading, listening, visuospatial game) for similar amounts of time. Subjects had a mean age of 74 years and 60% were men; 7...

  17. Cystic adventitial disease of popliteal artery with significant stenosis

    International Nuclear Information System (INIS)

    Gupta, Ranjana; Mittal, Puneet; Gupta, Praveen; Jindal, Nancy

    2013-01-01

    Cystic adventitial disease of popliteal artery is a rare condition of unknown etiology which usually presents in middle-aged men. We present Doppler and computed tomography angiography findings in a case of cystic adventitial disease with significant obstruction of popliteal artery, with secondary narrowing of popliteal vein

  18. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  19. Psychology of computer use: XXIV. Computer-related stress among technical college students.

    Science.gov (United States)

    Ballance, C T; Rogers, S U

    1991-10-01

    Hudiburg's Computer Technology Hassles Scale, along with a measure of global stress and a scale on attitudes toward computers, were administered to 186 students in a two-year technical college. Hudiburg's work with the hassles scale as a measure of "technostress" was affirmed. Moderate, but statistically significant, correlations among the three scales are reported. No relationship between the hassles scale and achievement as measured by GPA was detected.

  20. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  2. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  3. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  4. Building Capacity Through Hands-on Computational Internships to Assure Reproducible Results and Implementation of Digital Documentation in the ICERT REU Program

    Science.gov (United States)

    Gomez, R.; Gentle, J.

    2015-12-01

    Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community

  5. Computing the scattering properties of participating media using Lorenz-Mie theory

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall; Christensen, Niels Jørgen; Jensen, Henrik Wann

    2007-01-01

    is capable of handling both absorbing host media and non-spherical particles, which significantly extends the classes of media and materials that can be modeled. We use the theory to compute optical properties for different types of ice and ocean water, and we derive a novel appearance model for milk...... parameterized by the fat and protein contents. Our results show that we are able to match measured scattering properties in cases where the classical Lorenz-Mie theory breaks down, and we can compute properties for media that cannot be measured using existing techniques in computer graphics....

  6. The evolution of computer technology

    CERN Document Server

    Kamar, Haq

    2018-01-01

    Today it seems that computers occupy every single space in life. This book traces the evolution of computers from the humble beginnings as simple calculators up to the modern day jack-of-all trades devices like the iPhone. Readers will learn about how computers evolved from humongous military-issue refrigerators to the spiffy, delicate, and intriguing devices that many modern people feel they can't live without anymore. Readers will also discover the historical significance of computers, and their pivotal roles in World War II, the Space Race, and the emergence of modern Western powers.

  7. Clinical significance of computed tomography in the measurement of thyroid volume after operation for Basedow's disease

    International Nuclear Information System (INIS)

    Kasuga, Yoshio; Miyakawa, Makoto; Sugenoya, Akira

    1986-01-01

    The postoperative volume of the thyroid glands was measured using computed tomography (CT) in 16 patients with Basedow's disease. In the group which had normal postoperative thyroid function and did not need to receive T 4 , CT showed increase of thyroid volume. In three of the four patients who needed to receive it, CT showed decreased thyroid volume, as compared with that immediately after operation. CT has proved to serve as a tool for measuring postoperative thyroid volume for Basedow's disease in relation to postoperative prognosis. (Namekawa, K.)

  8. Calculation of normalised organ and effective doses to adult reference computational phantoms from contemporary computed tomography scanners

    International Nuclear Information System (INIS)

    Jansen, Jan T.M.; Shrimpton, Paul C.

    2010-01-01

    The general-purpose Monte Carlo radiation transport code MCNPX has been used to simulate photon transport and energy deposition in anthropomorphic phantoms due to the x-ray exposure from the Philips iCT 256 and Siemens Definition CT scanners, together with the previously studied General Electric 9800. The MCNPX code was compiled with the Intel FORTRAN compiler and run on a Linux PC cluster. A patch has been successfully applied to reduce computing times by about 4%. The International Commission on Radiological Protection (ICRP) has recently published the Adult Male (AM) and Adult Female (AF) reference computational voxel phantoms as successors to the Medical Internal Radiation Dose (MIRD) stylised hermaphrodite mathematical phantoms that form the basis for the widely-used ImPACT CT dosimetry tool. Comparisons of normalised organ and effective doses calculated for a range of scanner operating conditions have demonstrated significant differences in results (in excess of 30%) between the voxel and mathematical phantoms as a result of variations in anatomy. These analyses illustrate the significant influence of choice of phantom on normalised organ doses and the need for standardisation to facilitate comparisons of dose. Further such dose simulations are needed in order to update the ImPACT CT Patient Dosimetry spreadsheet for contemporary CT practice. (author)

  9. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  10. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  11. Velocity-Autocorrelation Function in Liquids, Deduced from Neutron Incoherent Scattering Results

    DEFF Research Database (Denmark)

    Carneiro, Kim

    1976-01-01

    The Fourier transform p(ω) of the velocity-autocorrelation function is derived from neutron incoherent scattering results, obtained from the two liquids Ar and H2. The quality and significance of the results are discussed with special emphasis on the long-time t-3/2 tail, found in computer simula...

  12. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  13. 3D ultrasound computer tomography: Hardware setup, reconstruction methods and first clinical results

    Science.gov (United States)

    Gemmeke, Hartmut; Hopp, Torsten; Zapf, Michael; Kaiser, Clemens; Ruiter, Nicole V.

    2017-11-01

    A promising candidate for improved imaging of breast cancer is ultrasound computer tomography (USCT). Current experimental USCT systems are still focused in elevation dimension resulting in a large slice thickness, limited depth of field, loss of out-of-plane reflections, and a large number of movement steps to acquire a stack of images. 3D USCT emitting and receiving spherical wave fronts overcomes these limitations. We built an optimized 3D USCT, realizing for the first time the full benefits of a 3D system. The point spread function could be shown to be nearly isotropic in 3D, to have very low spatial variability and fit the predicted values. The contrast of the phantom images is very satisfactory in spite of imaging with a sparse aperture. The resolution and imaged details of the reflectivity reconstruction are comparable to a 3 T MRI volume. Important for the obtained resolution are the simultaneously obtained results of the transmission tomography. The KIT 3D USCT was then tested in a pilot study on ten patients. The primary goals of the pilot study were to test the USCT device, the data acquisition protocols, the image reconstruction methods and the image fusion techniques in a clinical environment. The study was conducted successfully; the data acquisition could be carried out for all patients with an average imaging time of six minutes per breast. The reconstructions provide promising images. Overlaid volumes of the modalities show qualitative and quantitative information at a glance. This paper gives a summary of the involved techniques, methods, and first results.

  14. Noise tolerant spatiotemporal chaos computing.

    Science.gov (United States)

    Kia, Behnam; Kia, Sarvenaz; Lindner, John F; Sinha, Sudeshna; Ditto, William L

    2014-12-01

    We introduce and design a noise tolerant chaos computing system based on a coupled map lattice (CML) and the noise reduction capabilities inherent in coupled dynamical systems. The resulting spatiotemporal chaos computing system is more robust to noise than a single map chaos computing system. In this CML based approach to computing, under the coupled dynamics, the local noise from different nodes of the lattice diffuses across the lattice, and it attenuates each other's effects, resulting in a system with less noise content and a more robust chaos computing architecture.

  15. Efficient computation of global sensitivity indices using sparse polynomial chaos expansions

    International Nuclear Information System (INIS)

    Blatman, Geraud; Sudret, Bruno

    2010-01-01

    Global sensitivity analysis aims at quantifying the relative importance of uncertain input variables onto the response of a mathematical model of a physical system. ANOVA-based indices such as the Sobol' indices are well-known in this context. These indices are usually computed by direct Monte Carlo or quasi-Monte Carlo simulation, which may reveal hardly applicable for computationally demanding industrial models. In the present paper, sparse polynomial chaos (PC) expansions are introduced in order to compute sensitivity indices. An adaptive algorithm allows the analyst to build up a PC-based metamodel that only contains the significant terms whereas the PC coefficients are computed by least-square regression using a computer experimental design. The accuracy of the metamodel is assessed by leave-one-out cross validation. Due to the genuine orthogonality properties of the PC basis, ANOVA-based sensitivity indices are post-processed analytically. This paper also develops a bootstrap technique which eventually yields confidence intervals on the results. The approach is illustrated on various application examples up to 21 stochastic dimensions. Accurate results are obtained at a computational cost 2-3 orders of magnitude smaller than that associated with Monte Carlo simulation.

  16. Efficient computation of global sensitivity indices using sparse polynomial chaos expansions

    Energy Technology Data Exchange (ETDEWEB)

    Blatman, Geraud, E-mail: geraud.blatman@edf.f [Clermont Universite, IFMA, EA 3867, Laboratoire de Mecanique et Ingenieries, BP 10448, F-63000 Clermont-Ferrand (France); EDF, R and D Division - Site des Renardieres, F-77818 Moret-sur-Loing (France); Sudret, Bruno, E-mail: sudret@phimeca.co [Clermont Universite, IFMA, EA 3867, Laboratoire de Mecanique et Ingenieries, BP 10448, F-63000 Clermont-Ferrand (France); Phimeca Engineering, Centre d' Affaires du Zenith, 34 rue de Sarlieve, F-63800 Cournon d' Auvergne (France)

    2010-11-15

    Global sensitivity analysis aims at quantifying the relative importance of uncertain input variables onto the response of a mathematical model of a physical system. ANOVA-based indices such as the Sobol' indices are well-known in this context. These indices are usually computed by direct Monte Carlo or quasi-Monte Carlo simulation, which may reveal hardly applicable for computationally demanding industrial models. In the present paper, sparse polynomial chaos (PC) expansions are introduced in order to compute sensitivity indices. An adaptive algorithm allows the analyst to build up a PC-based metamodel that only contains the significant terms whereas the PC coefficients are computed by least-square regression using a computer experimental design. The accuracy of the metamodel is assessed by leave-one-out cross validation. Due to the genuine orthogonality properties of the PC basis, ANOVA-based sensitivity indices are post-processed analytically. This paper also develops a bootstrap technique which eventually yields confidence intervals on the results. The approach is illustrated on various application examples up to 21 stochastic dimensions. Accurate results are obtained at a computational cost 2-3 orders of magnitude smaller than that associated with Monte Carlo simulation.

  17. Report of the evaluation by the Ad Hoc Review Committee on Computational Science and Engineering. Result evaluation in fiscal year 2000

    International Nuclear Information System (INIS)

    2001-06-01

    The Research Evaluation Committee, which consisted of 14 members from outside of the Japan Atomic Energy Research Institute (JAERI), set up an Ad Hoc Review Committee on Computational Science and Engineering in accordance with the 'Fundamental Guideline for the Evaluation of Research and Development (R and D) at JAERI' and its subsidiary regulations in order to evaluate the R and D accomplishments achieved for five years from Fiscal Year 1995 to Fiscal Year 1999 at Center for Promotion of Computational Science and Engineering of JAERI. The Ad Hoc Review Committee consisted of seven specialists from outside of JAERI. The Ad Hoc Review Committee conducted its activities from December 2000 to March 2001. The evaluation was performed on the basis of the materials submitted in advance and of the oral presentations made at the Ad Hoc Review Committee meeting which was held on December 27, 2000, in line with the items, viewpoints, and criteria for the evaluation specified by the Research Evaluation Committee. The result of the evaluation by the Ad Hoc Review Committee was submitted to the Research Evaluation Committee, and was judged to be appropriate at its meeting held on March 16, 2001. This report describes the result of the evaluation by the Ad Hoc Review Committee on Computational Science and Engineering. (author)

  18. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. Comparison of tests of accommodation for computer users.

    Science.gov (United States)

    Kolker, David; Hutchinson, Robert; Nilsen, Erik

    2002-04-01

    With the increased use of computers in the workplace and at home, optometrists are finding more patients presenting with symptoms of Computer Vision Syndrome. Among these symptomatic individuals, research supports that accommodative disorders are the most common vision finding. A prepresbyopic group (N= 30) and a presbyopic group (N = 30) were selected from a private practice. Assignment to a group was determined by age, accommodative amplitude, and near visual acuity with their distance prescription. Each subject was given a thorough vision and ocular health examination, then administered several nearpoint tests of accommodation at a computer working distance. All the tests produced similar results in the presbyopic group. For the prepresbyopic group, the tests yielded very different results. To effectively treat symptomatic VDT users, optometrists must assess the accommodative system along with the binocular and refractive status. For presbyopic patients, all nearpoint tests studied will yield virtually the same result. However, the method of testing accommodation, as well as the test stimulus presented, will yield significantly different responses for prepresbyopic patients. Previous research indicates that a majority of patients prefer the higher plus prescription yielded by the Gaussian image test.

  1. Computer-enhanced interventions for drug use and HIV risk in the emergency room: preliminary results on psychological precursors of behavior change.

    Science.gov (United States)

    Bonar, Erin E; Walton, Maureen A; Cunningham, Rebecca M; Chermack, Stephen T; Bohnert, Amy S B; Barry, Kristen L; Booth, Brenda M; Blow, Frederic C

    2014-01-01

    This article describes process data from a randomized controlled trial among 781 adults recruited in the emergency department who reported recent drug use and were randomized to: intervener-delivered brief intervention (IBI) assisted by computer, computerized BI (CBI), or enhanced usual care (EUC). Analyses examined differences between baseline and post-intervention on psychological constructs theoretically related to changes in drug use and HIV risk: importance, readiness, intention, help-seeking, and confidence. Compared to EUC, participants receiving the IBI significantly increased in confidence and intentions; CBI patients increased importance, readiness, confidence, and help-seeking. Both groups increased relative to the EUC in likelihood of condom use with regular partners. Examining BI components suggested that benefits of change and tools for change were associated with changes in psychological constructs. Delivering BIs targeting drug use and HIV risk using computers appears promising for implementation in healthcare settings. This trial is ongoing and future work will report behavioral outcomes. © 2013.

  2. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  3. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  4. Short-distance expansion for the electromagnetic half-space Green's tensor: general results and an application to radiative lifetime computations

    International Nuclear Information System (INIS)

    Panasyuk, George Y; Schotland, John C; Markel, Vadim A

    2009-01-01

    We obtain a short-distance expansion for the half-space, frequency domain electromagnetic Green's tensor. The small parameter of the theory is ωε 1 L/c, where ω is the frequency, ε 1 is the permittivity of the upper half-space, in which both the source and the point of observation are located, and which is assumed to be transparent, c is the speed of light in vacuum and L is a characteristic length, defined as the distance from the point of observation to the reflected (with respect to the planar interface) position of the source. In the case when the lower half-space (the substrate) is characterized by a complex permittivity ε 2 , we compute the expansion to third order. For the case when the substrate is a transparent dielectric, we compute the imaginary part of the Green's tensor to seventh order. The analytical calculations are verified numerically. The practical utility of the obtained expansion is demonstrated by computing the radiative lifetime of two electromagnetically interacting molecules in the vicinity of a transparent dielectric substrate. The computation is performed in the strong interaction regime when the quasi-particle pole approximation is inapplicable. In this regime, the integral representation for the half-space Green's tensor is difficult to use while its electrostatic limiting expression is grossly inadequate. However, the analytical expansion derived in this paper can be used directly and efficiently. The results of this study are also relevant to nano-optics and near-field imaging, especially when tomographic image reconstruction is involved

  5. The relationship between computer games and quality of life in adolescents

    Science.gov (United States)

    Dolatabadi, Nayereh Kasiri; Eslami, Ahmad Ali; Mostafavi, Firooze; Hassanzade, Akbar; Moradi, Azam

    2013-01-01

    Background: Term of doing computer games among teenagers is growing rapidly. This popular phenomenon can cause physical and psychosocial issues in them. Therefore, this study examined the relationship between computer games and quality of life domains in adolescents aging 12-15 years. Materials and Methods: In a cross-sectional study using the 2-stage stratified cluster sampling method, 444 male and female students in Borkhar were selected. The data collection tool consisted of 1) World Health Organization Quality Of Life – BREF questionnaire and 2) personal information questionnaire. The data were analyzed by Pearson correlation, Spearman correlation, chi-square, independent t-tests and analysis of covariance. Findings: The total mean score of quality of life in students was 67.11±13.34. The results showed a significant relationship between the age of starting to play games and the overall quality of life score and its fourdomains (range r=–0.13 to –0.18). The mean of overall quality of life score in computer game users was 68.27±13.03 while it was 64.81±13.69 among those who did not play computer games and the difference was significant (P=0.01). There were significant differences in environmental and mental health domains between the two groups (Pcomputer games. Conclusion: Playing computer games for a short time under parental supervision can have positive effects on quality of life in adolescents. However, spending long hours for playing computer games may have negative long-term effects. PMID:24083270

  6. Lymphography and computed tomography in the assessment of lymphnode invasion by bladder cancer. Comparison of diagnostic value

    International Nuclear Information System (INIS)

    Leguay, O.; Bellin, M.F.; Richard, F.; Mallet, A.; Grellet, J.

    1989-01-01

    The diagnostic value of lymphography and computed tomography (CT) in the assessment of lymph node invasion by bladder cancers has been compared on the basis of 30 observations. Although computed tomography apparently yields better results (reliability: 93%) than lymphography (reliability: 87%), these findings have no statistical significance. The study of literature shows that the statistical exploitation of the results was seldom carried out. The combination of both exploration techniques seems to improve predictive values, but this improvement was not statistically significant in the study [fr

  7. Training directionally selective motion pathways can significantly improve reading efficiency

    Science.gov (United States)

    Lawton, Teri

    2004-06-01

    This study examined whether perceptual learning at early levels of visual processing would facilitate learning at higher levels of processing. This was examined by determining whether training the motion pathways by practicing leftright movement discrimination, as found previously, would improve the reading skills of inefficient readers significantly more than another computer game, a word discrimination game, or the reading program offered by the school. This controlled validation study found that practicing left-right movement discrimination 5-10 minutes twice a week (rapidly) for 15 weeks doubled reading fluency, and significantly improved all reading skills by more than one grade level, whereas inefficient readers in the control groups barely improved on these reading skills. In contrast to previous studies of perceptual learning, these experiments show that perceptual learning of direction discrimination significantly improved reading skills determined at higher levels of cognitive processing, thereby being generalized to a new task. The deficits in reading performance and attentional focus experienced by the person who struggles when reading are suggested to result from an information overload, resulting from timing deficits in the direction-selectivity network proposed by Russell De Valois et al. (2000), that following practice on direction discrimination goes away. This study found that practicing direction discrimination rapidly transitions the inefficient 7-year-old reader to an efficient reader.

  8. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  9. Development of computer code SIMPSEX for simulation of FBR fuel reprocessing flowsheets: II. additional benchmarking results

    International Nuclear Information System (INIS)

    Shekhar Kumar; Koganti, S.B.

    2003-07-01

    Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)

  10. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  11. CBT for depression: a pilot RCT comparing mobile phone vs. computer

    Directory of Open Access Journals (Sweden)

    Watts Sarah

    2013-02-01

    Full Text Available Abstract Background This paper reports the results of a pilot randomized controlled trial comparing the delivery modality (mobile phone/tablet or fixed computer of a cognitive behavioural therapy intervention for the treatment of depression. The aim was to establish whether a previously validated computerized program (The Sadness Program remained efficacious when delivered via a mobile application. Method 35 participants were recruited with Major Depression (80% female and randomly allocated to access the program using a mobile app (on either a mobile phone or iPad or a computer. Participants completed 6 lessons, weekly homework assignments, and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2. After lesson 2 email contact was only provided in response to participant request, or in response to a deterioration in psychological distress scores. The primary outcome measure was the Patient Health Questionnaire 9 (PHQ-9. Of the 35 participants recruited, 68.6% completed 6 lessons and 65.7% completed the 3-months follow up. Attrition was handled using mixed-model repeated-measures ANOVA. Results Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test. At 3 months follow up, the reduction seen for both groups remained significant. Conclusions These results provide evidence to indicate that delivering a CBT program using a mobile application, can result in clinically significant improvements in outcomes for patients with depression. Trial registration Australian New Zealand Clinical Trials Registry ACTRN 12611001257954

  12. Evaluation of Musculoskeletal Disorders among computer Users in Isfahan

    Directory of Open Access Journals (Sweden)

    Ayoub Ghanbary

    2015-08-01

    Full Text Available Along with widespread use of computers, work-related musculoskeletal disorders (MSDs have become the most prevalent ergonomic problems in computer users. With evaluating musculoskeletal disorders among Computer Users can intervent a action to reduce musculoskeletal disorders carried out. The aim of the present study was to Assessment of Musculoskeletal Disorders among Computer Users in Isfahan University with Rapid Office Strain Assessment (ROSA method and Nordic questionnaire. This cross-sectional study was conducted on 96 computer users in Isfahan university. The data were analyzed using correlation and line regression by test spss 20. and descriptive statistics and Anova test. Data collection tool was Nordic questionnaire and Rapid Office Strain Assessment method checklist. The results of Nordic questionnaire showed that prevalence of musculoskeletal disorders in computer users were in the shoulder (62.1%, neck (54.9% and back (53.1% respectively more than in other parts of the body. Based on the level of risk of ROSA were 19 individuals in an area of low risk, 50 individual area of notification and 27 individual in the area hazard and need for ergonomics interventions. Musculoskeletal disorders prevalence were in women more than men. Also Anova test showed that there is a direct and significant correlation between age and work experience with a final score ROSA (p<0.001. The study result showed that the prevalence of MSDs among computer users of Isfahan universities is pretty high and must ergonomic interventions such as computer workstation redesign, users educate about ergonomic principles computer with work, reduced working hours in computers with work, and elbows should be kept close to the body with the angle between 90 and 120 degrees to reduce musculoskeletal disorders carried out.

  13. CBT for depression: a pilot RCT comparing mobile phone vs. computer.

    Science.gov (United States)

    Watts, Sarah; Mackenzie, Anna; Thomas, Cherian; Griskaitis, Al; Mewton, Louise; Williams, Alishia; Andrews, Gavin

    2013-02-07

    This paper reports the results of a pilot randomized controlled trial comparing the delivery modality (mobile phone/tablet or fixed computer) of a cognitive behavioural therapy intervention for the treatment of depression. The aim was to establish whether a previously validated computerized program (The Sadness Program) remained efficacious when delivered via a mobile application. 35 participants were recruited with Major Depression (80% female) and randomly allocated to access the program using a mobile app (on either a mobile phone or iPad) or a computer. Participants completed 6 lessons, weekly homework assignments, and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2. After lesson 2 email contact was only provided in response to participant request, or in response to a deterioration in psychological distress scores. The primary outcome measure was the Patient Health Questionnaire 9 (PHQ-9). Of the 35 participants recruited, 68.6% completed 6 lessons and 65.7% completed the 3-months follow up. Attrition was handled using mixed-model repeated-measures ANOVA. Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test. At 3 months follow up, the reduction seen for both groups remained significant. These results provide evidence to indicate that delivering a CBT program using a mobile application, can result in clinically significant improvements in outcomes for patients with depression. Australian New Zealand Clinical Trials Registry ACTRN 12611001257954.

  14. Computational intelligence for decision support in cyber-physical systems

    CERN Document Server

    Ali, A; Riaz, Zahid

    2014-01-01

    This book is dedicated to applied computational intelligence and soft computing techniques with special reference to decision support in Cyber Physical Systems (CPS), where the physical as well as the communication segment of the networked entities interact with each other. The joint dynamics of such systems result in a complex combination of computers, software, networks and physical processes all combined to establish a process flow at system level. This volume provides the audience with an in-depth vision about how to ensure dependability, safety, security and efficiency in real time by making use of computational intelligence in various CPS applications ranging from the nano-world to large scale wide area systems of systems. Key application areas include healthcare, transportation, energy, process control and robotics where intelligent decision support has key significance in establishing dynamic, ever-changing and high confidence future technologies. A recommended text for graduate students and researche...

  15. Computational neural network regression model for Host based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Gautam

    2016-09-01

    Full Text Available The current scenario of information gathering and storing in secure system is a challenging task due to increasing cyber-attacks. There exists computational neural network techniques designed for intrusion detection system, which provide security to single machine and entire network's machine. In this paper, we have used two types of computational neural network models, namely, Generalized Regression Neural Network (GRNN model and Multilayer Perceptron Neural Network (MPNN model for Host based Intrusion Detection System using log files that are generated by a single personal computer. The simulation results show correctly classified percentage of normal and abnormal (intrusion class using confusion matrix. On the basis of results and discussion, we found that the Host based Intrusion Systems Model (HISM significantly improved the detection accuracy while retaining minimum false alarm rate.

  16. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  17. Iterative concurrent reconstruction algorithms for emission computed tomography

    International Nuclear Information System (INIS)

    Brown, J.K.; Hasegawa, B.H.; Lang, T.F.

    1994-01-01

    Direct reconstruction techniques, such as those based on filtered backprojection, are typically used for emission computed tomography (ECT), even though it has been argued that iterative reconstruction methods may produce better clinical images. The major disadvantage of iterative reconstruction algorithms, and a significant reason for their lack of clinical acceptance, is their computational burden. We outline a new class of ''concurrent'' iterative reconstruction techniques for ECT in which the reconstruction process is reorganized such that a significant fraction of the computational processing occurs concurrently with the acquisition of ECT projection data. These new algorithms use the 10-30 min required for acquisition of a typical SPECT scan to iteratively process the available projection data, significantly reducing the requirements for post-acquisition processing. These algorithms are tested on SPECT projection data from a Hoffman brain phantom acquired with a 2 x 10 5 counts in 64 views each having 64 projections. The SPECT images are reconstructed as 64 x 64 tomograms, starting with six angular views. Other angular views are added to the reconstruction process sequentially, in a manner that reflects their availability for a typical acquisition protocol. The results suggest that if T s of concurrent processing are used, the reconstruction processing time required after completion of the data acquisition can be reduced by at least 1/3 T s. (Author)

  18. Task-induced frequency modulation features for brain-computer interfacing

    Science.gov (United States)

    Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz

    2017-10-01

    Objective. Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects’ intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects’ intents with an accuracy comparable to task-induced amplitude modulation. Approach. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. Main results. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Significance. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.

  19. Reduction of energy consumption peaks in a greenhouse by computer control

    Energy Technology Data Exchange (ETDEWEB)

    Amsen, M.G.; Froesig Nielsen, O.; Jacobsen, L.H. (Danish Research Service for Plant and Soil Science, Research Centre for Horticulture, Department of Horticultural Engineering, Aarslev (DK))

    1990-01-01

    The results of using a computer for environmental control in one greenhouse is in this paper compared with using modified analogue control equipment in another one. Energy consumption peaks can be almost prevented by properly applying the computer control and strategy. Both treatments were based upon negative DIF, i.e. low day and high night minimum set points (14 deg. C/ 22 deg. C) for room temperature. No difference in production time and quality was observed in six different pot plant species. Only Kalanchoe showed significant increase in fresh weight and dry weight. By applying computer control, the lack of flexibility of analogue control can be avoided by applying computer control and a more accurate room temperature control can be obtained. (author).

  20. An Introduction to Parallel Cluster Computing Using PVM for Computer Modeling and Simulation of Engineering Problems

    International Nuclear Information System (INIS)

    Spencer, VN

    2001-01-01

    An investigation has been conducted regarding the ability of clustered personal computers to improve the performance of executing software simulations for solving engineering problems. The power and utility of personal computers continues to grow exponentially through advances in computing capabilities such as newer microprocessors, advances in microchip technologies, electronic packaging, and cost effective gigabyte-size hard drive capacity. Many engineering problems require significant computing power. Therefore, the computation has to be done by high-performance computer systems that cost millions of dollars and need gigabytes of memory to complete the task. Alternately, it is feasible to provide adequate computing in the form of clustered personal computers. This method cuts the cost and size by linking (clustering) personal computers together across a network. Clusters also have the advantage that they can be used as stand-alone computers when they are not operating as a parallel computer. Parallel computing software to exploit clusters is available for computer operating systems like Unix, Windows NT, or Linux. This project concentrates on the use of Windows NT, and the Parallel Virtual Machine (PVM) system to solve an engineering dynamics problem in Fortran

  1. Computational bone remodelling simulations and comparisons with DEXA results.

    Science.gov (United States)

    Turner, A W L; Gillies, R M; Sekel, R; Morris, P; Bruce, W; Walsh, W R

    2005-07-01

    Femoral periprosthetic bone loss following total hip replacement is often associated with stress shielding. Extensive bone resorption may lead to implant or bone failure and complicate revision surgery. In this study, an existing strain-adaptive bone remodelling theory was modified and combined with anatomic three-dimensional finite element models to predict alterations in periprosthetic apparent density. The theory incorporated an equivalent strain stimulus and joint and muscle forces from 45% of the gait cycle. Remodelling was simulated for three femoral components with different design philosophies: cobalt-chrome alloy, two-thirds proximally coated; titanium alloy, one-third proximally coated; and a composite of cobalt-chrome surrounded by polyaryletherketone, fully coated. Theoretical bone density changes correlated significantly with clinical densitometry measurements (DEXA) after 2 years across the Gruen zones (R2>0.67, p<0.02), with average differences of less than 5.4%. The results suggest that a large proportion of adaptive bone remodelling changes seen clinically with these implants may be explained by a consistent theory incorporating a purely mechanical stimulus. This theory could be applied to pre-clinical testing of new implants, investigation of design modifications, and patient-specific implant selection.

  2. Computer-aided detection (CAD) of solid pulmonary nodules in chest x-ray equivalent ultralow dose chest CT - first in-vivo results at dose levels of 0.13 mSv

    Energy Technology Data Exchange (ETDEWEB)

    Messerli, Michael, E-mail: Michael.Messerli@usz.ch [Division of Radiology and Nuclear Medicine, Cantonal Hospital St. Gallen (Switzerland); Kluckert, Thomas; Knitel, Meinhard [Division of Radiology and Nuclear Medicine, Cantonal Hospital St. Gallen (Switzerland); Rengier, Fabian [Department of Diagnostic and Interventional Radiology, University Hospital Heidelberg (Germany); Warschkow, René [Department of Surgery, Cantonal Hospital St. Gallen (Switzerland); Alkadhi, Hatem [Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, University Zurich (Switzerland); Leschka, Sebastian [Division of Radiology and Nuclear Medicine, Cantonal Hospital St. Gallen (Switzerland); Institute of Diagnostic and Interventional Radiology, University Hospital Zurich, University Zurich (Switzerland); Wildermuth, Simon; Bauer, Ralf W. [Division of Radiology and Nuclear Medicine, Cantonal Hospital St. Gallen (Switzerland)

    2016-12-15

    Highlights: • Computer-aided detection (CAD) of solid pulmonary nodules was compared in 202 patients in standard dose and ultralow dose CT. • The per–nodule sensitivity of CAD was 70% in standard dose CT and 68% in ultralow dose CT. • The per–nodule sensitivity of CAD in standard dose CT was similar to ultralow dose CT in all size subgroups (all p > 0.05). • Adding CAD markings in ultralow dose CT significantly improved the sensitivity of two radiologists from 77% to 88% and from 66% to 79%, respectively. • CAD can serve as an excellent second reader for nodule detection in CT even at dose levels similar to chest X-ray. - Abstract: Objectives: To determine the value of computer-aided detection (CAD) for solid pulmonary nodules in ultralow radiation dose single-energy computed tomography (CT) of the chest using third-generation dual-source CT at 100 kV and fixed tube current at 70 mAs with tin filtration. Methods: 202 consecutive patients undergoing clinically indicated standard dose chest CT (1.8 ± 0.7 mSv) were prospectively included and scanned with an additional ultralow dose CT (0.13 ± 0.01 mSv) in the same session. Standard of reference (SOR) was established by consensus reading of standard dose CT by two radiologists. CAD was performed in standard dose and ultralow dose CT with two different reconstruction kernels. CAD detection rate of nodules was evaluated including subgroups of different nodule sizes (<5, 5–7, >7 mm). Sensitivity was further analysed in multivariable mixed effects logistic regression. Results: The SOR included 279 solid nodules (mean diameter 4.3 ± 3.4 mm, range 1–24 mm). There was no significant difference in per–nodule sensitivity of CAD in standard dose with 70% compared to 68% in ultralow dose CT both overall and in different size subgroups (all p > 0.05). CAD led to a significant increase of sensitivity for both radiologists reading the ultralow dose CT scans (all p < 0.001). In multivariable analysis, the use

  3. Computer-aided detection (CAD) of solid pulmonary nodules in chest x-ray equivalent ultralow dose chest CT - first in-vivo results at dose levels of 0.13 mSv

    International Nuclear Information System (INIS)

    Messerli, Michael; Kluckert, Thomas; Knitel, Meinhard; Rengier, Fabian; Warschkow, René; Alkadhi, Hatem; Leschka, Sebastian; Wildermuth, Simon; Bauer, Ralf W.

    2016-01-01

    Highlights: • Computer-aided detection (CAD) of solid pulmonary nodules was compared in 202 patients in standard dose and ultralow dose CT. • The per–nodule sensitivity of CAD was 70% in standard dose CT and 68% in ultralow dose CT. • The per–nodule sensitivity of CAD in standard dose CT was similar to ultralow dose CT in all size subgroups (all p > 0.05). • Adding CAD markings in ultralow dose CT significantly improved the sensitivity of two radiologists from 77% to 88% and from 66% to 79%, respectively. • CAD can serve as an excellent second reader for nodule detection in CT even at dose levels similar to chest X-ray. - Abstract: Objectives: To determine the value of computer-aided detection (CAD) for solid pulmonary nodules in ultralow radiation dose single-energy computed tomography (CT) of the chest using third-generation dual-source CT at 100 kV and fixed tube current at 70 mAs with tin filtration. Methods: 202 consecutive patients undergoing clinically indicated standard dose chest CT (1.8 ± 0.7 mSv) were prospectively included and scanned with an additional ultralow dose CT (0.13 ± 0.01 mSv) in the same session. Standard of reference (SOR) was established by consensus reading of standard dose CT by two radiologists. CAD was performed in standard dose and ultralow dose CT with two different reconstruction kernels. CAD detection rate of nodules was evaluated including subgroups of different nodule sizes (<5, 5–7, >7 mm). Sensitivity was further analysed in multivariable mixed effects logistic regression. Results: The SOR included 279 solid nodules (mean diameter 4.3 ± 3.4 mm, range 1–24 mm). There was no significant difference in per–nodule sensitivity of CAD in standard dose with 70% compared to 68% in ultralow dose CT both overall and in different size subgroups (all p > 0.05). CAD led to a significant increase of sensitivity for both radiologists reading the ultralow dose CT scans (all p < 0.001). In multivariable analysis, the use

  4. Parallel algorithms and cluster computing

    CERN Document Server

    Hoffmann, Karl Heinz

    2007-01-01

    This book presents major advances in high performance computing as well as major advances due to high performance computing. It contains a collection of papers in which results achieved in the collaboration of scientists from computer science, mathematics, physics, and mechanical engineering are presented. From the science problems to the mathematical algorithms and on to the effective implementation of these algorithms on massively parallel and cluster computers we present state-of-the-art methods and technology as well as exemplary results in these fields. This book shows that problems which seem superficially distinct become intimately connected on a computational level.

  5. Computers and clinical arrhythmias.

    Science.gov (United States)

    Knoebel, S B; Lovelace, D E

    1983-02-01

    Cardiac arrhythmias are ubiquitous in normal and abnormal hearts. These disorders may be life-threatening or benign, symptomatic or unrecognized. Arrhythmias may be the precursor of sudden death, a cause or effect of cardiac failure, a clinical reflection of acute or chronic disorders, or a manifestation of extracardiac conditions. Progress is being made toward unraveling the diagnostic and therapeutic problems involved in arrhythmogenesis. Many of the advances would not be possible, however, without the availability of computer technology. To preserve the proper balance and purposeful progression of computer usage, engineers and physicians have been exhorted not to work independently in this field. Both should learn some of the other's trade. The two disciplines need to come together to solve important problems with computers in cardiology. The intent of this article was to acquaint the practicing cardiologist with some of the extant and envisioned computer applications and some of the problems with both. We conclude that computer-based database management systems are necessary for sorting out the clinical factors of relevance for arrhythmogenesis, but computer database management systems are beset with problems that will require sophisticated solutions. The technology for detecting arrhythmias on routine electrocardiograms is quite good but human over-reading is still required, and the rationale for computer application in this setting is questionable. Systems for qualitative, continuous monitoring and review of extended time ECG recordings are adequate with proper noise rejection algorithms and editing capabilities. The systems are limited presently for clinical application to the recognition of ectopic rhythms and significant pauses. Attention should now be turned to the clinical goals for detection and quantification of arrhythmias. We should be asking the following questions: How quantitative do systems need to be? Are computers required for the detection of

  6. [Effects of long-term isolation and anticipation of significant event on sleep: results of the project "Mars-520"].

    Science.gov (United States)

    Zavalko, I M; Rasskazova, E I; Gordeev, S A; Palatov, S Iu; Kovrov, G V

    2013-01-01

    The purpose of the research was to study effect of long-term isolation on night sleep. The data were collected during international ground simulation of an interplanetary manned flight--"Mars-500". The polysomnographic recordings of six healthy men were performed before, four times during and after 520-days confinement. During the isolation sleep efficiency and delta-latency decreased, while sleep latency increased. Post-hoc analysis demonstrate significant differences between background and the last (1.5 months before the end of the experiment) measure during isolation. Frequency of nights with low sleep efficiency rose on the eve of the important for the crew events (simulation of Mars landing and the end of the confinement). Two weeks after the landing simulation, amount of the nights with a low sleep efficiency significantly decreased. Therefore, anticipation of significant event under condition of long-term isolation might result in sleep worsening in previously healthy men, predominantly difficulties getting to sleep.

  7. Pattern recognition, neural networks, genetic algorithms and high performance computing in nuclear reactor diagnostics. Results and perspectives

    International Nuclear Information System (INIS)

    Dzwinel, W.; Pepyolyshev, N.

    1996-01-01

    The main goal of this paper is the presentation of our experience in development of the diagnostic system for the IBR-2 (Russia - Dubna) nuclear reactor. The authors show the principal results of the system modifications to make it work more reliable and much faster. The former needs the adaptation of new techniques of data processing, the latter, implementation of the newest computational facilities. The results of application of the clustering techniques and a method of visualization of the multi-dimensional information directly on the operator display are presented. The experiences with neural nets, used for prediction of the reactor operation, are discussed. The genetic algorithms were also tested, to reduce the quantity of data nd extracting the most informative components of the analyzed spectra. (authors)

  8. Cloud Computing with iPlant Atmosphere.

    Science.gov (United States)

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  9. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  10. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  11. Innovative Phase Change Approach for Significant Energy Savings

    Science.gov (United States)

    2016-09-01

    related to the production, use, transmission , storage, control, or conservation of energy that will – (A) reduce the need for additional energy supplies...Conditions set for operation were: a. The computer with the broadband wireless card is to be used for data collection, transmission and...FINAL REPORT Innovative Phase Change Approach for Significant Energy Savings ESTCP Project EW-201138 SEPTEMBER 2016 Dr. Aly H Shaaban Applied

  12. Meningitis tuberculosa: Clinical findings and results of cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Trautmann, M.; Loddenkemper, R.; Hoffmann, H.G.

    1982-10-01

    Guided by 9 own observations between 1977 and 1981, new diagnostic facilities in tuberculous meningitis are discussed. For differentiation from viral meningitis, measurement of CSF lactic acid concentration in addition to that of CSF glucose has proved to be of value in recent years. In accordance with the literature, two cases of this series which were examined for CSF lactic acid concentration showed markedly elevated levels of 8,4 rsp. 10,4 mmol/l. In contrast to this, in viral meningitis usually values of less than 3.5 mmol/l are found. Additionally, the presence of hypochlor- and hyponatremia, which could be demonstrated in 6 of our 9 patients, may raise the suspicion of tuberculous etiology. In the series presented, cranial computed tomography was of greatest diagnostic value, enabling the diagnosis of hydrocephalus internus in 5, and basal arachnoiditis in 2 cases.

  13. The Potential of the Cell Processor for Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel; Shalf, John; Oliker, Leonid; Husbands, Parry; Kamil, Shoaib; Yelick, Katherine

    2005-10-14

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of the using the forth coming STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. We are the first to present quantitative Cell performance data on scientific kernels and show direct comparisons against leading superscalar (AMD Opteron), VLIW (IntelItanium2), and vector (Cray X1) architectures. Since neither Cell hardware nor cycle-accurate simulators are currently publicly available, we develop both analytical models and simulators to predict kernel performance. Our work also explores the complexity of mapping several important scientific algorithms onto the Cells unique architecture. Additionally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  14. Feasibility of an automatic computer-assisted algorithm for the detection of significant coronary artery disease in patients presenting with acute chest pain

    International Nuclear Information System (INIS)

    Kang, Ki-Woon; Chang, Hyuk-Jae; Shim, Hackjoon; Kim, Young-Jin; Choi, Byoung-Wook; Yang, Woo-In; Shim, Jee-Young; Ha, Jongwon; Chung, Namsik

    2012-01-01

    Automatic computer-assisted detection (auto-CAD) of significant coronary artery disease (CAD) in coronary computed tomography angiography (cCTA) has been shown to have relatively high accuracy. However, to date, scarce data are available regarding the performance of auto-CAD in the setting of acute chest pain. This study sought to demonstrate the feasibility of an auto-CAD algorithm for cCTA in patients presenting with acute chest pain. We retrospectively investigated 398 consecutive patients (229 male, mean age 50 ± 21 years) who had acute chest pain and underwent cCTA between Apr 2007 and Jan 2011 in the emergency department (ED). All cCTA data were analyzed using an auto-CAD algorithm for the detection of >50% CAD on cCTA. The accuracy of auto-CAD was compared with the formal radiology report. In 380 of 398 patients (18 were excluded due to failure of data processing), per-patient analysis of auto-CAD revealed the following: sensitivity 94%, specificity 63%, positive predictive value (PPV) 76%, and negative predictive value (NPV) 89%. After the exclusion of 37 cases that were interpreted as invalid by the auto-CAD algorithm, the NPV was further increased up to 97%, considering the false-negative cases in the formal radiology report, and was confirmed by subsequent invasive angiogram during the index visit. We successfully demonstrated the high accuracy of an auto-CAD algorithm, compared with the formal radiology report, for the detection of >50% CAD on cCTA in the setting of acute chest pain. The auto-CAD algorithm can be used to facilitate the decision-making process in the ED.

  15. Computational Methods in Stochastic Dynamics Volume 2

    CERN Document Server

    Stefanou, George; Papadopoulos, Vissarion

    2013-01-01

    The considerable influence of inherent uncertainties on structural behavior has led the engineering community to recognize the importance of a stochastic approach to structural problems. Issues related to uncertainty quantification and its influence on the reliability of the computational models are continuously gaining in significance. In particular, the problems of dynamic response analysis and reliability assessment of structures with uncertain system and excitation parameters have been the subject of continuous research over the last two decades as a result of the increasing availability of powerful computing resources and technology.   This book is a follow up of a previous book with the same subject (ISBN 978-90-481-9986-0) and focuses on advanced computational methods and software tools which can highly assist in tackling complex problems in stochastic dynamic/seismic analysis and design of structures. The selected chapters are authored by some of the most active scholars in their respective areas and...

  16. Integration of computer-aided diagnosis/detection (CAD) results in a PACS environment using CAD-PACS toolkit and DICOM SR

    International Nuclear Information System (INIS)

    Le, Anh H.T.; Liu, Brent; Huang, H.K.

    2009-01-01

    Picture Archiving and Communication System (PACS) is a mature technology in health care delivery for daily clinical imaging service and data management. Computer-aided detection and diagnosis (CAD) utilizes computer methods to obtain quantitative measurements from medical images and clinical information to assist clinicians to assess a patient's clinical state more objectively. CAD needs image input and related information from PACS to improve its accuracy; and PACS benefits from CAD results online and available at the PACS workstation as a second reader to assist physicians in the decision making process. Currently, these two technologies remain as two separate independent systems with only minimal system integration. This paper describes a universal method to integrate CAD results with PACS in its daily clinical environment. The method is based on Health Level 7 (HL7) and Digital imaging and communications in medicine (DICOM) standards, and Integrating the Healthcare Enterprise (IHE) workflow profiles. In addition, the integration method is Health Insurance Portability and Accountability Act (HIPAA) compliant. The paper presents (1) the clinical value and advantages of integrating CAD results in a PACS environment, (2) DICOM Structured Reporting formats and some important IHE workflow profiles utilized in the system integration, (3) the methodology using the CAD-PACS integration toolkit, and (4) clinical examples with step-by-step workflows of this integration. (orig.)

  17. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  18. Determining the haemodynamic significance of arterial stenosis: the relationship between CT angiography, computational fluid dynamics, and non-invasive fractional flow reserve

    International Nuclear Information System (INIS)

    Pang, C.L.; Alcock, R.; Pilkington, N.; Reis, T.; Roobottom, C.

    2016-01-01

    Coronary artery disease causes significant morbidity and mortality worldwide. Invasive coronary angiography (ICA) is currently the reference standard investigation. Fractional flow reserve (FFR) complements traditional ICA by providing extra information on blood flow, which has convincingly led to better patient management and improved cost-effectiveness. Computed tomography coronary angiography (CTCA) is suitable for the investigation of chest pain, especially in the low- and intermediate-risk groups. FFR generated using CT data (producing FFR_C_T) may improve the positive predictive value of CTCA. The basic science of FFR_C_T is like a “black box” to most imaging professionals. A fundamental principle is that good quality CTCA is likely to make any post-processing easier and more reliable. Both diagnostic and observational studies have suggested that the accuracy and the short-term outcome of using FFR_C_T are both comparable with FFR in ICA. More multidisciplinary research with further refined diagnostic and longer-term observational studies will hopefully pinpoint the role of FFR_C_T in existing clinical pathways.

  19. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  20. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  3. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    Science.gov (United States)

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  4. Computer-enhanced thallium scintigrams in asymptomatic men with abnormal exercise tests

    International Nuclear Information System (INIS)

    Uhl, G.S.; Kay, T.N.; Hickman, J.R. Jr.

    1981-01-01

    The use of treadmill testing in asymptomatic patients and those with an atypical chest pain syndrome is increasing, yet the proportion of false positive stress electrocardiograms increases as the prevalence of disease decreases. To determine the diagnostic accuracy of computer-enhanced thallium perfusion scintigraphy in this subgroup of patients, multigated thallium scans were obtained after peak exercise and 3 or 4 hours after exercise and the raw images enhanced by a computer before interpretations were made. The patient group consisted of 191 asymptomatic U.S. Air force aircrewmen who had an abnormal exercise electrocardiogram. Of these, 135 had normal coronary angiographic findings, 15 had subcritical coronary stenosis (less than 50 percent diameter narrowing) and 41 had significant coronary artery disease. Use of computer enhancement resulted in only four false positive and two false negative scintigrams. The small subgroup with subcritical coronary disease had equivocal results on thallium scintigraphy, 10 men having abnormal scans and 5 showing no defects. The clinical significance of such subcritical disease in unclear, but it can be detected with thallium scintigraphy. Thallium scintigrams that have been enhanced by readily available computer techniques are an accurate diagnostic tool even in asymptomatic patients with an easily interpretable abnormal maximal stress electrocardiogram. Thallium scans can be effectively used in counseling asymptomatic patients on the likelihood of their having coronary artery disease

  5. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  6. Using Robotics and Game Design to Enhance Children's Self-Efficacy, STEM Attitudes, and Computational Thinking Skills

    Science.gov (United States)

    Leonard, Jacqueline; Buss, Alan; Gamboa, Ruben; Mitchell, Monica; Fashola, Olatokunbo S.; Hubert, Tarcia; Almughyirah, Sultan

    2016-12-01

    This paper describes the findings of a pilot study that used robotics and game design to develop middle school students' computational thinking strategies. One hundred and twenty-four students engaged in LEGO® EV3 robotics and created games using Scalable Game Design software. The results of the study revealed students' pre-post self-efficacy scores on the construct of computer use declined significantly, while the constructs of videogaming and computer gaming remained unchanged. When these constructs were analyzed by type of learning environment, self-efficacy on videogaming increased significantly in the combined robotics/gaming environment compared with the gaming-only context. Student attitudes toward STEM, however, did not change significantly as a result of the study. Finally, children's computational thinking (CT) strategies varied by method of instruction as students who participated in holistic game development (i.e., Project First) had higher CT ratings. This study contributes to the STEM education literature on the use of robotics and game design to influence self-efficacy in technology and CT, while informing the research team about the adaptations needed to ensure project fidelity during the remaining years of the study.

  7. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  8. Third International Conference on Computational Science, Engineering and Information Technology (CCSEIT-2013), v.1

    CERN Document Server

    Kumar, Ashok; Annamalai, Annamalai

    2013-01-01

      This book is the proceedings of Third International Conference on Computational Science, Engineering and Information Technology (CCSEIT-2013) that was held in Konya, Turkey, on June 7-9. CCSEIT-2013 provided an excellent international forum for sharing knowledge and results in theory, methodology and applications of computational science, engineering and information technology. This book contains research results, projects, survey work and industrial experiences representing significant advances in the field. The different contributions collected in this book cover five main areas: algorithms, data structures and applications;  wireless and mobile networks; computer networks and communications; natural language processing and information theory; cryptography and information security.  

  9. The Relation between Accounting Result and Tax Result in the Case of the Profit Tax

    Directory of Open Access Journals (Sweden)

    Băcanu Mihaela-Nicoleta

    2017-01-01

    Full Text Available Accounting and taxation are two connected domains in Romania. The proof that these areconnected is the computation of the profit tax, for which the tax result is computed based on theaccounting result. The scope of the paper is to present what is the relation between accountingresult and tax result. There is a direct relation but also an indirect relation between the two results,taking into consideration the way of computing the tax result, but also the professional judgment,when the revenues and the expenses are recorded in the accounting register. The paper alsoanalyzes which one of the two results influences the other result.

  10. Games at work: the recreational use of computer games during working hours.

    Science.gov (United States)

    Reinecke, Leonard

    2009-08-01

    The present study investigated the recreational use of video and computer games in the workplace. In an online survey, 833 employed users of online casual games reported on their use of computer games during working hours. The data indicate that playing computer games in the workplace elicits substantial levels of recovery experience. Recovery experience associated with gameplay was the strongest predictor for the use of games in the workplace. Furthermore, individuals with higher levels of work-related fatigue reported stronger recovery experience during gameplay and showed a higher tendency to play games during working hours than did persons with lower levels of work strain. Additionally, the social situation at work was found to have a significant influence on the use of games. Persons receiving less social support from colleagues and supervisors played games at work more frequently than did individuals with higher levels of social support. Furthermore, job control was positively related to the use of games at work. In sum, the results of the present study illustrate that computer games have a significant recovery potential. Implications of these findings for research on personal computer use during work and for games research in general are discussed.

  11. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  12. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  13. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  14. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  15. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  16. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  17. Meningitis tuberculosa: Clinical findings and results of cranial computed tomography

    International Nuclear Information System (INIS)

    Trautmann, M.; Loddenkemper, R.; Hoffmann, H.G.; Krankenhaus Zehlendorf, Berlin; Allgemeines Krankenhaus Altona

    1982-01-01

    Guided by 9 own observations between 1977 and 1981, new diagnostic facilities in tuberculous meningitis are discussed. For differentiation from viral meningitis, measurement of CSF lactic acid concentration in addition to that of CSF glucose has proved to be of value in recent years. In accordance with the literature, two cases of this series which were examined for CSF lactic acid concentration showed markedly elevated levels of 8,4 rsp. 10,4 mmol/l. In contrast to this, in viral meningitis usually values of less than 3.5 mmol/l are found. Additionally, the presence of hypochlor- and hyponatremia, which could be demonstrated in 6 of our 9 patients, may raise the suspicion of tuberculous etiology. In the series presented, cranial computed tomography was of greatest diagnostic value, enabling the diagnosis of hydrocephalus internus in 5, and basal arachnoiditis in 2 cases. (orig.) [de

  18. Visual ergonomic aspects of glare on computer displays: glossy screens and angular dependence

    Science.gov (United States)

    Brunnström, Kjell; Andrén, Börje; Konstantinides, Zacharias; Nordström, Lukas

    2007-02-01

    Recently flat panel computer displays and notebook computer are designed with a so called glare panel i.e. highly glossy screens, have emerged on the market. The shiny look of the display appeals to the costumers, also there are arguments that the contrast, colour saturation etc improves by using a glare panel. LCD displays suffer often from angular dependent picture quality. This has been even more pronounced by the introduction of Prism Light Guide plates into displays for notebook computers. The TCO label is the leading labelling system for computer displays. Currently about 50% of all computer displays on the market are certified according to the TCO requirements. The requirements are periodically updated to keep up with the technical development and the latest research in e.g. visual ergonomics. The gloss level of the screen and the angular dependence has recently been investigated by conducting user studies. A study of the effect of highly glossy screens compared to matt screens has been performed. The results show a slight advantage for the glossy screen when no disturbing reflexes are present, however the difference was not statistically significant. When disturbing reflexes are present the advantage is changed into a larger disadvantage and this difference is statistically significant. Another study of angular dependence has also been performed. The results indicates a linear relationship between the picture quality and the centre luminance of the screen.

  19. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  20. Postoperative myocardial infarction documented by technetium pyrophosphate scan using single-photon emission computed tomography: Significance of intraoperative myocardial ischemia and hemodynamic control

    International Nuclear Information System (INIS)

    Cheng, D.C.; Chung, F.; Burns, R.J.; Houston, P.L.; Feindel, C.M.

    1989-01-01

    The aim of this prospective study was to document postoperative myocardial infarction (PMI) by technetium pyrophosphate scan using single-photon emission computed tomography (TcPPi-SPECT) in 28 patients undergoing elective coronary bypass grafting (CABG). The relationships of intraoperative electrocardiographic myocardial ischemia, hemodynamic responses, and pharmacological requirements to this incidence of PMI were correlated. Radionuclide cardioangiography and TcPPi-SPECT were performed 24 h preoperatively and 48 h postoperatively. A standard high-dose fentanyl anesthetic protocol was used. Twenty-five percent of elective CABG patients were complicated with PMI, as documented by TcPPi-SPECT with an infarcted mass of 38.0 +/- 5.5 g. No significant difference in demographic, preoperative right and left ventricular function, number of coronary vessels grafted, or aortic cross-clamp time was observed between the PMI and non-PMI groups. The distribution of patients using preoperative beta-adrenergic blocking drugs or calcium channel blocking drugs was found to have no correlation with the outcome of PMI. As well, no significant differences in hemodynamic changes or pharmacological requirements were observed in the PMI and non-PMI groups during prebypass or postbypass periods, indicating careful intraoperative control of hemodynamic indices did not prevent the outcome of PMI in these patients. However, the incidence of prebypass ischemia was 39.3% and significantly correlated with the outcome of positive TcPPi-SPECT, denoting a 3.9-fold increased risk of developing PMI. Prebypass ischemic changes in leads II and V5 were shown to correlate with increased CPK-MB release (P less than 0.05) and tends to occur more frequently with lateral myocardial infarction

  1. Postoperative myocardial infarction documented by technetium pyrophosphate scan using single-photon emission computed tomography: Significance of intraoperative myocardial ischemia and hemodynamic control

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, D.C.; Chung, F.; Burns, R.J.; Houston, P.L.; Feindel, C.M. (Toronto Hospital, Ontario (Canada))

    1989-12-01

    The aim of this prospective study was to document postoperative myocardial infarction (PMI) by technetium pyrophosphate scan using single-photon emission computed tomography (TcPPi-SPECT) in 28 patients undergoing elective coronary bypass grafting (CABG). The relationships of intraoperative electrocardiographic myocardial ischemia, hemodynamic responses, and pharmacological requirements to this incidence of PMI were correlated. Radionuclide cardioangiography and TcPPi-SPECT were performed 24 h preoperatively and 48 h postoperatively. A standard high-dose fentanyl anesthetic protocol was used. Twenty-five percent of elective CABG patients were complicated with PMI, as documented by TcPPi-SPECT with an infarcted mass of 38.0 +/- 5.5 g. No significant difference in demographic, preoperative right and left ventricular function, number of coronary vessels grafted, or aortic cross-clamp time was observed between the PMI and non-PMI groups. The distribution of patients using preoperative beta-adrenergic blocking drugs or calcium channel blocking drugs was found to have no correlation with the outcome of PMI. As well, no significant differences in hemodynamic changes or pharmacological requirements were observed in the PMI and non-PMI groups during prebypass or postbypass periods, indicating careful intraoperative control of hemodynamic indices did not prevent the outcome of PMI in these patients. However, the incidence of prebypass ischemia was 39.3% and significantly correlated with the outcome of positive TcPPi-SPECT, denoting a 3.9-fold increased risk of developing PMI. Prebypass ischemic changes in leads II and V5 were shown to correlate with increased CPK-MB release (P less than 0.05) and tends to occur more frequently with lateral myocardial infarction.

  2. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  3. Communication: Proper treatment of classically forbidden electronic transitions significantly improves detailed balance in surface hopping

    Energy Technology Data Exchange (ETDEWEB)

    Sifain, Andrew E. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Wang, Linjun [Department of Chemistry, Zhejiang University, Hangzhou 310027 (China); Prezhdo, Oleg V. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Department of Chemistry, University of Southern California, Los Angeles, California 90089-1062 (United States)

    2016-06-07

    Surface hopping is the most popular method for nonadiabatic molecular dynamics. Many have reported that it does not rigorously attain detailed balance at thermal equilibrium, but does so approximately. We show that convergence to the Boltzmann populations is significantly improved when the nuclear velocity is reversed after a classically forbidden hop. The proposed prescription significantly reduces the total number of classically forbidden hops encountered along a trajectory, suggesting that some randomization in nuclear velocity is needed when classically forbidden hops constitute a large fraction of attempted hops. Our results are verified computationally using two- and three-level quantum subsystems, coupled to a classical bath undergoing Langevin dynamics.

  4. COMPUTER VISION SYNDROME: A SHORT REVIEW.

    OpenAIRE

    Sameena; Mohd Inayatullah

    2012-01-01

    Computers are probably one of the biggest scientific inventions of the modern era, and since then they have become an integral part of our life. The increased usage of computers have lead to variety of ocular symptoms which includ es eye strain, tired eyes, irritation, redness, blurred vision, and diplopia, collectively referred to as Computer Vision Syndrome (CVS). CVS may have a significant impact not only on visual com fort but also occupational productivit...

  5. Computers in Academic Architecture Libraries.

    Science.gov (United States)

    Willis, Alfred; And Others

    1992-01-01

    Computers are widely used in architectural research and teaching in U.S. schools of architecture. A survey of libraries serving these schools sought information on the emphasis placed on computers by the architectural curriculum, accessibility of computers to library staff, and accessibility of computers to library patrons. Survey results and…

  6. Initial results from a prototype whole-body photon-counting computed tomography system.

    Science.gov (United States)

    Yu, Z; Leng, S; Jorgensen, S M; Li, Z; Gutjahr, R; Chen, B; Duan, X; Halaweish, A F; Yu, L; Ritman, E L; McCollough, C H

    X-ray computed tomography (CT) with energy-discriminating capabilities presents exciting opportunities for increased dose efficiency and improved material decomposition analyses. However, due to constraints imposed by the inability of photon-counting detectors (PCD) to respond accurately at high photon flux, to date there has been no clinical application of PCD-CT. Recently, our lab installed a research prototype system consisting of two x-ray sources and two corresponding detectors, one using an energy-integrating detector (EID) and the other using a PCD. In this work, we report the first third-party evaluation of this prototype CT system using both phantoms and a cadaver head. The phantom studies demonstrated several promising characteristics of the PCD sub-system, including improved longitudinal spatial resolution and reduced beam hardening artifacts, relative to the EID sub-system. More importantly, we found that the PCD sub-system offers excellent pulse pileup control in cases of x-ray flux up to 550 mA at 140 kV, which corresponds to approximately 2.5×10 11 photons per cm 2 per second. In an anthropomorphic phantom and a cadaver head, the PCD sub-system provided image quality comparable to the EID sub-system for the same dose level. Our results demonstrate the potential of the prototype system to produce clinically-acceptable images in vivo .

  7. Computing with networks of spiking neurons on a biophysically motivated floating-gate based neuromorphic integrated circuit.

    Science.gov (United States)

    Brink, S; Nease, S; Hasler, P

    2013-09-01

    Results are presented from several spiking network experiments performed on a novel neuromorphic integrated circuit. The networks are discussed in terms of their computational significance, which includes applications such as arbitrary spatiotemporal pattern generation and recognition, winner-take-all competition, stable generation of rhythmic outputs, and volatile memory. Analogies to the behavior of real biological neural systems are also noted. The alternatives for implementing the same computations are discussed and compared from a computational efficiency standpoint, with the conclusion that implementing neural networks on neuromorphic hardware is significantly more power efficient than numerical integration of model equations on traditional digital hardware. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Optical Computers and Space Technology

    Science.gov (United States)

    Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela

    1995-01-01

    The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.

  9. Executing a gather operation on a parallel computer

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Ratterman, Joseph D [Rochester, MN

    2012-03-20

    Methods, apparatus, and computer program products are disclosed for executing a gather operation on a parallel computer according to embodiments of the present invention. Embodiments include configuring, by the logical root, a result buffer or the logical root, the result buffer having positions, each position corresponding to a ranked node in the operational group and for storing contribution data gathered from that ranked node. Embodiments also include repeatedly for each position in the result buffer: determining, by each compute node of an operational group, whether the current position in the result buffer corresponds with the rank of the compute node, if the current position in the result buffer corresponds with the rank of the compute node, contributing, by that compute node, the compute node's contribution data, if the current position in the result buffer does not correspond with the rank of the compute node, contributing, by that compute node, a value of zero for the contribution data, and storing, by the logical root in the current position in the result buffer, results of a bitwise OR operation of all the contribution data by all compute nodes of the operational group for the current position, the results received through the global combining network.

  10. Verification of thermal-hydraulic computer codes against standard problems for WWER reflooding

    International Nuclear Information System (INIS)

    Alexander D Efanov; Vladimir N Vinogradov; Victor V Sergeev; Oleg A Sudnitsyn

    2005-01-01

    Full text of publication follows: The computational assessment of reactor core components behavior under accident conditions is impossible without knowledge of the thermal-hydraulic processes occurring in this case. The adequacy of the results obtained using the computer codes to the real processes is verified by carrying out a number of standard problems. In 2000-2003, the fulfillment of three Russian standard problems on WWER core reflooding was arranged using the experiments on full-height electrically heated WWER 37-rod bundle model cooldown in regimes of bottom (SP-1), top (SP-2) and combined (SP-3) reflooding. The representatives from the eight MINATOM's organizations took part in this work, in the course of which the 'blind' and posttest calculations were performed using various versions of the RELAP5, ATHLET, CATHARE, COBRA-TF, TRAP, KORSAR computer codes. The paper presents a brief description of the test facility, test section, test scenarios and conditions as well as the basic results of computational analysis of the experiments. The analysis of the test data revealed a significantly non-one-dimensional nature of cooldown and rewetting of heater rods heated up to a high temperature in a model bundle. This was most pronounced at top and combined reflooding. The verification of the model reflooding computer codes showed that most of computer codes fairly predict the peak rod temperature and the time of bundle cooldown. The exception is provided by the results of calculations with the ATHLET and CATHARE codes. The nature and rate of rewetting front advance in the lower half of the bundle are fairly predicted practically by all computer codes. The disagreement between the calculations and experimental results for the upper half of the bundle is caused by the difficulties of computational simulation of multidimensional effects by 1-D computer codes. In this regard, a quasi-two-dimensional computer code COBRA-TF offers certain advantages. Overall, the closest

  11. Afrika Statistika ISSN 2316-090X A Bayesian significance test of ...

    African Journals Online (AJOL)

    of the generalized likelihood ratio test to detect a change in binomial ... computational simplicity to the problem of calculating posterior marginals. ... the impact of a single outlier on the performance of the Bayesian significance test of change.

  12. Risk perception and risk management in cloud computing: results from a case study of Swiss companies

    OpenAIRE

    Brender, Nathalie; Markov, Iliya

    2013-01-01

    In today's economic turmoil, the pay-per-use pricing model of cloud computing, its flexibility and scalability and the potential for better security and availability levels are alluring to both SMEs and large enterprises. However, cloud computing is fraught with security risks which need to be carefully evaluated before any engagement in this area. This article elaborates on the most important risks inherent to the cloud such as information security, regulatory compliance, data location, inve...

  13. Motivation in computer-assisted instruction.

    Science.gov (United States)

    Hu, Amanda; Shewokis, Patricia A; Ting, Kimberly; Fung, Kevin

    2016-08-01

    and 2) or senior (year 3 and 4). There were no significant differences in anatomy scores based on educational modality. There was significant interaction of educational modality by year [F(1,96) = 4.12, P = 0.045, ω(2)  = 0.031]. For the total score, there was a significant effect of year [F(1,96) = 22.28, P motivational score, the total IMMS score had two significant effects. With educational modality [F(1,96) = 5.18, P = 0.025, ω(2)  = 0.041], the 3D group (12.4 ± 2.8) scored significantly higher than the written text group (11.7 ± 3.2). With year [F(1,96) = 25.31, P motivation scores (P motivational levels. Computer-aided instruction was found to have a greater positive impact on senior medical students with higher anatomy and motivational scores. Higher anatomy scores were positively associated with higher motivational scores. Computer-aided instruction may be better targeted toward senior students. N/A. Laryngoscope, 126:S5-S13, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  14. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  15. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  16. Computers and conversation

    CERN Document Server

    Luff, Paul; Gilbert, Nigel G

    1986-01-01

    In the past few years a branch of sociology, conversation analysis, has begun to have a significant impact on the design of human*b1computer interaction (HCI). The investigation of human*b1human dialogue has emerged as a fruitful foundation for interactive system design.****This book includes eleven original chapters by leading researchers who are applying conversation analysis to HCI. The fundamentals of conversation analysis are outlined, a number of systems are described, and a critical view of their value for HCI is offered.****Computers and Conversation will be of interest to all concerne

  17. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  18. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel

    2012-06-01

    Understanding the influence of multiple parameters in a complex simulation setting is a difficult task. In the ideal case, the scientist can freely steer such a simulation and is immediately presented with the results for a certain configuration of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute for the time-consuming simulation. The surrogate model we propose is based on the sparse grid technique, and we identify the main computational tasks associated with its evaluation and its extension. We further show how distributed data management combined with the specific use of accelerators allows us to approximate and deliver simulation results to a high-resolution visualization system in real-time. This significantly enhances the steering workflow and facilitates the interactive exploration of large datasets. © 2012 IEEE.

  19. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  20. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  1. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Corones, James [Krell Institute

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  2. Most significant preliminary results of the probabilistic safety analysis on the Juragua nuclear power plant

    International Nuclear Information System (INIS)

    Perdomo, Manuel

    1995-01-01

    Since 1990 the Group for PSA Development and Applications (GDA/APS) is working on the Level-1 PSA for the Juragua-1 NPP, as a part of an IAEA Technical Assistance Project. The main objective of this study, which is still under way, is to assess, in a preliminary way, the Reactor design safety to find its potential 'weak points' at the construction stage, using a eneric data base. At the same time, the study allows the PSA team to familiarize with the plant design and analysis techniques for the future operational PSA of the plant. This paper presents the most significant preliminary results of the study, which reveal some advantages of the safety characteristics of the plant design in comparison with the homologous VVER-440 reactors and some areas, where including slight modifications would improve the plant safety, considering the level of detail at which the study is carried out. (author). 13 refs, 1 fig, 2 tabs

  3. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  4. Total variation-based neutron computed tomography

    Science.gov (United States)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  5. The (InSignificance of Socio-Demographic Factors as Possible Determinants of Vietnamese Social Scientists’ Contribution-Adjusted Productivity: Preliminary Results from 2008–2017 Scopus Data

    Directory of Open Access Journals (Sweden)

    Thu-Trang Vuong

    2017-12-01

    Full Text Available As collaboration has become widespread in academia, and the number of authors per article has increased, the publication count is no longer an accurate indicator of scientific output in many cases. To overcome this limitation, this study defined and computed a relative count of publications called ‘CP’ (credit-based contribution points, based on the sequence-determines-credit (SDC method, which takes into account the level of contribution of each author. Analyses were done on a sample of 410 Vietnamese social scientists whose publications were indexed in the Scopus database during 2008–2017. The results showed that the average CP of Vietnamese researchers in the field of social sciences and humanities is very low: more than 88% of authors have a CP less than five over a span 10 years. Researchers with a higher CP were mostly 40–50 years old; however, even for this sub-group, the mean CP was only 3.07. Multiple attributes of first-authorship—including knowledge, research skills, and critical thinking—could boost the CP by a ratio of 1:1.06. There is no evidence of gender differences in productivity, however, there is a regional difference. These findings offer significant insights into the education system in regard to science and technology, namely policy implications for science funding and management strategies for research funds.

  6. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  7. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  8. Effects on mortality, treatment, and time management as a result of routine use of total body computed tomography in blunt high-energy trauma patients.

    Science.gov (United States)

    van Vugt, Raoul; Kool, Digna R; Deunk, Jaap; Edwards, Michael J R

    2012-03-01

    Currently, total body computed tomography (TBCT) is rapidly implemented in the evaluation of trauma patients. With this review, we aim to evaluate the clinical implications-mortality, change in treatment, and time management-of the routine use of TBCT in adult blunt high-energy trauma patients compared with a conservative approach with the use of conventional radiography, ultrasound, and selective computed tomography. A literature search for original studies on TBCT in blunt high-energy trauma patients was performed. Two independent observers included studies concerning mortality, change of treatment, and/or time management as outcome measures. For each article, relevant data were extracted and analyzed. In addition, the quality according to the Oxford levels of evidence was assessed. From 183 articles initially identified, the observers included nine original studies in consensus. One of three studies described a significant difference in mortality; four described a change of treatment in 2% to 27% of patients because of the use of TBCT. Five studies found a gain in time with the use of immediate routine TBCT. Eight studies scored a level of evidence of 2b and one of 3b. Current literature has predominantly suboptimal design to prove terminally that the routine use of TBCT results in improved survival of blunt high-energy trauma patients. TBCT can give a change of treatment and improves time intervals in the emergency department as compared with its selective use.

  9. Computer vision syndrome: a study of knowledge and practices in university students.

    Science.gov (United States)

    Reddy, S C; Low, C K; Lim, Y P; Low, L L; Mardina, F; Nursaleha, M P

    2013-01-01

    Computer vision syndrome (CVS) is a condition in which a person experiences one or more of eye symptoms as a result of prolonged working on a computer. To determine the prevalence of CVS symptoms, knowledge and practices of computer use in students studying in different universities in Malaysia, and to evaluate the association of various factors in computer use with the occurrence of symptoms. In a cross sectional, questionnaire survey study, data was collected in college students regarding the demography, use of spectacles, duration of daily continuous use of computer, symptoms of CVS, preventive measures taken to reduce the symptoms, use of radiation filter on the computer screen, and lighting in the room. A total of 795 students, aged between 18 and 25 years, from five universities in Malaysia were surveyed. The prevalence of symptoms of CVS (one or more) was found to be 89.9%; the most disturbing symptom was headache (19.7%) followed by eye strain (16.4%). Students who used computer for more than 2 hours per day experienced significantly more symptoms of CVS (p=0.0001). Looking at far objects in-between the work was significantly (p=0.0008) associated with less frequency of CVS symptoms. The use of radiation filter on the screen (p=0.6777) did not help in reducing the CVS symptoms. Ninety percent of university students in Malaysia experienced symptoms related to CVS, which was seen more often in those who used computer for more than 2 hours continuously per day. © NEPjOPH.

  10. Computer games and fine motor skills.

    Science.gov (United States)

    Borecki, Lukasz; Tolstych, Katarzyna; Pokorski, Mieczyslaw

    2013-01-01

    The study seeks to determine the influence of computer games on fine motor skills in young adults, an area of incomplete understanding and verification. We hypothesized that computer gaming could have a positive influence on basic motor skills, such as precision, aiming, speed, dexterity, or tremor. We examined 30 habitual game users (F/M - 3/27; age range 20-25 years) of the highly interactive game Counter Strike, in which players impersonate soldiers on a battlefield, and 30 age- and gender-matched subjects who declared never to play games. Selected tests from the Vienna Test System were used to assess fine motor skills and tremor. The results demonstrate that the game users scored appreciably better than the control subjects in all tests employed. In particular, the players did significantly better in the precision of arm-hand movements, as expressed by a lower time of errors, 1.6 ± 0.6 vs. 2.8 ± 0.6 s, a lower error rate, 13.6 ± 0.3 vs. 20.4 ± 2.2, and a shorter total time of performing a task, 14.6 ± 2.9 vs. 32.1 ± 4.5 s in non-players, respectively; p computer games on psychomotor functioning. We submit that playing computer games may be a useful training tool to increase fine motor skills and movement coordination.

  11. Computer-Assisted Instruction to Teach DOS Commands: A Pilot Study.

    Science.gov (United States)

    McWeeney, Mark G.

    1992-01-01

    Describes a computer-assisted instruction (CAI) program used to teach DOS commands. Pretest and posttest results for 65 graduate students using the program are reported, and it is concluded that the CAI program significantly aided the students. Sample screen displays for the program and several questions from the pre/posttest are included. (nine…

  12. Potent corticosteroid cream (mometasone furoate) significantly reduces acute radiation dermatitis: results from a double-blind, randomized study

    International Nuclear Information System (INIS)

    Bostroem, Aasa; Lindman, Henrik; Swartling, Carl; Berne, Berit; Bergh, Jonas

    2001-01-01

    Purpose: Radiation-induced dermatitis is a very common side effect of radiation therapy, and may necessitate interruption of the therapy. There is a substantial lack of evidence-based treatments for this condition. The aim of this study was to investigate the effect of mometasone furoate cream (MMF) on radiation dermatitis in a prospective, double-blind, randomized study. Material and methods: The study comprised 49 patients with node-negative breast cancer. They were operated on with sector resection and scheduled for postoperative radiotherapy using photons with identical radiation qualities and dosage to the breast parenchyma. The patients were randomized to receive either MMF or emollient cream. The cream was applied on the irradiated skin twice a week from the start of radiotherapy until the 12th fraction (24 Gy) and thereafter once daily until 3 weeks after completion of radiation. Both groups additionally received non-blinded emollient cream daily. The intensity of the acute radiation dermatitis was evaluated on a weekly basis regarding erythema and pigmentation, using a reflectance spectrophotometer together with visual scoring of the skin reactions. Results: MMF in combination with emollient cream treatment significantly decreased acute radiation dermatitis (P=0.0033) compared with emollient cream alone. There was no significant difference in pigmentation between the two groups. Conclusions: Adding MMF, a potent topical corticosteroid, to an emollient cream is statistically significantly more effective than emollient cream alone in reducing acute radiation dermatitis

  13. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  14. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  15. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  16. Optimized blind gamma-ray pulsar searches at fixed computing budget

    International Nuclear Information System (INIS)

    Pletsch, Holger J.; Clark, Colin J.

    2014-01-01

    The sensitivity of blind gamma-ray pulsar searches in multiple years worth of photon data, as from the Fermi LAT, is primarily limited by the finite computational resources available. Addressing this 'needle in a haystack' problem, here we present methods for optimizing blind searches to achieve the highest sensitivity at fixed computing cost. For both coherent and semicoherent methods, we consider their statistical properties and study their search sensitivity under computational constraints. The results validate a multistage strategy, where the first stage scans the entire parameter space using an efficient semicoherent method and promising candidates are then refined through a fully coherent analysis. We also find that for the first stage of a blind search incoherent harmonic summing of powers is not worthwhile at fixed computing cost for typical gamma-ray pulsars. Further enhancing sensitivity, we present efficiency-improved interpolation techniques for the semicoherent search stage. Via realistic simulations we demonstrate that overall these optimizations can significantly lower the minimum detectable pulsed fraction by almost 50% at the same computational expense.

  17. Pancreatic gross tumor volume contouring on computed tomography (CT) compared with magnetic resonance imaging (MRI): Results of an international contouring conference.

    Science.gov (United States)

    Hall, William A; Heerkens, Hanne D; Paulson, Eric S; Meijer, Gert J; Kotte, Alexis N; Knechtges, Paul; Parikh, Parag J; Bassetti, Michael F; Lee, Percy; Aitken, Katharine L; Palta, Manisha; Myrehaug, Sten; Koay, Eugene J; Portelance, Lorraine; Ben-Josef, Edgar; Erickson, Beth A

    Accurate identification of the gross tumor volume (GTV) in pancreatic adenocarcinoma is challenging. We sought to understand differences in GTV delineation using pancreatic computed tomography (CT) compared with magnetic resonance imaging (MRI). Twelve attending radiation oncologists were convened for an international contouring symposium. All participants had a clinical and research interest in pancreatic adenocarcinoma. CT and MRI scans from 3 pancreatic cases were used for contouring. CT and MRI GTVs were analyzed and compared. Interobserver variability was compared using Dice's similarity coefficient (DSC), Hausdorff distances, and Jaccard indices. Mann-Whitney tests were used to check for significant differences. Consensus contours on CT and MRI scans and constructed count maps were used to visualize the agreement. Agreement regarding the optimal method to determine GTV definition using MRI was reached. Six contour sets (3 from CT and 3 from MRI) were obtained and compared for each observer, totaling 72 contour sets. The mean volume of contours on CT was significantly larger at 57.48 mL compared with a mean of 45.76 mL on MRI, P = .011. The standard deviation obtained from the CT contours was significantly larger than the standard deviation from the MRI contours (P = .027). The mean DSC was 0.73 for the CT and 0.72 for the MRI (P = .889). The conformity index measurement was similar for CT and MRI (P = .58). Count maps were created to highlight differences in the contours from CT and MRI. Using MRI as a primary image set to define a pancreatic adenocarcinoma GTV resulted in smaller contours compared with CT. No differences in DSC or the conformity index were seen between MRI and CT. A stepwise method is recommended as an approach to contour a pancreatic GTV using MRI. Copyright © 2017 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  18. A fast algorithm for sparse matrix computations related to inversion

    International Nuclear Information System (INIS)

    Li, S.; Wu, W.; Darve, E.

    2013-01-01

    We have developed a fast algorithm for computing certain entries of the inverse of a sparse matrix. Such computations are critical to many applications, such as the calculation of non-equilibrium Green’s functions G r and G for nano-devices. The FIND (Fast Inverse using Nested Dissection) algorithm is optimal in the big-O sense. However, in practice, FIND suffers from two problems due to the width-2 separators used by its partitioning scheme. One problem is the presence of a large constant factor in the computational cost of FIND. The other problem is that the partitioning scheme used by FIND is incompatible with most existing partitioning methods and libraries for nested dissection, which all use width-1 separators. Our new algorithm resolves these problems by thoroughly decomposing the computation process such that width-1 separators can be used, resulting in a significant speedup over FIND for realistic devices — up to twelve-fold in simulation. The new algorithm also has the added advantage that desired off-diagonal entries can be computed for free. Consequently, our algorithm is faster than the current state-of-the-art recursive methods for meshes of any size. Furthermore, the framework used in the analysis of our algorithm is the first attempt to explicitly apply the widely-used relationship between mesh nodes and matrix computations to the problem of multiple eliminations with reuse of intermediate results. This framework makes our algorithm easier to generalize, and also easier to compare against other methods related to elimination trees. Finally, our accuracy analysis shows that the algorithms that require back-substitution are subject to significant extra round-off errors, which become extremely large even for some well-conditioned matrices or matrices with only moderately large condition numbers. When compared to these back-substitution algorithms, our algorithm is generally a few orders of magnitude more accurate, and our produced round-off errors

  19. A fast algorithm for sparse matrix computations related to inversion

    Energy Technology Data Exchange (ETDEWEB)

    Li, S., E-mail: lisong@stanford.edu [Institute for Computational and Mathematical Engineering, Stanford University, 496 Lomita Mall, Durand Building, Stanford, CA 94305 (United States); Wu, W. [Department of Electrical Engineering, Stanford University, 350 Serra Mall, Packard Building, Room 268, Stanford, CA 94305 (United States); Darve, E. [Institute for Computational and Mathematical Engineering, Stanford University, 496 Lomita Mall, Durand Building, Stanford, CA 94305 (United States); Department of Mechanical Engineering, Stanford University, 496 Lomita Mall, Durand Building, Room 209, Stanford, CA 94305 (United States)

    2013-06-01

    We have developed a fast algorithm for computing certain entries of the inverse of a sparse matrix. Such computations are critical to many applications, such as the calculation of non-equilibrium Green’s functions G{sup r} and G{sup <} for nano-devices. The FIND (Fast Inverse using Nested Dissection) algorithm is optimal in the big-O sense. However, in practice, FIND suffers from two problems due to the width-2 separators used by its partitioning scheme. One problem is the presence of a large constant factor in the computational cost of FIND. The other problem is that the partitioning scheme used by FIND is incompatible with most existing partitioning methods and libraries for nested dissection, which all use width-1 separators. Our new algorithm resolves these problems by thoroughly decomposing the computation process such that width-1 separators can be used, resulting in a significant speedup over FIND for realistic devices — up to twelve-fold in simulation. The new algorithm also has the added advantage that desired off-diagonal entries can be computed for free. Consequently, our algorithm is faster than the current state-of-the-art recursive methods for meshes of any size. Furthermore, the framework used in the analysis of our algorithm is the first attempt to explicitly apply the widely-used relationship between mesh nodes and matrix computations to the problem of multiple eliminations with reuse of intermediate results. This framework makes our algorithm easier to generalize, and also easier to compare against other methods related to elimination trees. Finally, our accuracy analysis shows that the algorithms that require back-substitution are subject to significant extra round-off errors, which become extremely large even for some well-conditioned matrices or matrices with only moderately large condition numbers. When compared to these back-substitution algorithms, our algorithm is generally a few orders of magnitude more accurate, and our produced round

  20. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    Science.gov (United States)

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.