WorldWideScience

Sample records for significantly reduced computational

  1. Sucralfate significantly reduces ciprofloxacin concentrations in serum.

    OpenAIRE

    Garrelts, J C; Godley, P J; Peterie, J D; Gerlach, E H; Yakshe, C C

    1990-01-01

    The effect of sucralfate on the bioavailability of ciprofloxacin was evaluated in eight healthy subjects utilizing a randomized, crossover design. The area under the concentration-time curve from 0 to 12 h was reduced from 8.8 to 1.1 micrograms.h/ml by sucralfate (P less than 0.005). Similarly, the maximum concentration of ciprofloxacin in serum was reduced from 2.0 to 0.2 micrograms/ml (P less than 0.005). We conclude that concurrent ingestion of sucralfate significantly reduces the concentr...

  2. Rackspace: Significance of Cloud Computing to CERN

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The research collaboration between Rackspace and CERN is contributing to how OpenStack cloud computing will move science work around the world for CERN, and to reducing the barriers between clouds for Rackspace.

  3. Quilting after mastectomy significantly reduces seroma formation

    African Journals Online (AJOL)

    reduce or prevent seroma formation among mastectomy patients ... of this prospective study is to evaluate the effect of surgical quilting ... Seroma was more common in smokers (p=0.003) and was not decreased by the .... explain its aetiology.

  4. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  5. Significance of triplane computed tomography in otolaryngology

    International Nuclear Information System (INIS)

    Taiji, Hidenobu; Namiki, Hideo; Kano, Shigeru; Hojoh, Yoshio

    1985-01-01

    The authors obtained direct sagittal CT scans of the head using a new method for positioning the head of patient in sitting position. Direct sagittal scans are more useful than computed rearranged scans in a better spatial and density resolution. The triplane CT (axial, coronal, and sagittal CT) greatly improves three dimentional recognition of the intracranial and facial structures and the extent of the lesion. A series of patients with various nasal and oropharyngeal tumors was examined with the triplane CT. The advantages of direct sagittal scans are (1) the recognition of localization and extension of the lesion. (2) the evaluation of the extent of the deep facial and nasopharygeal tumors, especially in the intracranial and intraorbital regions. (3) the more accurate determination of staging of the maxillary cancer. (author)

  6. Significance of computed tomography in urology

    International Nuclear Information System (INIS)

    Harada, Takashi

    1981-01-01

    There are more than five years since computed tomography (CT) was first introduced in this country for practical use. However, cumulative diagnostic experiences in urology have not been discussed thoroughly yet. In the Department of Urology of Kansai Medical University over 120 times CT diagnosis were attempted past three years and the instrument employed during this period has been alternative from the first generation type (ACTA 150) to the third one (CT-3W) this year as to technical advance. These cases are 70 of pelvic lesions and retroperitoneal surveys are made in the rests. As a results, detection of space occupying mass in kidney, adrenal and their surroundings was comparatively easy to deliver by this method, but there are several pitfalls to come misunderstanding in diagnosis of pelvic organs. It seems to be difficult to obtain certain result on closely packed viscera with tightly adhered connective tissue in tiny space. However, these difficulties will be solved by bladder insufflation with olive oil, for instance, and scanning in prone position. Contrast enhancement by injection of dye also give more definite results in genitourinary tract assessment. Moreover, there are much benefit in diagnosis of renal parenchymal change including lacerating renal trauma unable to be differentiated conventional method. Bolus injection of contrast material also allows to calculate CT values obtained from ROI on tomography and enables to fit the value to time-activity curve likewise scintillation scanning. In forthcomming day, new device in this field including emission-CT, NMR-CT and others will open new sight for ideal diagnostic facility in urology. (author)

  7. Next-generation nozzle check valve significantly reduces operating costs

    Energy Technology Data Exchange (ETDEWEB)

    Roorda, O. [SMX International, Toronto, ON (Canada)

    2009-01-15

    Check valves perform an important function in preventing reverse flow and protecting plant and mechanical equipment. However, the variety of different types of valves and extreme differences in performance even within one type can change maintenance requirements and life cycle costs, amounting to millions of dollars over the typical 15-year design life of piping components. A next-generation non-slam nozzle check valve which prevents return flow has greatly reduced operating costs by protecting the mechanical equipment in a piping system. This article described the check valve varieties such as the swing check valve, a dual-plate check valve, and nozzle check valves. Advancements in optimized design of a non-slam nozzle check valve were also discussed, with particular reference to computer flow modelling such as computational fluid dynamics; computer stress modelling such as finite element analysis; and flow testing (using rapid prototype development and flow loop testing), both to improve dynamic performance and reduce hydraulic losses. The benefits of maximized dynamic performance and minimized pressure loss from the new designed valve were also outlined. It was concluded that this latest non-slam nozzle check valve design has potential applications in natural gas, liquefied natural gas, and oil pipelines, including subsea applications, as well as refineries, and petrochemical plants among others, and is suitable for horizontal and vertical installation. The result of this next-generation nozzle check valve design is not only superior performance, and effective protection of mechanical equipment but also minimized life cycle costs. 1 fig.

  8. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  9. Clinical significance of measurement of hepatic volume by computed tomography

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Matsuda, Yoshiro; Takada, Akira

    1984-01-01

    Hepatic volumes were measured by computed tomography (CT) in 91 patients with chronic liver diseases. Mean hepatic volume in alcoholic liver disease was significantly larger than that in non-alcoholic liver disease. Hepatic volumes in the majority of decompensated liver cirrhosis were significantly smaller than those of compensated liver cirrhosis. In liver cirrhosis, significant correlations between hepatic volume and various hepatic tests which reflect the total functioning hepatic cell masses were found. Combinations of hepatic volume with ICG maximum removal rate and with serum cholinesterase activity were most useful for the assessment of prognosis in liver cirrhosis. These results indicated that estimation of hepatic volume by CT is useful for analysis of pathophysiology and prognosis of chronic liver diseases, and for diagnosis of alcoholic liver diseases. (author)

  10. The significance of sensory appeal for reduced meat consumption.

    Science.gov (United States)

    Tucker, Corrina A

    2014-10-01

    Reducing meat (over-)consumption as a way to help address environmental deterioration will require a range of strategies, and any such strategies will benefit from understanding how individuals might respond to various meat consumption practices. To investigate how New Zealanders perceive such a range of practices, in this instance in vitro meat, eating nose-to-tail, entomophagy and reducing meat consumption, focus groups involving a total of 69 participants were held around the country. While it is the damaging environmental implications of intensive farming practices and the projected continuation of increasing global consumer demand for meat products that has propelled this research, when asked to consider variations on the conventional meat-centric diet common to many New Zealanders, it was the sensory appeal of the areas considered that was deemed most problematic. While an ecological rationale for considering these 'meat' alternatives was recognised and considered important by most, transforming this value into action looks far less promising given the recurrent sensory objections to consuming different protein-based foods or of reducing meat consumption. This article considers the responses of focus group participants in relation to each of the dietary practices outlined, and offers suggestions on ways to encourage a more environmentally viable diet. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Significance of computed tomography for diagnosis of heart diseases

    International Nuclear Information System (INIS)

    Senda, Kohei; Sakuma, Sadayuki

    1983-01-01

    Computed tomography (CT) with a 2 sec scanner was carried out on 105 cases with various heart disease in order to detect CT findings in each heart disease. Significance of CT as a imaging study was evaluated in comparison with scintigraphic, echographic and roentgenographic studies. CT with contrast enhancement in moderate inspiration was able to demonstrate accurately organic changes of intra-and extracardiac structure. Comparing with other imaging studies, CT was superior in detection of calcified or intracardiac mass lesion in spite of low value in evaluating cardiac function or dynamics. (author)

  12. PA positioning significantly reduces testicular dose during sacroiliac joint radiography

    Energy Technology Data Exchange (ETDEWEB)

    Mekis, Nejc [Faculty of Health Sciences, University of Ljubljana (Slovenia); Mc Entee, Mark F., E-mail: mark.mcentee@ucd.i [School of Medicine and Medical Science, University College Dublin 4 (Ireland); Stegnar, Peter [Jozef Stefan International Postgraduate School, Ljubljana (Slovenia)

    2010-11-15

    Radiation dose to the testes in the antero-posterior (AP) and postero-anterior (PA) projection of the sacroiliac joint (SIJ) was measured with and without a scrotal shield. Entrance surface dose, the dose received by the testicles and the dose area product (DAP) was used. DAP measurements revealed the dose received by the phantom in the PA position is 12.6% lower than the AP (p {<=} 0.009) with no statistically significant reduction in image quality (p {<=} 0.483). The dose received by the testes in the PA projection in SIJ imaging is 93.1% lower than the AP projection when not using protection (p {<=} 0.020) and 94.9% lower with protection (p {<=} 0.019). The dose received by the testicles was not changed by the use of a scrotal shield in the AP position (p {<=} 0.559); but was lowered by its use in the PA (p {<=} 0.058). Use of the PA projection in SIJ imaging significantly lowers, the dose received by the testes compared to the AP projection without significant loss of image quality.

  13. PA positioning significantly reduces testicular dose during sacroiliac joint radiography

    International Nuclear Information System (INIS)

    Mekis, Nejc; Mc Entee, Mark F.; Stegnar, Peter

    2010-01-01

    Radiation dose to the testes in the antero-posterior (AP) and postero-anterior (PA) projection of the sacroiliac joint (SIJ) was measured with and without a scrotal shield. Entrance surface dose, the dose received by the testicles and the dose area product (DAP) was used. DAP measurements revealed the dose received by the phantom in the PA position is 12.6% lower than the AP (p ≤ 0.009) with no statistically significant reduction in image quality (p ≤ 0.483). The dose received by the testes in the PA projection in SIJ imaging is 93.1% lower than the AP projection when not using protection (p ≤ 0.020) and 94.9% lower with protection (p ≤ 0.019). The dose received by the testicles was not changed by the use of a scrotal shield in the AP position (p ≤ 0.559); but was lowered by its use in the PA (p ≤ 0.058). Use of the PA projection in SIJ imaging significantly lowers, the dose received by the testes compared to the AP projection without significant loss of image quality.

  14. Significance of Computed Tomography in the Diagnosis of Cerebrovascular Accidents

    Directory of Open Access Journals (Sweden)

    Sumnima Acharya

    2014-06-01

    Full Text Available Introduction: Cerebrovascular Accident (CVA is defined as abrupt onset of a neurological deficit that is attributable to a focal vascular cause. CT scan is a widely available, affordable, non-invasive and relatively accurate investigation in patients with stroke and is important to identify stroke pathology and exclude mimics. Aim of this study is to establish the diagnostic significance of computed tomography in cerebrovascular accident and to differentiate between cerebral infarction and cerebral haemorrhage with CT for better management of CVA. Methods: A one year observational cross sectional study was conducted in 100 patients that presented at the department of radiodiagnosis from emergency or ward within the one year of study period with the clinical diagnosis of stroke, and had a brain CT scan done within one to fourteen days of onset. Results: A total of 100 patients were studied. 66 were male and 34 were female with a male/female ratio of 1.9:1. Maximum number of cases (39% was in the age group of 61-80 yrs. Among 100 patients, 55 cases were clinically diagnosed as hemorrhagic stroke and 45 cases were clinically diagnosed with an infarct. Out of the 55 hemorrhagic cases, two cases were diagnosed as both hemorrhage and infarct by CT scan, one case had normal CT scan findings and one had subdural haemorrhage. These four cases were excluded while comparing the clinical diagnosis with CT scan finding. Among 51 clinically diagnosed cases of hemorrhagic stroke, 32(62.7% cases were proved by CT scan as hemorrhagic stroke and among clinically diagnosed cases of infarct, 39(86.7% cases were proved by CT scan as infarct which is statistically significant (p <0.001. A significant agreement between clinical and CT diagnosis was observed as indicated by kappa value of 0.49. Sensitivity, specificity, positive predictive value and negative predictive value of clinical findings as compared to CT in diagnosing hemorrhage were 84.2%, 67.2%, 62.8% and 86

  15. The effect of ergonomic training and intervention on reducing occupational stress among computer users

    Directory of Open Access Journals (Sweden)

    T. Yektaee

    2014-05-01

    Result: According to covariance analysis, ergonomic training and interventions lead to reduction of occupational stress of computer users. .Conclusion: Training computer users and informing them of the ergonomic principals and also providing interventions such as correction of posture, reducing duration of work time, using armrest and footrest would have significant implication in reducing occupational stress among computer users.

  16. Reducing Computational Overhead of Network Coding with Intrinsic Information Conveying

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Pedersen, Morten V.

    is RLNC (Random Linear Network Coding) and the goal is to reduce the amount of coding operations both at the coding and decoding node, and at the same time remove the need for dedicated signaling messages. In a traditional RLNC system, coding operation takes up significant computational resources and adds...... the coding operations must be performed in a particular way, which we introduce. Finally we evaluate the suggested system and find that the amount of coding can be significantly reduced both at nodes that recode and decode.......This paper investigated the possibility of intrinsic information conveying in network coding systems. The information is embedded into the coding vector by constructing the vector based on a set of predefined rules. This information can subsequently be retrieved by any receiver. The starting point...

  17. The significance of computed tomography in optic neuropathy

    International Nuclear Information System (INIS)

    Awai, Tsugumi; Yasutake, Hirohide; Ono, Yoshiko; Kumagai, Kazuhisa; Kairada, Kensuke

    1981-01-01

    Computed tomography (CT scan) has become one of the important and useful modes of examination for ophthalmological and neuro-ophthalmological disorders. CT scan (EMI scan) was performed on 21 patients with optic neuropathy in order to detect the cause. Of these 21 patients, the CT scan was abnormal in six. These six patients were verified, histopathologically, as having chromophobe pituitary adenoma, craniopharyngioma, plasmocytoma from sphenoidal sinus, optic nerve glioma and giant aneurysma of anterior communicating artery. The practical diagnostic value of CT scan for optic neuropathy is discussed. (author)

  18. Clinical significance of computed tomographic arteriography for minute hepatocellular carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Itoh, H; Matsui, O; Suzuki, M; Ida, M; Kitagawa, K [Kanazawa Univ. (Japan). School of Medicine

    1982-03-01

    Computed tomographic arteriography (CTA) can clearly demonstrate minute hepatocellular carcinoma (H.C.C.) more than 2 cm in diameter as an enhanced mass lesion. In this case the precise localization of H.C.C. becomes so obvious that CTA plays an important role to evaluate its resectability. However, H.C.C. of the size from 2 cm to 1 cm indiameter, which is visualized with celiac and infusion hepatic angiography, becomes more difficult in detection, and particularly H.C.C. of less than 1 cm in diameter can hardly be recognized, nor be diagnosed as a malignant nodule by CTA, therefore it appears that in these sizes of H.C.C. the detectability of CTA is not superior to the hepatic angiography.

  19. Clinical significance of adrenal computed tomography in Addison's disease

    International Nuclear Information System (INIS)

    Sun, Zhong-Hua; Nomura, Kaoru; Toraya, Shohzoh; Ujihara, Makoto; Horiba, Nobuo; Suda, Toshihiro; Tsushima, Toshio; Demura, Hiroshi; Kono, Atsushi

    1992-01-01

    Adrenal computed tomographic (CT) scanning was conducted in twelve patients with Addison's disease during the clinical course. In tuberculous Addison's disease (n=8), three of four patients examined during the first two years after disease onset had bilaterally enlarged adrenals, while one of four had a unilaterally enlarged one. At least one adrenal gland was enlarged after onset in all six patients examined during the first four years. Thereafter, the adrenal glands was atrophied bilaterally, in contrast to adrenal glands in idiopathic Addison's disease which was atrophied bilaterally from disease onset (n=2). Adrenal calcification was a less sensitive clue in tracing pathogenesis, i.e., adrenal calcification was observed in five of eight patients with tuberculous Addison's disease, but not idiopathic patients. Thus, adrenal CT scanning could show the etiology of Addison's disease (infection or autoimmunity) and the phase of Addison's disease secondary to tuberculosis, which may be clinically important for initiating antituberculous treatment. (author)

  20. Computational toxicology: Its essential role in reducing drug attrition.

    Science.gov (United States)

    Naven, R T; Louise-May, S

    2015-12-01

    Predictive toxicology plays a critical role in reducing the failure rate of new drugs in pharmaceutical research and development. Despite recent gains in our understanding of drug-induced toxicity, however, it is urgent that the utility and limitations of our current predictive tools be determined in order to identify gaps in our understanding of mechanistic and chemical toxicology. Using recently published computational regression analyses of in vitro and in vivo toxicology data, it will be demonstrated that significant gaps remain in early safety screening paradigms. More strategic analyses of these data sets will allow for a better understanding of their domain of applicability and help identify those compounds that cause significant in vivo toxicity but which are currently mis-predicted by in silico and in vitro models. These 'outliers' and falsely predicted compounds are metaphorical lighthouses that shine light on existing toxicological knowledge gaps, and it is essential that these compounds are investigated if attrition is to be reduced significantly in the future. As such, the modern computational toxicologist is more productively engaged in understanding these gaps and driving investigative toxicology towards addressing them. © The Author(s) 2015.

  1. Reduced Calibration Curve for Proton Computed Tomography

    International Nuclear Information System (INIS)

    Yevseyeva, Olga; Assis, Joaquim de; Evseev, Ivan; Schelin, Hugo; Paschuk, Sergei; Milhoretto, Edney; Setti, Joao; Diaz, Katherin; Hormaza, Joel; Lopes, Ricardo

    2010-01-01

    The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies.

  2. A REDUCE program for symbolic computation of Puiseux expansions

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Tiller, P.

    1991-01-01

    The program is described for computation of Puiseux expansions of algebraic functions. The Newton diagram method is used for construction of initial coefficients of all the Puiseux series at the given point. The program is written in computer algebra language Reduce. Some illustrative examples are given. 20 refs

  3. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  4. Computation of spatial significance of mountain objects extracted from multiscale digital elevation models

    International Nuclear Information System (INIS)

    Sathyamoorthy, Dinesh

    2014-01-01

    The derivation of spatial significance is an important aspect of geospatial analysis and hence, various methods have been proposed to compute the spatial significance of entities based on spatial distances with other entities within the cluster. This paper is aimed at studying the spatial significance of mountain objects extracted from multiscale digital elevation models (DEMs). At each scale, the value of spatial significance index SSI of a mountain object is the minimum number of morphological dilation iterations required to occupy all the other mountain objects in the terrain. The mountain object with the lowest value of SSI is the spatially most significant mountain object, indicating that it has the shortest distance to the other mountain objects. It is observed that as the area of the mountain objects reduce with increasing scale, the distances between the mountain objects increase, resulting in increasing values of SSI. The results obtained indicate that the strategic location of a mountain object at the centre of the terrain is more important than its size in determining its reach to other mountain objects and thus, its spatial significance

  5. Sodium-Reduced Meat and Poultry Products Contain a Significant Amount of Potassium from Food Additives.

    Science.gov (United States)

    Parpia, Arti Sharma; Goldstein, Marc B; Arcand, JoAnne; Cho, France; L'Abbé, Mary R; Darling, Pauline B

    2018-05-01

    counterparts (mean difference [95% CI]: 486 [334-638]; Padditives appearing on the product label ingredient list, did not significantly differ between the two groups. Potassium additives are frequently added to sodium-reduced MPPs in amounts that significantly contribute to the potassium load for patients with impaired renal handling of potassium caused by chronic kidney disease and certain medications. Patients requiring potassium restriction should be counseled to be cautious regarding the potassium content of sodium-reduced MPPs and encouraged to make food choices accordingly. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  6. A pilot weight reduction program over one year significantly reduced DNA strand breaks in obese subjects

    Directory of Open Access Journals (Sweden)

    Karl-Heinz Wagner

    2015-05-01

    Conclusion: A sustainable lifestyle change under supervision including physical activity and diet quality over a period of one year was not only responsible to reduce body weight and BMI but also led to significant reduction in all parameters of the comet assay. These results underline the importance of body weight reduction and highlight the positive changes in DNA stability.

  7. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  8. The significance of reduced respiratory chain enzyme activities: clinical, biochemical and radiological associations.

    Science.gov (United States)

    Mordekar, S R; Guthrie, P; Bonham, J R; Olpin, S E; Hargreaves, I; Baxter, P S

    2006-03-01

    Mitochondrial diseases are an important group of neurometabolic disorders in children with varied clinical presentations and diagnosis that can be difficult to confirm. To report the significance of reduced respiratory chain enzyme (RCE) activity in muscle biopsy samples from children. Retrospective odds ratio was used to compare clinical and biochemical features, DNA studies, neuroimaging, and muscle biopsies in 18 children with and 48 without reduced RCE activity. Children with reduced RCE activity were significantly more likely to have consanguineous parents, to present with acute encephalopathy and lactic acidaemia and/or within the first year of life; to have an axonal neuropathy, CSF lactate >4 mmol/l; and/or to have signal change in the basal ganglia. There were positive associations with a maternal family history of possible mitochondrial cytopathy; a presentation with failure to thrive and lactic acidaemia, ragged red fibres, reduced fibroblast fatty acid oxidation and with an abnormal allopurinol loading test. There was no association with ophthalmic abnormalities, deafness, epilepsy or myopathy. The association of these clinical, biochemical and radiological features with reduced RCE activity suggests a possible causative link.

  9. Computational design of patterned interfaces using reduced order models

    International Nuclear Information System (INIS)

    Vattre, A.J.; Abdolrahim, N.; Kolluri, K.; Demkowicz, M.J.

    2014-01-01

    Patterning is a familiar approach for imparting novel functionalities to free surfaces. We extend the patterning paradigm to interfaces between crystalline solids. Many interfaces have non-uniform internal structures comprised of misfit dislocations, which in turn govern interface properties. We develop and validate a computational strategy for designing interfaces with controlled misfit dislocation patterns by tailoring interface crystallography and composition. Our approach relies on a novel method for predicting the internal structure of interfaces: rather than obtaining it from resource-intensive atomistic simulations, we compute it using an efficient reduced order model based on anisotropic elasticity theory. Moreover, our strategy incorporates interface synthesis as a constraint on the design process. As an illustration, we apply our approach to the design of interfaces with rapid, 1-D point defect diffusion. Patterned interfaces may be integrated into the microstructure of composite materials, markedly improving performance. (authors)

  10. Significance of a postenhancement computed tomography findings in liver cirrhosis: In view of hemodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suck Hong; Kim, Byung Soo [Pusan National University College of Medicine, Pusan (Korea, Republic of)

    1985-04-15

    We observed a significant sign in postenhancement computed tomography of liver cirrhosis, that is visualization of portal venous branches. During postenhancement computed tomography scanning of liver, the portal vein can not be identified in liver parenchyme in 84% of patients without known cirrhosis (including chronic active hepatitis). The two have the same hemodynamic changes in that there is diffuse fibrosis and resultant decrease in vascular bed. Visualization of intrahepatic portal branches in postenhancement computed tomography is because of decreased diffusion ability and portal hypertension.

  11. Fixed-point image orthorectification algorithms for reduced computational cost

    Science.gov (United States)

    French, Joseph Clinton

    Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation

  12. Male circumcision significantly reduces prevalence and load of genital anaerobic bacteria.

    Science.gov (United States)

    Liu, Cindy M; Hungate, Bruce A; Tobian, Aaron A R; Serwadda, David; Ravel, Jacques; Lester, Richard; Kigozi, Godfrey; Aziz, Maliha; Galiwango, Ronald M; Nalugoda, Fred; Contente-Cuomo, Tania L; Wawer, Maria J; Keim, Paul; Gray, Ronald H; Price, Lance B

    2013-04-16

    Male circumcision reduces female-to-male HIV transmission. Hypothesized mechanisms for this protective effect include decreased HIV target cell recruitment and activation due to changes in the penis microbiome. We compared the coronal sulcus microbiota of men from a group of uncircumcised controls (n = 77) and from a circumcised intervention group (n = 79) at enrollment and year 1 follow-up in a randomized circumcision trial in Rakai, Uganda. We characterized microbiota using16S rRNA gene-based quantitative PCR (qPCR) and pyrosequencing, log response ratio (LRR), Bayesian classification, nonmetric multidimensional scaling (nMDS), and permutational multivariate analysis of variance (PerMANOVA). At baseline, men in both study arms had comparable coronal sulcus microbiota; however, by year 1, circumcision decreased the total bacterial load and reduced microbiota biodiversity. Specifically, the prevalence and absolute abundance of 12 anaerobic bacterial taxa decreased significantly in the circumcised men. While aerobic bacterial taxa also increased postcircumcision, these gains were minor. The reduction in anaerobes may partly account for the effects of circumcision on reduced HIV acquisition. The bacterial changes identified in this study may play an important role in the HIV risk reduction conferred by male circumcision. Decreasing the load of specific anaerobes could reduce HIV target cell recruitment to the foreskin. Understanding the mechanisms that underlie the benefits of male circumcision could help to identify new intervention strategies for decreasing HIV transmission, applicable to populations with high HIV prevalence where male circumcision is culturally less acceptable.

  13. A chimpanzee recognizes synthetic speech with significantly reduced acoustic cues to phonetic content.

    Science.gov (United States)

    Heimbauer, Lisa A; Beran, Michael J; Owren, Michael J

    2011-07-26

    A long-standing debate concerns whether humans are specialized for speech perception, which some researchers argue is demonstrated by the ability to understand synthetic speech with significantly reduced acoustic cues to phonetic content. We tested a chimpanzee (Pan troglodytes) that recognizes 128 spoken words, asking whether she could understand such speech. Three experiments presented 48 individual words, with the animal selecting a corresponding visuographic symbol from among four alternatives. Experiment 1 tested spectrally reduced, noise-vocoded (NV) synthesis, originally developed to simulate input received by human cochlear-implant users. Experiment 2 tested "impossibly unspeechlike" sine-wave (SW) synthesis, which reduces speech to just three moving tones. Although receiving only intermittent and noncontingent reward, the chimpanzee performed well above chance level, including when hearing synthetic versions for the first time. Recognition of SW words was least accurate but improved in experiment 3 when natural words in the same session were rewarded. The chimpanzee was more accurate with NV than SW versions, as were 32 human participants hearing these items. The chimpanzee's ability to spontaneously recognize acoustically reduced synthetic words suggests that experience rather than specialization is critical for speech-perception capabilities that some have suggested are uniquely human. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Defibrillator charging before rhythm analysis significantly reduces hands-off time during resuscitation

    DEFF Research Database (Denmark)

    Hansen, L. K.; Folkestad, L.; Brabrand, M.

    2013-01-01

    BACKGROUND: Our objective was to reduce hands-off time during cardiopulmonary resuscitation as increased hands-off time leads to higher mortality. METHODS: The European Resuscitation Council (ERC) 2005 and ERC 2010 guidelines were compared with an alternative sequence (ALT). Pulseless ventricular...... physicians were included. All had prior experience in advanced life support. Chest compressions were shorter interrupted using ALT (mean, 6.7 vs 13.0 seconds). Analyzing data for ventricular tachycardia scenarios only, hands-off time was shorter using ALT (mean, 7.1 vs 18.2 seconds). In ERC 2010 vs ALT, 12...... physicians were included. Two physicians had not prior experience in advanced life support. Hands-off time was reduced using ALT (mean, 3.9 vs 5.6 seconds). Looking solely at ventricular tachycardia scenarios, hands-off time was shortened using ALT (mean, 4.5 vs 7.6 seconds). No significant reduction...

  15. Reduced content of chloroatranol and atranol in oak moss absolute significantly reduces the elicitation potential of this fragrance material.

    Science.gov (United States)

    Andersen, Flemming; Andersen, Kirsten H; Bernois, Armand; Brault, Christophe; Bruze, Magnus; Eudes, Hervé; Gadras, Catherine; Signoret, Anne-Cécile J; Mose, Kristian F; Müller, Boris P; Toulemonde, Bernard; Andersen, Klaus Ejner

    2015-02-01

    Oak moss absolute, an extract from the lichen Evernia prunastri, is a valued perfume ingredient but contains extreme allergens. To compare the elicitation properties of two preparations of oak moss absolute: 'classic oak moss', the historically used preparation, and 'new oak moss', with reduced contents of the major allergens atranol and chloroatranol. The two preparations were compared in randomized double-blinded repeated open application tests and serial dilution patch tests in 30 oak moss-sensitive volunteers and 30 non-allergic control subjects. In both test models, new oak moss elicited significantly less allergic contact dermatitis in oak moss-sensitive subjects than classic oak moss. The control subjects did not react to either of the preparations. New oak moss is still a fragrance allergen, but elicits less allergic contact dermatitis in previously oak moss-sensitized individuals, suggesting that new oak moss is less allergenic to non-sensitized individuals. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Four-phonon scattering significantly reduces intrinsic thermal conductivity of solids

    Science.gov (United States)

    Feng, Tianli; Lindsay, Lucas; Ruan, Xiulin

    2017-10-01

    For decades, the three-phonon scattering process has been considered to govern thermal transport in solids, while the role of higher-order four-phonon scattering has been persistently unclear and so ignored. However, recent quantitative calculations of three-phonon scattering have often shown a significant overestimation of thermal conductivity as compared to experimental values. In this Rapid Communication we show that four-phonon scattering is generally important in solids and can remedy such discrepancies. For silicon and diamond, the predicted thermal conductivity is reduced by 30% at 1000 K after including four-phonon scattering, bringing predictions in excellent agreement with measurements. For the projected ultrahigh-thermal conductivity material, zinc-blende BAs, a competitor of diamond as a heat sink material, four-phonon scattering is found to be strikingly strong as three-phonon processes have an extremely limited phase space for scattering. The four-phonon scattering reduces the predicted thermal conductivity from 2200 to 1400 W/m K at room temperature. The reduction at 1000 K is 60%. We also find that optical phonon scattering rates are largely affected, being important in applications such as phonon bottlenecks in equilibrating electronic excitations. Recognizing that four-phonon scattering is expensive to calculate, in the end we provide some guidelines on how to quickly assess the significance of four-phonon scattering, based on energy surface anharmonicity and the scattering phase space. Our work clears the decades-long fundamental question of the significance of higher-order scattering, and points out ways to improve thermoelectrics, thermal barrier coatings, nuclear materials, and radiative heat transfer.

  17. Incorporation of catalytic dehydrogenation into fischer-tropsch synthesis to significantly reduce carbon dioxide emissions

    Science.gov (United States)

    Huffman, Gerald P.

    2012-11-13

    A new method of producing liquid transportation fuels from coal and other hydrocarbons that significantly reduces carbon dioxide emissions by combining Fischer-Tropsch synthesis with catalytic dehydrogenation is claimed. Catalytic dehydrogenation (CDH) of the gaseous products (C1-C4) of Fischer-Tropsch synthesis (FTS) can produce large quantities of hydrogen while converting the carbon to multi-walled carbon nanotubes (MWCNT). Incorporation of CDH into a FTS-CDH plant converting coal to liquid fuels can eliminate all or most of the CO.sub.2 emissions from the water-gas shift (WGS) reaction that is currently used to elevate the H.sub.2 level of coal-derived syngas for FTS. Additionally, the FTS-CDH process saves large amounts of water used by the WGS reaction and produces a valuable by-product, MWCNT.

  18. Nano-CL-20/HMX Cocrystal Explosive for Significantly Reduced Mechanical Sensitivity

    Directory of Open Access Journals (Sweden)

    Chongwei An

    2017-01-01

    Full Text Available Spray drying method was used to prepare cocrystals of hexanitrohexaazaisowurtzitane (CL-20 and cyclotetramethylene tetranitramine (HMX. Raw materials and cocrystals were characterized using scanning electron microscopy, X-ray diffraction, differential scanning calorimetry, Raman spectroscopy, and Fourier transform infrared spectroscopy. Impact and friction sensitivity of cocrystals were tested and analyzed. Results show that, after preparation by spray drying method, microparticles were spherical in shape and 0.5–5 µm in size. Particles formed aggregates of numerous tiny plate-like cocrystals, whereas CL-20/HMX cocrystals had thicknesses of below 100 nm. Cocrystals were formed by C–H⋯O bonding between –NO2 (CL-20 and –CH2– (HMX. Nanococrystal explosives exhibited drop height of 47.3 cm, and friction demonstrated explosion probability of 64%. Compared with raw HMX, cocrystals displayed significantly reduced mechanical sensitivity.

  19. Implementation of standardized follow-up care significantly reduces peritonitis in children on chronic peritoneal dialysis.

    Science.gov (United States)

    Neu, Alicia M; Richardson, Troy; Lawlor, John; Stuart, Jayne; Newland, Jason; McAfee, Nancy; Warady, Bradley A

    2016-06-01

    The Standardizing Care to improve Outcomes in Pediatric End stage renal disease (SCOPE) Collaborative aims to reduce peritonitis rates in pediatric chronic peritoneal dialysis patients by increasing implementation of standardized care practices. To assess this, monthly care bundle compliance and annualized monthly peritonitis rates were evaluated from 24 SCOPE centers that were participating at collaborative launch and that provided peritonitis rates for the 13 months prior to launch. Changes in bundle compliance were assessed using either a logistic regression model or a generalized linear mixed model. Changes in average annualized peritonitis rates over time were illustrated using the latter model. In the first 36 months of the collaborative, 644 patients with 7977 follow-up encounters were included. The likelihood of compliance with follow-up care practices increased significantly (odds ratio 1.15, 95% confidence interval 1.10, 1.19). Mean monthly peritonitis rates significantly decreased from 0.63 episodes per patient year (95% confidence interval 0.43, 0.92) prelaunch to 0.42 (95% confidence interval 0.31, 0.57) at 36 months postlaunch. A sensitivity analysis confirmed that as mean follow-up compliance increased, peritonitis rates decreased, reaching statistical significance at 80% at which point the prelaunch rate was 42% higher than the rate in the months following achievement of 80% compliance. In its first 3 years, the SCOPE Collaborative has increased the implementation of standardized follow-up care and demonstrated a significant reduction in average monthly peritonitis rates. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  20. Intensity-modulated radiotherapy significantly reduces xerostomia compared with conventional radiotherapy

    International Nuclear Information System (INIS)

    Braam, Petra M.; Terhaard, Chris H.J. M.D.; Roesink, Judith M.; Raaijmakers, Cornelis P.J.

    2006-01-01

    Purpose: Xerostomia is a severe complication after radiotherapy for oropharyngeal cancer, as the salivary glands are in close proximity with the primary tumor. Intensity-modulated radiotherapy (IMRT) offers theoretical advantages for normal tissue sparing. A Phase II study was conducted to determine the value of IMRT for salivary output preservation compared with conventional radiotherapy (CRT). Methods and Materials: A total of 56 patients with oropharyngeal cancer were prospectively evaluated. Of these, 30 patients were treated with IMRT and 26 with CRT. Stimulated parotid salivary flow was measured before, 6 weeks, and 6 months after treatment. A complication was defined as a stimulated parotid flow rate <25% of the preradiotherapy flow rate. Results: The mean dose to the parotid glands was 48.1 Gy (SD 14 Gy) for CRT and 33.7 Gy (SD 10 Gy) for IMRT (p < 0.005). The mean parotid flow ratio 6 weeks and 6 months after treatment was respectively 41% and 64% for IMRT and respectively 11% and 18% for CRT. As a result, 6 weeks after treatment, the number of parotid flow complications was significantly lower after IMRT (55%) than after CRT (87%) (p = 0.002). The number of complications 6 months after treatment was 56% for IMRT and 81% for CRT (p = 0.04). Conclusions: IMRT significantly reduces the number of parotid flow complications for patients with oropharyngeal cancer

  1. Induction-heating MOCVD reactor with significantly improved heating efficiency and reduced harmful magnetic coupling

    KAUST Repository

    Li, Kuang-Hui; Alotaibi, Hamad S.; Sun, Haiding; Lin, Ronghui; Guo, Wenzhe; Torres-Castanedo, Carlos G.; Liu, Kaikai; Galan, Sergio V.; Li, Xiaohang

    2018-01-01

    In a conventional induction-heating III-nitride metalorganic chemical vapor deposition (MOCVD) reactor, the induction coil is outside the chamber. Therefore, the magnetic field does not couple with the susceptor well, leading to compromised heating efficiency and harmful coupling with the gas inlet and thus possible overheating. Hence, the gas inlet has to be at a minimum distance away from the susceptor. Because of the elongated flow path, premature reactions can be more severe, particularly between Al- and B-containing precursors and NH3. Here, we propose a structure that can significantly improve the heating efficiency and allow the gas inlet to be closer to the susceptor. Specifically, the induction coil is designed to surround the vertical cylinder of a T-shaped susceptor comprising the cylinder and a top horizontal plate holding the wafer substrate within the reactor. Therefore, the cylinder coupled most magnetic field to serve as the thermal source for the plate. Furthermore, the plate can block and thus significantly reduce the uncoupled magnetic field above the susceptor, thereby allowing the gas inlet to be closer. The results show approximately 140% and 2.6 times increase in the heating and susceptor coupling efficiencies, respectively, as well as a 90% reduction in the harmful magnetic flux on the gas inlet.

  2. Induction-heating MOCVD reactor with significantly improved heating efficiency and reduced harmful magnetic coupling

    KAUST Repository

    Li, Kuang-Hui

    2018-02-23

    In a conventional induction-heating III-nitride metalorganic chemical vapor deposition (MOCVD) reactor, the induction coil is outside the chamber. Therefore, the magnetic field does not couple with the susceptor well, leading to compromised heating efficiency and harmful coupling with the gas inlet and thus possible overheating. Hence, the gas inlet has to be at a minimum distance away from the susceptor. Because of the elongated flow path, premature reactions can be more severe, particularly between Al- and B-containing precursors and NH3. Here, we propose a structure that can significantly improve the heating efficiency and allow the gas inlet to be closer to the susceptor. Specifically, the induction coil is designed to surround the vertical cylinder of a T-shaped susceptor comprising the cylinder and a top horizontal plate holding the wafer substrate within the reactor. Therefore, the cylinder coupled most magnetic field to serve as the thermal source for the plate. Furthermore, the plate can block and thus significantly reduce the uncoupled magnetic field above the susceptor, thereby allowing the gas inlet to be closer. The results show approximately 140% and 2.6 times increase in the heating and susceptor coupling efficiencies, respectively, as well as a 90% reduction in the harmful magnetic flux on the gas inlet.

  3. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  4. Reduced frontal and occipital lobe asymmetry on the CT-scans of schizophrenic patients. Its specificity and clinical significance

    International Nuclear Information System (INIS)

    Falkai, P.; Schneider, T.; Greve, B.; Klieser, E.; Bogerts, B.

    1995-01-01

    Frontal and occipital lobe widths were determined in the computed tomographic (CT) scans of 135 schizophrenic patients, 158 neuro psychiatrically healthy and 102 psychiatric control subjects, including patients with affective psychosis, neurosis and schizoaffective psychosis. Most healthy right-handed subjects demonstrate a relative enlargement of the right frontal as well as left occipital lobe compared to the opposite hemisphere. These normal frontal and occipital lobe asymmetries were selectively reduced in schizophrenics (f.: 5%, p < .0005; o.: 3%, p < .05), irrespective of the pathophysiological subgroup. Schizophrenic neuroleptic non-responders revealed a significant reduction of frontal lobe asymmetry (3%, p < .05), while no correlation between BPRS-sub scores and disturbed cerebral laterality could be detected. In sum the present study demonstrates the disturbed cerebral lateralisation in schizophrenic patients supporting the hypothesis of interrupted early brain development in schizophrenia. (author)

  5. Using lytic bacteriophages to eliminate or significantly reduce contamination of food by foodborne bacterial pathogens.

    Science.gov (United States)

    Sulakvelidze, Alexander

    2013-10-01

    Bacteriophages (also called 'phages') are viruses that kill bacteria. They are arguably the oldest (3 billion years old, by some estimates) and most ubiquitous (total number estimated to be 10(30) -10(32) ) known organisms on Earth. Phages play a key role in maintaining microbial balance in every ecosystem where bacteria exist, and they are part of the normal microflora of all fresh, unprocessed foods. Interest in various practical applications of bacteriophages has been gaining momentum recently, with perhaps the most attention focused on using them to improve food safety. That approach, called 'phage biocontrol', typically includes three main types of applications: (i) using phages to treat domesticated livestock in order to reduce their intestinal colonization with, and shedding of, specific bacterial pathogens; (ii) treatments for decontaminating inanimate surfaces in food-processing facilities and other food establishments, so that foods processed on those surfaces are not cross-contaminated with the targeted pathogens; and (iii) post-harvest treatments involving direct applications of phages onto the harvested foods. This mini-review primarily focuses on the last type of intervention, which has been gaining the most momentum recently. Indeed, the results of recent studies dealing with improving food safety, and several recent regulatory approvals of various commercial phage preparations developed for post-harvest food safety applications, strongly support the idea that lytic phages may provide a safe, environmentally-friendly, and effective approach for significantly reducing contamination of various foods with foodborne bacterial pathogens. However, some important technical and nontechnical problems may need to be addressed before phage biocontrol protocols can become an integral part of routine food safety intervention strategies implemented by food industries in the USA. © 2013 Society of Chemical Industry.

  6. Pharmacological kynurenine 3-monooxygenase enzyme inhibition significantly reduces neuropathic pain in a rat model.

    Science.gov (United States)

    Rojewska, Ewelina; Piotrowska, Anna; Makuch, Wioletta; Przewlocka, Barbara; Mika, Joanna

    2016-03-01

    Recent studies have highlighted the involvement of the kynurenine pathway in the pathology of neurodegenerative diseases, but the role of this system in neuropathic pain requires further extensive research. Therefore, the aim of our study was to examine the role of kynurenine 3-monooxygenase (Kmo), an enzyme that is important in this pathway, in a rat model of neuropathy after chronic constriction injury (CCI) to the sciatic nerve. For the first time, we demonstrated that the injury-induced increase in the Kmo mRNA levels in the spinal cord and the dorsal root ganglia (DRG) was reduced by chronic administration of the microglial inhibitor minocycline and that this effect paralleled a decrease in the intensity of neuropathy. Further, minocycline administration alleviated the lipopolysaccharide (LPS)-induced upregulation of Kmo mRNA expression in microglial cell cultures. Moreover, we demonstrated that not only indirect inhibition of Kmo using minocycline but also direct inhibition using Kmo inhibitors (Ro61-6048 and JM6) decreased neuropathic pain intensity on the third and the seventh days after CCI. Chronic Ro61-6048 administration diminished the protein levels of IBA-1, IL-6, IL-1beta and NOS2 in the spinal cord and/or the DRG. Both Kmo inhibitors potentiated the analgesic properties of morphine. In summary, our data suggest that in neuropathic pain model, inhibiting Kmo function significantly reduces pain symptoms and enhances the effectiveness of morphine. The results of our studies show that the kynurenine pathway is an important mediator of neuropathic pain pathology and indicate that Kmo represents a novel pharmacological target for the treatment of neuropathy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Quantized Average Consensus on Gossip Digraphs with Reduced Computation

    Science.gov (United States)

    Cai, Kai; Ishii, Hideaki

    The authors have recently proposed a class of randomized gossip algorithms which solve the distributed averaging problem on directed graphs, with the constraint that each node has an integer-valued state. The essence of this algorithm is to maintain local records, called “surplus”, of individual state updates, thereby achieving quantized average consensus even though the state sum of all nodes is not preserved. In this paper we study a modified version of this algorithm, whose feature is primarily in reducing both computation and communication effort. Concretely, each node needs to update fewer local variables, and can transmit surplus by requiring only one bit. Under this modified algorithm we prove that reaching the average is ensured for arbitrary strongly connected graphs. The condition of arbitrary strong connection is less restrictive than those known in the literature for either real-valued or quantized states; in particular, it does not require the special structure on the network called balanced. Finally, we provide numerical examples to illustrate the convergence result, with emphasis on convergence time analysis.

  8. A computer language for reducing activation analysis data

    International Nuclear Information System (INIS)

    Friedman, M.H.; Tanner, J.T.

    1978-01-01

    A program, written in FORTRAN, which defines a language for reducing activation analysis data is described. An attempt was made to optimize the choice of commands and their definitions so as to concisely express what should be done, make the language natural to use and easy to learn, arranqe a system of checks to guard against communication errors and have the language be inclusive. Communications are effected through commands, and these can be given in almost any order. Consistency checks are done and diagnostic messages are printed automatically to guard against the incorrect use of commands. Default options on the commands allow instructions to be expressed concisely while providing a capability to specify details for the data reduction process. The program has been implemented on a UNIVAC 1108 computer. A complete description of the commands, the algorithms used, and the internal consistency checks used are given elsewhere. The applications of the program and the methods for obtaining data automatically have already been described. (T.G.)

  9. Intriguing model significantly reduces boarding of psychiatric patients, need for inpatient hospitalization.

    Science.gov (United States)

    2015-01-01

    As new approaches to the care of psychiatric emergencies emerge, one solution is gaining particular traction. Under the Alameda model, which has been put into practice in Alameda County, CA, patients who are brought to regional EDs with emergency psychiatric issues are quickly transferred to a designated emergency psychiatric facility as soon as they are medically stabilized. This alleviates boarding problems in area EDs while also quickly connecting patients with specialized care. With data in hand on the model's effectiveness, developers believe the approach could alleviate boarding problems in other communities as well. The model is funded by through a billing code established by California's Medicaid program for crisis stabilization services. Currently, only 22% of the patients brought to the emergency psychiatric facility ultimately need to be hospitalized; the other 78% are able to go home or to an alternative situation. In a 30-day study of the model, involving five community hospitals in Alameda County, CA, researchers found that ED boarding times were as much as 80% lower than comparable ED averages, and that patients were stabilized at least 75% of the time, significantly reducing the need for inpatient hospitalization.

  10. Significantly reduced hypoxemic events in morbidly obese patients undergoing gastrointestinal endoscopy: Predictors and practice effect

    Directory of Open Access Journals (Sweden)

    Basavana Gouda Goudra

    2014-01-01

    Full Text Available Background: Providing anesthesia for gastrointestinal (GI endoscopy procedures in morbidly obese patients is a challenge for a variety of reasons. The negative impact of obesity on the respiratory system combined with a need to share the upper airway and necessity to preserve the spontaneous ventilation, together add to difficulties. Materials and Methods: This retrospective cohort study included patients with a body mass index (BMI >40 kg/m 2 that underwent out-patient GI endoscopy between September 2010 and February 2011. Patient data was analyzed for procedure, airway management technique as well as hypoxemic and cardiovascular events. Results: A total of 119 patients met the inclusion criteria. Our innovative airway management technique resulted in a lower rate of intraoperative hypoxemic events compared with any published data available. Frequency of desaturation episodes showed statistically significant relation to previous history of obstructive sleep apnea (OSA. These desaturation episodes were found to be statistically independent of increasing BMI of patients. Conclusion: Pre-operative history of OSA irrespective of associated BMI values can be potentially used as a predictor of intra-procedural desaturation. With suitable modification of anesthesia technique, it is possible to reduce the incidence of adverse respiratory events in morbidly obese patients undergoing GI endoscopy procedures, thereby avoiding the need for endotracheal intubation.

  11. Significantly reduced hypoxemic events in morbidly obese patients undergoing gastrointestinal endoscopy: Predictors and practice effect.

    Science.gov (United States)

    Goudra, Basavana Gouda; Singh, Preet Mohinder; Penugonda, Lakshmi C; Speck, Rebecca M; Sinha, Ashish C

    2014-01-01

    Providing anesthesia for gastrointestinal (GI) endoscopy procedures in morbidly obese patients is a challenge for a variety of reasons. The negative impact of obesity on the respiratory system combined with a need to share the upper airway and necessity to preserve the spontaneous ventilation, together add to difficulties. This retrospective cohort study included patients with a body mass index (BMI) >40 kg/m(2) that underwent out-patient GI endoscopy between September 2010 and February 2011. Patient data was analyzed for procedure, airway management technique as well as hypoxemic and cardiovascular events. A total of 119 patients met the inclusion criteria. Our innovative airway management technique resulted in a lower rate of intraoperative hypoxemic events compared with any published data available. Frequency of desaturation episodes showed statistically significant relation to previous history of obstructive sleep apnea (OSA). These desaturation episodes were found to be statistically independent of increasing BMI of patients. Pre-operative history of OSA irrespective of associated BMI values can be potentially used as a predictor of intra-procedural desaturation. With suitable modification of anesthesia technique, it is possible to reduce the incidence of adverse respiratory events in morbidly obese patients undergoing GI endoscopy procedures, thereby avoiding the need for endotracheal intubation.

  12. Reduced-Complexity Direction of Arrival Estimation Using Real-Valued Computation with Arbitrary Array Configurations

    Directory of Open Access Journals (Sweden)

    Feng-Gang Yan

    2018-01-01

    Full Text Available A low-complexity algorithm is presented to dramatically reduce the complexity of the multiple signal classification (MUSIC algorithm for direction of arrival (DOA estimation, in which both tasks of eigenvalue decomposition (EVD and spectral search are implemented with efficient real-valued computations, leading to about 75% complexity reduction as compared to the standard MUSIC. Furthermore, the proposed technique has no dependence on array configurations and is hence suitable for arbitrary array geometries, which shows a significant implementation advantage over most state-of-the-art unitary estimators including unitary MUSIC (U-MUSIC. Numerical simulations over a wide range of scenarios are conducted to show the performance of the new technique, which demonstrates that with a significantly reduced computational complexity, the new approach is able to provide a close accuracy to the standard MUSIC.

  13. Reducing image noise in computed tomography (CT) colonography: effect of an integrated circuit CT detector.

    Science.gov (United States)

    Liu, Yu; Leng, Shuai; Michalak, Gregory J; Vrieze, Thomas J; Duan, Xinhui; Qu, Mingliang; Shiung, Maria M; McCollough, Cynthia H; Fletcher, Joel G

    2014-01-01

    To investigate whether the integrated circuit (IC) detector results in reduced noise in computed tomography (CT) colonography (CTC). Three hundred sixty-six consecutive patients underwent clinically indicated CTC using the same CT scanner system, except for a difference in CT detectors (IC or conventional). Image noise, patient size, and scanner radiation output (volume CT dose index) were quantitatively compared between patient cohorts using each detector system, with separate comparisons for the abdomen and pelvis. For the abdomen and pelvis, despite significantly larger patient sizes in the IC detector cohort (both P 0.18). Based on the observed image noise reduction, radiation dose could alternatively be reduced by approximately 20% to result in similar levels of image noise. Computed tomography colonography images acquired using the IC detector had significantly lower noise than images acquired using the conventional detector. This noise reduction can permit further radiation dose reduction in CTC.

  14. Environmental program with operational cases to reduce risk to the marine environment significantly

    International Nuclear Information System (INIS)

    Cline, J.T.; Forde, R.

    1991-01-01

    In this paper Amoco Norway Oil Company's environmental program is detailed, followed by example operational programs and achievements aimed to minimize environmental risks to the marine environment at Valhall platform. With a corporate goal to be a leader in protecting the environment, the appropriate strategies and policies that form the basis of the environmental management system are incorporated in the quality assurance programs. Also, included in the program are necessary organizational structures, responsibilities of environmental affairs and line organization personnel, compliance procedures and a waste task force obliged to implement operations improvements. An internal environmental audit system has been initiated, in addition to corporate level audits, which, when communicated to the line organization closes the environmental management loop through experience feed back. Environmental projects underway are significantly decreasing the extent and/or risk of pollution from offshore activities. The cradle to grave responsibility is assumed with waste separated offshore and onshore followed by disposal in audited sites. A $5 MM program is underway to control produced oily solids and reduce oil in produced water aiming to less than 20 ppm. When oil-based mud is used in deeper hole sections, drill solids disposed at sea average less than 60 g oil/kg dry cuttings using appropriate shaker screens, and a washing/centrifuge system to remove fines. Certain oily liquid wastes are being injected down hole whereas previously they were burned using a mud burner. Finally, a program is underway with a goal to eliminate sea discharge of oil on cuttings through injection disposal of oily wastes, drilling with alternative muds such as a cationic water base mud, and/or proper onshore disposal of oily wastes

  15. Simultaneous bilateral stereotactic procedure for deep brain stimulation implants: a significant step for reducing operation time.

    Science.gov (United States)

    Fonoff, Erich Talamoni; Azevedo, Angelo; Angelos, Jairo Silva Dos; Martinez, Raquel Chacon Ruiz; Navarro, Jessie; Reis, Paul Rodrigo; Sepulveda, Miguel Ernesto San Martin; Cury, Rubens Gisbert; Ghilardi, Maria Gabriela Dos Santos; Teixeira, Manoel Jacobsen; Lopez, William Omar Contreras

    2016-07-01

    OBJECT Currently, bilateral procedures involve 2 sequential implants in each of the hemispheres. The present report demonstrates the feasibility of simultaneous bilateral procedures during the implantation of deep brain stimulation (DBS) leads. METHODS Fifty-seven patients with movement disorders underwent bilateral DBS implantation in the same study period. The authors compared the time required for the surgical implantation of deep brain electrodes in 2 randomly assigned groups. One group of 28 patients underwent traditional sequential electrode implantation, and the other 29 patients underwent simultaneous bilateral implantation. Clinical outcomes of the patients with Parkinson's disease (PD) who had undergone DBS implantation of the subthalamic nucleus using either of the 2 techniques were compared. RESULTS Overall, a reduction of 38.51% in total operating time for the simultaneous bilateral group (136.4 ± 20.93 minutes) as compared with that for the traditional consecutive approach (220.3 ± 27.58 minutes) was observed. Regarding clinical outcomes in the PD patients who underwent subthalamic nucleus DBS implantation, comparing the preoperative off-medication condition with the off-medication/on-stimulation condition 1 year after the surgery in both procedure groups, there was a mean 47.8% ± 9.5% improvement in the Unified Parkinson's Disease Rating Scale Part III (UPDRS-III) score in the simultaneous group, while the sequential group experienced 47.5% ± 15.8% improvement (p = 0.96). Moreover, a marked reduction in the levodopa-equivalent dose from preoperatively to postoperatively was similar in these 2 groups. The simultaneous bilateral procedure presented major advantages over the traditional sequential approach, with a shorter total operating time. CONCLUSIONS A simultaneous stereotactic approach significantly reduces the operation time in bilateral DBS procedures, resulting in decreased microrecording time, contributing to the optimization of functional

  16. Computational Approach to Annotating Variants of Unknown Significance in Clinical Next Generation Sequencing.

    Science.gov (United States)

    Schulz, Wade L; Tormey, Christopher A; Torres, Richard

    2015-01-01

    Next generation sequencing (NGS) has become a common technology in the clinical laboratory, particularly for the analysis of malignant neoplasms. However, most mutations identified by NGS are variants of unknown clinical significance (VOUS). Although the approach to define these variants differs by institution, software algorithms that predict variant effect on protein function may be used. However, these algorithms commonly generate conflicting results, potentially adding uncertainty to interpretation. In this review, we examine several computational tools used to predict whether a variant has clinical significance. In addition to describing the role of these tools in clinical diagnostics, we assess their efficacy in analyzing known pathogenic and benign variants in hematologic malignancies. Copyright© by the American Society for Clinical Pathology (ASCP).

  17. Soil nitrate reducing processes drivers, mechanisms for spatial variation, and significance for nitrous oxide production

    OpenAIRE

    Giles, M.; Morley, N.; Baggs, E.M.; Daniell, T.J.

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium\\ud (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for\\ud the loss of nitrate (NO−\\ud 3 ) and production of the potent greenhouse gas, nitrous oxide (N2O).\\ud A number of factors are known to control these processes, including O2 concentrations and\\ud moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms\\ud responsible for the ...

  18. Pegasus project. DLC coating and low viscosity oil reduce energy losses significantly

    Energy Technology Data Exchange (ETDEWEB)

    Doerwald, Dave; Jacobs, Ruud [Hauzer Techno Coating (Netherlands). Tribological Coatings

    2012-03-15

    Pegasus, the flying horse from Greek mythology, is a suitable name for the research project initiated by a German automotive OEM with participation of Hauzer Techno Coating and several automotive suppliers. It will enable future automotive vehicles to reduce fuel consumption without losing power. The project described in this article focuses on the rear differential, because reducing friction here can contribute considerably to efficiency improvement of the whole vehicle. Surfaces, coating and oil viscosity have been investigated and interesting conclusions have been reached. (orig.)

  19. Mindfulness significantly reduces self-reported levels of anxiety and depression

    DEFF Research Database (Denmark)

    Würtzen, Hanne; Dalton, Susanne Oksbjerg; Elsass, Peter

    2013-01-01

    INTRODUCTION: As the incidence of and survival from breast cancer continue to raise, interventions to reduce anxiety and depression before, during and after treatment are needed. Previous studies have reported positive effects of a structured 8-week group mindfulness-based stress reduction program...

  20. Soil nitrate reducing processes – drivers, mechanisms for spatial variation, and significance for nitrous oxide production

    Science.gov (United States)

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M.; Daniell, Tim J.

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3−) and production of the potent greenhouse gas, nitrous oxide (N2O). A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N2O production from soils. PMID:23264770

  1. Soil nitrate reducing processes – drivers, mechanisms for spatial variation and significance for nitrous oxide production

    Directory of Open Access Journals (Sweden)

    Madeline Eleanore Giles

    2012-12-01

    Full Text Available The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3-¬ and production of the potent greenhouse gas, nitrous oxide (N2O. A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub cm areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location and potential for N2O production from soils.

  2. Soil nitrate reducing processes - drivers, mechanisms for spatial variation, and significance for nitrous oxide production.

    Science.gov (United States)

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M; Daniell, Tim J

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate ([Formula: see text]) and production of the potent greenhouse gas, nitrous oxide (N(2)O). A number of factors are known to control these processes, including O(2) concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N(2)O production from soils.

  3. Reducing constraints on quantum computer design by encoded selective recoupling

    International Nuclear Information System (INIS)

    Lidar, D.A.; Wu, L.-A.

    2002-01-01

    The requirement of performing both single-qubit and two-qubit operations in the implementation of universal quantum logic often leads to very demanding constraints on quantum computer design. We show here how to eliminate the need for single-qubit operations in a large subset of quantum computer proposals: those governed by isotropic and XXZ , XY -type anisotropic exchange interactions. Our method employs an encoding of one logical qubit into two physical qubits, while logic operations are performed using an analogue of the NMR selective recoupling method

  4. Reducing dysfunctional beliefs about sleep does not significantly improve insomnia in cognitive behavioral therapy.

    Science.gov (United States)

    Okajima, Isa; Nakajima, Shun; Ochi, Moeko; Inoue, Yuichi

    2014-01-01

    The present study examined to examine whether improvement of insomnia is mediated by a reduction in sleep-related dysfunctional beliefs through cognitive behavioral therapy for insomnia. In total, 64 patients with chronic insomnia received cognitive behavioral therapy for insomnia consisting of 6 biweekly individual treatment sessions of 50 minutes in length. Participants were asked to complete the Athens Insomnia Scale and the Dysfunctional Beliefs and Attitudes about Sleep scale both at the baseline and at the end of treatment. The results showed that although cognitive behavioral therapy for insomnia greatly reduced individuals' scores on both scales, the decrease in dysfunctional beliefs and attitudes about sleep with treatment did not seem to mediate improvement in insomnia. The findings suggest that sleep-related dysfunctional beliefs endorsed by patients with chronic insomnia may be attenuated by cognitive behavioral therapy for insomnia, but changes in such beliefs are not likely to play a crucial role in reducing the severity of insomnia.

  5. Significant prognosticators after primary radiotherapy in 903 nondisseminated nasopharyngeal carcinoma evaluated by computer tomography

    International Nuclear Information System (INIS)

    Teo, P.; Yu, P.; Lee, W.Y.; Leung, S.F.; Kwan, W.H.; Yu, K.H.; Choi, P.; Johnson, P.J.

    1996-01-01

    Purpose: To evaluate the significant prognosticators in nasopharyngeal carcinoma (NPC). Methods and Materials: From 1984 to 1989, 903 treatment-naive nondisseminated (MO) NPC were given primary radical radiotherapy to 60-62.5 Gy in 6 weeks. All patients had computed tomographic (CT) and endoscopic evaluation of the primary tumor. Potentially significant parameters (the patient's age and sex, the anatomical structures infiltrated by the primary lesion, the cervical nodal characteristics, the tumor histological subtypes, and various treatment variables were analyzed by both monovariate and multivariate methods for each of the five clinical endpoints: actuarial survival, disease-free survival, free from distant metastasis, free from local failure, and free from regional failure. Results: The significant prognosticators predicting for an increased risk of distant metastases and poorer survival included male sex, skull base and cranial nerve(s) involvement, advanced Ho's N level, and presence of fixed or partially fixed nodes or nodes contralateral to the side of the bulk of the nasopharyngeal primary. Advanced patient age led to significantly worse survival and poorer local tumor control. Local and regional failures were both increased by tumor infiltrating the skull base and/or the cranial nerves. In addition, regional failure was increased significantly by advancing Ho's N level. Parapharyngeal tumor involvement was the strongest independent prognosticator that determined distant metastasis and survival rates in the absence of the overriding prognosticators of skull base infiltration, cranial nerve(s) palsy, and cervical nodal metastasis. Conclusions: The significant prognosticators are delineated after the advent of CT and these should form the foundation of the modern stage classification for NPC

  6. [Study on computed tomography features of nasal septum cellule and its clinical significance].

    Science.gov (United States)

    Huang, Dingqiang; Li, Wanrong; Gao, Liming; Xu, Guanqiang; Ou, Xiaoyi; Tang, Guangcai

    2008-03-01

    To investigate the features of nasal septum cellule in computed tomographic (CT) images and its clinical significance. CT scans data of nasal septum in 173 patients were randomly obtained from January 2001 to June 2005. Prevalence and clinical features were summarized in the data of 19 patients with nasal septum cellule retrospectively. (1) Nineteen cases with nasal septum cellule were found in 173 patients. (2) All nasal septum cellule of 19 cases located in perpendicular plate of the ethmoid bone, in which 8 cases located in upper part of nasal septum and 11 located in middle. (3) There were totally seven patients with nasal diseases related to nasal septum cellule, in which 3 cases with inflammation, 2 cases with bone fracture, 1 case with cholesterol granuloma, 1 case with mucocele. Nasal septum cellule is an anatomic variation of nasal septum bone, and its features can provide further understanding of some diseases related to nasal septum cellule.

  7. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  8. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-01-10

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  9. Reproducibility of Dynamic Computed Tomography Brain Perfusion Measurements in Patients with Significant Carotid Artery Stenosis

    Energy Technology Data Exchange (ETDEWEB)

    Serafin, Z.; Kotarski, M.; Karolkiewicz, M.; Mindykowski, R.; Lasek, W.; Molski, S.; Gajdzinska, M.; Nowak-Nowacka, A. (Dept. of Radiology and Diagnostic Imaging, and Dept. of General and Vascular Surgery, Nicolaus Copernicus Univ., Collegium Medicum, Bydgoszcz (Poland))

    2009-02-15

    Background: Perfusion computed tomography (PCT) determination is a minimally invasive and widely available technique for brain blood flow assessment, but its application may be restricted by large variation of results. Purpose: To determine the intraobserver, interobserver, and inter examination variability of brain PCT absolute measurements in patients with significant carotid artery stenosis (CAS), and to evaluate the effect of the use of relative perfusion values on PCT reproducibility. Material and Methods: PCT imaging was completed in 61 patients before endarterectomy, and in 38 of these within 4 weeks after treatment. Cerebral blood flow (CBF), cerebral blood volume (CBV), time to peak (TTP), and peak enhancement intensity (PEI) were calculated with the maximum slope method. Inter examination variability was evaluated based on perfusion of hemisphere contralateral to the treated CAS, from repeated examinations. Interobserver and intraobserver variability were established for the untreated side, based on pretreatment examination. Results: Interobserver and intraobserver variability were highest for CBF measurement (28.8% and 32.5%, respectively), and inter examination variability was the highest for CBV (24.1%). Intraobserver and interobserver variability were higher for absolute perfusion values compared with their respective ratios for CBF and TTP. The only statistically significant difference between perfusion values measured by two observers was for CBF (mean 78.3 vs. 67.5 ml/100 g/min). The inter examination variability of TTP (12.1%) was significantly lower than the variability of other absolute perfusion measures, and the inter examination variability of ratios was significantly lower than absolute values for all the parameters. Conclusion: In longitudinal studies of patients with chronic cerebral ischemia, PCT ratios and either TTP or CBV are more suitable measures than absolute CBF values, because of their considerably lower inter- and intraobserver

  10. Reproducibility of Dynamic Computed Tomography Brain Perfusion Measurements in Patients with Significant Carotid Artery Stenosis

    International Nuclear Information System (INIS)

    Serafin, Z.; Kotarski, M.; Karolkiewicz, M.; Mindykowski, R.; Lasek, W.; Molski, S.; Gajdzinska, M.; Nowak-Nowacka, A.

    2009-01-01

    Background: Perfusion computed tomography (PCT) determination is a minimally invasive and widely available technique for brain blood flow assessment, but its application may be restricted by large variation of results. Purpose: To determine the intraobserver, interobserver, and inter examination variability of brain PCT absolute measurements in patients with significant carotid artery stenosis (CAS), and to evaluate the effect of the use of relative perfusion values on PCT reproducibility. Material and Methods: PCT imaging was completed in 61 patients before endarterectomy, and in 38 of these within 4 weeks after treatment. Cerebral blood flow (CBF), cerebral blood volume (CBV), time to peak (TTP), and peak enhancement intensity (PEI) were calculated with the maximum slope method. Inter examination variability was evaluated based on perfusion of hemisphere contralateral to the treated CAS, from repeated examinations. Interobserver and intraobserver variability were established for the untreated side, based on pretreatment examination. Results: Interobserver and intraobserver variability were highest for CBF measurement (28.8% and 32.5%, respectively), and inter examination variability was the highest for CBV (24.1%). Intraobserver and interobserver variability were higher for absolute perfusion values compared with their respective ratios for CBF and TTP. The only statistically significant difference between perfusion values measured by two observers was for CBF (mean 78.3 vs. 67.5 ml/100 g/min). The inter examination variability of TTP (12.1%) was significantly lower than the variability of other absolute perfusion measures, and the inter examination variability of ratios was significantly lower than absolute values for all the parameters. Conclusion: In longitudinal studies of patients with chronic cerebral ischemia, PCT ratios and either TTP or CBV are more suitable measures than absolute CBF values, because of their considerably lower inter- and intraobserver

  11. The Evolution of Polymer Composition during PHA Accumulation: The Significance of Reducing Equivalents

    Directory of Open Access Journals (Sweden)

    Liliana Montano-Herrera

    2017-03-01

    Full Text Available This paper presents a systematic investigation into monomer development during mixed culture Polyhydroxyalkanoates (PHA accumulation involving concurrent active biomass growth and polymer storage. A series of mixed culture PHA accumulation experiments, using several different substrate-feeding strategies, was carried out. The feedstock comprised volatile fatty acids, which were applied as single carbon sources, as mixtures, or in series, using a fed-batch feed-on-demand controlled bioprocess. A dynamic trend in active biomass growth as well as polymer composition was observed. The observations were consistent over replicate accumulations. Metabolic flux analysis (MFA was used to investigate metabolic activity through time. It was concluded that carbon flux, and consequently copolymer composition, could be linked with how reducing equivalents are generated.

  12. Significantly reduced c-axis thermal diffusivity of graphene-based papers

    Science.gov (United States)

    Han, Meng; Xie, Yangsu; Liu, Jing; Zhang, Jingchao; Wang, Xinwei

    2018-06-01

    Owing to their very high thermal conductivity as well as large surface-to-volume ratio, graphene-based films/papers have been proposed as promising candidates of lightweight thermal interface materials and lateral heat spreaders. In this work, we study the cross-plane (c-axis) thermal conductivity (k c ) and diffusivity (α c ) of two typical graphene-based papers, which are partially reduced graphene paper (PRGP) and graphene oxide paper (GOP), and compare their thermal properties with highly-reduced graphene paper and graphite. The determined α c of PRGP varies from (1.02 ± 0.09) × 10‑7 m2 s‑1 at 295 K to (2.31 ± 0.18) × 10‑7 m2 s‑1 at 12 K. This low α c is mainly attributed to the strong phonon scattering at the grain boundaries and defect centers due to the small grain sizes and high-level defects. For GOP, α c varies from (1.52 ± 0.05) × 10‑7 m2 s‑1 at 295 K to (2.28 ± 0.08) × 10‑7 m2 s‑1 at 12.5 K. The cross-plane thermal transport of GOP is attributed to the high density of functional groups between carbon layers which provide weak thermal transport tunnels across the layers in the absence of direct energy coupling among layers. This work sheds light on the understanding and optimizing of nanostructure of graphene-based paper-like materials for desired thermal performance.

  13. Technological significances to reduce the material problems. Feasibility of heat flux reduction

    International Nuclear Information System (INIS)

    Yamazaki, Seiichiro; Shimada, Michiya.

    1994-01-01

    For a divertor plate in a fusion power reactor, a high temperature coolant must be used for heat removal to keep thermal efficiency high. It makes the temperature and thermal stress of wall materials higher than the design limits. Issues of the coolant itself, e.g. burnout of high temperature water, will also become a serious problem. Sputtering erosion of the surface material will be a great concern of its lifetime. Therefore, it is necessary to reduce the heat and particle loads to the divertor plate technologically. The feasibility of some technological methods of heat reduction, such as separatrix sweeping, is discussed. As one of the most promising ideas, the methods of radiative cooling of the divertor plasma are summarized based on the recent results of large tokamaks. The feasibility of remote radiative cooling and gas divertor is discussed. The ideas are considered in recent design studies of tokamak power reactors and experimental reactors. By way of example, conceptual designs of divertor plate for the steady state tokamak power reactor are described. (author)

  14. Thrombolysis significantly reduces transient myocardial ischaemia following first acute myocardial infarction

    DEFF Research Database (Denmark)

    Mickley, H; Pless, P; Nielsen, J R

    1992-01-01

    In order to investigate whether thrombolysis affects residual myocardial ischaemia, we prospectively performed a predischarge maximal exercise test and early out-of-hospital ambulatory ST segment monitoring in 123 consecutive men surviving a first acute myocardial infarction (AMI). Seventy......-four patients fulfilled our criteria for thrombolysis, but only the last 35 patients included received thrombolytic therapy. As thrombolysis was not available in our Department at the start of the study, the first 39 patients included were conservatively treated (controls). No significant differences...... in baseline clinical characteristics were found between the two groups. In-hospital atrial fibrillation and digoxin therapy was more prevalent in controls (P less than 0.05). During exercise, thrombolysed patients reached a higher maximal work capacity compared with controls: 160 +/- 41 vs 139 +/- 34 W (P...

  15. Selenium Supplementation Significantly Reduces Thyroid Autoantibody Levels in Patients with Chronic Autoimmune Thyroiditis

    DEFF Research Database (Denmark)

    Wichman, Johanna Eva Märta; Winther, Kristian Hillert; Bonnema, Steen Joop

    2016-01-01

    BACKGROUND: Selenium supplementation may decrease circulating thyroid autoantibodies in patients with chronic autoimmune thyroiditis (AIT), but the available trials are heterogenous. This study expands and critically reappraises the knowledge on this topic. METHODS: A literature search identified...... 3366 records. Controlled trials in adults (≥18 years of age) with AIT, comparing selenium with or without levothyroxine (LT4), versus placebo and/or LT4, were eligible. Assessed outcomes were serum thyroid peroxidase (TPOAb) and thyroglobulin (TgAb) autoantibody levels, and immunomodulatory effects...... and LT4-untreated. Heterogeneity was estimated using I(2), and quality of evidence was assessed per outcome, using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: In LT4-treated populations, the selenium group had significantly lower TPOAb levels after...

  16. A case of gastric endocrine cell carcinoma which was significantly reduced in size by radiotherapy

    International Nuclear Information System (INIS)

    Azakami, Kiyoshi; Nishida, Kouji; Tanikawa, Ken

    2016-01-01

    In 2010, the World Health Organization classified gastric neuroendocrine tumors (NETs) into three types: NET grade (G) 1, NET G2 and neuroendocrine carcinoma (NEC). NECs are associated with a very poor prognosis. The patient was an 84-year-old female who was initially diagnosed by gastrointestinal endoscope with type 3 advanced gastric cancer with stenosis of the gastric cardia. Her overall status and performance status did not allow for operations or intensive chemotherapy. Palliative radiotherapy was performed and resulted in a significant reduction in the size of the tumor as well as the improvement of the obstructive symptoms. She died 9 months after radiotherapy. An autopsy provided a definitive diagnosis of gastric endocrine cell carcinoma, and the effectiveness of radiotherapy was pathologically-confirmed. Palliative radiotherapy may be a useful treatment option for providing symptom relief, especially for old patients with unresectable advanced gastric neuroendocrine carcinoma. (author)

  17. Ad libitum Mediterranean and Low Fat Diets both Significantly Reduce Hepatic Steatosis: a Randomized Controlled Trial.

    Science.gov (United States)

    Properzi, Catherine; O'Sullivan, Therese A; Sherriff, Jill L; Ching, Helena L; Jeffrey, Garry P; Buckley, Rachel F; Tibballs, Jonathan; MacQuillan, Gerry C; Garas, George; Adams, Leon A

    2018-05-05

    Although diet induced weight loss is first-line treatment for patients with non-alcoholic fatty liver disease (NAFLD), long-term maintenance is difficult. The optimal diet for either improvement in NAFLD or associated cardio-metabolic risk factors regardless of weight loss, is unknown. We examined the effect of two ad libitum isocaloric diets [Mediterranean (MD) or Low Fat (LF)] on hepatic steatosis and cardio-metabolic risk factors. Subjects with NAFLD were randomized to a 12-week blinded dietary intervention (MD vs LF). Hepatic steatosis was determined via magnetic resonance spectroscopy (MRS). From a total of 56 subjects enrolled, 49 subjects completed the intervention and 48 were included for analysis. During the intervention, subjects on the MD had significantly higher total and monounsaturated fat but lower carbohydrate and sodium intakes compared to LF subjects (pfat reduction between the groups (p=0.32), with mean (SD) relative reductions of 25.0% (±25.3%) in LF and 32.4% (±25.5%) in MD. Liver enzymes also improved significantly in both groups. Weight loss was minimal and not different between groups [-1.6 (±2.1)kg in LF vs -2.1 (±2.5)kg in MD, (p=0.52)]. Within-group improvements in the Framingham risk score, total cholesterol, serum triglyceride, and HbA1c were observed in the MD (all pvs. 64%, p=0.048). Ad libitum low fat and Mediterranean diets both improve hepatic steatosis to a similar degree. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  18. Reducing musculoskeletal disorders among computer operators: comparison between ergonomics interventions at the workplace.

    Science.gov (United States)

    Levanon, Yafa; Gefen, Amit; Lerman, Yehuda; Givon, Uri; Ratzon, Navah Z

    2012-01-01

    Typing is associated with musculoskeletal disorders (MSDs) caused by multiple risk factors. This control study aimed to evaluate the efficacy of a workplace intervention for reducing MSDs among computer workers. Sixty-six subjects with and without MSD were assigned consecutively to one of three groups: ergonomics intervention (work site and body posture adjustments, muscle activity training and exercises) accompanied with biofeedback training, the same ergonomics intervention without biofeedback and a control group. Evaluation of MSDs, body posture, psychosocial status, upper extremity (UE) kinematics and muscle surface electromyography were carried out before and after the intervention in the workplace and the motion lab. Our main hypothesis that significant differences in the reduction of MSDs will exist between subjects in the study groups and controls was confirmed (χ(2) = 13.3; p = 0.001). Significant changes were found in UE kinematics and posture as well. Both ergonomics interventions effectively reduced MSD and improved body posture. This study aimed to test the efficacy of an individual workplace intervention programme among computer workers by evaluating musculoskeletal disorders (MSDs), body posture, upper extremity kinematics, muscle activity and psychosocial factors were tested. The proposed ergonomics interventions effectively reduced MSDs and improved body posture.

  19. Social networking strategies that aim to reduce obesity have achieved significant although modest results.

    Science.gov (United States)

    Ashrafian, Hutan; Toma, Tania; Harling, Leanne; Kerr, Karen; Athanasiou, Thanos; Darzi, Ara

    2014-09-01

    The global epidemic of obesity continues to escalate. Obesity accounts for an increasing proportion of the international socioeconomic burden of noncommunicable disease. Online social networking services provide an effective medium through which information may be exchanged between obese and overweight patients and their health care providers, potentially contributing to superior weight-loss outcomes. We performed a systematic review and meta-analysis to assess the role of these services in modifying body mass index (BMI). Our analysis of twelve studies found that interventions using social networking services produced a modest but significant 0.64 percent reduction in BMI from baseline for the 941 people who participated in the studies' interventions. We recommend that social networking services that target obesity should be the subject of further clinical trials. Additionally, we recommend that policy makers adopt reforms that promote the use of anti-obesity social networking services, facilitate multistakeholder partnerships in such services, and create a supportive environment to confront obesity and its associated noncommunicable diseases. Project HOPE—The People-to-People Health Foundation, Inc.

  20. Targeting Heparin to Collagen within Extracellular Matrix Significantly Reduces Thrombogenicity and Improves Endothelialization of Decellularized Tissues.

    Science.gov (United States)

    Jiang, Bin; Suen, Rachel; Wertheim, Jason A; Ameer, Guillermo A

    2016-12-12

    Thrombosis within small-diameter vascular grafts limits the development of bioartificial, engineered vascular conduits, especially those derived from extracellular matrix (ECM). Here we describe an easy-to-implement strategy to chemically modify vascular ECM by covalently linking a collagen binding peptide (CBP) to heparin to form a heparin derivative (CBP-heparin) that selectively binds a subset of collagens. Modification of ECM with CBP-heparin leads to increased deposition of functional heparin (by ∼7.2-fold measured by glycosaminoglycan composition) and a corresponding reduction in platelet binding (>70%) and whole blood clotting (>80%) onto the ECM. Furthermore, addition of CBP-heparin to the ECM stabilizes long-term endothelial cell attachment to the lumen of ECM-derived vascular conduits, potentially through recruitment of heparin-binding growth factors that ultimately improve the durability of endothelialization in vitro. Overall, our findings provide a simple yet effective method to increase deposition of functional heparin on the surface of ECM-based vascular grafts and thereby minimize thrombogenicity of decellularized tissue, overcoming a significant challenge in tissue engineering of bioartificial vessels and vascularized organs.

  1. Numerical Feynman integrals with physically inspired interpolation: Faster convergence and significant reduction of computational cost

    Directory of Open Access Journals (Sweden)

    Nikesh S. Dattani

    2012-03-01

    Full Text Available One of the most successful methods for calculating reduced density operator dynamics in open quantum systems, that can give numerically exact results, uses Feynman integrals. However, when simulating the dynamics for a given amount of time, the number of time steps that can realistically be used with this method is always limited, therefore one often obtains an approximation of the reduced density operator at a sparse grid of points in time. Instead of relying only on ad hoc interpolation methods (such as splines to estimate the system density operator in between these points, I propose a method that uses physical information to assist with this interpolation. This method is tested on a physically significant system, on which its use allows important qualitative features of the density operator dynamics to be captured with as little as two time steps in the Feynman integral. This method allows for an enormous reduction in the amount of memory and CPU time required for approximating density operator dynamics within a desired accuracy. Since this method does not change the way the Feynman integral itself is calculated, the value of the density operator approximation at the points in time used to discretize the Feynamn integral will be the same whether or not this method is used, but its approximation in between these points in time is considerably improved by this method. A list of ways in which this proposed method can be further improved is presented in the last section of the article.

  2. Significance of frontal cortical atrophy in Parkinson's disease: computed tomographic study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Sang; Suh, Jung Ho; Chung, Tae Sub; Kim, Dong Ik [College of Medicine, Yonsei University, Seoul (Korea, Republic of)

    1987-10-15

    Fifty-five patients with Parkinson's disease were evaluated clinically and with brain computed tomography (CT) in order to determine the incidence of frontal cortical and subcortical atrophy. Twenty cases of age-related healthy control group were also scanned. The CT criteria of frontal cortical atrophy that was used in this study were the maximum width of frontal hemispheric cortical sulci and width of anterior interhemispheric fissure between frontal lobes comparing with maximum width of hemispheric cortical sulci except frontal lobes. And the criteria of frontal subcortical atrophy were bifrontal index bicaudate index, and Evans index. The results are as follows: 1. Cortical atrophic changes in Parkinson's disease were more prominent in frontal lobe rather than other causes of cortical atrophy. 2. Frontal cortical and subcortical atrophic changes were also more prominent in Parkinson's disease rather than age-related control group. 3. Subcortical atrophic changes in frontal lobe were always associated with cortical atrophic changes. 4. Changes of basal ganglia were hardly seen in Parkinson's disease. 5. Cortical atrophic changes in frontal lobe must be the one of significant findings in Parkinson's disease.

  3. Significance of frontal cortical atrophy in Parkinson's disease: computed tomographic study

    International Nuclear Information System (INIS)

    Lee, Kyung Sang; Suh, Jung Ho; Chung, Tae Sub; Kim, Dong Ik

    1987-01-01

    Fifty-five patients with Parkinson's disease were evaluated clinically and with brain computed tomography (CT) in order to determine the incidence of frontal cortical and subcortical atrophy. Twenty cases of age-related healthy control group were also scanned. The CT criteria of frontal cortical atrophy that was used in this study were the maximum width of frontal hemispheric cortical sulci and width of anterior interhemispheric fissure between frontal lobes comparing with maximum width of hemispheric cortical sulci except frontal lobes. And the criteria of frontal subcortical atrophy were bifrontal index bicaudate index, and Evans index. The results are as follows: 1. Cortical atrophic changes in Parkinson's disease were more prominent in frontal lobe rather than other causes of cortical atrophy. 2. Frontal cortical and subcortical atrophic changes were also more prominent in Parkinson's disease rather than age-related control group. 3. Subcortical atrophic changes in frontal lobe were always associated with cortical atrophic changes. 4. Changes of basal ganglia were hardly seen in Parkinson's disease. 5. Cortical atrophic changes in frontal lobe must be the one of significant findings in Parkinson's disease

  4. Prevalence and clinical significance of pleural microbubbles in computed tomography of thoracic empyema

    International Nuclear Information System (INIS)

    Smolikov, A.; Smolyakov, R.; Riesenberg, K.; Schlaeffer, F.; Borer, A.; Cherniavsky, E.; Gavriel, A.; Gilad, J.

    2006-01-01

    AIM: To determine the prevalence and clinical significance of pleural microbubbles in thoracic empyema. MATERIALS AND METHODS: The charts of 71 consecutive patients with empyema were retrospectively reviewed for relevant demographic, laboratory, microbiological, therapeutic and outcome data. Computed tomography (CT) images were reviewed for various signs of empyema as well as pleural microbubbles. Two patient groups, with and without microbubbles were compared. RESULTS: Mean patient age was 49 years and 72% were males. Microbubbles were detected in 58% of patients. There were no significant differences between patients with and without microbubbles in regard to pleural fluid chemistry. A causative organism was identified in about 75% of cases in both. There was no difference in the rates of pleural thickening and enhancement, increased extra-pleural fat attenuation, air-fluid levels or loculations. Microbubbles were diagnosed after a mean of 7.8 days from admission. Thoracentesis before CT was performed in 90 and 57% of patients with and without microbubbles (p=0.0015), respectively. Patients with microbubbles were more likely to require repeated drainage (65.9 versus 36.7%, p=0.015) and surgical decortication (31.7 versus 6.7%, p=0.011). Mortalities were 9.8 and 6.6% respectively (p=0.53). CONCLUSION: Pleural microbubbles are commonly encountered in CT imaging of empyema but have not been systematically studied to date. Microbubbles may be associated with adverse outcome such as repeated drainage or surgical decortication. The sensitivity and specificity of this finding and its prognostic implications need further assessment

  5. Thyroid function appears to be significantly reduced in Space-borne MDS mice

    Science.gov (United States)

    Saverio Ambesi-Impiombato, Francesco; Curcio, Francesco; Fontanini, Elisabetta; Perrella, Giuseppina; Spelat, Renza; Zambito, Anna Maria; Damaskopoulou, Eleni; Peverini, Manola; Albi, Elisabetta

    It is known that prolonged space flights induced changes in human cardiovascular, muscu-loskeletal and nervous systems whose function is regulated by the thyroid gland but, until now, no data were reported about thyroid damage during space missions. We have demonstrated in vitro that, during space missions (Italian Soyuz Mission "ENEIDE" in 2005, Shuttle STS-120 "ESPERIA" in 2007), thyroid in vitro cultured cells did not respond to thyroid stimulating hor-mone (TSH) treatment; they appeared healthy and alive, despite their being in a pro-apopotic state characterised by a variation of sphingomyelin metabolism and consequent increase in ce-ramide content. The insensitivity to TSH was largely due to a rearrangement of specific cell membrane microdomains, acting as platforms for TSH-receptor (TEXUS-44 mission in 2008). To study if these effects were present also in vivo, as part of the Mouse Drawer System (MDS) Tissue Sharing Program, we performed experiments in mice maintained onboard the Interna-tional Space Station during the long-duration (90 days) exploration mission STS-129. After return to earth, the thyroids isolated from the 3 animals were in part immediately frozen to study the morphological modification in space and in part immediately used to study the effect of TSH treatment. For this purpose small fragments of tissue were treated with 10-7 or 10-8 M TSH for 1 hour by using untreated fragments as controls. Then the fragments were fixed with absolute ethanol for 10 min at room temperature and centrifuged for 20 min. at 3000 x g. The supernatants were used for cAMP analysis whereas the pellet were used for protein amount determination and for immunoblotting analysis of TSH-receptor, sphingomyelinase and sphingomyelin-synthase. The results showed a modification of the thyroid structure and also the values of cAMP production after treatment with 10-7 M TSH for 1 hour were significantly lower than those obtained in Earth's gravity. The treatment with TSH

  6. Reducing power consumption during execution of an application on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-06-05

    Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: executing, by each compute node, an application, the application including power consumption directives corresponding to one or more portions of the application; identifying, by each compute node, the power consumption directives included within the application during execution of the portions of the application corresponding to those identified power consumption directives; and reducing power, by each compute node, to one or more components of that compute node according to the identified power consumption directives during execution of the portions of the application corresponding to those identified power consumption directives.

  7. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  8. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  9. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  10. Explaining Delusions: Reducing Uncertainty Through Basic and Computational Neuroscience.

    Science.gov (United States)

    Feeney, Erin J; Groman, Stephanie M; Taylor, Jane R; Corlett, Philip R

    2017-03-01

    Delusions, the fixed false beliefs characteristic of psychotic illness, have long defied understanding despite their response to pharmacological treatments (e.g., D2 receptor antagonists). However, it can be challenging to discern what makes beliefs delusional compared with other unusual or erroneous beliefs. We suggest mapping the putative biology to clinical phenomenology with a cognitive psychology of belief, culminating in a teleological approach to beliefs and brain function supported by animal and computational models. We argue that organisms strive to minimize uncertainty about their future states by forming and maintaining a set of beliefs (about the organism and the world) that are robust, but flexible. If uncertainty is generated endogenously, beliefs begin to depart from consensual reality and can manifest into delusions. Central to this scheme is the notion that formal associative learning theory can provide an explanation for the development and persistence of delusions. Beliefs, in animals and humans, may be associations between representations (e.g., of cause and effect) that are formed by minimizing uncertainty via new learning and attentional allocation. Animal research has equipped us with a deep mechanistic basis of these processes, which is now being applied to delusions. This work offers the exciting possibility of completing revolutions of translation, from the bedside to the bench and back again. The more we learn about animal beliefs, the more we may be able to apply to human beliefs and their aberrations, enabling a deeper mechanistic understanding. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  11. Phasic firing in vasopressin cells: understanding its functional significance through computational models.

    Directory of Open Access Journals (Sweden)

    Duncan J MacGregor

    Full Text Available Vasopressin neurons, responding to input generated by osmotic pressure, use an intrinsic mechanism to shift from slow irregular firing to a distinct phasic pattern, consisting of long bursts and silences lasting tens of seconds. With increased input, bursts lengthen, eventually shifting to continuous firing. The phasic activity remains asynchronous across the cells and is not reflected in the population output signal. Here we have used a computational vasopressin neuron model to investigate the functional significance of the phasic firing pattern. We generated a concise model of the synaptic input driven spike firing mechanism that gives a close quantitative match to vasopressin neuron spike activity recorded in vivo, tested against endogenous activity and experimental interventions. The integrate-and-fire based model provides a simple physiological explanation of the phasic firing mechanism involving an activity-dependent slow depolarising afterpotential (DAP generated by a calcium-inactivated potassium leak current. This is modulated by the slower, opposing, action of activity-dependent dendritic dynorphin release, which inactivates the DAP, the opposing effects generating successive periods of bursting and silence. Model cells are not spontaneously active, but fire when perturbed by random perturbations mimicking synaptic input. We constructed one population of such phasic neurons, and another population of similar cells but which lacked the ability to fire phasically. We then studied how these two populations differed in the way that they encoded changes in afferent inputs. By comparison with the non-phasic population, the phasic population responds linearly to increases in tonic synaptic input. Non-phasic cells respond to transient elevations in synaptic input in a way that strongly depends on background activity levels, phasic cells in a way that is independent of background levels, and show a similar strong linearization of the response

  12. Reducing power consumption while performing collective operations on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-10-18

    Methods, apparatus, and products are disclosed for reducing power consumption while performing collective operations on a plurality of compute nodes that include: receiving, by each compute node, instructions to perform a type of collective operation; selecting, by each compute node from a plurality of collective operations for the collective operation type, a particular collective operation in dependence upon power consumption characteristics for each of the plurality of collective operations; and executing, by each compute node, the selected collective operation.

  13. Complex functionality with minimal computation: Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    Science.gov (United States)

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-01

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.

  14. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  15. Significance of cranial computer tomography for the early diagnosis of peri- and postnatal damage

    Energy Technology Data Exchange (ETDEWEB)

    Richter, E I

    1981-01-01

    It is reported on examination-technical possibilities with craniocerebral Computer Tomography in the peri- and postnatal period. Some typical tomographic images from a 17 1/2 months period in our own patient material of 327 children are demonstrated. The special advantages of this new technical-extensive method are: exact diagnoses, observation possibility of the longitudinal section, and the absolute harmlessness to the child.

  16. Clinical significance of computed tomography in the measurement of thyroid volume after operation for Basedow's disease

    International Nuclear Information System (INIS)

    Kasuga, Yoshio; Miyakawa, Makoto; Sugenoya, Akira

    1986-01-01

    The postoperative volume of the thyroid glands was measured using computed tomography (CT) in 16 patients with Basedow's disease. In the group which had normal postoperative thyroid function and did not need to receive T 4 , CT showed increase of thyroid volume. In three of the four patients who needed to receive it, CT showed decreased thyroid volume, as compared with that immediately after operation. CT has proved to serve as a tool for measuring postoperative thyroid volume for Basedow's disease in relation to postoperative prognosis. (Namekawa, K.)

  17. Interaction between FOXO1A-209 Genotype and Tea Drinking is Significantly Associated with Reduced Mortality at Advanced Ages

    DEFF Research Database (Denmark)

    Zeng, Yi; Chen, Huashuai; Ni, Ting

    2016-01-01

    Based on the genotypic/phenotypic data from Chinese Longitudinal Healthy Longevity Survey (CLHLS) and Cox proportional hazard model, the present study demonstrates that interactions between carrying FOXO1A-209 genotypes and tea drinking are significantly associated with lower risk of mortality...... at advanced ages. Such significant association is replicated in two independent Han Chinese CLHLS cohorts (p =0.028-0.048 in the discovery and replication cohorts, and p =0.003-0.016 in the combined dataset). We found the associations between tea drinking and reduced mortality are much stronger among carriers...... of the FOXO1A-209 genotype compared to non-carriers, and drinking tea is associated with a reversal of the negative effects of carrying FOXO1A-209 minor alleles, that is, from a substantially increased mortality risk to substantially reduced mortality risk at advanced ages. The impacts are considerably...

  18. Nutcracker or left renal vein compression phenomenon: multidetector computed tomography findings and clinical significance

    International Nuclear Information System (INIS)

    Cuellar i Calabria, Hug; Quiroga Gomez, Sergi; Sebastia Cerqueda, Carmen; Boye de la Presa, Rosa; Miranda, Americo; Alvarez-Castells, Agusti

    2005-01-01

    The use of multidetector computed tomography (MDCT) in routine abdominal explorations has increased the detection of the nutcracker phenomenon, defined as left renal vein (LRV) compression by adjacent anatomic structures. The embryology and anatomy of the nutcracker phenomenon are relevant as a background for the nutcracker syndrome, a rare cause of hematuria as well as other symptoms. MDCT examples of collateral renal vein circulation (gonadal, ureteric, azygous, lumbar, capsular) and aortomesenteric (anterior) and retroaortic (posterior) nutcracker phenomena in patients with no urologic complaint are shown as well as studies performed on patients with gross hematuria of uncertain origin. Incidental observation of collateral veins draining the LRV in abdominal MDCT explorations of asymptomatic patients may be a sign of a compensating nutcracker phenomenon. Imbalance between LRV compression and development of collateral circulation may lead to symptomatic nutcracker syndrome. (orig.)

  19. Low density in liver of idiopathic portal hypertension. A computed tomographic observation with possible diagnostic significance

    Energy Technology Data Exchange (ETDEWEB)

    Ishito, Hiroyuki

    1988-01-01

    In order to evaluate the diagnostic value of low density in liver on computed tomography (CT), CT scans of 11 patients with idiopathic portal hypertension (IPH) were compared with those from 22 cirrhotic patients, two patients with scarred liver and 16 normal subjects. Low densities on plain CT scans in patients with IPH were distinctly different from those observed in normal liver. Some of the low densities had irregular shape with unclear margin and were scattered near the liver surface, and others had vessel-like structures with unclear margin and extended as far as near the liver surface. Ten of the 11 patients with IPH had low densities mentioned above, while none of the 22 cirrhotic patients had such low densities. The present results suggest that the presence of low densities in liver on plain CT scan is clinically beneficial in diagnosis of IPH.

  20. Nutcracker or left renal vein compression phenomenon: multidetector computed tomography findings and clinical significance

    Energy Technology Data Exchange (ETDEWEB)

    Cuellar i Calabria, Hug; Quiroga Gomez, Sergi; Sebastia Cerqueda, Carmen; Boye de la Presa, Rosa; Miranda, Americo; Alvarez-Castells, Agusti [Hospitals Universitaris Vall D' Hebron, Institut de Diagnostic Per La Imatge, Servei De Radiodiagnostic, Barcelona (Spain)

    2005-08-01

    The use of multidetector computed tomography (MDCT) in routine abdominal explorations has increased the detection of the nutcracker phenomenon, defined as left renal vein (LRV) compression by adjacent anatomic structures. The embryology and anatomy of the nutcracker phenomenon are relevant as a background for the nutcracker syndrome, a rare cause of hematuria as well as other symptoms. MDCT examples of collateral renal vein circulation (gonadal, ureteric, azygous, lumbar, capsular) and aortomesenteric (anterior) and retroaortic (posterior) nutcracker phenomena in patients with no urologic complaint are shown as well as studies performed on patients with gross hematuria of uncertain origin. Incidental observation of collateral veins draining the LRV in abdominal MDCT explorations of asymptomatic patients may be a sign of a compensating nutcracker phenomenon. Imbalance between LRV compression and development of collateral circulation may lead to symptomatic nutcracker syndrome. (orig.)

  1. Diagnostic significance and therapeutic consequences of computed tomography (patient outcome research). Pt. 1. Diagnosis in traumatology

    International Nuclear Information System (INIS)

    Schroeder, R.J.; Hidajat, N.; Vogl, T.; Haas, N.; Suedkamp, N.; Schedel, H.; Felix, R.

    1995-01-01

    During 1993, 201 primary traumatologic patients underwent 230 computed tomography examinations. 87% of the CT's were performed completely without contrast media, 2.6% exclusively supported by intravenously given contrast media, 9.1% in both ways, and 1.3% after intra-articular contrast media administration. 97.4% served for primary diagnostic purposes and 2.6% for the control of therapeutic results. In 47.8% of the CT's, the principle diagnosis was known before CT. In 52.2%, the diagnosis without CT was impossible by other methods. The CT diagnoses were correctly positive in 58.7% and correctly negative in 41.3%. 60.9% of CT's demonstrated a missing indication for operation in the examined body region; in 39.1% the operation followed. (orig.) [de

  2. Defining Spaces of Potential Art: The significance of representation in computer-aided creativity

    DEFF Research Database (Denmark)

    Dahlstedt, Palle

    2005-01-01

    One way of looking at the creative process is as a search in a space of possible answers. One way of simulating such a process is through evolutionary algorithms, i.e., simulated evolution by random variation and selection. The search space is defined by the chosen genetic representation, a kind...... of formal description, and the ways of navigating the space are defined by the choice of genetic operators (e.g., mutations). In creative systems, such as computer-aided music composition tools, these choices determine the efficiency of the system, in terms of the diversity of the results, the degree...... of novelty and the coherence within the material. Based on various implementations developed during five years of research, and experiences from real-life artistic applications, I will explain and discuss these mechanisms, from a perspective of the creative artist....

  3. [The significance of dermatologic management in computer-assisted occupational dermatology consultation].

    Science.gov (United States)

    Rakoski, J; Borelli, S

    1989-01-15

    At our occupational outpatient clinic, 230 patients were treated for about 15 months. With the help of a standardized questionary, we registered all the data regarding the relevant substances the patients contacted during their work as well as their various jobs since they left school. The patients were repeatedly seen and trained in procedures of skin care and skin protection. If required, we took steps to find new jobs for them within their employing company; this was done in cooperation with the trade cooperative association according to the dermatological insurance consultanship. If these proceedings did not work out, the patient had to change his profession altogether. All data were computerized. As an example for this computer-based documentation we present the data of barbers.

  4. The significance of routine thoracic computed tomography in patients with blunt chest trauma.

    Science.gov (United States)

    Çorbacıoğlu, Seref Kerem; Er, Erhan; Aslan, Sahin; Seviner, Meltem; Aksel, Gökhan; Doğan, Nurettin Özgür; Güler, Sertaç; Bitir, Aysen

    2015-05-01

    The purpose of this study is to investigate whether the use of thoracic computed tomography (TCT) as part of nonselective computed tomography (CT) guidelines is superior to selective CT during the diagnosis of blunt chest trauma. This study was planned as a prospective cohort study, and it was conducted at the emergency department between 2013 and 2014. A total of 260 adult patients who did not meet the exclusion criteria were enrolled in the study. All patients were evaluated by an emergency physician, and their primary surveys were completed based on the Advanced Trauma Life Support (ATLS) principles. Based on the initial findings and ATLS recommendations, patients in whom thoracic CT was indicated were determined (selective CT group). Routine CTs were then performed on all patients. Thoracic injuries were found in 97 (37.3%) patients following routine TCT. In 53 (20%) patients, thoracic injuries were found by selective CT. Routine TCT was able to detect chest injury in 44 (16%) patients for whom selective TCT would not otherwise be ordered based on the EP evaluation (nonselective TCT group). Five (2%) patients in this nonselective TCT group required tube thoracostomy, while there was no additional treatment provided for thoracic injuries in the remaining 39 (15%). In conclusion, we found that the nonselective TCT method was superior to the selective TCT method in detecting thoracic injuries in patients with blunt trauma. Furthermore, we were able to demonstrate that the nonselective TCT method can change the course of patient management albeit at low rates. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Significance of computed tomography in the diagnosis of the mediastinal mass lesions

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, Masanori; Takashima, Tsutomu; Suzuki, Masayuki; Itoh, Hiroshi; Hirose, Jinichiro; Choto, Shuichi (Kanazawa Univ. (Japan). School of Medicine)

    1983-08-01

    Thirty cases of the mediastinal mass lesions were examined by computed tomography and diagnostic ability of CT was retrospectively evaluated. We devided them into two major groups: cystic and solid lesions. Cysts and cystic teratomas were differentiated on the thickness of their wall. Pericardial cysts were typically present at the cardiophrenic angle. In the solid mediastinal lesions, the presence of calcific and/or fatty components, the presence of necrosis, the irregularity of the margin and the obliteration of the surrounding fat layer were the clues to differential diagnosis and of evaluation for their invasiveness. Although differential diagnosis of the solid anterior mediastinal tumors was often difficult, teratomas with calcific and fatty componets were easily diagnosed. Invasiveness of the malignant thymoma and other malignant lesions were successfully evaluated to some extent. Neurogenic posterior mediastinal tumors were easily diagnosed because of the presence of the spine deformity and typical dumbbell shaped appearance. We stress that our diagnostic approach is useful to differentiate the mediastinal mass lesions.

  6. Significance of computed tomography in the diagnosis of the mediastinal mass lesions

    International Nuclear Information System (INIS)

    Kimura, Masanori; Takashima, Tsutomu; Suzuki, Masayuki; Itoh, Hiroshi; Hirose, Jinichiro; Choto, Shuichi

    1983-01-01

    Thirty cases of the mediastinal mass lesions were examined by computed tomography and diagnostic ability of CT was retrospectively evaluated. We devided them into two major groups: cystic and solid lesions. Cysts and cystic teratomas were differentiated on the thickness of their wall. Pericardial cysts were typically present at the cardiophrenic angle. In the solid mediastinal lesions, the presence of calcific and/or fatty components, the presence of necrosis, the irregularity of the margin and the obliteration of the surrounding fat layer were the clues to differential diagnosis and of evaluation for their invasiveness. Although differential diagnosis of the solid anterior mediastinal tumors was often difficult, teratomas with calcific and fatty componets were easily diagnosed. Invasiveness of the malignant thymoma and other malignant lesions were successfully evaluated to some extent. Neurogenic posterior mediastinal tumors were easily diagnosed because of the presence of the spine deformity and typical dumbbell shaped appearance. We stress that our diagnostic approach is useful to differentiate the mediastinal mass lesions. (author)

  7. The clinical significance of Fuji computed radiography on lateral chest radiogram

    International Nuclear Information System (INIS)

    Kifune, Kouichi

    1995-01-01

    The purpose of this study was to clarify the benefits of digital lateral chest radiogram. In the basic study, the modulation transfer factor (MTF) and the wiener spectra (WS) of conventional screen film (CSF) and Fuji computed radiography (FCR) were measured. The visibility of the simulated nodules on FCR using 3 human bodies was subjectively compared with that on CSF by 13 observers. In the clinical study, the visibility of the normal structures on FCR was subjectively compared with that on CSF using 50 lateral chest radiograms by 10 observers. The diagnostic performance to detect pulmonary nodules on FCR was also compared with that on CSF using each 30 positive and negative cases by 8 observers. In the basic study, the MTF of FCR was superior to that of CSF, and the WS of FCR displayed in half size was superior to that of CSF. In all exposure conditions, the visibility of the nodules on FCR in the pulmonary apex was inferior to that on CSF, while FCR was superior to CSF in the other lung field. However, the visibility of the nodules on FCR in the pulmonary apex was improved when the exposure condition was increased. In the clinical study, the visibility of the normal structures on FCR was comparable or superior to that on CSF except for interlobar fissure due to resolution properties. The diagnostic performance of pulmonary nodules on FCR was comparable to that on CSF especially in classifying the marginal character and diameter of the nodules. According to the location of the nodules, the detectability of FCR was superior to that of CSF in the retrosternal space and tended to be inferior to that of CSF in the pulmonary apex. An adequate exposure condition should be considered before discussing the visibility and detectability of abnormal shadow in the lateral chest radiogram. In conclusion, the digital lateral chest radiogram is superior to the CSF images, mainly because of wide latitude in FCR. (author)

  8. [Diagnostic significance of multi-slice computed tomography imaging in congenital inner ear malformations].

    Science.gov (United States)

    Ma, Hui; Han, Ping; Liang, Bo; Liu, Fang; Tian, Zhi-Liang; Lei, Zi-Qiao; Li, You-Lin; Kong, Wei-Jia

    2005-04-01

    To evaluate the feasibility and usability of multi-slice computed tomography (MSCT) in congenital inner ear malformations. Fourty-four patients with sensorineural hearing loss (SNHL) were examined by a Somatom Sensation 16 (siemens, Germany) CT scanner with following parameters: 120 kV, 100 mAs, 0.75 mm collimation, 1 mm reconstruction increment, a pitch factor of 1 and a field of view of 100 mm. The axial images of interested ears were reconstructed with 0.1 mm reconstruction increment, and a field of view of 50 mm. The 3D reconstructions were done with volume rendering technique (VRT) on the workstation (3D Virtuoso and Wizard,siemens). Twenty-five patients were normal and 19 patients (36 ears) were congenital inner ear malformations among 44 patients scanned with MSCT. Of the malformations, all the axial, MPR and VRT images can display the site and degree in 33 ears. VRT images were superior to the axial images in displaying the malformations in 3 ears with the small lateral semicircular canal malformations. The malformations were Michel deformity (1 ear), common cavity deformity (3 ears), incomplete partition I (3 ears), incomplete partition II (Mondini deformity, 5 ears), vestibular and semicircular canal malformations( 14 ears), vestibular aqueduct dilate( 16 ears, of which 6 ears accompanied by other malformations), the internal auditory canal malformation(8 ears, all accompanied by other malformations). MSCT allows a comprehensively assessing various congenital ear malformations through high quality MPR and VRT reconstructions. VRT images can display the site and degree of the malformations three-dimensionally and intuitionisticly. It is very useful to the cochlear implantation.

  9. A Recombinant Multi-Stage Vaccine against Paratuberculosis Significantly Reduces Bacterial Level in Tissues without Interference in Diagnostics

    DEFF Research Database (Denmark)

    Jungersen, Gregers; Thakur, Aneesh; Aagaard, C.

    , PPDj-specific IFN-γ responses or positive PPDa or PPDb skin tests developed in vaccinees. Antibodies and cell-mediated immune responses were developed against FET11 antigens, however. At necropsy 8 or 12 months of age, relative Map burden was determined in a number of gut tissues by quantitative IS900...... PCR and revealed significantly reduced levels of Map and reduced histopathology. Diagnostic tests for antibody responses and cell-mediated immune responses, used as surrogates of infection, corroborated the observed vaccine efficacy: Five of seven non‐vaccinated calves seroconverted in ID Screen......-γ assay responses from 40 to 52 weeks compared to non-vaccinated calves. These results indicate the FET11 vaccine can be used to accelerate eradication of paratuberculosis while surveillance or test-and-manage control programs for tuberculosis and Johne’s disease remain in place. Funded by EMIDA ERA...

  10. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  11. Lime and Phosphate Amendment Can Significantly Reduce Uptake of Cd and Pb by Field-Grown Rice

    Directory of Open Access Journals (Sweden)

    Rongbo Xiao

    2017-03-01

    Full Text Available Agricultural soils are suffering from increasing heavy metal pollution, among which, paddy soil polluted by heavy metals is frequently reported and has elicited great public concern. In this study, we carried out field experiments on paddy soil around a Pb-Zn mine to study amelioration effects of four soil amendments on uptake of Cd and Pb by rice, and to make recommendations for paddy soil heavy metal remediation, particularly for combined pollution of Cd and Pb. The results showed that all the four treatments can significantly reduce the Cd and Pb content in the late rice grain compared with the early rice, among which, the combination amendment of lime and phosphate had the best remediation effects where rice grain Cd content was reduced by 85% and 61%, respectively, for the late rice and the early rice, and by 30% in the late rice grain for Pb. The high reduction effects under the Ca + P treatment might be attributed to increase of soil pH from 5.5 to 6.7. We also found that influence of the Ca + P treatment on rice production was insignificant, while the available Cd and Pb content in soil was reduced by 16.5% and 11.7%, respectively.

  12. Reduced bone mineral density is not associated with significantly reduced bone quality in men and women practicing long-term calorie restriction with adequate nutrition.

    Science.gov (United States)

    Villareal, Dennis T; Kotyk, John J; Armamento-Villareal, Reina C; Kenguva, Venkata; Seaman, Pamela; Shahar, Allon; Wald, Michael J; Kleerekoper, Michael; Fontana, Luigi

    2011-02-01

    Calorie restriction (CR) reduces bone quantity but not bone quality in rodents. Nothing is known regarding the long-term effects of CR with adequate intake of vitamin and minerals on bone quantity and quality in middle-aged lean individuals. In this study, we evaluated body composition, bone mineral density (BMD), and serum markers of bone turnover and inflammation in 32 volunteers who had been eating a CR diet (approximately 35% less calories than controls) for an average of 6.8 ± 5.2 years (mean age 52.7 ± 10.3 years) and 32 age- and sex-matched sedentary controls eating Western diets (WD). In a subgroup of 10 CR and 10 WD volunteers, we also measured trabecular bone (TB) microarchitecture of the distal radius using high-resolution magnetic resonance imaging. We found that the CR volunteers had significantly lower body mass index than the WD volunteers (18.9 ± 1.2 vs. 26.5 ± 2.2 kg m(-2) ; P = 0.0001). BMD of the lumbar spine (0.870 ± 0.11 vs. 1.138 ± 0.12 g cm(-2) , P = 0.0001) and hip (0.806 ± 0.12 vs. 1.047 ± 0.12 g cm(-2) , P = 0.0001) was also lower in the CR than in the WD group. Serum C-terminal telopeptide and bone-specific alkaline phosphatase concentration were similar between groups, while serum C-reactive protein (0.19 ± 0.26 vs. 1.46 ± 1.56 mg L(-1) , P = 0.0001) was lower in the CR group. Trabecular bone microarchitecture parameters such as the erosion index (0.916 ± 0.087 vs. 0.877 ± 0.088; P = 0.739) and surface-to-curve ratio (10.3 ± 1.4 vs. 12.1 ± 2.1, P = 0.440) were not significantly different between groups. These findings suggest that markedly reduced BMD is not associated with significantly reduced bone quality in middle-aged men and women practicing long-term calorie restriction with adequate nutrition.

  13. Smoking cessation programmes in radon affected areas: can they make a significant contribution to reducing radon-induced lung cancers?

    International Nuclear Information System (INIS)

    Denman, A.R.; Groves-Kirkby, C.J.; Timson, K.; Shield, G.; Rogers, S.; Phillips, P.S.

    2008-01-01

    Domestic radon levels in parts of the UK are sufficiently high to increase the risk of lung cancer in the occupants. Public health campaigns in Northamptonshire, a designated radon affected area with 6.3% of homes having average radon levels over the UK action level of 200 Bq m -3 , have encouraged householders to test for radon and then to carry out remediation in their homes, but have been only partially successful. Only 40% of Northamptonshire houses have been tested, and only 15% of householders finding raised levels proceed to remediate. Of those who did remediate, only 9% smoked, compared to a countywide average of 28.8%. This is unfortunate, since radon and smoking combine to place the individual at higher risk by a factor of around 4, and suggests that current strategies to reduce domestic radon exposure are not reaching those most at risk. During 2004-5, the NHS Stop Smoking Services in Northamptonshire assisted 2,808 smokers to quit to the 4-week stage, with some 30% of 4-week quitters remaining quitters at 1 year. We consider whether smoking cessation campaigns make significant contributions to radon risk reduction on their own, by assessing individual occupants' risk of developing lung cancer from knowledge of their age, gender, and smoking habits, together with he radon level in their house. The results demonstrate that smoking cessation programmes have significant added value in radon affected areas, and contribute a greater health benefit than reducing radon levels in the smokers' homes, whilst they remain smokers. Additionally, results are presented from a questionnaire-based survey of quitters, addressing their reasons for seeking help in quitting smoking, and whether knowledge of radon risks influenced this decision. The impact of these findings on future public health campaigns to reduce the impact of radon and smoking are discussed. (author)

  14. Computer input devices: neutral party or source of significant error in manual lesion segmentation?

    Science.gov (United States)

    Chen, James Y; Seagull, F Jacob; Nagy, Paul; Lakhani, Paras; Melhem, Elias R; Siegel, Eliot L; Safdar, Nabile M

    2011-02-01

    Lesion segmentation involves outlining the contour of an abnormality on an image to distinguish boundaries between normal and abnormal tissue and is essential to track malignant and benign disease in medical imaging for clinical, research, and treatment purposes. A laser optical mouse and a graphics tablet were used by radiologists to segment 12 simulated reference lesions per subject in two groups (one group comprised three lesion morphologies in two sizes, one for each input device for each device two sets of six, composed of three morphologies in two sizes each). Time for segmentation was recorded. Subjects completed an opinion survey following segmentation. Error in contour segmentation was calculated using root mean square error. Error in area of segmentation was calculated compared to the reference lesion. 11 radiologists segmented a total of 132 simulated lesions. Overall error in contour segmentation was less with the graphics tablet than with the mouse (P Error in area of segmentation was not significantly different between the tablet and the mouse (P = 0.62). Time for segmentation was less with the tablet than the mouse (P = 0.011). All subjects preferred the graphics tablet for future segmentation (P = 0.011) and felt subjectively that the tablet was faster, easier, and more accurate (P = 0.0005). For purposes in which accuracy in contour of lesion segmentation is of the greater importance, the graphics tablet is superior to the mouse in accuracy with a small speed benefit. For purposes in which accuracy of area of lesion segmentation is of greater importance, the graphics tablet and mouse are equally accurate.

  15. Lipid Replacement Therapy Drink Containing a Glycophospholipid Formulation Rapidly and Significantly Reduces Fatigue While Improving Energy and Mental Clarity

    Directory of Open Access Journals (Sweden)

    Robert Settineri

    2011-08-01

    Full Text Available Background: Fatigue is the most common complaint of patients seeking general medical care and is often treated with stimulants. It is also important in various physical activities of relatively healthy men and women, such as sports performance. Recent clinical trials using patients with chronic fatigue have shown the benefit of Lipid Replacement Therapy in restoring mitochondrial electron transport function and reducing moderate to severe chronic fatigue. Methods: Lipid Replacement Therapy was administered for the first time as an all-natural functional food drink (60 ml containing polyunsaturated glycophospholipids but devoid of stimulants or herbs to reduce fatigue. This preliminary study used the Piper Fatigue Survey instrument as well as a supplemental questionnaire to assess the effects of the glycophospholipid drink on fatigue and the acceptability of the test drink in adult men and women. A volunteer group of 29 subjects of mean age 56.2±4.5 years with various fatigue levels were randomly recruited in a clinical health fair setting to participate in an afternoon open label trial on the effects of the test drink. Results: Using the Piper Fatigue instrument overall fatigue among participants was reduced within the 3-hour seminar by a mean of 39.6% (p<0.0001. All of the subcategories of fatigue showed significant reductions. Some subjects responded within 15 minutes, and the majority responded within one hour with increased energy and activity and perceived improvements in cognitive function, mental clarity and focus. The test drink was determined to be quite acceptable in terms of taste and appearance. There were no adverse events from the energy drink during the study.Functional Foods in Health and Disease 2011; 8:245-254Conclusions: The Lipid Replacement Therapy functional food drink appeared to be a safe, acceptable and potentially useful new method to reduce fatigue, sustain energy and improve perceptions of mental function.

  16. Significant Association between Sulfate-Reducing Bacteria and Uranium-Reducing Microbial Communities as Revealed by a Combined Massively Parallel Sequencing-Indicator Species Approach▿ †

    Science.gov (United States)

    Cardenas, Erick; Wu, Wei-Min; Leigh, Mary Beth; Carley, Jack; Carroll, Sue; Gentry, Terry; Luo, Jian; Watson, David; Gu, Baohua; Ginder-Vogel, Matthew; Kitanidis, Peter K.; Jardine, Philip M.; Zhou, Jizhong; Criddle, Craig S.; Marsh, Terence L.; Tiedje, James M.

    2010-01-01

    Massively parallel sequencing has provided a more affordable and high-throughput method to study microbial communities, although it has mostly been used in an exploratory fashion. We combined pyrosequencing with a strict indicator species statistical analysis to test if bacteria specifically responded to ethanol injection that successfully promoted dissimilatory uranium(VI) reduction in the subsurface of a uranium contamination plume at the Oak Ridge Field Research Center in Tennessee. Remediation was achieved with a hydraulic flow control consisting of an inner loop, where ethanol was injected, and an outer loop for flow-field protection. This strategy reduced uranium concentrations in groundwater to levels below 0.126 μM and created geochemical gradients in electron donors from the inner-loop injection well toward the outer loop and downgradient flow path. Our analysis with 15 sediment samples from the entire test area found significant indicator species that showed a high degree of adaptation to the three different hydrochemical-created conditions. Castellaniella and Rhodanobacter characterized areas with low pH, heavy metals, and low bioactivity, while sulfate-, Fe(III)-, and U(VI)-reducing bacteria (Desulfovibrio, Anaeromyxobacter, and Desulfosporosinus) were indicators of areas where U(VI) reduction occurred. The abundance of these bacteria, as well as the Fe(III) and U(VI) reducer Geobacter, correlated with the hydraulic connectivity to the substrate injection site, suggesting that the selected populations were a direct response to electron donor addition by the groundwater flow path. A false-discovery-rate approach was implemented to discard false-positive results by chance, given the large amount of data compared. PMID:20729318

  17. Significant association between sulfate-reducing bacteria and uranium-reducing microbial communities as revealed by a combined massively parallel sequencing-indicator species approach.

    Science.gov (United States)

    Cardenas, Erick; Wu, Wei-Min; Leigh, Mary Beth; Carley, Jack; Carroll, Sue; Gentry, Terry; Luo, Jian; Watson, David; Gu, Baohua; Ginder-Vogel, Matthew; Kitanidis, Peter K; Jardine, Philip M; Zhou, Jizhong; Criddle, Craig S; Marsh, Terence L; Tiedje, James M

    2010-10-01

    Massively parallel sequencing has provided a more affordable and high-throughput method to study microbial communities, although it has mostly been used in an exploratory fashion. We combined pyrosequencing with a strict indicator species statistical analysis to test if bacteria specifically responded to ethanol injection that successfully promoted dissimilatory uranium(VI) reduction in the subsurface of a uranium contamination plume at the Oak Ridge Field Research Center in Tennessee. Remediation was achieved with a hydraulic flow control consisting of an inner loop, where ethanol was injected, and an outer loop for flow-field protection. This strategy reduced uranium concentrations in groundwater to levels below 0.126 μM and created geochemical gradients in electron donors from the inner-loop injection well toward the outer loop and downgradient flow path. Our analysis with 15 sediment samples from the entire test area found significant indicator species that showed a high degree of adaptation to the three different hydrochemical-created conditions. Castellaniella and Rhodanobacter characterized areas with low pH, heavy metals, and low bioactivity, while sulfate-, Fe(III)-, and U(VI)-reducing bacteria (Desulfovibrio, Anaeromyxobacter, and Desulfosporosinus) were indicators of areas where U(VI) reduction occurred. The abundance of these bacteria, as well as the Fe(III) and U(VI) reducer Geobacter, correlated with the hydraulic connectivity to the substrate injection site, suggesting that the selected populations were a direct response to electron donor addition by the groundwater flow path. A false-discovery-rate approach was implemented to discard false-positive results by chance, given the large amount of data compared.

  18. Reducing Eating Disorder Onset in a Very High Risk Sample with Significant Comorbid Depression: A Randomized Controlled Trial

    Science.gov (United States)

    Taylor, C. Barr; Kass, Andrea E.; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E.

    2015-01-01

    Objective Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated on-line eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. Method 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or non-clinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or wait-list control. Assessments included the Eating Disorder Examination (EDE to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. Results ED attitudes and behaviors improved more in the intervention than control group (p = 0.02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = 0.28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% versus 42%, p = 0.025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = 0.016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% versus 57%, NNT = 4). Conclusions An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. PMID:26795936

  19. Reducing eating disorder onset in a very high risk sample with significant comorbid depression: A randomized controlled trial.

    Science.gov (United States)

    Taylor, C Barr; Kass, Andrea E; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E

    2016-05-01

    Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated online eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or nonclinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or waitlist control. Assessments included the Eating Disorder Examination (EDE, to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. ED attitudes and behaviors improved more in the intervention than control group (p = .02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = .28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% vs. 42%, p = .025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = .016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% vs. 57%, NNT = 4). An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. (c) 2016 APA, all rights reserved).

  20. Potent corticosteroid cream (mometasone furoate) significantly reduces acute radiation dermatitis: results from a double-blind, randomized study

    International Nuclear Information System (INIS)

    Bostroem, Aasa; Lindman, Henrik; Swartling, Carl; Berne, Berit; Bergh, Jonas

    2001-01-01

    Purpose: Radiation-induced dermatitis is a very common side effect of radiation therapy, and may necessitate interruption of the therapy. There is a substantial lack of evidence-based treatments for this condition. The aim of this study was to investigate the effect of mometasone furoate cream (MMF) on radiation dermatitis in a prospective, double-blind, randomized study. Material and methods: The study comprised 49 patients with node-negative breast cancer. They were operated on with sector resection and scheduled for postoperative radiotherapy using photons with identical radiation qualities and dosage to the breast parenchyma. The patients were randomized to receive either MMF or emollient cream. The cream was applied on the irradiated skin twice a week from the start of radiotherapy until the 12th fraction (24 Gy) and thereafter once daily until 3 weeks after completion of radiation. Both groups additionally received non-blinded emollient cream daily. The intensity of the acute radiation dermatitis was evaluated on a weekly basis regarding erythema and pigmentation, using a reflectance spectrophotometer together with visual scoring of the skin reactions. Results: MMF in combination with emollient cream treatment significantly decreased acute radiation dermatitis (P=0.0033) compared with emollient cream alone. There was no significant difference in pigmentation between the two groups. Conclusions: Adding MMF, a potent topical corticosteroid, to an emollient cream is statistically significantly more effective than emollient cream alone in reducing acute radiation dermatitis

  1. Reducing the Computational Complexity of Reconstruction in Compressed Sensing Nonuniform Sampling

    DEFF Research Database (Denmark)

    Grigoryan, Ruben; Jensen, Tobias Lindstrøm; Arildsen, Thomas

    2013-01-01

    sparse signals, but requires computationally expensive reconstruction algorithms. This can be an obstacle for real-time applications. The reduction of complexity is achieved by applying a multi-coset sampling procedure. This proposed method reduces the size of the dictionary matrix, the size...

  2. Walking with a four wheeled walker (rollator) significantly reduces EMG lower-limb muscle activity in healthy subjects.

    Science.gov (United States)

    Suica, Zorica; Romkes, Jacqueline; Tal, Amir; Maguire, Clare

    2016-01-01

    To investigate the immediate effect of four-wheeled- walker(rollator)walking on lower-limb muscle activity and trunk-sway in healthy subjects. In this cross-sectional design electromyographic (EMG) data was collected in six lower-limb muscle groups and trunk-sway was measured as peak-to-peak angular displacement of the centre-of-mass (level L2/3) in the sagittal and frontal-planes using the SwayStar balance system. 19 subjects walked at self-selected speed firstly without a rollator then in randomised order 1. with rollator 2. with rollator with increased weight-bearing. Rollator-walking caused statistically significant reductions in EMG activity in lower-limb muscle groups and effect-sizes were medium to large. Increased weight-bearing increased the effect. Trunk-sway in the sagittal and frontal-planes showed no statistically significant difference between conditions. Rollator-walking reduces lower-limb muscle activity but trunk-sway remains unchanged as stability is likely gained through forces generated by the upper-limbs. Short-term stability is gained but the long-term effect is unclear and requires investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Modest hypoxia significantly reduces triglyceride content and lipid droplet size in 3T3-L1 adipocytes

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Takeshi, E-mail: thashimo@fc.ritsumei.ac.jp [Faculty of Sport and Health Science, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga 525-8577 (Japan); Yokokawa, Takumi; Endo, Yuriko [Faculty of Sport and Health Science, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga 525-8577 (Japan); Iwanaka, Nobumasa [Ritsumeikan Global Innovation Research Organization, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga 525-8577 (Japan); Higashida, Kazuhiko [Faculty of Sport and Health Science, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga 525-8577 (Japan); Faculty of Sport Science, Waseda University, 2-579-15 Mikajima, Tokorozawa, Saitama 359-1192 (Japan); Taguchi, Sadayoshi [Faculty of Sport and Health Science, Ritsumeikan University, 1-1-1 Nojihigashi, Kusatsu, Shiga 525-8577 (Japan)

    2013-10-11

    Highlights: •Long-term hypoxia decreased the size of LDs and lipid storage in 3T3-L1 adipocytes. •Long-term hypoxia increased basal lipolysis in 3T3-L1 adipocytes. •Hypoxia decreased lipid-associated proteins in 3T3-L1 adipocytes. •Hypoxia decreased basal glucose uptake and lipogenic proteins in 3T3-L1 adipocytes. •Hypoxia-mediated lipogenesis may be an attractive therapeutic target against obesity. -- Abstract: Background: A previous study has demonstrated that endurance training under hypoxia results in a greater reduction in body fat mass compared to exercise under normoxia. However, the cellular and molecular mechanisms that underlie this hypoxia-mediated reduction in fat mass remain uncertain. Here, we examine the effects of modest hypoxia on adipocyte function. Methods: Differentiated 3T3-L1 adipocytes were incubated at 5% O{sub 2} for 1 week (long-term hypoxia, HL) or one day (short-term hypoxia, HS) and compared with a normoxia control (NC). Results: HL, but not HS, resulted in a significant reduction in lipid droplet size and triglyceride content (by 50%) compared to NC (p < 0.01). As estimated by glycerol release, isoproterenol-induced lipolysis was significantly lowered by hypoxia, whereas the release of free fatty acids under the basal condition was prominently enhanced with HL compared to NC or HS (p < 0.01). Lipolysis-associated proteins, such as perilipin 1 and hormone-sensitive lipase, were unchanged, whereas adipose triglyceride lipase and its activator protein CGI-58 were decreased with HL in comparison to NC. Interestingly, such lipogenic proteins as fatty acid synthase, lipin-1, and peroxisome proliferator-activated receptor gamma were decreased. Furthermore, the uptake of glucose, the major precursor of 3-glycerol phosphate for triglyceride synthesis, was significantly reduced in HL compared to NC or HS (p < 0.01). Conclusion: We conclude that hypoxia has a direct impact on reducing the triglyceride content and lipid droplet size via

  4. Modest hypoxia significantly reduces triglyceride content and lipid droplet size in 3T3-L1 adipocytes

    International Nuclear Information System (INIS)

    Hashimoto, Takeshi; Yokokawa, Takumi; Endo, Yuriko; Iwanaka, Nobumasa; Higashida, Kazuhiko; Taguchi, Sadayoshi

    2013-01-01

    Highlights: •Long-term hypoxia decreased the size of LDs and lipid storage in 3T3-L1 adipocytes. •Long-term hypoxia increased basal lipolysis in 3T3-L1 adipocytes. •Hypoxia decreased lipid-associated proteins in 3T3-L1 adipocytes. •Hypoxia decreased basal glucose uptake and lipogenic proteins in 3T3-L1 adipocytes. •Hypoxia-mediated lipogenesis may be an attractive therapeutic target against obesity. -- Abstract: Background: A previous study has demonstrated that endurance training under hypoxia results in a greater reduction in body fat mass compared to exercise under normoxia. However, the cellular and molecular mechanisms that underlie this hypoxia-mediated reduction in fat mass remain uncertain. Here, we examine the effects of modest hypoxia on adipocyte function. Methods: Differentiated 3T3-L1 adipocytes were incubated at 5% O 2 for 1 week (long-term hypoxia, HL) or one day (short-term hypoxia, HS) and compared with a normoxia control (NC). Results: HL, but not HS, resulted in a significant reduction in lipid droplet size and triglyceride content (by 50%) compared to NC (p < 0.01). As estimated by glycerol release, isoproterenol-induced lipolysis was significantly lowered by hypoxia, whereas the release of free fatty acids under the basal condition was prominently enhanced with HL compared to NC or HS (p < 0.01). Lipolysis-associated proteins, such as perilipin 1 and hormone-sensitive lipase, were unchanged, whereas adipose triglyceride lipase and its activator protein CGI-58 were decreased with HL in comparison to NC. Interestingly, such lipogenic proteins as fatty acid synthase, lipin-1, and peroxisome proliferator-activated receptor gamma were decreased. Furthermore, the uptake of glucose, the major precursor of 3-glycerol phosphate for triglyceride synthesis, was significantly reduced in HL compared to NC or HS (p < 0.01). Conclusion: We conclude that hypoxia has a direct impact on reducing the triglyceride content and lipid droplet size via

  5. Head multidetector computed tomography: emergency medicine physicians overestimate the pretest probability and legal risk of significant findings.

    Science.gov (United States)

    Baskerville, Jerry Ray; Herrick, John

    2012-02-01

    This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  7. A pilot study: Horticulture-related activities significantly reduce stress levels and salivary cortisol concentration of maladjusted elementary school children.

    Science.gov (United States)

    Lee, Min Jung; Oh, Wook; Jang, Ja Soon; Lee, Ju Young

    2018-04-01

    The effects of three horticulture-related activities (HRAs), including floral arranging, planting, and flower pressing were compared to see if they influenced changes on a stress scale and on salivary cortisol concentrations (SCC) in maladjusted elementary school children. Twenty maladjusted elementary school children were randomly assigned either to an experimental or control group. The control group carried out individual favorite indoor activities under the supervision of a teacher. Simultaneously, the ten children in the experimental group participated in a HRA program consisting of flower arrangement (FA), planting (P), and flower pressing (PF) activities, in which the other ten children in the control group did not take part. During nine sessions, the activities were completed as follows: FA-FA-FA, P-P-P, and PF-PF-PF; each session lasted 40 min and took place once a week. For the quantitative analysis of salivary cortisol, saliva was collected from the experimental group one week before the HRAs and immediately after the activities for 9 consecutive weeks at the same time each session. In the experimental group, stress scores of interpersonal relationship, school life, personal problems, and home life decreased after the HRAs by 1.3, 1.8, 4.2, and 1.3 points, respectively. In particular, the stress score of school life was significantly reduced (P < 0.01). In addition, from the investigation of the SCCs for the children before and after repeating HRAs three times, it was found that flower arrangement, planting, and flower pressing activities reduced the SCCs by ≥37% compared to the SCCs prior to taking part in the HRAs. These results indicate that HRAs are associated with a reduction in the stress levels of maladjusted elementary school children. Copyright © 2018. Published by Elsevier Ltd.

  8. Cerebral Embolic Protection During Transcatheter Aortic Valve Replacement Significantly Reduces Death and Stroke Compared With Unprotected Procedures.

    Science.gov (United States)

    Seeger, Julia; Gonska, Birgid; Otto, Markus; Rottbauer, Wolfgang; Wöhrle, Jochen

    2017-11-27

    The aim of this study was to evaluate the impact of cerebral embolic protection on stroke-free survival in patients undergoing transcatheter aortic valve replacement (TAVR). Imaging data on cerebral embolic protection devices have demonstrated a significant reduction in number and volume of cerebral lesions. A total of 802 consecutive patients were enrolled. The Sentinel cerebral embolic protection device (Claret Medical Inc., Santa Rosa, California) was used in 34.9% (n = 280) of consecutive patients. In 65.1% (n = 522) of patients TAVR was performed in the identical setting except without cerebral embolic protection. Neurological follow-up was done within 7 days post-procedure. The primary endpoint was a composite of all-cause mortality or all-stroke according to Valve Academic Research Consortium-2 criteria within 7 days. Propensity score matching was performed to account for possible confounders. Both filters of the device were successfully positioned in 280 of 305 (91.8%) consecutive patients. With use of cerebral embolic protection rate of disabling and nondisabling stroke was significantly reduced from 4.6% to 1.4% (p = 0.03; odds ratio: 0.29, 95% confidence interval: 0.10 to 0.93) in the propensity-matched population (n = 560). The primary endpoint occurred significantly less frequently, with 2.1% (n = 6 of 280) in the protected group compared with 6.8% (n = 19 of 280) in the control group (p = 0.01; odds ratio: 0.30; 95% confidence interval: 0.12 to 0.77). In multivariable analysis Society of Thoracic Surgeons score for mortality (p = 0.02) and TAVR without protection (p = 0.02) were independent predictors for the primary endpoint. In patients undergoing TAVR use of a cerebral embolic protection device demonstrated a significant higher rate of stroke-free survival compared with unprotected TAVR. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  9. Dermal application of nitric oxide releasing acidified nitrite-containing liniments significantly reduces blood pressure in humans.

    Science.gov (United States)

    Opländer, Christian; Volkmar, Christine M; Paunel-Görgülü, Adnana; Fritsch, Thomas; van Faassen, Ernst E; Mürtz, Manfred; Grieb, Gerrit; Bozkurt, Ahmet; Hemmrich, Karsten; Windolf, Joachim; Suschek, Christoph V

    2012-02-15

    Vascular ischemic diseases, hypertension, and other systemic hemodynamic and vascular disorders may be the result of impaired bioavailability of nitric oxide (NO). NO but also its active derivates like nitrite or nitroso compounds are important effector and signal molecules with vasodilating properties. Our previous findings point to a therapeutical potential of cutaneous administration of NO in the treatment of systemic hemodynamic disorders. Unfortunately, no reliable data are available on the mechanisms, kinetics and biological responses of dermal application of nitric oxide in humans in vivo. The aim of the study was to close this gap and to explore the therapeutical potential of dermal nitric oxide application. We characterized with human skin in vitro and in vivo the capacity of NO, applied in a NO-releasing acidified form of nitrite-containing liniments, to penetrate the epidermis and to influence local as well as systemic hemodynamic parameters. We found that dermal application of NO led to a very rapid and significant transepidermal translocation of NO into the underlying tissue. Depending on the size of treated skin area, this translocation manifests itself through a significant systemic increase of the NO derivates nitrite and nitroso compounds, respectively. In parallel, this translocation was accompanied by an increased systemic vasodilatation and blood flow as well as reduced blood pressure. We here give evidence that in humans dermal application of NO has a therapeutic potential for systemic hemodynamic disorders that might arise from local or systemic insufficient availability of NO or its bio-active NO derivates, respectively. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Significant change of local atomic configurations at surface of reduced activation Eurofer steels induced by hydrogenation treatments

    Energy Technology Data Exchange (ETDEWEB)

    Greculeasa, S.G.; Palade, P.; Schinteie, G. [National Institute for Materials Physics, P.O. Box MG-7, 77125, Bucharest-Magurele (Romania); Kuncser, A.; Stanciu, A. [National Institute for Materials Physics, P.O. Box MG-7, 77125, Bucharest-Magurele (Romania); University of Bucharest, Faculty of Physics, 77125, Bucharest-Magurele (Romania); Lungu, G.A. [National Institute for Materials Physics, P.O. Box MG-7, 77125, Bucharest-Magurele (Romania); Porosnicu, C.; Lungu, C.P. [National Institute for Laser, Plasma and Radiation Physics, 77125, Bucharest-Magurele (Romania); Kuncser, V., E-mail: kuncser@infim.ro [National Institute for Materials Physics, P.O. Box MG-7, 77125, Bucharest-Magurele (Romania)

    2017-04-30

    Highlights: • Engineering of Eurofer slab properties by hydrogenation treatments. • Hydrogenation modifies significantly the local atomic configurations at the surface. • Hydrogenation increases the expulsion of the Cr atoms toward the very surface. • Approaching binomial atomic distribution by hydrogenation in the next surface 100 nm. - Abstract: Reduced-activation steels such as Eurofer alloys are candidates for supporting plasma facing components in tokamak-like nuclear fusion reactors. In order to investigate the impact of hydrogen/deuterium insertion in their crystalline lattice, annealing treatments in hydrogen atmosphere have been applied on Eurofer slabs. The resulting samples have been analyzed with respect to local structure and atomic configuration both before and after successive annealing treatments, by X-ray diffractometry (XRD), scanning electron microscopy and energy dispersive spectroscopy (SEM-EDS), X-ray photoelectron spectroscopy (XPS) and conversion electron Mössbauer spectroscopy (CEMS). The corroborated data point out for a bcc type structure of the non-hydrogenated alloy, with an average alloy composition approaching Fe{sub 0.9}Cr{sub 0.1} along a depth of about 100 nm. EDS elemental maps do not indicate surface inhomogeneities in concentration whereas the Mössbauer spectra prove significant deviations from a homogeneous alloying. The hydrogenation increases the expulsion of the Cr atoms toward the surface layer and decreases their oxidation, with considerable influence on the surface properties of the steel. The hydrogenation treatment is therefore proposed as a potential alternative for a convenient engineering of the surface of different Fe-Cr based alloys.

  11. Optical trapping of nanoparticles with significantly reduced laser powers by using counter-propagating beams (Presentation Recording)

    Science.gov (United States)

    Zhao, Chenglong; LeBrun, Thomas W.

    2015-08-01

    Gold nanoparticles (GNP) have wide applications ranging from nanoscale heating to cancer therapy and biological sensing. Optical trapping of GNPs as small as 18 nm has been successfully achieved with laser power as high as 855 mW, but such high powers can damage trapped particles (particularly biological systems) as well heat the fluid, thereby destabilizing the trap. In this article, we show that counter propagating beams (CPB) can successfully trap GNP with laser powers reduced by a factor of 50 compared to that with a single beam. The trapping position of a GNP inside a counter-propagating trap can be easily modulated by either changing the relative power or position of the two beams. Furthermore, we find that under our conditions while a single-beam most stably traps a single particle, the counter-propagating beam can more easily trap multiple particles. This (CPB) trap is compatible with the feedback control system we recently demonstrated to increase the trapping lifetimes of nanoparticles by more than an order of magnitude. Thus, we believe that the future development of advanced trapping techniques combining counter-propagating traps together with control systems should significantly extend the capabilities of optical manipulation of nanoparticles for prototyping and testing 3D nanodevices and bio-sensing.

  12. Secukinumab Significantly Reduces Psoriasis-Related Work Impairment and Indirect Costs Compared With Ustekinumab and Etanercept in the United Kingdom.

    Science.gov (United States)

    Warren, R B; Halliday, A; Graham, C N; Gilloteau, I; Miles, L; McBride, D

    2018-05-30

    Psoriasis causes work productivity impairment that increases with disease severity. Whether differential treatment efficacy translates into differential indirect cost savings is unknown. To assess work hours lost and indirect costs associated with secukinumab versus ustekinumab and etanercept in the United Kingdom (UK). This was a post hoc analysis of work impairment data collected in the CLEAR study (secukinumab vs. ustekinumab) and applied to the FIXTURE study (secukinumab vs. etanercept). Weighted weekly and annual average indirect costs per patient per treatment were calculated from (1) overall work impairment derived from Work Productivity and Activity Impairment data collected in CLEAR at 16 and 52 weeks by Psoriasis Area and Severity Index (PASI) response level; (2) weekly/annual work productivity loss by PASI response level; (3) weekly and annual indirect costs by PASI response level, based on hours of work productivity loss; and (4) weighted average indirect costs for each treatment. In the primary analysis, work impairment data for employed patients in CLEAR at Week 16 were used to compare secukinumab and ustekinumab. Secondary analyses were conducted at different timepoints and with patient cohorts, including FIXTURE. In CLEAR, 452 patients (67%) were employed at baseline. At Week 16, percentages of weekly work impairment/mean hours lost decreased with higher PASI: PASI hours; PASI 50-74: 13.3%/4.45 hours; PASI 75-89: 6.4%/2.14 hours; PASI ≥90: 4.9%/1.65 hours. Weighted mean weekly/annual work hours lost were significantly lower for secukinumab than ustekinumab (1.96/102.51 vs. 2.40/125.12; P=0.0006). Results were consistent for secukinumab versus etanercept (2.29/119.67 vs. 3.59/187.17; Ρreduced work impairment and associated indirect costs of psoriasis compared with ustekinumab and etanercept at Week 16 through 52 in the UK. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Implementation of a solution Cloud Computing with MapReduce model

    International Nuclear Information System (INIS)

    Baya, Chalabi

    2014-01-01

    In recent years, large scale computer systems have emerged to meet the demands of high storage, supercomputing, and applications using very large data sets. The emergence of Cloud Computing offers the potentiel for analysis and processing of large data sets. Mapreduce is the most popular programming model which is used to support the development of such applications. It was initially designed by Google for building large datacenters on a large scale, to provide Web search services with rapid response and high availability. In this paper we will test the clustering algorithm K-means Clustering in a Cloud Computing. This algorithm is implemented on MapReduce. It has been chosen for its characteristics that are representative of many iterative data analysis algorithms. Then, we modify the framework CloudSim to simulate the MapReduce execution of K-means Clustering on different Cloud Computing, depending on their size and characteristics of target platforms. The experiment show that the implementation of K-means Clustering gives good results especially for large data set and the Cloud infrastructure has an influence on these results

  14. Effective computation of stochastic protein kinetic equation by reducing stiffness via variable transformation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lijin, E-mail: ljwang@ucas.ac.cn [School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049 (China)

    2016-06-08

    The stochastic protein kinetic equations can be stiff for certain parameters, which makes their numerical simulation rely on very small time step sizes, resulting in large computational cost and accumulated round-off errors. For such situation, we provide a method of reducing stiffness of the stochastic protein kinetic equation by means of a kind of variable transformation. Theoretical and numerical analysis show effectiveness of this method. Its generalization to a more general class of stochastic differential equation models is also discussed.

  15. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  16. Prognostic significance of tumor size of small lung adenocarcinomas evaluated with mediastinal window settings on computed tomography.

    Directory of Open Access Journals (Sweden)

    Yukinori Sakao

    Full Text Available BACKGROUND: We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. METHODS: We evaluated 176 patients with small lung adenocarcinomas (diameter, 1-3 cm who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion. Recurrence-free survival was used for prognosis. RESULTS: Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0

  17. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    Science.gov (United States)

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0

  18. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  19. Substituting computers for services - potential to reduce ICT's environmental footprint

    Energy Technology Data Exchange (ETDEWEB)

    Plepys, A. [The International Inst. for Industrial Environmental Economics at Lund Univ. (Sweden)

    2004-07-01

    The environmental footprint of IT products are significant and, in spite of manufacturing and product design improvements, growing consumption of electronics results in increasing absolute environmental impact. Computers have short technological lifespan and a lot of the in-build performance, although necessary, remains idling for most of the time. Today, most of computers used in non-residential sectors are connected to networks. The premise of this paper is that computer networks are an untapped resource, which could allow addressing environmental impacts of IT products through centralising and sharing computing resources. The article presents results of a comparative study of two computing architectures. The first one is the traditional decentralised PC-based system and the second - centralised server-based computing (SBC) system. Both systems deliver equivalent functions to the final users and this can be compared on a one-to-one basis. The study evaluates product lifespan, energy consumption in user stage, product design and its environmental implications in manufacturing. (orig.)

  20. Reduced opiate use after total knee arthroplasty using computer-assisted cryotherapy.

    Science.gov (United States)

    Thijs, Elke; Schotanus, Martijn G M; Bemelmans, Yoeri F L; Kort, Nanne P

    2018-05-03

    Despite multimodal pain management and advances in anesthetic techniques, total knee arthroplasty (TKA) remains painful during the early postoperative phase. This trial investigated whether computer-assisted cryotherapy (CAC) is effective in reduction of pain and consumption of opioids in patients operated for TKA following an outpatient surgery pathway. Sixty patients scheduled for primary TKA were included in this prospective, double-blind, randomized controlled trial receiving CAC at 10-12 °C (Cold-group, n = 30) or at 21 °C (Warm-group, n = 30) during the first 7 days after TKA according to a fixed schedule. All patients received the same pre-, peri- and postoperative care with a multimodal pain protocol. Pain was assessed before and after every session of cryotherapy using the numerical rating scale for pain (NRS-pain). The consumption of opioids was strictly noted during the first 4 postoperative days. Secondary outcomes were knee swelling, visual hematoma and patient reported outcome measures (PROMs). These parameters were measured pre-, 1, 2 and 6 weeks postoperatively. In both study groups, a reduction in NRS-pain after every CAC session were seen during the postoperative period of 7 days. A mean reduction of 0.9 and 0.7 on the NRS-pain was seen for respectively the Cold- (P = 0.008) and Warm-group (n.s.). A significant (P = 0.001) lower number of opioids were used by the Cold-group during the acute postoperative phase of 4 days, 47 and 83 tablets for respectively the Cold and Warm-group. No difference could be observed for secondary outcomes and adverse effects between both study groups. Postoperative CAC can be in added value in patients following an outpatient surgery pathway for TKA, resulting in reduced experienced pain and consumption of opioids during the first postoperative days.

  1. Oxidation of naturally reduced uranium in aquifer sediments by dissolved oxygen and its potential significance to uranium plume persistence

    Science.gov (United States)

    Davis, J. A.; Smith, R. L.; Bohlke, J. K.; Jemison, N.; Xiang, H.; Repert, D. A.; Yuan, X.; Williams, K. H.

    2015-12-01

    The occurrence of naturally reduced zones is common in alluvial aquifers in the western U.S.A. due to the burial of woody debris in flood plains. Such reduced zones are usually heterogeneously dispersed in these aquifers and characterized by high concentrations of organic carbon, reduced mineral phases, and reduced forms of metals, including uranium(IV). The persistence of high concentrations of dissolved uranium(VI) at uranium-contaminated aquifers on the Colorado Plateau has been attributed to slow oxidation of insoluble uranium(IV) mineral phases found in association with these reducing zones, although there is little understanding of the relative importance of various potential oxidants. Four field experiments were conducted within an alluvial aquifer adjacent to the Colorado River near Rifle, CO, wherein groundwater associated with the naturally reduced zones was pumped into a gas-impermeable tank, mixed with a conservative tracer (Br-), bubbled with a gas phase composed of 97% O2 and 3% CO2, and then returned to the subsurface in the same well from which it was withdrawn. Within minutes of re-injection of the oxygenated groundwater, dissolved uranium(VI) concentrations increased from less than 1 μM to greater than 2.5 μM, demonstrating that oxygen can be an important oxidant for uranium in such field systems if supplied to the naturally reduced zones. Dissolved Fe(II) concentrations decreased to the detection limit, but increases in sulfate could not be detected due to high background concentrations. Changes in nitrogen species concentrations were variable. The results contrast with other laboratory and field results in which oxygen was introduced to systems containing high concentrations of mackinawite (FeS), rather than the more crystalline iron sulfides found in aged, naturally reduced zones. The flux of oxygen to the naturally reduced zones in the alluvial aquifers occurs mainly through interactions between groundwater and gas phases at the water table

  2. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    International Nuclear Information System (INIS)

    Lent, Wineke A.M. van; Deetman, Joost W.; Teertstra, H. Jelle; Muller, Sara H.; Hans, Erwin W.; Harten, Wim H. van

    2012-01-01

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  3. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  4. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  5. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  6. Vaccination of pigs two weeks before infection significantly reduces transmission of foot-and-mouth disease virus

    NARCIS (Netherlands)

    Eble, P.L.; Bouma, A.; Bruin, de M.G.M.; Hemert-Kluitenberg, van F.; Oirschot, van J.T.; Dekker, A.

    2004-01-01

    The objective of this study was to investigate whether and at what time interval could vaccination reduce transmission of foot-and-Mouth disease virus (FMDV) among pigs. Reduction of virus transmission by vaccination was determined experimentally. Transmission of FMDV was studied in three groups of

  7. Computer-based training (CBT) intervention reduces workplace violence and harassment for homecare workers.

    Science.gov (United States)

    Glass, Nancy; Hanson, Ginger C; Anger, W Kent; Laharnar, Naima; Campbell, Jacquelyn C; Weinstein, Marc; Perrin, Nancy

    2017-07-01

    The study examines the effectiveness of a workplace violence and harassment prevention and response program with female homecare workers in a consumer driven model of care. Homecare workers were randomized to either; computer based training (CBT only) or computer-based training with homecare worker peer facilitation (CBT + peer). Participants completed measures on confidence, incidents of violence, and harassment, health and work outcomes at baseline, 3, 6 months post-baseline. Homecare workers reported improved confidence to prevent and respond to workplace violence and harassment and a reduction in incidents of workplace violence and harassment in both groups at 6-month follow-up. A decrease in negative health and work outcomes associated with violence and harassment were not reported in the groups. CBT alone or with trained peer facilitation with homecare workers can increase confidence and reduce incidents of workplace violence and harassment in a consumer-driven model of care. © 2017 Wiley Periodicals, Inc.

  8. REDUCED DATA FOR CURVE MODELING – APPLICATIONS IN GRAPHICS, COMPUTER VISION AND PHYSICS

    Directory of Open Access Journals (Sweden)

    Małgorzata Janik

    2013-06-01

    Full Text Available In this paper we consider the problem of modeling curves in Rn via interpolation without a priori specified interpolation knots. We discuss two approaches to estimate the missing knots for non-parametric data (i.e. collection of points. The first approach (uniform evaluation is based on blind guess in which knots are chosen uniformly. The second approach (cumulative chord parameterization incorporates the geometry of the distribution of data points. More precisely, the difference is equal to the Euclidean distance between data points qi+1 and qi. The second method partially compensates for the loss of the information carried by the reduced data. We also present the application of the above schemes for fitting non-parametric data in computer graphics (light-source motion rendering, in computer vision (image segmentation and in physics (high velocity particles trajectory modeling. Though experiments are conducted for points in R2 and R3 the entire method is equally applicable in Rn.

  9. Computer-based versus in-person interventions for preventing and reducing stress in workers.

    Science.gov (United States)

    Kuster, Anootnara Talkul; Dalsbø, Therese K; Luong Thanh, Bao Yen; Agarwal, Arnav; Durand-Moreau, Quentin V; Kirkehei, Ingvild

    2017-08-30

    Chronic exposure to stress has been linked to several negative physiological and psychological health outcomes. Among employees, stress and its associated effects can also result in productivity losses and higher healthcare costs. In-person (face-to-face) and computer-based (web- and mobile-based) stress management interventions have been shown to be effective in reducing stress in employees compared to no intervention. However, it is unclear if one form of intervention delivery is more effective than the other. It is conceivable that computer-based interventions are more accessible, convenient, and cost-effective. To compare the effects of computer-based interventions versus in-person interventions for preventing and reducing stress in workers. We searched CENTRAL, MEDLINE, PubMed, Embase, PsycINFO, NIOSHTIC, NIOSHTIC-2, HSELINE, CISDOC, and two trials registers up to February 2017. We included randomised controlled studies that compared the effectiveness of a computer-based stress management intervention (using any technique) with a face-to-face intervention that had the same content. We included studies that measured stress or burnout as an outcome, and used workers from any occupation as participants. Three authors independently screened and selected 75 unique studies for full-text review from 3431 unique reports identified from the search. We excluded 73 studies based on full-text assessment. We included two studies. Two review authors independently extracted stress outcome data from the two included studies. We contacted study authors to gather additional data. We used standardised mean differences (SMDs) with 95% confidence intervals (CIs) to report study results. We did not perform meta-analyses due to variability in the primary outcome and considerable statistical heterogeneity. We used the GRADE approach to rate the quality of the evidence. Two studies met the inclusion criteria, including a total of 159 participants in the included arms of the studies

  10. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  11. ClusterSignificance: A bioconductor package facilitating statistical analysis of class cluster separations in dimensionality reduced data

    DEFF Research Database (Denmark)

    Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per

    2017-01-01

    , e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....

  12. Evaluation of a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography of scaphoid fixation screws.

    Science.gov (United States)

    Filli, Lukas; Marcon, Magda; Scholz, Bernhard; Calcagni, Maurizio; Finkenstädt, Tim; Andreisek, Gustav; Guggenberger, Roman

    2014-12-01

    The aim of this study was to evaluate a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography (FDCT) of scaphoid fixation screws. FDCT has gained interest in imaging small anatomic structures of the appendicular skeleton. Angiographic C-arm systems with flat detectors allow fluoroscopy and FDCT imaging in a one-stop procedure emphasizing their role as an ideal intraoperative imaging tool. However, FDCT imaging can be significantly impaired by artefacts induced by fixation screws. Following ethical board approval, commercially available scaphoid fixation screws were inserted into six cadaveric specimens in order to fix artificially induced scaphoid fractures. FDCT images corrected with the algorithm were compared to uncorrected images both quantitatively and qualitatively by two independent radiologists in terms of artefacts, screw contour, fracture line visibility, bone visibility, and soft tissue definition. Normal distribution of variables was evaluated using the Kolmogorov-Smirnov test. In case of normal distribution, quantitative variables were compared using paired Student's t tests. The Wilcoxon signed-rank test was used for quantitative variables without normal distribution and all qualitative variables. A p value of < 0.05 was considered to indicate statistically significant differences. Metal artefacts were significantly reduced by the correction algorithm (p < 0.001), and the fracture line was more clearly defined (p < 0.01). The inter-observer reliability was "almost perfect" (intra-class correlation coefficient 0.85, p < 0.001). The prototype correction algorithm in FDCT for metal artefacts induced by scaphoid fixation screws may facilitate intra- and postoperative follow-up imaging. Flat detector computed tomography (FDCT) is a helpful imaging tool for scaphoid fixation. The correction algorithm significantly reduces artefacts in FDCT induced by scaphoid fixation screws. This may facilitate intra

  13. Cone-beam computed tomography analysis of accessory maxillary ostium and Haller cells: Prevalence and clinical significance

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Ibrahim K.; Sansare, Kaustubh; Karjodkar, Freny R.; Vanga, Kavita; Salve, Prashant [Dept. of Oral Medicine and Radiology, Nair Hospital Dental College, Mumbai (India); Pawar, Ajinkya M. [Dept. of Conservative Dentistry and Endodontics, Nair Hospital Dental College, Mumbai (India)

    2017-03-15

    This study aimed to evaluate the prevalence of Haller cells and accessory maxillary ostium (AMO) in cone-beam computed tomography (CBCT) images, and to analyze the relationships among Haller cells, AMO, and maxillary sinusitis. Volumetric CBCT scans from 201 patients were retrieved from our institution's Digital Imaging and Communications in Medicine archive folder. Two observers evaluated the presence of Haller cells, AMO, and maxillary sinusitis in the CBCT scans. AMO was observed in 114 patients, of whom 27 (23.7%) had AMO exclusively on the right side, 26 (22.8%) only on the left side, and 61 (53.5%) bilaterally. Haller cells were identified in 73 (36.3%) patients. In 24 (32.9%) they were present exclusively on the right side, in 17 (23.3%) they were only present on the left side, and in 32 (43.8%) they were located bilaterally. Of the 73 (36.3%) patients with Haller cells, maxillary sinusitis was also present in 50 (68.5%). On using chi-square test, a significant association was observed between AMO and maxillary sinusitis in the presence of Haller cells. Our results showed AMO and Haller cells to be associated with maxillary sinusitis. This study provides evidence for the usefulness of CBCT in imaging the bony anatomy of the sinonasal complex with significantly higher precision and a smaller radiation dose.

  14. Computation of complexity measures of morphologically significant zones decomposed from binary fractal sets via multiscale convexity analysis

    International Nuclear Information System (INIS)

    Lim, Sin Liang; Koo, Voon Chet; Daya Sagar, B.S.

    2009-01-01

    Multiscale convexity analysis of certain fractal binary objects-like 8-segment Koch quadric, Koch triadic, and random Koch quadric and triadic islands-is performed via (i) morphologic openings with respect to recursively changing the size of a template, and (ii) construction of convex hulls through half-plane closings. Based on scale vs convexity measure relationship, transition levels between the morphologic regimes are determined as crossover scales. These crossover scales are taken as the basis to segment binary fractal objects into various morphologically prominent zones. Each segmented zone is characterized through normalized morphologic complexity measures. Despite the fact that there is no notably significant relationship between the zone-wise complexity measures and fractal dimensions computed by conventional box counting method, fractal objects-whether they are generated deterministically or by introducing randomness-possess morphologically significant sub-zones with varied degrees of spatial complexities. Classification of realistic fractal sets and/or fields according to sub-zones possessing varied degrees of spatial complexities provides insight to explore links with the physical processes involved in the formation of fractal-like phenomena.

  15. Reduced-Order Computational Model for Low-Frequency Dynamics of Automobiles

    Directory of Open Access Journals (Sweden)

    A. Arnoux

    2013-01-01

    Full Text Available A reduced-order model is constructed to predict, for the low-frequency range, the dynamical responses in the stiff parts of an automobile constituted of stiff and flexible parts. The vehicle has then many elastic modes in this range due to the presence of many flexible parts and equipment. A nonusual reduced-order model is introduced. The family of the elastic modes is not used and is replaced by an adapted vector basis of the admissible space of global displacements. Such a construction requires a decomposition of the domain of the structure in subdomains in order to control the spatial wave length of the global displacements. The fast marching method is used to carry out the subdomain decomposition. A probabilistic model of uncertainties is introduced. The parameters controlling the level of uncertainties are estimated solving a statistical inverse problem. The methodology is validated with a large computational model of an automobile.

  16. Optimizing contrast agents with respect to reducing beam hardening in nonmedical X-ray computed tomography experiments.

    Science.gov (United States)

    Nakashima, Yoshito; Nakano, Tsukasa

    2014-01-01

    Iodine is commonly used as a contrast agent in nonmedical science and engineering, for example, to visualize Darcy flow in porous geological media using X-ray computed tomography (CT). Undesirable beam hardening artifacts occur when a polychromatic X-ray source is used, which makes the quantitative analysis of CT images difficult. To optimize the chemistry of a contrast agent in terms of the beam hardening reduction, we performed computer simulations and generated synthetic CT images of a homogeneous cylindrical sand-pack (diameter, 28 or 56 mm; porosity, 39 vol.% saturated with aqueous suspensions of heavy elements assuming the use of a polychromatic medical CT scanner. The degree of cupping derived from the beam hardening was assessed using the reconstructed CT images to find the chemistry of the suspension that induced the least cupping. The results showed that (i) the degree of cupping depended on the position of the K absorption edge of the heavy element relative to peak of the polychromatic incident X-ray spectrum, (ii) (53)I was not an ideal contrast agent because it causes marked cupping, and (iii) a single element much heavier than (53)I ((64)Gd to (79)Au) reduced the cupping artifact significantly, and a four-heavy-element mixture of elements from (64)Gd to (79)Au reduced the artifact most significantly.

  17. Diagnostic significance of rib series in minor thorax trauma compared to plain chest film and computed tomography.

    Science.gov (United States)

    Hoffstetter, Patrick; Dornia, Christian; Schäfer, Stephan; Wagner, Merle; Dendl, Lena M; Stroszczynski, Christian; Schreyer, Andreas G

    2014-01-01

    Rib series (RS) are a special radiological technique to improve the visualization of the bony parts of the chest. The aim of this study was to evaluate the diagnostic accuracy of rib series in minor thorax trauma. Retrospective study of 56 patients who received RS, 39 patients where additionally evaluated by plain chest film (PCF). All patients underwent a computed tomography (CT) of the chest. RS and PCF were re-read independently by three radiologists, the results were compared with the CT as goldstandard. Sensitivity, specificity, negative and positive predictive value were calculated. Significance in the differences of findings was determined by McNemar test, interobserver variability by Cohens kappa test. 56 patients were evaluated (34 men, 22 women, mean age =61 y.). In 22 patients one or more rib fracture could be identified by CT. In 18 of these cases (82%) the correct diagnosis was made by RS, in 16 cases (73%) the correct number of involved ribs was detected. These differences were significant (p = 0.03). Specificity was 100%, negative and positive predictive value were 85% and 100%. Kappa values for the interobserver agreement was 0.92-0.96. Sensitivity of PCF was 46% and was significantly lower (p = 0.008) compared to CT. Rib series does not seem to be an useful examination in evaluating minor thorax trauma. CT seems to be the method of choice to detect rib fractures, but the clinical value of the radiological proof has to be discussed and investigated in larger follow up studies.

  18. An enhanced technique for mobile cloudlet offloading with reduced computation using compression in the cloud

    Science.gov (United States)

    Moro, A. C.; Nadesh, R. K.

    2017-11-01

    The cloud computing paradigm has transformed the way we do business in today’s world. Services on cloud have come a long way since just providing basic storage or software on demand. One of the fastest growing factor in this is mobile cloud computing. With the option of offloading now available to mobile users, mobile users can offload entire applications onto cloudlets. With the problems regarding availability and limited-storage capacity of these mobile cloudlets, it becomes difficult to decide for the mobile user when to use his local memory or the cloudlets. Hence, we take a look at a fast algorithm that decides whether the mobile user should go for cloudlet or rely on local memory based on an offloading probability. We have partially implemented the algorithm which decides whether the task can be carried out locally or given to a cloudlet. But as it becomes a burden on the mobile devices to perform the complete computation, so we look to offload this on to a cloud in our paper. Also further we use a file compression technique before sending the file onto the cloud to further reduce the load.

  19. A novel multi-stage subunit vaccine against paratuberculosis induces significant immunity and reduces bacterial burden in tissues (P4304)

    DEFF Research Database (Denmark)

    Thakur, Aneesh; Aagaard, Claus; Riber, Ulla

    2013-01-01

    Effective control of paratuberculosis is hindered by lack of a vaccine preventing infection, transmission and without diagnostic interference with tuberculosis. We have developed a novel multi-stage recombinant subunit vaccine in which a fusion of four early expressed MAP antigens is combined...... characterized by a significant containment of bacterial burden in gut tissues compared to non-vaccinated animals. There was no cross-reaction with bovine tuberculosis in vaccinated animals. This novel multi-stage vaccine has the potential to become a marker vaccine for paratuberculosis....

  20. Reduced expression of circRNA hsa_circ_0003159 in gastric cancer and its clinical significance.

    Science.gov (United States)

    Tian, Mengqian; Chen, Ruoyu; Li, Tianwen; Xiao, Bingxiu

    2018-03-01

    Circular RNAs (circRNAs) play a crucial role in the occurrence of several diseases including cancers. However, little is known about circRNAs' diagnostic values for gastric cancer, one of the worldwide most common diseases of mortality. The hsa_circ_0003159 levels in 108 paired gastric cancer tissues and adjacent non-tumorous tissues from surgical patients with gastric cancer were first detected by real-time quantitative reverse transcription-polymerase chain reaction. Then, the relationships between hsa_circ_0003159 expression levels in gastric cancer tissues and the clinicopathological factors of patients with gastric cancer were analyzed. Finally, its diagnostic value was evaluated through the receiver operating characteristic curve. Compared with paired adjacent non-tumorous tissues, hsa_circ_0003159 expression was significantly down-regulated in gastric cancer tissues. What is more, we found that hsa_circ_0003159 expression levels were significantly negatively associated with gender, distal metastasis, and tumor-node-metastasis stage. All of the results suggest that hsa_circ_0003159 may be a potential cancer marker of patients with gastric cancer. © 2017 Wiley Periodicals, Inc.

  1. β-Hydroxy-β-methylbutyrate (HMB) supplementation and resistance exercise significantly reduce abdominal adiposity in healthy elderly men.

    Science.gov (United States)

    Stout, Jeffrey R; Fukuda, David H; Kendall, Kristina L; Smith-Ryan, Abbie E; Moon, Jordan R; Hoffman, Jay R

    2015-04-01

    The effects of 12-weeks of HMB ingestion and resistance training (RT) on abdominal adiposity were examined in 48 men (66-78 yrs). All participants were randomly assigned to 1 of 4 groups: no-training placebo (NT-PL), HMB only (NT-HMB), RT with PL (RT-PL), or HMB with RT (RT-HMB). DXA was used to estimate abdominal fat mass (AFM) by placing the region of interest over the L1-L4 region of the spine. Outcomes were assessed by ANCOVA, with Bonferroni-corrected pairwise comparisons. Baseline AFM values were used as the covariate. The ANCOVA indicated a significant difference (p = 0.013) between group means for the adjusted posttest AFM values (mean (kg) ± SE: NT-PL = 2.59 ± 0.06; NT-HMB = 2.59 ± 0.61; RT-PL = 2.59 ± 0.62; RT-HMB = 2.34 ± 0.61). The pairwise comparisons indicated that AFM following the intervention period in the RT-HMB group was significantly less than NT-PL (p = 0.013), NT-HMB (p = 0.011), and RT-PL (p = 0.010). These data suggested that HMB in combination with 12 weeks of RT decreased AFM in elderly men. Copyright © 2015. Published by Elsevier Inc.

  2. Wind Erosion Caused by Land Use Changes Significantly Reduces Ecosystem Carbon Storage and Carbon Sequestration Potentials in Grassland

    Science.gov (United States)

    Li, P.; Chi, Y. G.; Wang, J.; Liu, L.

    2017-12-01

    Wind erosion exerts a fundamental influence on the biotic and abiotic processes associated with ecosystem carbon (C) cycle. However, how wind erosion under different land use scenarios will affect ecosystem C balance and its capacity for future C sequestration are poorly quantified. Here, we established an experiment in a temperate steppe in Inner Mongolia, and simulated different intensity of land uses: control, 50% of aboveground vegetation removal (50R), 100% vegetation removal (100R) and tillage (TI). We monitored lateral and vertical carbon flux components and soil characteristics from 2013 to 2016. Our study reveals three key findings relating to the driving factors, the magnitude and consequence of wind erosion on ecosystem C balance: (1) Frequency of heavy wind exerts a fundamental control over the severity of soil erosion, and its interaction with precipitation and vegetation characteristics explained 69% variation in erosion intensity. (2) With increases in land use intensity, the lateral C flux induced by wind erosion increased rapidly, equivalent to 33%, 86%, 111% and 183% of the net ecosystem exchange of the control site under control, 50R, 100R and TI sites, respectively. (3) After three years' treatment, erosion induced decrease in fine fractions led to 31%, 43%, 85% of permanent loss of C sequestration potential in the surface 5cm soil for 50R, 100R and TI sites. Overall, our study demonstrates that lateral C flux associated with wind erosion is too large to be ignored. The loss of C-enriched fine particles not only reduces current ecosystem C content, but also results in irreversible loss of future soil C sequestration potential. The dynamic soil characteristics need be considered when projecting future ecosystem C balance in aeolian landscape. We also propose that to maintain the sustainability of grassland ecosystems, land managers should focus on implementing appropriate land use rather than rely on subsequent managements on degraded soils.

  3. Postoperative Stiffness Requiring Manipulation Under Anesthesia Is Significantly Reduced After Simultaneous Versus Staged Bilateral Total Knee Arthroplasty.

    Science.gov (United States)

    Meehan, John P; Monazzam, Shafagh; Miles, Troy; Danielsen, Beate; White, Richard H

    2017-12-20

    adjust for relevant risk factors, the 90-day odds ratio (OR) of undergoing manipulation after simultaneous bilateral TKA was significantly lower than that for unilateral TKA (OR = 0.70; 95% confidence interval [CI], 0.57 to 0.86) and staged bilateral TKA (OR = 0.71; 95% CI, 0.57 to 0.90). Similarly, at 180 days, the odds of undergoing manipulation were significantly lower after simultaneous bilateral TKA than after both unilateral TKA (OR = 0.71; 95% CI, 0.59 to 0.84) and staged bilateral TKA (OR = 0.76; 95% CI, 0.63 to 0.93). The frequency of manipulation was significantly associated with younger age, fewer comorbidities, black race, and the absence of obesity. Although the ORs were small (close to 1), simultaneous bilateral TKA had a significantly decreased rate of stiffness requiring manipulation under anesthesia at 90 days and 180 days after knee replacement compared with that after staged bilateral TKA and unilateral TKA. Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

  4. Significance of surface functionalization of Gold Nanorods for reduced effect on IgG stability and minimization of cytotoxicity

    Energy Technology Data Exchange (ETDEWEB)

    Alex, Sruthi Ann; Rajiv, Sundaramoorthy [Centre for Nanobiotechnology, VIT University, Vellore (India); Chakravarty, Sujay [UGC-DAE CSR, Kalpakkam, Node, Kokilamedu (India); Chandrasekaran, N. [Centre for Nanobiotechnology, VIT University, Vellore (India); Mukherjee, Amitava, E-mail: amit.mookerjea@gmail.com [Centre for Nanobiotechnology, VIT University, Vellore (India)

    2017-02-01

    side effect of AuNRs by modifying capping. • Polymer-coated AuNRs safe for in vitro assays, but hamper protein functioning. • PEG-AuNRs reduced toxicity to lymphocyte cells and lesser effect on IgG. • Highlights importance of neutral PEGylated particles for theranostic applications.

  5. Significance of surface functionalization of Gold Nanorods for reduced effect on IgG stability and minimization of cytotoxicity

    International Nuclear Information System (INIS)

    Alex, Sruthi Ann; Rajiv, Sundaramoorthy; Chakravarty, Sujay; Chandrasekaran, N.; Mukherjee, Amitava

    2017-01-01

    side effect of AuNRs by modifying capping. • Polymer-coated AuNRs safe for in vitro assays, but hamper protein functioning. • PEG-AuNRs reduced toxicity to lymphocyte cells and lesser effect on IgG. • Highlights importance of neutral PEGylated particles for theranostic applications.

  6. [Intra-Articular Application of Tranexamic Acid Significantly Reduces Blood Loss and Transfusion Requirement in Primary Total Knee Arthroplasty].

    Science.gov (United States)

    Lošťák, J; Gallo, J; Špička, J; Langová, K

    2016-01-01

    PURPOSE OF THE STUDY The aim of this prospective study was to investigate the effect of topical application of tranexamic acid (TXA, Exacyl) on the amount of post-operative blood loss, and blood transfusion requirement in patients undergoing primary total knee arthroplasty (TKA). Attention was paid to early complications potentially associated with TXA administration, such as haematoma, wound exudate, or knee swelling. In addition, the economic benefit of TXA treatment was also taken into account. MATERIAL AND METHODS The study included 238 patients (85 men and 153 women) who underwent primary total knee arthroplasty (TKA) at our department between January 2013 and November 2015. A group of 119 patients (41 men and 78 women) received intraarticular TXA injections according to the treatment protocol (TXA group). A control group matched in basic characteristics to the TXA group also consisted of 119 patients. The average age in the TXA group was 69.8 years, and the most frequent indication for TKA surgery was primary knee osteoarthritis (81.5%). In each patient, post-operative volume of blood lost from drains and total blood loss including hidden blood loss were recorded, as well as post-operative haemoglobin and haematocrit levels. On discharge of each patient from hospital, the size and site of a haematoma; wound exudate, if present after post-operative day 4; joint swelling; range of motion and early revision surgery, if performed, were evaluated. Requirements of analgesic drugs after surgery were also recorded. RESULTS In the TXA group, blood losses from drains were significantly lower than in the control group (456.7 ± 270.8 vs 640.5 ±448.2; p = 0.004). The median value for blood losses from drains was lower by 22% and the average value for total blood loss, including hidden losses, was also lower than in the control group (762.4 ± 345.2 ml vs 995.5 ± 457.3 ml). The difference in the total amount of blood loss between the two groups was significant (p = 0

  7. Weight loss significantly reduces serum lipocalin-2 levels in overweight and obese women with polycystic ovary syndrome.

    Science.gov (United States)

    Koiou, Ekaterini; Tziomalos, Konstantinos; Katsikis, Ilias; Kandaraki, Eleni A; Kalaitzakis, Emmanuil; Delkos, Dimitrios; Vosnakis, Christos; Panidis, Dimitrios

    2012-01-01

    Serum lipocalin-2 levels are elevated in obese patients. We assessed serum lipocalin-2 levels in polycystic ovary syndrome (PCOS) and the effects of weight loss or metformin on these levels. Forty-seven overweight/obese patients with PCOS [body mass index (BMI) >27 kg/m(2)] were instructed to follow a low-calorie diet, to exercise and were given orlistat or sibutramine for 6 months. Twenty-five normal weight patients with PCOS (BMI weight and 25 overweight/obese healthy female volunteers comprised the control groups. Serum lipocalin-2 levels did not differ between overweight/obese patients with PCOS and overweight/obese controls (p = 0.258), or between normal weight patients with PCOS and normal weight controls (p = 0.878). Lipocalin-2 levels were higher in overweight/obese patients with PCOS than in normal weight patients with PCOS (p weight loss resulted in a fall in lipocalin-2 levels (p weight patients with PCOS, treatment with metformin did not affect lipocalin-2 levels (p = 0.484). In conclusion, PCOS per se is not associated with elevated lipocalin-2 levels. Weight loss induces a significant reduction in lipocalin-2 levels in overweight/obese patients with PCOS.

  8. The co registration of initial PET on the CT-radiotherapy reduces significantly the variabilities of anatomo-clinical target volume in the child hodgkin disease

    International Nuclear Information System (INIS)

    Metwally, H.; Blouet, A.; David, I.; Rives, M.; Izar, F.; Courbon, F.; Filleron, T.; Laprie, A.; Plat, G.; Vial, J.

    2009-01-01

    It exists a great interobserver variability for the anatomo-clinical target volume (C.T.V.) definition in children suffering of Hodgkin disease. In this study, the co-registration of the PET with F.D.G. on the planning computed tomography has significantly lead to a greater coherence in the clinical target volume definition. (N.C.)

  9. Sense of coherence is significantly associated with both metabolic syndrome and lifestyle in Japanese computer software office workers

    Directory of Open Access Journals (Sweden)

    Yusaku Morita

    2014-12-01

    Full Text Available Objectives: Sense of coherence (SOC is an individual characteristic related to a positive life orientation, leading to effective coping. Little is known about the relationship between SOC and metabolic syndrome (MetS. This cross-sectional study aimed at testing the hypothesis that workers with a strong SOC have fewer atherosclerotic risk factors, including MetS, and healthier lifestyle behaviors. Material and Methods: One hundred and sixty-seven computer software workers aged 20–64 years underwent a periodical health examination including assessment of body mass index, waist circumference, blood pressure, blood lipid levels, fasting blood sugar (FBS levels and lifestyle behaviors (walking duration, smoking status, nutrition, alcohol consumption, and sleep duration. During this period, the participants also completed a 29-item questionnaire of SOC and the Brief Job Stress Questionnaire to assess job stressors such as job strain and workplace social support. Results: Our results showed that the participants with a stronger SOC were likely to walk for at least 1 h a day, to eat slowly or at a moderate speed, and to sleep for at least 6 h. Compared with the participants with the weakest SOC, those with the strongest SOC had a significantly lower odds ratio (OR for being overweight (OR = 0.31; 95% confidence interval (CI: 0.11–0.81, and having higher FBS levels (OR = 0.11; 95% CI: 0.02–0.54, dyslipidemia (OR = 0.29; 95% CI: 0.09–0.84, and MetS (OR = 0.12; 95% CI: 0.02–0.63, even after adjusting for age, gender and job stressors. Conclusions: High SOC is associated with a healthy lifestyle and fewer atherosclerotic risk factors, including MetS.

  10. Significance and management of computed tomography detected pulmonary nodules: a report from the National Wilms Tumor Study Group

    International Nuclear Information System (INIS)

    Meisel, Jay A.; Guthrie, Katherine A.; Breslow, Norman E.; Donaldson, Sarah S.; Green, Daniel M.

    1999-01-01

    Purpose: To define the optimal treatment for children with Wilms tumor who have pulmonary nodules identified on chest computed tomography (CT) scan, but have a negative chest radiograph, we evaluated the outcome of all such patients randomized or followed on National Wilms Tumor Study (NWTS)-3 and -4. Patients and Methods: We estimated the event-free and overall survival percentages of 53 patients with favorable histology tumors and pulmonary densities identified only by CT scan (CT-only) who were treated as Stage IV with intensive doxorubicin-containing chemotherapy and whole-lung irradiation, and compared these to the event-free and overall survival percentages of 37 CT-only patients who were treated less aggressively based on the extent of locoregional disease with 2 or 3 drugs, and without whole-lung irradiation. Results: The 4-year event-free and overall survival percentages of the 53 patients with CT-only nodules and favorable histology Wilms tumor who were treated as Stage IV were 89% and 91%, respectively. The 4-year event-free and overall survival percentages for the 37 patients with CT-only nodules and favorable histology who were treated according to the extent of locoregional disease were 80% and 85%, respectively. The differences observed between the 2 groups were not statistically significant. Among the patients who received whole-lung irradiation, there were fewer pulmonary relapses, but more deaths attributable to lung toxicity. Conclusions: The current data raise the possibility that children with Wilms tumor and CT-only pulmonary nodules who receive whole lung irradiation have fewer pulmonary relapses, but a greater number of deaths due to treatment toxicity. The role of whole lung irradiation in the treatment of this group of patients cannot be definitively determined based on the present data. Prolonged follow-up of this group of patients is necessary to accurately estimate the frequency of late, treatment-related mortality

  11. Computation of 3-D magnetostatic fields using a reduced scalar potential

    International Nuclear Information System (INIS)

    Biro, O.; Preis, K.; Vrisk, G.; Richter, K.R.

    1993-01-01

    The paper presents some improvements to the finite element computation of static magnetic fields in three dimensions using a reduced magnetic scalar potential. New methods are described for obtaining an edge element representation of the rotational part of the magnetic field from a given source current distribution. In the case when the current distribution is not known in advance, a boundary value problem is set up in terms of a current vector potential. An edge element representation of the solution can be directly used in the subsequent magnetostatic calculation. The magnetic field in a D.C. arc furnace is calculated by first determining the current distribution in terms of a current vector potential. A three dimensional problem involving a permanent magnet as well as a coil is solved and the magnetic field in some points is compared with measurement results

  12. Behavior Life Style Analysis for Mobile Sensory Data in Cloud Computing through MapReduce

    Science.gov (United States)

    Hussain, Shujaat; Bang, Jae Hun; Han, Manhyung; Ahmed, Muhammad Idris; Amin, Muhammad Bilal; Lee, Sungyoung; Nugent, Chris; McClean, Sally; Scotney, Bryan; Parr, Gerard

    2014-01-01

    Cloud computing has revolutionized healthcare in today's world as it can be seamlessly integrated into a mobile application and sensor devices. The sensory data is then transferred from these devices to the public and private clouds. In this paper, a hybrid and distributed environment is built which is capable of collecting data from the mobile phone application and store it in the cloud. We developed an activity recognition application and transfer the data to the cloud for further processing. Big data technology Hadoop MapReduce is employed to analyze the data and create user timeline of user's activities. These activities are visualized to find useful health analytics and trends. In this paper a big data solution is proposed to analyze the sensory data and give insights into user behavior and lifestyle trends. PMID:25420151

  13. Behavior Life Style Analysis for Mobile Sensory Data in Cloud Computing through MapReduce

    Directory of Open Access Journals (Sweden)

    Shujaat Hussain

    2014-11-01

    Full Text Available Cloud computing has revolutionized healthcare in today’s world as it can be seamlessly integrated into a mobile application and sensor devices. The sensory data is then transferred from these devices to the public and private clouds. In this paper, a hybrid and distributed environment is built which is capable of collecting data from the mobile phone application and store it in the cloud. We developed an activity recognition application and transfer the data to the cloud for further processing. Big data technology Hadoop MapReduce is employed to analyze the data and create user timeline of user’s activities. These activities are visualized to find useful health analytics and trends. In this paper a big data solution is proposed to analyze the sensory data and give insights into user behavior and lifestyle trends.

  14. Reduced-dose C-arm computed tomography applications at a pediatric institution

    Energy Technology Data Exchange (ETDEWEB)

    Acord, Michael; Shellikeri, Sphoorti; Vatsky, Seth; Srinivasan, Abhay; Krishnamurthy, Ganesh; Keller, Marc S.; Cahill, Anne Marie [The Children' s Hospital of Philadelphia, Department of Radiology, Philadelphia, PA (United States)

    2017-12-15

    Reduced-dose C-arm computed tomography (CT) uses flat-panel detectors to acquire real-time 3-D images in the interventional radiology suite to assist with anatomical localization and procedure planning. To describe dose-reduction techniques for C-arm CT at a pediatric institution and to provide guidance for implementation. We conducted a 5-year retrospective study on procedures using an institution-specific reduced-dose protocol: 5 or 8 s Dyna Rotation, 248/396 projection images/acquisition and 0.1-0.17 μGy/projection dose at the detector with 0.3/0.6/0.9-mm copper (Cu) filtration. We categorized cases by procedure type and average patient age and calculated C-arm CT and total dose area product (DAP). Two hundred twenty-two C-arm CT-guided procedures were performed with a dose-reduction protocol. The most common procedures were temporomandibular and sacroiliac joint injections (48.6%) and sclerotherapy (34.2%). C-arm CT was utilized in cases of difficult percutaneous access in less common applications such as cecostomy and gastrostomy placement, foreign body retrieval and thoracentesis. C-arm CT accounted for between 9.9% and 80.7% of the total procedural DAP. Dose-reducing techniques can preserve image quality for intervention while reducing radiation exposure to the child. This technology has multiple applications within pediatric interventional radiology and can be considered as an adjunctive imaging tool in a variety of procedures, particularly when percutaneous access is challenging despite routine fluoroscopic or ultrasound guidance. (orig.)

  15. Doing Very Big Calculations on Modest Size Computers: Reducing the Cost of Exact Diagonalization Using Singular Value Decomposition

    International Nuclear Information System (INIS)

    Weinstein, M.

    2012-01-01

    I will talk about a new way of implementing Lanczos and contraction algorithms to diagonalize lattice Hamiltonians that dramatically reduces the memory required to do the computation, without restricting to variational ansatzes. (author)

  16. A Comprehensive study on Cloud Green Computing: To Reduce Carbon Footprints Using Clouds

    OpenAIRE

    Chiranjeeb Roy Chowdhury, Arindam Chatterjee, Alap Sardar, Shalabh Agarwal, Asoke Nath

    2013-01-01

    Cloud computing and Green computing are twomostemergent areas in information communicationtechnology (ICT) with immense applications in theentire globe. The future trends of ICT will be moretowards cloud computing and green computing.Due to tremendous improvements in computernetworks now the people prefer the Network-basedcomputing instead of doing something inan in-house based computing.In any business sector dailybusiness and individual computing are nowmigrating from individual hard drives...

  17. Prognostic Significance of Tumor Size of Small Lung Adenocarcinomas Evaluated with Mediastinal Window Settings on Computed Tomography

    OpenAIRE

    Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae

    2014-01-01

    BACKGROUND: We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. METHODS: We evaluated 176 patients with small lung adenocarcinomas (diameter, 1-3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution ...

  18. Study of Propagation Mechanisms in Dynamical Railway Environment to Reduce Computation Time of 3D Ray Tracing Simulator

    Directory of Open Access Journals (Sweden)

    Siham Hairoud

    2013-01-01

    Full Text Available In order to better assess the behaviours of the propagation channel in a confined environment such as a railway tunnel for subway application, we present an optimization method for a deterministic channel simulator based on 3D ray tracing associated to the geometrical optics laws and the uniform theory of diffraction. This tool requires a detailed description of the environment. Thus, the complexity of this model is directly bound to the complexity of the environment and specifically to the number of facets that compose it. In this paper, we propose an algorithm to identify facets that have no significant impact on the wave propagation. This allows us to simplify the description of the geometry of the modelled environment by removing them and by this way, to reduce the complexity of our model and therefore its computation time. A comparative study between full and simplified environment is led and shows the impact of this proposed method on the characteristic parameters of the propagation channel. Thus computation time obtained from the simplified environment is 6 times lower than the one of the full model without significant degradation of simulation accuracy.

  19. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  20. The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images

    International Nuclear Information System (INIS)

    Pita-Machado, Reinado; Perez-Diaz, Marlen; Lorenzo-Ginori, Juan V.; Bravo-Pino, Rolando

    2014-01-01

    Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which are not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions

  1. Computer Game Play Reduces Intrusive Memories of Experimental Trauma via Reconsolidation-Update Mechanisms.

    Science.gov (United States)

    James, Ella L; Bonsall, Michael B; Hoppitt, Laura; Tunbridge, Elizabeth M; Geddes, John R; Milton, Amy L; Holmes, Emily A

    2015-08-01

    Memory of a traumatic event becomes consolidated within hours. Intrusive memories can then flash back repeatedly into the mind's eye and cause distress. We investigated whether reconsolidation-the process during which memories become malleable when recalled-can be blocked using a cognitive task and whether such an approach can reduce these unbidden intrusions. We predicted that reconsolidation of a reactivated visual memory of experimental trauma could be disrupted by engaging in a visuospatial task that would compete for visual working memory resources. We showed that intrusive memories were virtually abolished by playing the computer game Tetris following a memory-reactivation task 24 hr after initial exposure to experimental trauma. Furthermore, both memory reactivation and playing Tetris were required to reduce subsequent intrusions (Experiment 2), consistent with reconsolidation-update mechanisms. A simple, noninvasive cognitive-task procedure administered after emotional memory has already consolidated (i.e., > 24 hours after exposure to experimental trauma) may prevent the recurrence of intrusive memories of those emotional events. © The Author(s) 2015.

  2. Computing and the Crisis: The Significant Role of New Information Technologies in the Current Socio-economic Meltdown

    Directory of Open Access Journals (Sweden)

    David Hakken

    2010-08-01

    Full Text Available There is good reason to be concerned about the long-term implications of the current crisis for the reproduction of contemporary social formations. Thus there is an urgent need to understand it character, especially its distinctive features. This article identifies profound ambiguities in valuing assets as new and key economic features of this crisis, ambiguities traceable to the dominant, “computationalist” computing used to develop new financial instruments. After some preliminaries, the article identifies four specific ways in which computerization of finance is generative of crisis. It then demonstrates how computationalist computing is linked to other efforts to extend commodification based on the ideology of so-called “intellectual property” (IP. Several other accounts for the crisis are considered and then demonstrated to have less explanatory value. After considering how some commons-oriented (e.g., Free/Libre and/or Opening Source Software development projects forms of computing also undermine the IP project, the article concludes with a brief discussion of what research on Socially Robust and Enduring Computing might contribute to fostering alternative, non-crisis generative ways to compute.

  3. Evaluation of two iterative techniques for reducing metal artifacts in computed tomography.

    Science.gov (United States)

    Boas, F Edward; Fleischmann, Dominik

    2011-06-01

    To evaluate two methods for reducing metal artifacts in computed tomography (CT)--the metal deletion technique (MDT) and the selective algebraic reconstruction technique (SART)--and compare these methods with filtered back projection (FBP) and linear interpolation (LI). The institutional review board approved this retrospective HIPAA-compliant study; informed patient consent was waived. Simulated projection data were calculated for a phantom that contained water, soft tissue, bone, and iron. Clinical projection data were obtained retrospectively from 11 consecutively identified CT scans with metal streak artifacts, with a total of 178 sections containing metal. Each scan was reconstructed using FBP, LI, SART, and MDT. The simulated scans were evaluated quantitatively by calculating the average error in Hounsfield units for each pixel compared with the original phantom. Two radiologists who were blinded to the reconstruction algorithms used qualitatively evaluated the clinical scans, ranking the overall severity of artifacts for each algorithm. P values for comparisons of the image quality ranks were calculated from the binomial distribution. The simulations showed that MDT reduces artifacts due to photon starvation, beam hardening, and motion and does not introduce new streaks between metal and bone. MDT had the lowest average error (76% less than FBP, 42% less than LI, 17% less than SART). Blinded comparison of the clinical scans revealed that MDT had the best image quality 100% of the time (95% confidence interval: 72%, 100%). LI had the second best image quality, and SART and FBP had the worst image quality. On images from two CT scans, as compared with images generated by the scanner, MDT revealed information of potential clinical importance. For a wide range of scans, MDT yields reduced metal streak artifacts and better-quality images than does FBP, LI, or SART. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11101782/-/DC1. RSNA, 2011

  4. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  5. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention.

    Science.gov (United States)

    Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I

    2017-01-01

    This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p cloud computing system in our present protocol did not reduce DTB time.

  6. Assessment of right ventricular function using gated blood pool single photon emission computed tomography in inferior myocardial infarction with or without hemodynamically significant right ventricular infarction

    International Nuclear Information System (INIS)

    Takahashi, Masaharu

    1992-01-01

    Right ventricular function was assessed using gated blood pool single photon emission computed tomography (GSPECT) in 10 normal subjects and 14 patients with inferior myocardial infarction. Three-dimensional backbround subtraction was achieved by applying an optimal cut off level. The patient group consisted of 6 patients with definite hemodynamic abnormalities indicative of right ventricular infarction (RVI) and 8 other patients with significant obstructive lesion at the proximal portion of right coronary artery without obvious hemodynamic signs of RVI. Right ventricular regional wall motion abnormalities were demonstrated on GSPECT functional images and the indices of right ventricular function (i.e the right ventricular ejection fraction (RVEF), the right ventricular peak ejection rate (RVPER) and the right ventricular peak filling rate (RVPFR)) were significantly reduced in the patient group, not only in the patients with definite RVI but also in those without hemodynamic signs of RVI, even in the absence of definite hemodynamic signs, when the proximal portion of right coronary artery is obstructed. It is concluded that GSPECT is reliable for the assessment of right ventricular function and regional wall motion, and is also useful for the diagnosis of RVI. (author)

  7. Segmentation process significantly influences the accuracy of 3D surface models derived from cone beam computed tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Schepers, Rutger H; Gerrits, Pieter; Ren, Yijin

    AIMS: To assess the accuracy of surface models derived from 3D cone beam computed tomography (CBCT) with two different segmentation protocols. MATERIALS AND METHODS: Seven fresh-frozen cadaver heads were used. There was no conflict of interests in this study. CBCT scans were made of the heads and 3D

  8. From meatless Mondays to meatless Sundays: motivations for meat reduction among vegetarians and semi-vegetarians who mildly or significantly reduce their meat intake.

    Science.gov (United States)

    De Backer, Charlotte J S; Hudders, Liselot

    2014-01-01

    This study explores vegetarians' and semi-vegetarians' motives for reducing their meat intake. Participants are categorized as vegetarians (remove all meat from their diet); semi-vegetarians (significantly reduce meat intake: at least three days a week); or light semi-vegetarians (mildly reduce meat intake: once or twice a week). Most differences appear between vegetarians and both groups of semi-vegetarians. Animal-rights and ecological concerns, together with taste preferences, predict vegetarianism, while an increase in health motives increases the odds of being semi-vegetarian. Even within each group, subgroups with different motives appear, and it is recommended that future researchers pay more attention to these differences.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. Computer-aided detection system for chest radiography: reducing report turnaround times of examinations with abnormalities.

    Science.gov (United States)

    Kao, E-Fong; Liu, Gin-Chung; Lee, Lo-Yeh; Tsai, Huei-Yi; Jaw, Twei-Shiun

    2015-06-01

    The ability to give high priority to examinations with pathological findings could be very useful to radiologists with large work lists who wish to first evaluate the most critical studies. A computer-aided detection (CAD) system for identifying chest examinations with abnormalities has therefore been developed. To evaluate the effectiveness of a CAD system on report turnaround times of chest examinations with abnormalities. The CAD system was designed to automatically mark chest examinations with possible abnormalities in the work list of radiologists interpreting chest examinations. The system evaluation was performed in two phases: two radiologists interpreted the chest examinations without CAD in phase 1 and with CAD in phase 2. The time information recorded by the radiology information system was then used to calculate the turnaround times. All chest examinations were reviewed by two other radiologists and were divided into normal and abnormal groups. The turnaround times for the examinations with pathological findings with and without the CAD system assistance were compared. The sensitivity and specificity of the CAD for chest abnormalities were 0.790 and 0.697, respectively, and use of the CAD system decreased the turnaround time for chest examinations with abnormalities by 44%. The turnaround times required for radiologists to identify chest examinations with abnormalities could be reduced by using the CAD system. This system could be useful for radiologists with large work lists who wish to first evaluate the most critical studies. © The Foundation Acta Radiologica 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Signal Amplification Technique (SAT): an approach for improving resolution and reducing image noise in computed tomography

    International Nuclear Information System (INIS)

    Phelps, M.E.; Huang, S.C.; Hoffman, E.J.; Plummer, D.; Carson, R.

    1981-01-01

    Spatial resolution improvements in computed tomography (CT) have been limited by the large and unique error propagation properties of this technique. The desire to provide maximum image resolution has resulted in the use of reconstruction filter functions designed to produce tomographic images with resolution as close as possible to the intrinsic detector resolution. Thus, many CT systems produce images with excessive noise with the system resolution determined by the detector resolution rather than the reconstruction algorithm. CT is a rigorous mathematical technique which applies an increasing amplification to increasing spatial frequencies in the measured data. This mathematical approach to spatial frequency amplification cannot distinguish between signal and noise and therefore both are amplified equally. We report here a method in which tomographic resolution is improved by using very small detectors to selectively amplify the signal and not noise. Thus, this approach is referred to as the signal amplification technique (SAT). SAT can provide dramatic improvements in image resolution without increases in statistical noise or dose because increases in the cutoff frequency of the reconstruction algorithm are not required to improve image resolution. Alternatively, in cases where image counts are low, such as in rapid dynamic or receptor studies, statistical noise can be reduced by lowering the cutoff frequency while still maintaining the best possible image resolution. A possible system design for a positron CT system with SAT is described

  12. The impact of slice-reduced computed tomography on histogram-based densitometry assessment of lung fibrosis in patients with systemic sclerosis.

    Science.gov (United States)

    Nguyen-Kim, Thi Dan Linh; Maurer, Britta; Suliman, Yossra A; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas

    2018-04-01

    To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051-0.073). All scores correlated significantly (Phistogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both Phistogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both Phistogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients.

  13. Computer tablet distraction reduces pain and anxiety in pediatric burn patients undergoing hydrotherapy: A randomized trial.

    Science.gov (United States)

    Burns-Nader, Sherwood; Joe, Lindsay; Pinion, Kelly

    2017-09-01

    Distraction is often used in conjunction with analgesics to minimize pain in pediatric burn patients during treatment procedures. Computer tablets provide many options for distraction items in one tool and are often used during medical procedures. Few studies have examined the effectiveness of tablet distraction in improving the care of pediatric burn patients. This study examines the effectiveness of tablet distraction provided by a child life specialist to minimize pain and anxiety in pediatric burn patients undergoing hydrotherapy. Thirty pediatric patients (4-12) undergoing hydrotherapy for the treatment of burns participated in this randomized clinical trial. The tablet distraction group received tablet distraction provided by a child life specialist while those in the control group received standard care. Pain was assessed through self-reports and observation reports. Anxiety was assessed through behavioral observations. Length of procedure was also recorded. Nurses reported significantly less pain for the tablet distraction group compared to the control group. There was no significant difference between groups on self-reported pain. The tablet distraction group displayed significantly less anxiety during the procedure compared to the control group. Also, the tablet distraction group returned to baseline after the procedure while those in the control group displayed higher anxiety post-procedure. There was no difference in the length of the procedure between groups. These findings suggest tablet distraction provided by a child life specialist may be an effective method for improving pain and anxiety in children undergoing hydrotherapy treatment for burns. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.

  14. Tilting the jaw to improve the image quality or to reduce the dose in cone-beam computed tomography

    International Nuclear Information System (INIS)

    Luckow, Marlen; Deyhle, Hans; Beckmann, Felix; Dagassan-Berndt, Dorothea; Müller, Bert

    2011-01-01

    Objective: The image quality in cone-beam computed tomography (CBCT) should be improved tilting the mandible that contains two dental titanium implants, within the relevant range of motion. Materials and methods: Using the mandible of a five-month-old pig, CBCT was performed varying the accelerating voltage, beam current, the starting rotation angle of the mandible in the source-detector plane and the tilt angles of the jaw with respect to the source-detector plane. The different datasets were automatically registered with respect to micro CT data to extract the common volume and the deviance to the pre-defined standard that characterizes the image quality. Results: The variations of the accelerating voltage, beam current and the rotation within the source-detection plane provided the expected quantitative behavior indicating the appropriate choice of the imaging quality factor. The tilting of the porcine mandible by about 14° improves the image quality by almost a factor of two. Conclusions: The tilting of the mandible with two dental implants can be used to significantly reduce the artifacts of the strongly X-ray absorbing materials in the CBCT images. The comparison of 14° jaw tilting with respect to the currently recommended arrangement in plane with the teeth demonstrates that the applied exposure time and the related dose can be reduced by a factor of four without decreasing the image quality.

  15. Evaluation of a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography of scaphoid fixation screws

    Energy Technology Data Exchange (ETDEWEB)

    Filli, Lukas; Finkenstaedt, Tim; Andreisek, Gustav; Guggenberger, Roman [University Hospital of Zurich, Department of Diagnostic and Interventional Radiology, Zurich (Switzerland); Marcon, Magda [University Hospital of Zurich, Department of Diagnostic and Interventional Radiology, Zurich (Switzerland); University of Udine, Institute of Diagnostic Radiology, Department of Medical and Biological Sciences, Udine (Italy); Scholz, Bernhard [Imaging and Therapy Division, Siemens AG, Healthcare Sector, Forchheim (Germany); Calcagni, Maurizio [University Hospital of Zurich, Division of Plastic Surgery and Hand Surgery, Zurich (Switzerland)

    2014-12-15

    The aim of this study was to evaluate a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography (FDCT) of scaphoid fixation screws. FDCT has gained interest in imaging small anatomic structures of the appendicular skeleton. Angiographic C-arm systems with flat detectors allow fluoroscopy and FDCT imaging in a one-stop procedure emphasizing their role as an ideal intraoperative imaging tool. However, FDCT imaging can be significantly impaired by artefacts induced by fixation screws. Following ethical board approval, commercially available scaphoid fixation screws were inserted into six cadaveric specimens in order to fix artificially induced scaphoid fractures. FDCT images corrected with the algorithm were compared to uncorrected images both quantitatively and qualitatively by two independent radiologists in terms of artefacts, screw contour, fracture line visibility, bone visibility, and soft tissue definition. Normal distribution of variables was evaluated using the Kolmogorov-Smirnov test. In case of normal distribution, quantitative variables were compared using paired Student's t tests. The Wilcoxon signed-rank test was used for quantitative variables without normal distribution and all qualitative variables. A p value of < 0.05 was considered to indicate statistically significant differences. Metal artefacts were significantly reduced by the correction algorithm (p < 0.001), and the fracture line was more clearly defined (p < 0.01). The inter-observer reliability was ''almost perfect'' (intra-class correlation coefficient 0.85, p < 0.001). The prototype correction algorithm in FDCT for metal artefacts induced by scaphoid fixation screws may facilitate intra- and postoperative follow-up imaging. (orig.)

  16. Evaluation of a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography of scaphoid fixation screws

    International Nuclear Information System (INIS)

    Filli, Lukas; Finkenstaedt, Tim; Andreisek, Gustav; Guggenberger, Roman; Marcon, Magda; Scholz, Bernhard; Calcagni, Maurizio

    2014-01-01

    The aim of this study was to evaluate a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography (FDCT) of scaphoid fixation screws. FDCT has gained interest in imaging small anatomic structures of the appendicular skeleton. Angiographic C-arm systems with flat detectors allow fluoroscopy and FDCT imaging in a one-stop procedure emphasizing their role as an ideal intraoperative imaging tool. However, FDCT imaging can be significantly impaired by artefacts induced by fixation screws. Following ethical board approval, commercially available scaphoid fixation screws were inserted into six cadaveric specimens in order to fix artificially induced scaphoid fractures. FDCT images corrected with the algorithm were compared to uncorrected images both quantitatively and qualitatively by two independent radiologists in terms of artefacts, screw contour, fracture line visibility, bone visibility, and soft tissue definition. Normal distribution of variables was evaluated using the Kolmogorov-Smirnov test. In case of normal distribution, quantitative variables were compared using paired Student's t tests. The Wilcoxon signed-rank test was used for quantitative variables without normal distribution and all qualitative variables. A p value of < 0.05 was considered to indicate statistically significant differences. Metal artefacts were significantly reduced by the correction algorithm (p < 0.001), and the fracture line was more clearly defined (p < 0.01). The inter-observer reliability was ''almost perfect'' (intra-class correlation coefficient 0.85, p < 0.001). The prototype correction algorithm in FDCT for metal artefacts induced by scaphoid fixation screws may facilitate intra- and postoperative follow-up imaging. (orig.)

  17. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    Science.gov (United States)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  18. Image quality analysis to reduce dental artifacts in head and neck imaging with dual-source computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Ketelsen, D.; Werner, M.K.; Thomas, C.; Tsiflikas, I.; Reimann, A.; Claussen, C.D.; Heuschmid, M. [Tuebingen Univ. (Germany). Abt. fuer Diagnostische und Interventionelle Radiologie; Koitschev, A. [Tuebingen Univ. (Germany). Abt. fuer Hals-Nasen-Ohrenheilkunde

    2009-01-15

    Purpose: Important oropharyngeal structures can be superimposed by metallic artifacts due to dental implants. The aim of this study was to compare the image quality of multiplanar reconstructions and an angulated spiral in dual-source computed tomography (DSCT) of the neck. Materials and Methods: Sixty-two patients were included for neck imaging with DSCT. MPRs from an axial dataset and an additional short spiral parallel to the mouth floor were acquired. Leading anatomical structures were then evaluated with respect to the extent to which they were affected by dental artifacts using a visual scale, ranging from 1 (least artifacts) to 4 (most artifacts). Results: In MPR, 87.1 % of anatomical structures had significant artifacts (3.12 {+-} 0.86), while in angulated slices leading anatomical structures of the oropharynx showed negligible artifacts (1.28 {+-} 0.46). The diagnostic growth due to primarily angulated slices concerning artifact severity was significant (p < 0.01). Conclusion: MPRs are not capable of reducing dental artifacts sufficiently. In patients with dental artifacts overlying the anatomical structures of the oropharynx, an additional short angulated spiral parallel to the floor of the mouth is recommended and should be applied for daily routine. As a result of the static gantry design of DSCT, the use of a flexible head holder is essential. (orig.)

  19. Algorithm for computing significance levels using the Kolmogorov-Smirnov statistic and valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    The KSTEST code presented here is designed to perform the Kolmogorov-Smirnov one-sample test. The code may be used as a stand-alone program or the principal subroutines may be excerpted and used to service other programs. The Kolmogorov-Smirnov one-sample test is a nonparametric goodness-of-fit test. A number of codes to perform this test are in existence, but they suffer from the inability to provide meaningful results in the case of small sample sizes (number of values less than or equal to 80). The KSTEST code overcomes this inadequacy by using two distinct algorithms. If the sample size is greater than 80, an asymptotic series developed by Smirnov is evaluated. If the sample size is 80 or less, a table of values generated by Birnbaum is referenced. Valid results can be obtained from KSTEST when the sample contains from 3 to 300 data points. The program was developed on a Digital Equipment Corporation PDP-10 computer using the FORTRAN-10 language. The code size is approximately 450 card images and the typical CPU execution time is 0.19 s.

  20. Transport coefficient computation based on input/output reduced order models

    Science.gov (United States)

    Hurst, Joshua L.

    equation. Its a recursive method that solves nonlinear ODE's by solving a LTV systems at each iteration to obtain a new closer solution. LTV models are derived for both Gosling and Lees-Edwards type models. Particular attention is given to SLLOD Lees-Edwards models because they are in a form most amenable to performing Taylor series expansion, and the most commonly used model to examine viscosity. With linear models developed a method is presented to calculate viscosity based on LTI Gosling models but is shown to have some limitations. To address these issues LTV SLLOD models are analyzed with both Balanced Truncation and POD and both show that significant order reduction is possible. By examining the singular values of both techniques it is shown that Balanced Truncation has a potential to offer greater reduction, which should be expected as it is based on the input/output mapping instead of just the state information as in POD. Obtaining reduced order systems that capture the property of interest is challenging. For Balanced Truncation reduced order models for 1-D LJ and FENE systems are obtained and are shown to capture the output of interest fairly well. However numerical challenges currently limit this analysis to small order systems. Suggestions are presented to extend this method to larger systems. In addition reduced 2nd order systems are obtained from POD. Here the challenge is extending the solution beyond the original period used for the projection, in particular identifying the manifold the solution travels along. The remaining challenges are presented and discussed.

  1. Reduced iodinated contrast media for abdominal imaging by dual-layer spectral detector computed tomography for patients with kidney disease

    Directory of Open Access Journals (Sweden)

    Hirokazu Saito, MD

    2018-04-01

    Full Text Available Contrast-enhanced computed tomography using iodinated contrast media is useful for diagnosis of gastrointestinal diseases. However, contrast-induced nephropathy remains problematic for kidney diseases patients. Although current guidelines recommended the use of a minimal dose of contrast media necessary to obtain adequate images for diagnosis, obtaining adequate images with sufficient contrast enhancement is difficult with conventional computed tomography using reduced contrast media. Dual-layer spectral detector computed tomography enables the simultaneous acquisition of low- and high-energy data and the reconstruction of virtual monochromatic images ranging from 40 to 200 keV, retrospectively. Low-energy virtual monochromatic images can enhance the contrast of images, thereby facilitating reduced contrast media. In case 1, abdominal computed tomography angiography at 50 keV using 40% of the conventional dose of contrast media revealed the artery that was the source of diverticular bleeding in the ascending colon. In case 2, ischemia of the transverse colon was diagnosed by contrast-enhanced computed tomography and iodine-selective imaging using 40% of the conventional dose of contrast media. In case 3, advanced esophagogastric junctional cancer was staged and preoperative abdominal computed tomography angiography could be obtained with 30% of the conventional dose of contrast media. However, the texture of virtual monochromatic images may be a limitation at low energy. Keywords: Virtual monochromatic images, Contrast-induced nephropathy

  2. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    Science.gov (United States)

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  3. Segmentation process significantly influences the accuracy of 3D surface models derived from cone beam computed tomography

    International Nuclear Information System (INIS)

    Fourie, Zacharias; Damstra, Janalt; Schepers, Rutger H.; Gerrits, Peter O.; Ren Yijin

    2012-01-01

    Aims: To assess the accuracy of surface models derived from 3D cone beam computed tomography (CBCT) with two different segmentation protocols. Materials and methods: Seven fresh-frozen cadaver heads were used. There was no conflict of interests in this study. CBCT scans were made of the heads and 3D surface models were created of the mandible using two different segmentation protocols. The one series of 3D models was segmented by a commercial software company, while the other series was done by an experienced 3D clinician. The heads were then macerated following a standard process. A high resolution laser surface scanner was used to make a 3D model of the macerated mandibles, which acted as the reference 3D model or “gold standard”. The 3D models generated from the two rendering protocols were compared with the “gold standard” using a point-based rigid registration algorithm to superimpose the three 3D models. The linear difference at 25 anatomic and cephalometric landmarks between the laser surface scan and the 3D models generate from the two rendering protocols was measured repeatedly in two sessions with one week interval. Results: The agreement between the repeated measurement was excellent (ICC = 0.923–1.000). The mean deviation from the gold standard by the 3D models generated from the CS group was 0.330 mm ± 0.427, while the mean deviation from the Clinician's rendering was 0.763 mm ± 0.392. The surface models segmented by both CS and DS protocols tend to be larger than those of the reference models. In the DS group, the biggest mean differences with the LSS models were found at the points ConLatR (CI: 0.83–1.23), ConMedR (CI: −3.16 to 2.25), CoLatL (CI: −0.68 to 2.23), Spine (CI: 1.19–2.28), ConAntL (CI: 0.84–1.69), ConSupR (CI: −1.12 to 1.47) and RetMolR (CI: 0.84–1.80). Conclusion: The Commercially segmented models resembled the reality more closely than the Doctor's segmented models. If 3D models are needed for surgical drilling

  4. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  5. Biofeedback effectiveness to reduce upper limb muscle activity during computer work is muscle specific and time pressure dependent

    DEFF Research Database (Denmark)

    Vedsted, Pernille; Søgaard, Karen; Blangsted, Anne Katrine

    2011-01-01

    trapezius (TRA) can reduce bilateral TRA activity but not extensor digitorum communis (EDC) activity; (2) biofeedback from EDC can reduce activity in EDC but not in TRA; (3) biofeedback is more effective in no time constraint than in the time constraint working condition. Eleven healthy women performed......Continuous electromyographic (EMG) activity level is considered a risk factor in developing muscle disorders. EMG biofeedback is known to be useful in reducing EMG activity in working muscles during computer work. The purpose was to test the following hypotheses: (1) unilateral biofeedback from...... computer work during two different working conditions (time constraint/no time constraint) while receiving biofeedback. Biofeedback was given from right TRA or EDC through two modes (visual/auditory) by the use of EMG or mechanomyography as biofeedback source. During control sessions (no biofeedback), EMG...

  6. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    Science.gov (United States)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  7. The transhumanism of Ray Kurzweil. Is biological ontology reducible to computation?

    Directory of Open Access Journals (Sweden)

    Javier Monserrat

    2016-02-01

    Full Text Available Computer programs, primarily engineering machine vision and programming of somatic sensors, have already allowed, and they will do it more perfectly in the future, to build high perfection androids or cyborgs. They will collaborate with man and open new moral reflections to respect the ontological dignity in the new humanoid machines. In addition, both men and new androids will be in connection with huge external computer networks that will grow up to almost incredible levels the efficiency in the domain of body and nature. However, our current scientific knowledge, on the one hand, about hardware and software that will support both the humanoid machines and external computer networks, made with existing engineering (and also the foreseeable medium and even long term engineering and, on the other hand, our scientific knowledge about animal and human behavior from neural-biological structures that produce a psychic system, allow us to establish that there is no scientific basis to talk about an ontological identity between the computational machines and man. Accordingly, different ontologies (computational machines and biological entities will produce various different functional systems. There may be simulation, but never ontological identity. These ideas are essential to assess the transhumanism of Ray Kurzweil.

  8. A Computer-Based Interactive Multimedia Program to Reduce HIV Transmission for Women with Intellectual Disability

    Science.gov (United States)

    Delaine, Khaya

    2011-01-01

    Background Despite recent recognition of the need for preventive sexual health materials for people with intellectual disability (ID), there have been remarkably few health-based interventions designed for people with mild to moderate ID. The purpose of this study was to evaluate the effects of a computer-based interactive multimedia (CBIM) program to teach HIV/AIDS knowledge, skills, and decision-making. Methods Twenty-five women with mild to moderate intellectual disability evaluated the program. The study used a quasi-experimental within-subjects design to assess the efficacy of the CBIM program. Research participants completed five qualitative and quantitative instruments that assessed HIV knowledge, and decision-making skills regarding HIV prevention practices and condom application skills (i.e., demonstration of skills opening a condom and putting it on a model penis). In addition, 18 service providers who work with women with ID reviewed the program and completed a demographics questionnaire and a professional customer satisfaction survey. Results Women with ID showed statistically significant increases from pretest to posttest in all knowledge and skill domains. Furthermore, the statistical gains were accompanied by medium to large effect sizes. Overall, service providers rated the program highly on several outcome measures (stimulation, relevance, and usability). Conclusions The results of this study indicate the CBIM program was effective in increasing HIV/AIDS knowledge and skills among women with ID, who live both semi-independently and independently, in a single-session intervention. Since the CBIM program is not dependent on staff for instructional delivery, it is a highly efficient teaching tool; and CBIM is an efficacious means to provide behavioral health content, compensating for the dearth of available health promotion materials for people with ID. As such, it has a potential for broad distribution and implementation by medical practitioners, and

  9. SLMRACE: a noise-free RACE implementation with reduced computational time

    Science.gov (United States)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  10. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    simultaneously, this method is very demanding in terms of numerical efficiency and computational power. Therefore, this method has not yet proved to be feasible. It has recently been shown how a hybrid method combining classical numerical models and artificial neural networks (ANN) can provide a dramatic...... prior to the experiment and with a properly trained ANN it is no problem to obtain accurate simulations much faster than real time-without any need for large computational capacity. The present study demonstrates how this hybrid method can be applied to the active truncated experiments yielding a system...

  11. Prenatal prochloraz treatment significantly increases pregnancy length and reduces offspring weight but does not affect social-olfactory memory in rats

    DEFF Research Database (Denmark)

    Dmytriyeva, Oksana; Klementiev, Boris; Berezin, Vladimir

    2013-01-01

    Metabolites of the commonly used imidazole fungicide prochloraz are androgen receptor antagonists. They have been shown to block androgen-driven development and compromise reproductive function. We tested the effect of prochloraz on cognitive behavior following exposure to this fungicide during...... the perinatal period. Pregnant Wistar rats were administered a 200mg/kg dose of prochloraz on gestational day (GD) 7, GD11, and GD15. The social recognition test (SRT) was performed on 7-week-old male rat offspring. We found an increase in pregnancy length and a significantly reduced pup weight on PND15 and PND...

  12. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention

    Directory of Open Access Journals (Sweden)

    Chi-Kung Ho

    2017-01-01

    Full Text Available Background. This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB time for ST segment elevation myocardial infarction (STEMI. Methods. A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p<0.05. There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.

  13. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    Energy Technology Data Exchange (ETDEWEB)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A. [Dept. of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Essen (Germany)

    2007-04-15

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography.

  14. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    International Nuclear Information System (INIS)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A.

    2007-01-01

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography

  15. The Effort to Reduce a Muscle Fatigue Through Gymnastics Relaxation and Ergonomic Approach for Computer Users in Central Building State University of Medan

    Science.gov (United States)

    Gultom, Syamsul; Darma Sitepu, Indra; Hasibuan, Nurman

    2018-03-01

    Fatigue due to long and continuous computer usage can lead to problems of dominant fatigue associated with decreased performance and work motivation. Specific targets in the first phase have been achieved in this research such as: (1) Identified complaints on workers using computers, using the Bourdon Wiersma test kit. (2) Finding the right relaxation & work posture draft for a solution to reduce muscle fatigue in computer-based workers. The type of research used in this study is research and development method which aims to produce the products or refine existing products. The final product is a prototype of back-holder, monitoring filter and arranging a relaxation exercise as well as the manual book how to do this while in front of the computer to lower the fatigue level for computer users in Unimed’s Administration Center. In the first phase, observations and interviews have been conducted and identified the level of fatigue on the employees of computer users at Uniemd’s Administration Center using Bourdon Wiersma test and has obtained the following results: (1) The average velocity time of respondents in BAUK, BAAK and BAPSI after working with the value of interpretation of the speed obtained value of 8.4, WS 13 was in a good enough category, (2) The average of accuracy of respondents in BAUK, in BAAK and in BAPSI after working with interpretation value accuracy obtained Value of 5.5, WS 8 was in doubt-category. This result shows that computer users experienced a significant tiredness at the Unimed Administration Center, (3) the consistency of the average of the result in measuring tiredness level on computer users in Unimed’s Administration Center after working with values in consistency of interpretation obtained Value of 5.5 with WS 8 was put in a doubt-category, which means computer user in The Unimed Administration Center suffered an extreme fatigue. In phase II, based on the results of the first phase in this research, the researcher offers

  16. MUSIDH, multiple use of simulated demographic histories, a novel method to reduce computation time in microsimulation models of infectious diseases.

    Science.gov (United States)

    Fischer, E A J; De Vlas, S J; Richardus, J H; Habbema, J D F

    2008-09-01

    Microsimulation of infectious diseases requires simulation of many life histories of interacting individuals. In particular, relatively rare infections such as leprosy need to be studied in very large populations. Computation time increases disproportionally with the size of the simulated population. We present a novel method, MUSIDH, an acronym for multiple use of simulated demographic histories, to reduce computation time. Demographic history refers to the processes of birth, death and all other demographic events that should be unrelated to the natural course of an infection, thus non-fatal infections. MUSIDH attaches a fixed number of infection histories to each demographic history, and these infection histories interact as if being the infection history of separate individuals. With two examples, mumps and leprosy, we show that the method can give a factor 50 reduction in computation time at the cost of a small loss in precision. The largest reductions are obtained for rare infections with complex demographic histories.

  17. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  18. A computer-assisted motivational social network intervention to reduce alcohol, drug and HIV risk behaviors among Housing First residents.

    Science.gov (United States)

    Kennedy, David P; Hunter, Sarah B; Chan Osilla, Karen; Maksabedian, Ervant; Golinelli, Daniela; Tucker, Joan S

    2016-03-15

    Individuals transitioning from homelessness to housing face challenges to reducing alcohol, drug and HIV risk behaviors. To aid in this transition, this study developed and will test a computer-assisted intervention that delivers personalized social network feedback by an intervention facilitator trained in motivational interviewing (MI). The intervention goal is to enhance motivation to reduce high risk alcohol and other drug (AOD) use and reduce HIV risk behaviors. In this Stage 1b pilot trial, 60 individuals that are transitioning from homelessness to housing will be randomly assigned to the intervention or control condition. The intervention condition consists of four biweekly social network sessions conducted using MI. AOD use and HIV risk behaviors will be monitored prior to and immediately following the intervention and compared to control participants' behaviors to explore whether the intervention was associated with any systematic changes in AOD use or HIV risk behaviors. Social network health interventions are an innovative approach for reducing future AOD use and HIV risk problems, but little is known about their feasibility, acceptability, and efficacy. The current study develops and pilot-tests a computer-assisted intervention that incorporates social network visualizations and MI techniques to reduce high risk AOD use and HIV behaviors among the formerly homeless. CLINICALTRIALS. NCT02140359.

  19. The effectiveness of the anti-CD11d treatment is reduced in rat models of spinal cord injury that produce significant levels of intraspinal hemorrhage.

    Science.gov (United States)

    Geremia, N M; Hryciw, T; Bao, F; Streijger, F; Okon, E; Lee, J H T; Weaver, L C; Dekaban, G A; Kwon, B K; Brown, A

    2017-09-01

    We have previously reported that administration of a CD11d monoclonal antibody (mAb) improves recovery in a clip-compression model of SCI. In this model the CD11d mAb reduces the infiltration of activated leukocytes into the injured spinal cord (as indicated by reduced intraspinal MPO). However not all anti-inflammatory strategies have reported beneficial results, suggesting that success of the CD11d mAb treatment may depend on the type or severity of the injury. We therefore tested the CD11d mAb treatment in a rat hemi-contusion model of cervical SCI. In contrast to its effects in the clip-compression model, the CD11d mAb treatment did not improve forelimb function nor did it significantly reduce MPO levels in the hemi-contused cord. To determine if the disparate results using the CD11d mAb were due to the biomechanical nature of the cord injury (compression SCI versus contusion SCI) or to the spinal level of the injury (12th thoracic level versus cervical) we further evaluated the CD11d mAb treatment after a T12 contusion SCI. In contrast to the T12 clip compression SCI, the CD11d mAb treatment did not improve locomotor recovery or significantly reduce MPO levels after T12 contusion SCI. Lesion analyses revealed increased levels of hemorrhage after contusion SCI compared to clip-compression SCI. SCI that is accompanied by increased intraspinal hemorrhage would be predicted to be refractory to the CD11d mAb therapy as this approach targets leukocyte diapedesis through the intact vasculature. These results suggest that the disparate results of the anti-CD11d treatment in contusion and clip-compression models of SCI are due to the different pathophysiological mechanisms that dominate these two types of spinal cord injuries. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  20. Intra-operative cone beam computed tomography can help avoid reinterventions and reduce CT follow up after infrarenal EVAR.

    Science.gov (United States)

    Törnqvist, P; Dias, N; Sonesson, B; Kristmundsson, T; Resch, T

    2015-04-01

    Re-interventions after endovascular abdominal aortic aneurysm repair (EVAR) are common and therefore a strict imaging follow up protocol is required. The purpose of this study was to evaluate whether cone beam computed tomography (CBCT) can detect intra-operative complications and to compare this with angiography and the 1 month CT follow up (computed tomography angiography [CTA]). Fifty-one patients (44 men) were enrolled in a prospective trial. Patients underwent completion angiography and CBCT during infrarenal EVAR. Contrast was used except when pre-operative renal insufficiency was present or if the maximum contrast dose threshold was reached. CBCT reconstruction included the top of the stent graft to the iliac bifurcation. Endoleaks, kinks, or compressions were recorded. CBCT was technically successful in all patients. Twelve endoleaks were detected on completion digital subtraction angiography (CA). CBCT detected 4/5 type 1 endoleaks, but only one type 2 endoleak. CTA identified eight type 2 endoleaks and one residual type I endoleak. Two cases of stent compression were seen on CA. CBCT revealed five stent compressions and one kink, which resulted in four intra-operative adjunctive manoeuvres. CTA identified all cases of kinks or compressions that were left untreated. Two of them were corrected later. No additional kinks/compressions were found on CTA. Groin closure consisted of 78 fascia sutures, nine cut downs, and 11 percutaneous sutures. Seven femoral artery pseudoaneurysms (<1 cm) were detected on CTA, but no intervention was needed. CA is better than CBCT in detecting and categorizing endoleaks but CBCT (with or without contrast) is better than CA for detection of kinks or stentgraft compression. CTA plus CBCT identified all significant complications noted on the 1 month follow up CTA. The use of intra-operative CA and CBCT could replace early CTA after standard EVAR thus reducing overall radiation and contrast use. Technical development might further

  1. Region-oriented CT image representation for reducing computing time of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Sarrut, David; Guigues, Laurent

    2008-01-01

    Purpose. We propose a new method for efficient particle transportation in voxelized geometry for Monte Carlo simulations. We describe its use for calculating dose distribution in CT images for radiation therapy. Material and methods. The proposed approach, based on an implicit volume representation named segmented volume, coupled with an adapted segmentation procedure and a distance map, allows us to minimize the number of boundary crossings, which slows down simulation. The method was implemented with the GEANT4 toolkit and compared to four other methods: One box per voxel, parameterized volumes, octree-based volumes, and nested parameterized volumes. For each representation, we compared dose distribution, time, and memory consumption. Results. The proposed method allows us to decrease computational time by up to a factor of 15, while keeping memory consumption low, and without any modification of the transportation engine. Speeding up is related to the geometry complexity and the number of different materials used. We obtained an optimal number of steps with removal of all unnecessary steps between adjacent voxels sharing a similar material. However, the cost of each step is increased. When the number of steps cannot be decreased enough, due for example, to the large number of material boundaries, such a method is not considered suitable. Conclusion. This feasibility study shows that optimizing the representation of an image in memory potentially increases computing efficiency. We used the GEANT4 toolkit, but we could potentially use other Monte Carlo simulation codes. The method introduces a tradeoff between speed and geometry accuracy, allowing computational time gain. However, simulations with GEANT4 remain slow and further work is needed to speed up the procedure while preserving the desired accuracy

  2. Hypoxis hemerocallidea Significantly Reduced Hyperglycaemia and Hyperglycaemic-Induced Oxidative Stress in the Liver and Kidney Tissues of Streptozotocin-Induced Diabetic Male Wistar Rats

    Directory of Open Access Journals (Sweden)

    Oluwafemi O. Oguntibeju

    2016-01-01

    Full Text Available Background. Hypoxis hemerocallidea is a native plant that grows in the Southern African regions and is well known for its beneficial medicinal effects in the treatment of diabetes, cancer, and high blood pressure. Aim. This study evaluated the effects of Hypoxis hemerocallidea on oxidative stress biomarkers, hepatic injury, and other selected biomarkers in the liver and kidneys of healthy nondiabetic and streptozotocin- (STZ- induced diabetic male Wistar rats. Materials and Methods. Rats were injected intraperitoneally with 50 mg/kg of STZ to induce diabetes. The plant extract-Hypoxis hemerocallidea (200 mg/kg or 800 mg/kg aqueous solution was administered (daily orally for 6 weeks. Antioxidant activities were analysed using a Multiskan Spectrum plate reader while other serum biomarkers were measured using the RANDOX chemistry analyser. Results. Both dosages (200 mg/kg and 800 mg/kg of Hypoxis hemerocallidea significantly reduced the blood glucose levels in STZ-induced diabetic groups. Activities of liver enzymes were increased in the diabetic control and in the diabetic group treated with 800 mg/kg, whereas the 200 mg/kg dosage ameliorated hepatic injury. In the hepatic tissue, the oxygen radical absorbance capacity (ORAC, ferric reducing antioxidant power (FRAP, catalase, and total glutathione were reduced in the diabetic control group. However treatment with both doses improved the antioxidant status. The FRAP and the catalase activities in the kidney were elevated in the STZ-induced diabetic group treated with 800 mg/kg of the extract possibly due to compensatory responses. Conclusion. Hypoxis hemerocallidea demonstrated antihyperglycemic and antioxidant effects especially in the liver tissue.

  3. Prenatal prochloraz treatment significantly increases pregnancy length and reduces offspring weight but does not affect social-olfactory memory in rats.

    Science.gov (United States)

    Dmytriyeva, Oksana; Klementiev, Boris; Berezin, Vladimir; Bock, Elisabeth

    2013-07-01

    Metabolites of the commonly used imidazole fungicide prochloraz are androgen receptor antagonists. They have been shown to block androgen-driven development and compromise reproductive function. We tested the effect of prochloraz on cognitive behavior following exposure to this fungicide during the perinatal period. Pregnant Wistar rats were administered a 200 mg/kg dose of prochloraz on gestational day (GD) 7, GD11, and GD15. The social recognition test (SRT) was performed on 7-week-old male rat offspring. We found an increase in pregnancy length and a significantly reduced pup weight on PND15 and PND40 but no effect of prenatal prochloraz exposure on social investigation or acquisition of social-olfactory memory. Copyright © 2012 Elsevier GmbH. All rights reserved.

  4. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Science.gov (United States)

    Zilka, Gila Cohen

    2016-01-01

    Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a) the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school); and (b) the digital…

  5. A Computer-Based Intervention to Reduce Internalized Heterosexism in Men

    Science.gov (United States)

    Lin, Yen-Jui; Israel, Tania

    2012-01-01

    Internalized heterosexism (IH) is a strong predictor of the psychological well-being of lesbian, gay, bisexual (LGB), or other same-sex attracted individuals. To respond to the call for interventions to address IH, the current study developed and tested an online intervention to reduce IH among gay, bisexual, and other same-sex attracted men. A…

  6. Impaired bone formation in ovariectomized mice reduces implant integration as indicated by longitudinal in vivo micro-computed tomography.

    Science.gov (United States)

    Li, Zihui; Kuhn, Gisela; Schirmer, Michael; Müller, Ralph; Ruffoni, Davide

    2017-01-01

    Although osteoporotic bone, with low bone mass and deteriorated bone architecture, provides a less favorable mechanical environment than healthy bone for implant fixation, there is no general agreement on the impact of osteoporosis on peri-implant bone (re)modeling, which is ultimately responsible for the long term stability of the bone-implant system. Here, we inserted an implant in a mouse model mimicking estrogen deficiency-induced bone loss and we monitored with longitudinal in vivo micro-computed tomography the spatio-temporal changes in bone (re)modeling and architecture, considering the separate contributions of trabecular, endocortical and periosteal surfaces. Specifically, 12 week-old C57BL/6J mice underwent OVX/SHM surgery; 9 weeks after we inserted special metal-ceramics implants into the 6th caudal vertebra and we measured bone response with in vivo micro-CT weekly for the following 6 weeks. Our results indicated that ovariectomized mice showed a reduced ability to increase the thickness of the cortical shell close to the implant because of impaired peri-implant bone formation, especially at the periosteal surface. Moreover, we observed that healthy mice had a significantly higher loss of trabecular bone far from the implant than estrogen depleted animals. Such behavior suggests that, in healthy mice, the substantial increase in peri-implant bone formation which rapidly thickened the cortex to secure the implant may raise bone resorption elsewhere and, specifically, in the trabecular network of the same bone but far from the implant. Considering the already deteriorated bone structure of estrogen depleted mice, further bone loss seemed to be hindered. The obtained knowledge on the dynamic response of diseased bone following implant insertion should provide useful guidelines to develop advanced treatments for osteoporotic fracture fixation based on local and selective manipulation of bone turnover in the peri-implant region.

  7. Impaired bone formation in ovariectomized mice reduces implant integration as indicated by longitudinal in vivo micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Zihui Li

    Full Text Available Although osteoporotic bone, with low bone mass and deteriorated bone architecture, provides a less favorable mechanical environment than healthy bone for implant fixation, there is no general agreement on the impact of osteoporosis on peri-implant bone (remodeling, which is ultimately responsible for the long term stability of the bone-implant system. Here, we inserted an implant in a mouse model mimicking estrogen deficiency-induced bone loss and we monitored with longitudinal in vivo micro-computed tomography the spatio-temporal changes in bone (remodeling and architecture, considering the separate contributions of trabecular, endocortical and periosteal surfaces. Specifically, 12 week-old C57BL/6J mice underwent OVX/SHM surgery; 9 weeks after we inserted special metal-ceramics implants into the 6th caudal vertebra and we measured bone response with in vivo micro-CT weekly for the following 6 weeks. Our results indicated that ovariectomized mice showed a reduced ability to increase the thickness of the cortical shell close to the implant because of impaired peri-implant bone formation, especially at the periosteal surface. Moreover, we observed that healthy mice had a significantly higher loss of trabecular bone far from the implant than estrogen depleted animals. Such behavior suggests that, in healthy mice, the substantial increase in peri-implant bone formation which rapidly thickened the cortex to secure the implant may raise bone resorption elsewhere and, specifically, in the trabecular network of the same bone but far from the implant. Considering the already deteriorated bone structure of estrogen depleted mice, further bone loss seemed to be hindered. The obtained knowledge on the dynamic response of diseased bone following implant insertion should provide useful guidelines to develop advanced treatments for osteoporotic fracture fixation based on local and selective manipulation of bone turnover in the peri-implant region.

  8. Reducing radiation dose to the female breast during conventional and dedicated breast computed tomography

    Science.gov (United States)

    Rupcich, Franco John

    The purpose of this study was to quantify the effectiveness of techniques intended to reduce dose to the breast during CT coronary angiography (CTCA) scans with respect to task-based image quality, and to evaluate the effectiveness of optimal energy weighting in improving contrast-to-noise ratio (CNR), and thus the potential for reducing breast dose, during energy-resolved dedicated breast CT. A database quantifying organ dose for several radiosensitive organs irradiated during CTCA, including the breast, was generated using Monte Carlo simulations. This database facilitates estimation of organ-specific dose deposited during CTCA protocols using arbitrary x-ray spectra or tube-current modulation schemes without the need to run Monte Carlo simulations. The database was used to estimate breast dose for simulated CT images acquired for a reference protocol and five protocols intended to reduce breast dose. For each protocol, the performance of two tasks (detection of signals with unknown locations) was compared over a range of breast dose levels using a task-based, signal-detectability metric: the estimator of the area under the exponential free-response relative operating characteristic curve, AFE. For large-diameter/medium-contrast signals, when maintaining equivalent AFE, the 80 kV partial, 80 kV, 120 kV partial, and 120 kV tube-current modulated protocols reduced breast dose by 85%, 81%, 18%, and 6%, respectively, while the shielded protocol increased breast dose by 68%. Results for the small-diameter/high-contrast signal followed similar trends, but with smaller magnitude of the percent changes in dose. The 80 kV protocols demonstrated the greatest reduction to breast dose, however, the subsequent increase in noise may be clinically unacceptable. Tube output for these protocols can be adjusted to achieve more desirable noise levels with lesser dose reduction. The improvement in CNR of optimally projection-based and image-based weighted images relative to photon

  9. Low tube voltage dual source computed tomography to reduce contrast media doses in adult abdomen examinations: A phantom study

    Energy Technology Data Exchange (ETDEWEB)

    Thor, Daniel [Department of Diagnostic Medical Physics, Karolinska University Hospital, Stockholm 14186 (Sweden); Brismar, Torkel B., E-mail: torkel.brismar@gmail.com; Fischer, Michael A. [Department of Clinical Science, Intervention and Technology at Karolinska Institutet and Department of Radiology, Karolinska University Hospital in Huddinge, Stockholm 14186 (Sweden)

    2015-09-15

    Purpose: To evaluate the potential of low tube voltage dual source (DS) single energy (SE) and dual energy (DE) computed tomography (CT) to reduce contrast media (CM) dose in adult abdominal examinations of various sizes while maintaining soft tissue and iodine contrast-to-noise ratio (CNR). Methods: Four abdominal phantoms simulating a body mass index of 16 to 35 kg/m{sup 2} with four inserted syringes of 0, 2, 4, and 8 mgI/ml CM were scanned using a 64-slice DS-CT scanner. Six imaging protocols were used; one single source (SS) reference protocol (120 kV, 180 reference mAs), four low kV SE protocols (70 and 80 kV using both SS and DS), and one DE protocol at 80/140 kV. Potential CM reduction with unchanged CNRs relative to the 120 kV protocol was calculated along with the corresponding increase in radiation dose. Results: The potential contrast media reductions were determined to be approximately 53% for DS 70 kV, 51% for SS 70 kV, 44% for DS 80 kV, 40% for SS 80 kV, and 20% for DE (all differences were significant, P < 0.05). Constant CNR could be achieved by using DS 70 kV for small to medium phantom sizes (16–26 kg/m{sup 2}) and for all sizes (16–35 kg/m{sup 2}) when using DS 80 kV and DE. Corresponding radiation doses increased by 60%–107%, 23%–83%, and 6%–12%, respectively. Conclusions: DS single energy CT can be used to reduce CM dose by 44%–53% with maintained CNR in adult abdominal examinations at the cost of an increased radiation dose. DS dual-energy CT allows reduction of CM dose by 20% at similar radiation dose as compared to a standard 120 kV single source.

  10. Unenhanced computed tomography in acute renal colic reduces cost outside radiology department

    DEFF Research Database (Denmark)

    Lauritsen, J.; Andersen, J.R.; Nordling, J.

    2008-01-01

    BACKGROUND: Unenhanced multidetector computed tomography (UMDCT) is well established as the procedure of choice for radiologic evaluation of patients with renal colic. The procedure has both clinical and financial consequences for departments of surgery and radiology. However, the financial effect...... outside the radiology department is poorly elucidated. PURPOSE: To evaluate the financial consequences outside of the radiology department, a retrospective study comparing the ward occupation of patients examined with UMDCT to that of intravenous urography (IVU) was performed. MATERIAL AND METHODS......) saved the hospital USD 265,000 every 6 months compared to the use of IVU. CONCLUSION: Use of UMDCT compared to IVU in patients with renal colic leads to cost savings outside the radiology department Udgivelsesdato: 2008/12...

  11. Glycophospholipid Formulation with NADH and CoQ10 Significantly Reduces Intractable Fatigue in Western Blot-Positive ‘Chronic Lyme Disease’ Patients: Preliminary Report

    Directory of Open Access Journals (Sweden)

    Garth L. Nicolson

    2012-03-01

    Full Text Available Background: An open label 8-week preliminary study was conducted in a small number of patients to determine if a combination oral supplement containing a mixture of phosphoglycolipids, coenzyme Q10 and microencapsulated NADH and other nutrients could affect fatigue levels in long-term, Western blot-positive, multi-symptom ‘chronic Lyme disease’ patients (also called ‘post-treatment Lyme disease’ or ‘post Lyme syndrome’ with intractable fatigue. Methods: The subjects in this study were 6 males (mean age = 45.1 ± 12.4 years and 10 females (mean age = 54.6 ± 7.4 years with ‘chronic Lyme disease’ (determined by multiple symptoms and positive Western blot analysis that had been symptomatic with chronic fatigue for an average of 12.7 ± 6.6 years. They had been seen by multiple physicians (13.3 ± 7.6 and had used many other remedies, supplements and drugs (14.4 ± 7.4 without fatigue relief. Fatigue was monitored at 0, 7, 30 and 60 days using a validated instrument, the Piper Fatigue Scale.Results: Patients in this preliminary study responded to the combination test supplement, showing a 26% reduction in overall fatigue by the end of the 8-week trial (p< 0.0003. Analysis of subcategories of fatigue indicated that there were significant improvements in the ability to complete tasks and activities as well as significant improvements in mood and cognitive abilities. Regression analysis of the data indicated that reductions in fatigue were consistent and occurred with a high degree of confidence (R2= 0.998. Functional Foods in Health and Disease 2012, 2(3:35-47 Conclusions: The combination supplement was a safe and effective method to significantly reduce intractable fatigue in long-term patients with Western blot-positive ‘chronic Lyme disease.’

  12. Control Synthesis of Discrete-Time T-S Fuzzy Systems: Reducing the Conservatism Whilst Alleviating the Computational Burden.

    Science.gov (United States)

    Xie, Xiangpeng; Yue, Dong; Zhang, Huaguang; Peng, Chen

    2017-09-01

    The augmented multi-indexed matrix approach acts as a powerful tool in reducing the conservatism of control synthesis of discrete-time Takagi-Sugeno fuzzy systems. However, its computational burden is sometimes too heavy as a tradeoff. Nowadays, reducing the conservatism whilst alleviating the computational burden becomes an ideal but very challenging problem. This paper is toward finding an efficient way to achieve one of satisfactory answers. Different from the augmented multi-indexed matrix approach in the literature, we aim to design a more efficient slack variable approach under a general framework of homogenous matrix polynomials. Thanks to the introduction of a new extended representation for homogeneous matrix polynomials, related matrices with the same coefficient are collected together into one sole set and thus those redundant terms of the augmented multi-indexed matrix approach can be removed, i.e., the computational burden can be alleviated in this paper. More importantly, due to the fact that more useful information is involved into control design, the conservatism of the proposed approach as well is less than the counterpart of the augmented multi-indexed matrix approach. Finally, numerical experiments are given to show the effectiveness of the proposed approach.

  13. Long-term use of amiodarone before heart transplantation significantly reduces early post-transplant atrial fibrillation and is not associated with increased mortality after heart transplantation

    Directory of Open Access Journals (Sweden)

    Rivinius R

    2016-02-01

    group (P=0.0123. There was no statistically significant difference between patients with and without long-term use of amiodarone prior to HTX in 1-year (P=0.8596, 2-year (P=0.8620, 5-year (P=0.2737, or overall follow-up mortality after HTX (P=0.1049. Moreover, Kaplan–Meier survival analysis showed no statistically significant difference in overall survival (P=0.1786.Conclusion: Long-term use of amiodarone in patients before HTX significantly reduces early post-transplant AF and is not associated with increased mortality after HTX. Keywords: amiodarone, atrial fibrillation, heart failure, heart transplantation, mortality

  14. Some computer realizations of the REDUCE-3 calculations for exclusive processes

    International Nuclear Information System (INIS)

    Darbaidze, Ya.Z.; Merebashvili, Z.V.; Rostovtsev, V.A.

    1990-01-01

    The REDUCE-3 algorithm for the calculation of the squared gauge invariant set of tree diagrams is given in the α 3 order of the perturbation theory. The necessity of using such program packages as factorizator, 'COLOR'-factor and so on is shown. The correctness of calculation for the infrared radiation corrections as compared with manual calculations is discussed. An example of applying the programs is given for the matrix and noncommutative algebras when the well-known supersymmetric commutative relation is proved. (author)

  15. A proper choice of route significantly reduces air pollution exposure--a study on bicycle and bus trips in urban streets.

    Science.gov (United States)

    Hertel, Ole; Hvidberg, Martin; Ketzel, Matthias; Storm, Lars; Stausgaard, Lizzi

    2008-01-15

    A proper selection of route through the urban area may significantly reduce the air pollution exposure. This is the main conclusion from the presented study. Air pollution exposure is determined for two selected cohorts along the route going from home to working place, and back from working place to home. Exposure is determined with a street pollution model for three scenarios: bicycling along the shortest possible route, bicycling along the low exposure route along less trafficked streets, and finally taking the shortest trip using public transport. Furthermore, calculations are performed for the cases the trip takes place inside as well as outside the traffic rush hours. The results show that the accumulated air pollution exposure for the low exposure route is between 10% and 30% lower for the primary pollutants (NO(x) and CO). However, the difference is insignificant and in some cases even negative for the secondary pollutants (NO(2) and PM(10)/PM(2.5)). Considering only the contribution from traffic in the travelled streets, the accumulated air pollution exposure is between 54% and 67% lower for the low exposure route. The bus is generally following highly trafficked streets, and the accumulated exposure along the bus route is therefore between 79% and 115% higher than the high exposure bicycle route (the short bicycle route). Travelling outside the rush hour time periods reduces the accumulated exposure between 10% and 30% for the primary pollutants, and between 5% and 20% for the secondary pollutants. The study indicates that a web based route planner for selecting the low exposure route through the city might be a good service for the public. In addition the public may be advised to travel outside rush hour time periods.

  16. Tobacco Town: Computational Modeling of Policy Options to Reduce Tobacco Retailer Density.

    Science.gov (United States)

    Luke, Douglas A; Hammond, Ross A; Combs, Todd; Sorg, Amy; Kasman, Matt; Mack-Crane, Austen; Ribisl, Kurt M; Henriksen, Lisa

    2017-05-01

    To identify the behavioral mechanisms and effects of tobacco control policies designed to reduce tobacco retailer density. We developed the Tobacco Town agent-based simulation model to examine 4 types of retailer reduction policies: (1) random retailer reduction, (2) restriction by type of retailer, (3) limiting proximity of retailers to schools, and (4) limiting proximity of retailers to each other. The model examined the effects of these policies alone and in combination across 4 different types of towns, defined by 2 levels of population density (urban vs suburban) and 2 levels of income (higher vs lower). Model results indicated that reduction of retailer density has the potential to decrease accessibility of tobacco products by driving up search and purchase costs. Policy effects varied by town type: proximity policies worked better in dense, urban towns whereas retailer type and random retailer reduction worked better in less-dense, suburban settings. Comprehensive retailer density reduction policies have excellent potential to reduce the public health burden of tobacco use in communities.

  17. Reduced service of the “IT Service Desk” (Computing Helpdesk) on the after noon of Friday 8 October 2010

    CERN Multimedia

    IT Department

    2010-01-01

    Please note that due to relocation, the “IT Service Desk” will be operating a reduced service on Friday 8th October from 12-30. In particular, the telephone line 78888 will not be available and users will be invited to submit their requests by e-mail (Computing.Helpdesk@cern.ch). E-mail requests will be treated as normal, but some delays are possible. In the event of urgent problems you may call the IT Manager on Duty on 163013. We also take this opportunity to remind you about the “IT Service Status Board” where all computing incidents and scheduled interventions are updated online. Please see: http://cern.ch/it-servicestatus. Normal service will be resumed at 8-30 a.m on Monday 11 October. Thank you in advance for your understanding. The CERN “User Support” Team (IT-UDS-HUS)

  18. A Rosa canina - Urtica dioica - Harpagophytum procumbens/zeyheri Combination Significantly Reduces Gonarthritis Symptoms in a Randomized, Placebo-Controlled Double-Blind Study.

    Science.gov (United States)

    Moré, Margret; Gruenwald, Joerg; Pohl, Ute; Uebelhack, Ralf

    2017-12-01

    The special formulation MA212 (Rosaxan) is composed of rosehip ( Rosa canina L.) puree/juice concentrate, nettle ( Urtica dioica L.) leaf extract, and devil's claw ( Harpagophytum procumbens DC. ex Meisn. or Harpagophytum zeyheri Decne.) root extract and also supplies vitamin D. It is a food for special medical purposes ([EU] No 609/2013) for the dietary management of pain in patients with gonarthritis.This 12-week randomized, placebo-controlled double-blind parallel-design study aimed to investigate the efficacy and safety of MA212 versus placebo in patients with gonarthritis.A 3D-HPLC-fingerprint (3-dimensional high pressure liquid chromatography fingerprint) of MA212 demonstrated the presence of its herbal ingredients. Ninety-two randomized patients consumed 40 mL of MA212 (n = 46) or placebo (n = 44) daily. The Western Ontario and McMaster Universities Arthritis Index (WOMAC), quality-of-life scores at 0, 6, and 12 weeks, and analgesic consumption were documented. Statistically, the initial WOMAC subscores/scores did not differ between groups. During the study, their means significantly improved in both groups. The mean pre-post change of the WOMAC pain score (primary endpoint) was 29.87 in the MA212 group and 10.23 in the placebo group. The group difference demonstrated a significant superiority in favor of MA212 (p U  < 0.001; p t  < 0.001). Group comparisons of all WOMAC subscores/scores at 6 and 12 weeks reached same significances. Compared to placebo, both physical and mental quality of life significantly improved with MA212. There was a trend towards reduced analgesics consumption with MA212, compared to placebo. In the final efficacy evaluation, physicians (p Chi  < 0.001) and patients (p Chi  < 0.001) rated MA212 superior to placebo. MA212 was well tolerated.This study demonstrates excellent efficacy for MA212 in gonarthritis patients. Georg Thieme Verlag KG Stuttgart · New York.

  19. Non-conforming finite-element formulation for cardiac electrophysiology: an effective approach to reduce the computation time of heart simulations without compromising accuracy

    Science.gov (United States)

    Hurtado, Daniel E.; Rojas, Guillermo

    2018-04-01

    Computer simulations constitute a powerful tool for studying the electrical activity of the human heart, but computational effort remains prohibitively high. In order to recover accurate conduction velocities and wavefront shapes, the mesh size in linear element (Q1) formulations cannot exceed 0.1 mm. Here we propose a novel non-conforming finite-element formulation for the non-linear cardiac electrophysiology problem that results in accurate wavefront shapes and lower mesh-dependance in the conduction velocity, while retaining the same number of global degrees of freedom as Q1 formulations. As a result, coarser discretizations of cardiac domains can be employed in simulations without significant loss of accuracy, thus reducing the overall computational effort. We demonstrate the applicability of our formulation in biventricular simulations using a coarse mesh size of ˜ 1 mm, and show that the activation wave pattern closely follows that obtained in fine-mesh simulations at a fraction of the computation time, thus improving the accuracy-efficiency trade-off of cardiac simulations.

  20. Fast Discrete Fourier Transform Computations Using the Reduced Adder Graph Technique

    Directory of Open Access Journals (Sweden)

    Andrew G. Dempster

    2007-01-01

    Full Text Available It has recently been shown that the n-dimensional reduced adder graph (RAG-n technique is beneficial for many DSP applications such as for FIR and IIR filters, where multipliers can be grouped in multiplier blocks. This paper highlights the importance of DFT and FFT as DSP objects and also explores how the RAG-n technique can be applied to these algorithms. This RAG-n DFT will be shown to be of low complexity and possess an attractively regular VLSI data flow when implemented with the Rader DFT algorithm or the Bluestein chirp-z algorithm. ASIC synthesis data are provided and demonstrate the low complexity and high speed of the design when compared to other alternatives.

  1. Fast Discrete Fourier Transform Computations Using the Reduced Adder Graph Technique

    Directory of Open Access Journals (Sweden)

    Dempster Andrew G

    2007-01-01

    Full Text Available It has recently been shown that the -dimensional reduced adder graph (RAG- technique is beneficial for many DSP applications such as for FIR and IIR filters, where multipliers can be grouped in multiplier blocks. This paper highlights the importance of DFT and FFT as DSP objects and also explores how the RAG- technique can be applied to these algorithms. This RAG- DFT will be shown to be of low complexity and possess an attractively regular VLSI data flow when implemented with the Rader DFT algorithm or the Bluestein chirp- algorithm. ASIC synthesis data are provided and demonstrate the low complexity and high speed of the design when compared to other alternatives.

  2. Reduced radiation exposure to the mammary glands in thoracic computed tomography using organ-based tube-current modulation

    International Nuclear Information System (INIS)

    Munechika, Jiro; Ohgiya, Yoshimitsu; Gokan, Takehiko; Hashimoto, Toshi; Iwai, Tsugunori

    2013-01-01

    Organ-based tube-current modulation has been used to reduce radiation exposure to specific organs. However, there are no reports yet published on reducing radiation exposure in clinical cases. In this study, we assessed the reduction in radiation exposure to the mammary glands during thoracic computed tomography (CT) using X-CARE. In a phantom experiment, the use of X-CARE reduced radiation exposure at the midline of the precordial region by a maximum of 45.1%. In our corresponding clinical study, CT was performed using X-CARE in 15 patients, and without X-CARE in another 15. Compared to the non-X-CARE group, radiation exposure was reduced in the X-CARE group at the midline of the precordial region by 22.3% (P 0.05). X-CARE thus reduced radiation exposure at the midline of the precordial region and allowed us to obtain consistent CT values without increasing noise. However, this study revealed increases in radiation exposure at the lateral sides of the breasts. It is conceivable that patients' breasts were laterally displaced by gravity under the standard thoracic imaging conditions. Further studies that consider factors such as body size and adjustment of imaging conditions may be needed in the future. (author)

  3. "No zone" approach in penetrating neck trauma reduces unnecessary computed tomography angiography and negative explorations.

    Science.gov (United States)

    Ibraheem, Kareem; Khan, Muhammad; Rhee, Peter; Azim, Asad; O'Keeffe, Terence; Tang, Andrew; Kulvatunyou, Narong; Joseph, Bellal

    2018-01-01

    The most recent management guidelines advocate computed tomography angiography (CTA) for any suspected vascular or aero-digestive injuries in all zones and give zone II injuries special consideration. We hypothesized that physical examination can safely guide CTA use in a "no zone" approach. An 8-year retrospective analysis of all adult trauma patients with penetrating neck trauma (PNT) was performed. We included all patients in whom the platysma was violated. Patients were classified into three groups as follows: hard signs, soft signs, and asymptomatic. CTA use, positive CTA (contrast extravasation, dissection, or intimal flap) and operative details were reported. Primary outcomes were positive CTA and therapeutic neck exploration (TNE) (defined by repair of major vascular or aero-digestive injuries). A total of 337 patients with PNT met the inclusion criteria. Eighty-two patients had hard signs and all of them went to the operating room, of which 59 (72%) had TNE. One hundred fifty-six patients had soft signs, of which CTA was performed in 121 (78%), with positive findings in 12 (10%) patients. The remaining 35 (22%) underwent initial neck exploration, of which 14 (40%) were therapeutic yielding a high rate of negative exploration. Ninty-nine patients were asymptomatic, of which CTA was performed in 79 (80%), with positive findings in 3 (4%), however, none of these patients required TNE. On sub analysis based on symptoms, there was no difference in the rate of TNE between the neck zones in patients with hard signs (P = 0.23) or soft signs (P = 0.51). Regardless of the zone of injury, asymptomatic patients did not require a TNE. Physical examination regardless of the zone of injury should be the primary guide to CTA or TNE in patients with PNT. Following traditional zone-based guidelines can result in unnecessary negative explorations in patients with soft signs and may need rethinking. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Myocardial fatty acid imaging with 123I-BMIPP in patients with chronic right ventricular pressure overload. Clinical significance of reduced uptake in interventricular septum

    International Nuclear Information System (INIS)

    Hori, Yoshiro; Ishida, Yoshio; Fukuchi, Kazuki; Hayashida, Kouhei; Takamiya, Makoto

    2002-01-01

    Regionally reduced 123 I-beta-methyliodophenyl pentadecanoic acid (123I-BMIPP) uptake in the interventricular septum (SEP) is observed in some patients with chronic right ventricular (RV) pressure overload. We studied the significance of this finding by comparing it with mean pulmonary arterial pressure (mPAP). 123 I-BMIPP SPECT imaging was carried out in 21 patients with pulmonary hypertension (PH; 51+-14 years; 11 men and 10 women; 7 with primary pulmonary hypertension, 11 with pulmonary thromboembolism, and 3 with atrial septal defect). mPAP ranged from 25 to 81 mmHg (49±16 mmHg). Using a midventricular horizontal long-axis plane, regional BMIPP distributions in the RV free wall and SEP were estimated by referring to those in the LV free wall. Count ratios of the RV free wall and SEP to the LV free wall (RV/LV, SEP/LV) were determined by ROI analysis. RV/LV showed a linear correlation with mPAP (r=0.42). However, SEP/LV was inversely correlated with mPAP (r=-0.49). When SEP/RV was compared among three regions of SEP in each patient, basal SEP/RV was most sensitively decreased in response to increased mPAP (r=-0.70). These results suggest that the assessment of septal tracer uptake in 123 I-BMIPP SPECT imaging is useful for evaluating the severity of RV pressure overload in patients with PH. (author)

  5. Low-power hardware implementation of movement decoding for brain computer interface with reduced-resolution discrete cosine transform.

    Science.gov (United States)

    Minho Won; Albalawi, Hassan; Xin Li; Thomas, Donald E

    2014-01-01

    This paper describes a low-power hardware implementation for movement decoding of brain computer interface. Our proposed hardware design is facilitated by two novel ideas: (i) an efficient feature extraction method based on reduced-resolution discrete cosine transform (DCT), and (ii) a new hardware architecture of dual look-up table to perform discrete cosine transform without explicit multiplication. The proposed hardware implementation has been validated for movement decoding of electrocorticography (ECoG) signal by using a Xilinx FPGA Zynq-7000 board. It achieves more than 56× energy reduction over a reference design using band-pass filters for feature extraction.

  6. Using computer, mobile and wearable technology enhanced interventions to reduce sedentary behaviour: a systematic review and meta-analysis.

    Science.gov (United States)

    Stephenson, Aoife; McDonough, Suzanne M; Murphy, Marie H; Nugent, Chris D; Mair, Jacqueline L

    2017-08-11

    High levels of sedentary behaviour (SB) are associated with negative health consequences. Technology enhanced solutions such as mobile applications, activity monitors, prompting software, texts, emails and websites are being harnessed to reduce SB. The aim of this paper is to evaluate the effectiveness of such technology enhanced interventions aimed at reducing SB in healthy adults and to examine the behaviour change techniques (BCTs) used. Five electronic databases were searched to identify randomised-controlled trials (RCTs), published up to June 2016. Interventions using computer, mobile or wearable technologies to facilitate a reduction in SB, using a measure of sedentary time as an outcome, were eligible for inclusion. Risk of bias was assessed using the Cochrane Collaboration's tool and interventions were coded using the BCT Taxonomy (v1). Meta-analysis of 15/17 RCTs suggested that computer, mobile and wearable technology tools resulted in a mean reduction of -41.28 min per day (min/day) of sitting time (95% CI -60.99, -21.58, I2 = 77%, n = 1402), in favour of the intervention group at end point follow-up. The pooled effects showed mean reductions at short (≤ 3 months), medium (>3 to 6 months), and long-term follow-up (>6 months) of -42.42 min/day, -37.23 min/day and -1.65 min/day, respectively. Overall, 16/17 studies were deemed as having a high or unclear risk of bias, and 1/17 was judged to be at a low risk of bias. A total of 46 BCTs (14 unique) were coded for the computer, mobile and wearable components of the interventions. The most frequently coded were "prompts and cues", "self-monitoring of behaviour", "social support (unspecified)" and "goal setting (behaviour)". Interventions using computer, mobile and wearable technologies can be effective in reducing SB. Effectiveness appeared most prominent in the short-term and lessened over time. A range of BCTs have been implemented in these interventions. Future studies need to improve reporting

  7. Reduced-order modeling (ROM) for simulation and optimization powerful algorithms as key enablers for scientific computing

    CERN Document Server

    Milde, Anja; Volkwein, Stefan

    2018-01-01

    This edited monograph collects research contributions and addresses the advancement of efficient numerical procedures in the area of model order reduction (MOR) for simulation, optimization and control. The topical scope includes, but is not limited to, new out-of-the-box algorithmic solutions for scientific computing, e.g. reduced basis methods for industrial problems and MOR approaches for electrochemical processes. The target audience comprises research experts and practitioners in the field of simulation, optimization and control, but the book may also be beneficial for graduate students alike. .

  8. A study on the effectiveness of lockup-free caches for a Reduced Instruction Set Computer (RISC) processor

    OpenAIRE

    Tharpe, Leonard.

    1992-01-01

    Approved for public release; distribution is unlimited This thesis presents a simulation and analysis of the Reduced Instruction Set Computer (RISC) architecture and the effects on RISC performance of a lockup-free cache interface. RISC architectures achieve high performance by having a small, but sufficient, instruction set with most instructions executing in one clock cycle. Current RISC performance range from 1.5 to 2.0 CPI. The goal of RISC is to attain a CPI of 1.0. The major hind...

  9. A Novel Computational Method to Reduce Leaky Reaction in DNA Strand Displacement

    Directory of Open Access Journals (Sweden)

    Xin Li

    2015-01-01

    Full Text Available DNA strand displacement technique is widely used in DNA programming, DNA biosensors, and gene analysis. In DNA strand displacement, leaky reactions can cause DNA signals decay and detecting DNA signals fails. The mostly used method to avoid leakage is cleaning up after upstream leaky reactions, and it remains a challenge to develop reliable DNA strand displacement technique with low leakage. In this work, we address the challenge by experimentally evaluating the basic factors, including reaction time, ratio of reactants, and ion concentration to the leakage in DNA strand displacement. Specifically, fluorescent probes and a hairpin structure reporting DNA strand are designed to detect the output of DNA strand displacement, and thus can evaluate the leakage of DNA strand displacement reactions with different reaction time, ratios of reactants, and ion concentrations. From the obtained data, mathematical models for evaluating leakage are achieved by curve derivation. As a result, it is obtained that long time incubation, high concentration of fuel strand, and inappropriate amount of ion concentration can weaken leaky reactions. This contributes to a method to set proper reaction conditions to reduce leakage in DNA strand displacement.

  10. MarsSedEx III: linking Computational Fluid Dynamics (CFD) and reduced gravity experiments

    Science.gov (United States)

    Kuhn, N. J.; Kuhn, B.; Gartmann, A.

    2015-12-01

    Nikolaus J. Kuhn (1), Brigitte Kuhn (1), and Andres Gartmann (2) (1) University of Basel, Physical Geography, Environmental Sciences, Basel, Switzerland (nikolaus.kuhn@unibas.ch), (2) Meteorology, Climatology, Remote Sensing, Environmental Sciences, University of Basel, Switzerland Experiments conducted during the MarsSedEx I and II reduced gravity experiments showed that using empirical models for sediment transport on Mars developed for Earth violates fluid dynamics. The error is caused by the interaction between runing water and sediment particles, which affect each other in a positive feedback loop. As a consequence, the actual flow conditions around a particle cannot be represented by drag coefficients derived on Earth. This study exmines the implications of such gravity effects on sediment movement on Mars, with special emphasis on the limits of sandstones and conglomerates formed on Earth as analogues for sedimentation on Mars. Furthermore, options for correctiong the errors using a combination of CFD and recent experiments conducted during the MarsSedEx III campaign are presented.

  11. Solution, solid phase and computational structures of apicidin and its backbone-reduced analogs.

    Science.gov (United States)

    Kranz, Michael; Murray, Peter John; Taylor, Stephen; Upton, Richard J; Clegg, William; Elsegood, Mark R J

    2006-06-01

    The recently isolated broad-spectrum antiparasitic apicidin (1) is one of the few naturally occurring cyclic tetrapeptides (CTP). Depending on the solvent, the backbone of 1 exhibits two gamma-turns (in CH(2)Cl(2)) or a beta-turn (in DMSO), differing solely in the rotation of the plane of one of the amide bonds. In the X-ray crystal structure, the peptidic C==Os and NHs are on opposite sides of the backbone plane, giving rise to infinite stacks of cyclotetrapeptides connected by three intermolecular hydrogen bonds between the backbones. Conformational searches (Amber force field) on a truncated model system of 1 confirm all three backbone conformations to be low-energy states. The previously synthesized analogs of 1 containing a reduced amide bond exhibit the same backbone conformation as 1 in DMSO, which is confirmed further by the X-ray crystal structure of a model system of the desoxy analogs of 1. This similarity helps in explaining why the desoxy analogs retain some of the antiprotozoal activities of apicidin. The backbone-reduction approach designed to facilitate the cyclization step of the acyclic precursors of the CTPs seems to retain the conformational preferences of the parent peptide backbone.

  12. Bessel function expansion to reduce the calculation time and memory usage for cylindrical computer-generated holograms.

    Science.gov (United States)

    Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko

    2017-07-10

    This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.

  13. Impact of reduced-radiation dual-energy protocols using 320-detector row computed tomography for analyzing urinary calculus components: initial in vitro evaluation.

    Science.gov (United States)

    Cai, Xiangran; Zhou, Qingchun; Yu, Juan; Xian, Zhaohui; Feng, Youzhen; Yang, Wencai; Mo, Xukai

    2014-10-01

    To evaluate the impact of reduced-radiation dual-energy (DE) protocols using 320-detector row computed tomography on the differentiation of urinary calculus components. A total of 58 urinary calculi were placed into the same phantom and underwent DE scanning with 320-detector row computed tomography. Each calculus was scanned 4 times with the DE protocols using 135 kV and 80 kV tube voltage and different tube current combinations, including 100 mA and 570 mA (group A), 50 mA and 290 mA (group B), 30 mA and 170 mA (group C), and 10 mA and 60 mA (group D). The acquisition data of all 4 groups were then analyzed by stone DE analysis software, and the results were compared with x-ray diffraction analysis. Noise, contrast-to-noise ratio, and radiation dose were compared. Calculi were correctly identified in 56 of 58 stones (96.6%) using group A and B protocols. However, only 35 stones (60.3%) and 16 stones (27.6%) were correctly diagnosed using group C and D protocols, respectively. Mean noise increased significantly and mean contrast-to-noise ratio decreased significantly from groups A to D (P calculus component analysis while reducing patient radiation exposure to 1.81 mSv. Further reduction of tube currents may compromise diagnostic accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Computer-assisted determination of left ventricular endocardial borders reduces variability in the echocardiographic assessment of ejection fraction

    Directory of Open Access Journals (Sweden)

    Lindstrom Lena

    2008-11-01

    Full Text Available Abstract Background Left ventricular size and function are important prognostic factors in heart disease. Their measurement is the most frequent reason for sending patients to the echo lab. These measurements have important implications for therapy but are sensitive to the skill of the operator. Earlier automated echo-based methods have not become widely used. The aim of our study was to evaluate an automatic echocardiographic method (with manual correction if needed for determining left ventricular ejection fraction (LVEF based on an active appearance model of the left ventricle (syngo®AutoEF, Siemens Medical Solutions. Comparisons were made with manual planimetry (manual Simpson, visual assessment and automatically determined LVEF from quantitative myocardial gated single photon emission computed tomography (SPECT. Methods 60 consecutive patients referred for myocardial perfusion imaging (MPI were included in the study. Two-dimensional echocardiography was performed within one hour of MPI at rest. Image quality did not constitute an exclusion criterion. Analysis was performed by five experienced observers and by two novices. Results LVEF (%, end-diastolic and end-systolic volume/BSA (ml/m2 were for uncorrected AutoEF 54 ± 10, 51 ± 16, 24 ± 13, for corrected AutoEF 53 ± 10, 53 ± 18, 26 ± 14, for manual Simpson 51 ± 11, 56 ± 20, 28 ± 15, and for MPI 52 ± 12, 67 ± 26, 35 ± 23. The required time for analysis was significantly different for all four echocardiographic methods and was for uncorrected AutoEF 79 ± 5 s, for corrected AutoEF 159 ± 46 s, for manual Simpson 177 ± 66 s, and for visual assessment 33 ± 14 s. Compared with the expert manual Simpson, limits of agreement for novice corrected AutoEF was lower than for novice manual Simpson (0.8 ± 10.5 vs. -3.2 ± 11.4 LVEF percentage points. Calculated for experts and with LVEF (% categorized into Conclusion Corrected AutoEF reduces the variation in measurements compared with

  15. Technical Note: Method of Morris effectively reduces the computational demands of global sensitivity analysis for distributed watershed models

    Directory of Open Access Journals (Sweden)

    J. D. Herman

    2013-07-01

    Full Text Available The increase in spatially distributed hydrologic modeling warrants a corresponding increase in diagnostic methods capable of analyzing complex models with large numbers of parameters. Sobol' sensitivity analysis has proven to be a valuable tool for diagnostic analyses of hydrologic models. However, for many spatially distributed models, the Sobol' method requires a prohibitive number of model evaluations to reliably decompose output variance across the full set of parameters. We investigate the potential of the method of Morris, a screening-based sensitivity approach, to provide results sufficiently similar to those of the Sobol' method at a greatly reduced computational expense. The methods are benchmarked on the Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM over a six-month period in the Blue River watershed, Oklahoma, USA. The Sobol' method required over six million model evaluations to ensure reliable sensitivity indices, corresponding to more than 30 000 computing hours and roughly 180 gigabytes of storage space. We find that the method of Morris is able to correctly screen the most and least sensitive parameters with 300 times fewer model evaluations, requiring only 100 computing hours and 1 gigabyte of storage space. The method of Morris proves to be a promising diagnostic approach for global sensitivity analysis of highly parameterized, spatially distributed hydrologic models.

  16. PARP-1 depletion in combination with carbon ion exposure significantly reduces MMPs activity and overall increases TIMPs expression in cultured HeLa cells

    International Nuclear Information System (INIS)

    Ghorai, Atanu; Sarma, Asitikantha; Chowdhury, Priyanka; Ghosh, Utpal

    2016-01-01

    Hadron therapy is an innovative technique where cancer cells are precisely killed leaving surrounding healthy cells least affected by high linear energy transfer (LET) radiation like carbon ion beam. Anti-metastatic effect of carbon ion exposure attracts investigators into the field of hadron biology, although details remain poor. Poly(ADP-ribose) polymerase-1 (PARP-1) inhibitors are well-known radiosensitizer and several PARP-1 inhibitors are in clinical trial. Our previous studies showed that PARP-1 depletion makes the cells more radiosensitive towards carbon ion than gamma. The purpose of the present study was to investigate combining effects of PARP-1 inhibition with carbon ion exposure to control metastatic properties in HeLa cells. Activities of matrix metalloproteinases-2, 9 (MMP-2, MMP-9) were measured using the gelatin zymography after 85 MeV carbon ion exposure or gamma irradiation (0- 4 Gy) to compare metastatic potential between PARP-1 knock down (HsiI) and control cells (H-vector - HeLa transfected with vector without shRNA construct). Expression of MMP-2, MMP-9, tissue inhibitor of MMPs such as TIMP-1, TIMP-2 and TIMP-3 were checked by immunofluorescence and western blot. Cell death by trypan blue, apoptosis and autophagy induction were studied after carbon ion exposure in each cell-type. The data was analyzed using one way ANOVA and 2-tailed paired-samples T-test. PARP-1 silencing significantly reduced MMP-2 and MMP-9 activities and carbon ion exposure further diminished their activities to less than 3 % of control H-vector. On the contrary, gamma radiation enhanced both MMP-2 and MMP-9 activities in H-vector but not in HsiI cells. The expression of MMP-2 and MMP-9 in H-vector and HsiI showed different pattern after carbon ion exposure. All three TIMPs were increased in HsiI, whereas only TIMP-1 was up-regulated in H-vector after irradiation. Notably, the expressions of all TIMPs were significantly higher in HsiI than H-vector at 4 Gy. Apoptosis was

  17. Symmetric dimeric bisbenzimidazoles DBP(n reduce methylation of RARB and PTEN while significantly increase methylation of rRNA genes in MCF-7 cancer cells.

    Directory of Open Access Journals (Sweden)

    Svetlana V Kostyuk

    Full Text Available Hypermethylation is observed in the promoter regions of suppressor genes in the tumor cancer cells. Reactivation of these genes by demethylation of their promoters is a prospective strategy of the anticancer therapy. Previous experiments have shown that symmetric dimeric bisbenzimidazoles DBP(n are able to block DNA methyltransferase activities. It was also found that DBP(n produces a moderate effect on the activation of total gene expression in HeLa-TI population containing epigenetically repressed avian sarcoma genome.It is shown that DBP(n are able to penetrate the cellular membranes and accumulate in breast carcinoma cell MCF-7, mainly in the mitochondria and in the nucleus, excluding the nucleolus. The DBP(n are non-toxic to the cells and have a weak overall demethylation effect on genomic DNA. DBP(n demethylate the promoter regions of the tumor suppressor genes PTEN and RARB. DBP(n promotes expression of the genes RARB, PTEN, CDKN2A, RUNX3, Apaf-1 and APC "silent" in the MCF-7 because of the hypermethylation of their promoter regions. Simultaneously with the demethylation of the DNA in the nucleus a significant increase in the methylation level of rRNA genes in the nucleolus was detected. Increased rDNA methylation correlated with a reduction of the rRNA amount in the cells by 20-30%. It is assumed that during DNA methyltransferase activity inhibition by the DBP(n in the nucleus, the enzyme is sequestered in the nucleolus and provides additional methylation of the rDNA that are not shielded by DBP(n.It is concluded that DBP (n are able to accumulate in the nucleus (excluding the nucleolus area and in the mitochondria of cancer cells, reducing mitochondrial potential. The DBP (n induce the demethylation of a cancer cell's genome, including the demethylation of the promoters of tumor suppressor genes. DBP (n significantly increase the methylation of ribosomal RNA genes in the nucleoli. Therefore the further study of these compounds is needed

  18. Holstein-Friesian calves selected for divergence in residual feed intake during growth exhibited significant but reduced residual feed intake divergence in their first lactation.

    Science.gov (United States)

    Macdonald, K A; Pryce, J E; Spelman, R J; Davis, S R; Wales, W J; Waghorn, G C; Williams, Y J; Marett, L C; Hayes, B J

    2014-03-01

    Residual feed intake (RFI), as a measure of feed conversion during growth, was estimated for around 2,000 growing Holstein-Friesian heifer calves aged 6 to 9 mo in New Zealand and Australia, and individuals from the most and least efficient deciles (low and high RFI phenotypes) were retained. These animals (78 New Zealand cows, 105 Australian cows) were reevaluated during their first lactation to determine if divergence for RFI observed during growth was maintained during lactation. Mean daily body weight (BW) gain during assessment as calves had been 0.86 and 1.15 kg for the respective countries, and the divergence in RFI between most and least efficient deciles for growth was 21% (1.39 and 1.42 kg of dry matter, for New Zealand and Australia, respectively). At the commencement of evaluation during lactation, the cows were aged 26 to 29 mo. All were fed alfalfa and grass cubes; it was the sole diet in New Zealand, whereas 6 kg of crushed wheat/d was also fed in Australia. Measurements of RFI during lactation occurred for 34 to 37 d with measurements of milk production (daily), milk composition (2 to 3 times per week), BW and BW change (1 to 3 times per week), as well as body condition score (BCS). Daily milk production averaged 13.8 kg for New Zealand cows and 20.0 kg in Australia. No statistically significant differences were observed between calf RFI decile groups for dry matter intake, milk production, BW change, or BCS; however a significant difference was noted between groups for lactating RFI. Residual feed intake was about 3% lower for lactating cows identified as most efficient as growing calves, and no negative effects on production were observed. These results support the hypothesis that calves divergent for RFI during growth are also divergent for RFI when lactating. The causes for this reduced divergence need to be investigated to ensure that genetic selection programs based on low RFI (better efficiency) are robust. Copyright © 2014 American Dairy

  19. Non-invasive imaging of myocardial bridge by coronary computed tomography angiography: the value of transluminal attenuation gradient to predict significant dynamic compression

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yuehua; Yu, Mengmeng; Zhang, Jiayin; Li, Minghua [Shanghai Jiao Tong University Affiliated Sixth People' s Hospital, Institute of Diagnostic and Interventional Radiology, Shanghai (China); Lu, Zhigang; Wei, Meng [Shanghai Jiao Tong University Affiliated Sixth People' s Hospital, Department of Cardiology, Shanghai (China)

    2017-05-15

    To study the diagnostic value of transluminal attenuation gradient (TAG) measured by coronary computed tomography angiography (CCTA) for identifying relevant dynamic compression of myocardial bridge (MB). Patients with confirmed MB who underwent both CCTA and ICA within one month were retrospectively included. TAG was defined as the linear regression coefficient between luminal attenuation and distance. The TAG of MB vessel, length and depth of MB were measured and correlated with the presence and degree of dynamic compression observed at ICA. Systolic compression ≥50 % was considered significant. 302 patients with confirmed MB lesions were included. TAG was lowest (-17.4 ± 6.7 HU/10 mm) in patients with significant dynamic compression and highest in patients without MB compression (-9.5 ± 4.3 HU/10 mm, p < 0.001). Linear correlation revealed relation between the percentage of systolic compression and TAG (Pearson correlation, r = -0.52, p < 0.001) and no significant relation between the percentage of systolic compression and MB depth or length. ROC curve analysis determined the best cut-off value of TAG as -14.8HU/10 mm (area under curve = 0.813, 95 % confidence interval = 0.764-0.855, p < 0.001), which yielded high diagnostic accuracy (82.1 %, 248/302). The degree of ICA-assessed systolic compression of MB significantly correlates with TAG but not MB depth or length. (orig.)

  20. Definition of bulky disease in early stage Hodgkin lymphoma in computed tomography era: prognostic significance of measurements in the coronal and transverse planes.

    Science.gov (United States)

    Kumar, Anita; Burger, Irene A; Zhang, Zhigang; Drill, Esther N; Migliacci, Jocelyn C; Ng, Andrea; LaCasce, Ann; Wall, Darci; Witzig, Thomas E; Ristow, Kay; Yahalom, Joachim; Moskowitz, Craig H; Zelenetz, Andrew D

    2016-10-01

    Disease bulk is an important prognostic factor in early stage Hodgkin lymphoma, but its definition is unclear in the computed tomography era. This retrospective analysis investigated the prognostic significance of bulky disease measured in transverse and coronal planes on computed tomography imaging. Early stage Hodgkin lymphoma patients (n=185) treated with chemotherapy with or without radiotherapy from 2000-2010 were included. The longest diameter of the largest lymph node mass was measured in transverse and coronal axes on pre-treatment imaging. The optimal cut off for disease bulk was maximal diameter greater than 7 cm measured in either the transverse or coronal plane. Thirty patients with maximal transverse diameter of 7 cm or under were found to have bulk in coronal axis. The 4-year overall survival was 96.5% (CI: 93.3%, 100%) and 4-year relapse-free survival was 86.8% (CI: 81.9%, 92.1%) for all patients. Relapse-free survival at four years for bulky patients was 80.5% (CI: 73%, 88.9%) compared to 94.4% (CI: 89.1%, 100%) for non-bulky; Cox HR 4.21 (CI: 1.43, 12.38) (P=0.004). In bulky patients, relapse-free survival was not impacted in patients treated with chemoradiotherapy; however, it was significantly lower in patients treated with chemotherapy alone. In an independent validation cohort of 38 patients treated with chemotherapy alone, patients with bulky disease had an inferior relapse-free survival [at 4 years, 71.1% (CI: 52.1%, 97%) vs 94.1% (CI: 83.6%, 100%), Cox HR 5.27 (CI: 0.62, 45.16); P=0.09]. Presence of bulky disease on multidimensional computed tomography imaging is a significant prognostic factor in early stage Hodgkin lymphoma. Coronal reformations may be included for routine Hodgkin lymphoma staging evaluation. In future, our definition of disease bulk may be useful in identifying patients who are most appropriate for chemotherapy alone. Copyright© Ferrata Storti Foundation.

  1. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  2. Reduced combustion mechanism for C1-C4 hydrocarbons and its application in computational fluid dynamics flare modeling.

    Science.gov (United States)

    Damodara, Vijaya; Chen, Daniel H; Lou, Helen H; Rasel, Kader M A; Richmond, Peyton; Wang, Anan; Li, Xianchang

    2017-05-01

    Emissions from flares constitute unburned hydrocarbons, carbon monoxide (CO), soot, and other partially burned and altered hydrocarbons along with carbon dioxide (CO 2 ) and water. Soot or visible smoke is of particular concern for flare operators/regulatory agencies. The goal of the study is to develop a computational fluid dynamics (CFD) model capable of predicting flare combustion efficiency (CE) and soot emission. Since detailed combustion mechanisms are too complicated for (CFD) application, a 50-species reduced mechanism, LU 3.0.1, was developed. LU 3.0.1 is capable of handling C 4 hydrocarbons and soot precursor species (C 2 H 2 , C 2 H 4 , C 6 H 6 ). The new reduced mechanism LU 3.0.1 was first validated against experimental performance indicators: laminar flame speed, adiabatic flame temperature, and ignition delay. Further, CFD simulations using LU 3.0.1 were run to predict soot emission and CE of air-assisted flare tests conducted in 2010 in Tulsa, Oklahoma, using ANSYS Fluent software. Results of non-premixed probability density function (PDF) model and eddy dissipation concept (EDC) model are discussed. It is also noteworthy that when used in conjunction with the EDC turbulence-chemistry model, LU 3.0.1 can reasonably predict volatile organic compound (VOC) emissions as well. A reduced combustion mechanism containing 50 C 1 -C 4 species and soot precursors has been developed and validated against experimental data. The combustion mechanism is then employed in the computational fluid dynamics (CFD) of modeling of soot emission and combustion efficiency (CE) of controlled flares for which experimental soot and CE data are available. The validated CFD modeling tools are useful for oil, gas, and chemical industries to comply with U.S. Environmental Protection Agency's (EPA) mandate to achieve smokeless flaring with a high CE.

  3. The clinical significance of isolated loss of lordosis on cervical spine computed tomography in blunt trauma patients: a prospective evaluation of 1,007 patients.

    Science.gov (United States)

    Mejaddam, Ali Y; Kaafarani, Haytham M A; Ramly, Elie P; Avery, Laura L; Yeh, Dante D; King, David R; de Moya, Marc A; Velmahos, George C

    2015-11-01

    A negative computed tomographic (CT) scan may be used to rule out cervical spine (c-spine) injury after trauma. Loss of lordosis (LOL) is frequently found as the only CT abnormality. We investigated whether LOL should preclude c-spine clearance. All adult trauma patients with isolated LOL at our Level I trauma center (February 1, 2011 to May 31, 2012) were prospectively evaluated. The primary outcome was clinically significant injury on magnetic resonance imaging (MRI), flexion-extension views, and/or repeat physical examination. Of 3,333 patients (40 ± 17 years, 60% men) with a c-spine CT, 1,007 (30%) had isolated LOL. Among 841 patients with a Glasgow Coma Scale score of 15, no abnormalities were found on MRI, flexion-extension views, and/or repeat examinations, and all collars were removed. Among 166 patients with Glasgow Coma Scale less than 15, 3 (.3%) had minor abnormal MRI findings but no clinically significant injury. Isolated LOL on c-spine CT is not associated with a clinically significant injury and should not preclude c-spine clearance. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Achieving Cyber Resilience, Reducing Cybercrime and Increasing Cyber Defense Capabilities: Where Should the U.S. Department of Defense Concentrate Today to Prevent Cyberattacks of Significant Consequence

    Science.gov (United States)

    2016-04-24

    criminal actors know their targets. The Art of War by Sun Tzu reminds us “If you know yourself but not the enemy, for every victory gained you...Incisive Media, 6 October 2015. http://www.computing.co.uk/2429256 (accessed 16 March 2016). Tzu , Sun . The Art of War, The Internet Classics Archive... Bibliography 48 v List of Figures 2.2.1 Open Source Software (OSS) development - collaborative process 43 2.2.1.1

  5. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads.

    Science.gov (United States)

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; Pautomatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs.

  6. Reducing multi-qubit interactions in adiabatic quantum computation without adding auxiliary qubits. Part 1: The "deduc-reduc" method and its application to quantum factorization of numbers

    OpenAIRE

    Tanburn, Richard; Okada, Emile; Dattani, Nike

    2015-01-01

    Adiabatic quantum computing has recently been used to factor 56153 [Dattani & Bryans, arXiv:1411.6758] at room temperature, which is orders of magnitude larger than any number attempted yet using Shor's algorithm (circuit-based quantum computation). However, this number is still vastly smaller than RSA-768 which is the largest number factored thus far on a classical computer. We address a major issue arising in the scaling of adiabatic quantum factorization to much larger numbers. Namely, the...

  7. Reduce in Variation and Improve Efficiency of Target Volume Delineation by a Computer-Assisted System Using a Deformable Image Registration Approach

    International Nuclear Information System (INIS)

    Chao, K.S. Clifford; Bhide, Shreerang FRCR; Chen, Hansen; Asper, Joshua PAC; Bush, Steven; Franklin, Gregg; Kavadi, Vivek; Liengswangwong, Vichaivood; Gordon, William; Raben, Adam; Strasser, Jon; Koprowski, Christopher; Frank, Steven; Chronowski, Gregory; Ahamad, Anesa; Malyapa, Robert; Zhang Lifei; Dong Lei

    2007-01-01

    Purpose: To determine whether a computer-assisted target volume delineation (CAT) system using a deformable image registration approach can reduce the variation of target delineation among physicians with different head and neck (HN) IMRT experiences and reduce the time spent on the contouring process. Materials and Methods: We developed a deformable image registration method for mapping contours from a template case to a patient case with a similar tumor manifestation but different body configuration. Eight radiation oncologists with varying levels of clinical experience in HN IMRT performed target delineation on two HN cases, one with base-of-tongue (BOT) cancer and another with nasopharyngeal cancer (NPC), by first contouring from scratch and then by modifying the contours deformed by the CAT system. The gross target volumes were provided. Regions of interest for comparison included the clinical target volumes (CTVs) and normal organs. The volumetric and geometric variation of these regions of interest and the time spent on contouring were analyzed. Results: We found that the variation in delineating CTVs from scratch among the physicians was significant, and that using the CAT system reduced volumetric variation and improved geometric consistency in both BOT and NPC cases. The average timesaving when using the CAT system was 26% to 29% for more experienced physicians and 38% to 47% for the less experienced ones. Conclusions: A computer-assisted target volume delineation approach, using a deformable image-registration method with template contours, was able to reduce the variation among physicians with different experiences in HN IMRT while saving contouring time

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. A computationally fast, reduced model for simulating landslide dynamics and tsunamis generated by landslides in natural terrains

    Science.gov (United States)

    Mohammed, F.

    2016-12-01

    Landslide hazards such as fast-moving debris flows, slow-moving landslides, and other mass flows cause numerous fatalities, injuries, and damage. Landslide occurrences in fjords, bays, and lakes can additionally generate tsunamis with locally extremely high wave heights and runups. Two-dimensional depth-averaged models can successfully simulate the entire lifecycle of the three-dimensional landslide dynamics and tsunami propagation efficiently and accurately with the appropriate assumptions. Landslide rheology is defined using viscous fluids, visco-plastic fluids, and granular material to account for the possible landslide source materials. Saturated and unsaturated rheologies are further included to simulate debris flow, debris avalanches, mudflows, and rockslides respectively. The models are obtained by reducing the fully three-dimensional Navier-Stokes equations with the internal rheological definition of the landslide material, the water body, and appropriate scaling assumptions to obtain the depth-averaged two-dimensional models. The landslide and tsunami models are coupled to include the interaction between the landslide and the water body for tsunami generation. The reduced models are solved numerically with a fast semi-implicit finite-volume, shock-capturing based algorithm. The well-balanced, positivity preserving algorithm accurately accounts for wet-dry interface transition for the landslide runout, landslide-water body interface, and the tsunami wave flooding on land. The models are implemented as a General-Purpose computing on Graphics Processing Unit-based (GPGPU) suite of models, either coupled or run independently within the suite. The GPGPU implementation provides up to 1000 times speedup over a CPU-based serial computation. This enables simulations of multiple scenarios of hazard realizations that provides a basis for a probabilistic hazard assessment. The models have been successfully validated against experiments, past studies, and field data

  12. A three-dimensional ground-water-flow model modified to reduce computer-memory requirements and better simulate confining-bed and aquifer pinchouts

    Science.gov (United States)

    Leahy, P.P.

    1982-01-01

    The Trescott computer program for modeling groundwater flow in three dimensions has been modified to (1) treat aquifer and confining bed pinchouts more realistically and (2) reduce the computer memory requirements needed for the input data. Using the original program, simulation of aquifer systems with nonrectangular external boundaries may result in a large number of nodes that are not involved in the numerical solution of the problem, but require computer storage. (USGS)

  13. Leukocyte-depletion of blood components does not significantly reduce the risk of infectious complications. Results of a double-blinded, randomized study

    DEFF Research Database (Denmark)

    Titlestad, I. L.; Ebbesen, L. S.; Ainsworth, A. P.

    2001-01-01

    Allogeneic blood transfusions are claimed to be an independent risk factor for postoperative infections in open colorectal surgery due to immunomodulation. Leukocyte-depletion of erythrocyte suspensions has been shown in some open randomized studies to reduce the rate of postoperative infection t...

  14. Clinical and prognostic significance of bone marrow abnormalities in the appendicular skeleton detected by low-dose whole-body multidetector computed tomography in patients with multiple myeloma

    International Nuclear Information System (INIS)

    Nishida, Y; Matsue, Y; Suehara, Y; Fukumoto, K; Fujisawa, M; Takeuchi, M; Ouchi, E; Matsue, K

    2015-01-01

    Clinical significance of medullary abnormalities in the appendicular skeleton (AS) detected by low-dose whole-body multidetector computed tomography (MDCT) in patients with multiple myeloma (MM) was investigated. A total of 172 patients with monoclonal gammopathy of undetermined significance (MGUS) (n=17), smoldering MM (n=47) and symptomatic MM (n=108) underwent low-dose MDCT. CT values (CTv) of medullary density of AS⩾0 Hounsfield unit (HU) was considered as abnormal. Percentage of medullary abnormalities and the mean CTv of AS in patients with MGUS, smoldering MM and symptomatic MM were 18, 55 and 62% and −44.5 , −20.3 and 11.2 HU, respectively (P<0.001 and P<0.001). Disease progression of MM was independently associated with high CTv on multivariate analysis. In symptomatic MM, the presence of abnormal medullary lesions was associated with increased incidence of high-risk cytogenetic abnormalities (34.4% vs 7.7% P=0.002) and extramedullary disease (10.4% vs 0% P=0.032). It was also an independent poor prognostic predictor (hazard ratio 3.546, P=0.04). This study showed that CTv of AS by MDCT is correlated with disease progression of MM, and the presence of abnormal medullary lesions is a predictor for poor survival

  15. Feasibility of an automatic computer-assisted algorithm for the detection of significant coronary artery disease in patients presenting with acute chest pain

    International Nuclear Information System (INIS)

    Kang, Ki-Woon; Chang, Hyuk-Jae; Shim, Hackjoon; Kim, Young-Jin; Choi, Byoung-Wook; Yang, Woo-In; Shim, Jee-Young; Ha, Jongwon; Chung, Namsik

    2012-01-01

    Automatic computer-assisted detection (auto-CAD) of significant coronary artery disease (CAD) in coronary computed tomography angiography (cCTA) has been shown to have relatively high accuracy. However, to date, scarce data are available regarding the performance of auto-CAD in the setting of acute chest pain. This study sought to demonstrate the feasibility of an auto-CAD algorithm for cCTA in patients presenting with acute chest pain. We retrospectively investigated 398 consecutive patients (229 male, mean age 50 ± 21 years) who had acute chest pain and underwent cCTA between Apr 2007 and Jan 2011 in the emergency department (ED). All cCTA data were analyzed using an auto-CAD algorithm for the detection of >50% CAD on cCTA. The accuracy of auto-CAD was compared with the formal radiology report. In 380 of 398 patients (18 were excluded due to failure of data processing), per-patient analysis of auto-CAD revealed the following: sensitivity 94%, specificity 63%, positive predictive value (PPV) 76%, and negative predictive value (NPV) 89%. After the exclusion of 37 cases that were interpreted as invalid by the auto-CAD algorithm, the NPV was further increased up to 97%, considering the false-negative cases in the formal radiology report, and was confirmed by subsequent invasive angiogram during the index visit. We successfully demonstrated the high accuracy of an auto-CAD algorithm, compared with the formal radiology report, for the detection of >50% CAD on cCTA in the setting of acute chest pain. The auto-CAD algorithm can be used to facilitate the decision-making process in the ED.

  16. Use of theory in computer-based interventions to reduce alcohol use among adolescents and young adults: a systematic review.

    Science.gov (United States)

    Tebb, Kathleen P; Erenrich, Rebecca K; Jasik, Carolyn Bradner; Berna, Mark S; Lester, James C; Ozer, Elizabeth M

    2016-06-17

    Alcohol use and binge drinking among adolescents and young adults remain frequent causes of preventable injuries, disease, and death, and there has been growing attention to computer-based modes of intervention delivery to prevent/reduce alcohol use. Research suggests that health interventions grounded in established theory are more effective than those with no theoretical basis. The goal of this study was to conduct a literature review of computer-based interventions (CBIs) designed to address alcohol use among adolescents and young adults (aged 12-21 years) and examine the extent to which CBIs use theories of behavior change in their development and evaluations. This study also provides an update on extant CBIs addressing alcohol use among youth and their effectiveness. Between November and December of 2014, a literature review of CBIs aimed at preventing or reducing alcohol in PsychINFO, PubMed, and Google Scholar was conducted. The use of theory in each CBI was examined using a modified version of the classification system developed by Painter et al. (Ann Behav Med 35:358-362, 2008). The search yielded 600 unique articles, 500 were excluded because they did not meet the inclusion criteria. The 100 remaining articles were retained for analyses. Many articles were written about a single intervention; thus, the search revealed a total of 42 unique CBIs. In examining the use of theory, 22 CBIs (52 %) explicitly named one or more theoretical frameworks. Primary theories mentioned were social cognitive theory, transtheoretical model, theory of planned behavior and reasoned action, and health belief model. Less than half (48 %), did not use theory, but mentioned either use of a theoretical construct (such as self-efficacy) or an intervention technique (e.g., manipulating social norms). Only a few articles provided detailed information about how the theory was applied to the CBI; the vast majority included little to no information. Given the importance of theory in

  17. Use of theory in computer-based interventions to reduce alcohol use among adolescents and young adults: a systematic review

    Directory of Open Access Journals (Sweden)

    Kathleen P. Tebb

    2016-06-01

    Full Text Available Abstract Background Alcohol use and binge drinking among adolescents and young adults remain frequent causes of preventable injuries, disease, and death, and there has been growing attention to computer-based modes of intervention delivery to prevent/reduce alcohol use. Research suggests that health interventions grounded in established theory are more effective than those with no theoretical basis. The goal of this study was to conduct a literature review of computer-based interventions (CBIs designed to address alcohol use among adolescents and young adults (aged 12–21 years and examine the extent to which CBIs use theories of behavior change in their development and evaluations. This study also provides an update on extant CBIs addressing alcohol use among youth and their effectiveness. Methods Between November and December of 2014, a literature review of CBIs aimed at preventing or reducing alcohol in PsychINFO, PubMed, and Google Scholar was conducted. The use of theory in each CBI was examined using a modified version of the classification system developed by Painter et al. (Ann Behav Med 35:358–362, 2008. Results The search yielded 600 unique articles, 500 were excluded because they did not meet the inclusion criteria. The 100 remaining articles were retained for analyses. Many articles were written about a single intervention; thus, the search revealed a total of 42 unique CBIs. In examining the use of theory, 22 CBIs (52 % explicitly named one or more theoretical frameworks. Primary theories mentioned were social cognitive theory, transtheoretical model, theory of planned behavior and reasoned action, and health belief model. Less than half (48 %, did not use theory, but mentioned either use of a theoretical construct (such as self-efficacy or an intervention technique (e.g., manipulating social norms. Only a few articles provided detailed information about how the theory was applied to the CBI; the vast majority included little

  18. Postoperative myocardial infarction documented by technetium pyrophosphate scan using single-photon emission computed tomography: Significance of intraoperative myocardial ischemia and hemodynamic control

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, D.C.; Chung, F.; Burns, R.J.; Houston, P.L.; Feindel, C.M. (Toronto Hospital, Ontario (Canada))

    1989-12-01

    The aim of this prospective study was to document postoperative myocardial infarction (PMI) by technetium pyrophosphate scan using single-photon emission computed tomography (TcPPi-SPECT) in 28 patients undergoing elective coronary bypass grafting (CABG). The relationships of intraoperative electrocardiographic myocardial ischemia, hemodynamic responses, and pharmacological requirements to this incidence of PMI were correlated. Radionuclide cardioangiography and TcPPi-SPECT were performed 24 h preoperatively and 48 h postoperatively. A standard high-dose fentanyl anesthetic protocol was used. Twenty-five percent of elective CABG patients were complicated with PMI, as documented by TcPPi-SPECT with an infarcted mass of 38.0 +/- 5.5 g. No significant difference in demographic, preoperative right and left ventricular function, number of coronary vessels grafted, or aortic cross-clamp time was observed between the PMI and non-PMI groups. The distribution of patients using preoperative beta-adrenergic blocking drugs or calcium channel blocking drugs was found to have no correlation with the outcome of PMI. As well, no significant differences in hemodynamic changes or pharmacological requirements were observed in the PMI and non-PMI groups during prebypass or postbypass periods, indicating careful intraoperative control of hemodynamic indices did not prevent the outcome of PMI in these patients. However, the incidence of prebypass ischemia was 39.3% and significantly correlated with the outcome of positive TcPPi-SPECT, denoting a 3.9-fold increased risk of developing PMI. Prebypass ischemic changes in leads II and V5 were shown to correlate with increased CPK-MB release (P less than 0.05) and tends to occur more frequently with lateral myocardial infarction.

  19. Whole-body computed tomography in trauma patients: optimization of the patient scanning position significantly shortens examination time while maintaining diagnostic image quality

    Directory of Open Access Journals (Sweden)

    Hickethier T

    2018-05-01

    Full Text Available Tilman Hickethier,1,* Kamal Mammadov,1,* Bettina Baeßler,1 Thorsten Lichtenstein,1 Jochen Hinkelbein,2 Lucy Smith,3 Patrick Sven Plum,4 Seung-Hun Chon,4 David Maintz,1 De-Hua Chang1 1Department of Radiology, University Hospital of Cologne, Cologne, Germany; 2Department of Anesthesiology and Intensive Care Medicine, University Hospital of Cologne, Cologne, Germany; 3Faculty of Medicine, Memorial University of Newfoundland, St. John’s, Canada; 4Department of General, Visceral and Cancer Surgery, University Hospital of Cologne, Cologne, Germany *These authors contributed equally to this work Background: The study was conducted to compare examination time and artifact vulnerability of whole-body computed tomographies (wbCTs for trauma patients using conventional or optimized patient positioning. Patients and methods: Examination time was measured in 100 patients scanned with conventional protocol (Group A: arms positioned alongside the body for head and neck imaging and over the head for trunk imaging and 100 patients scanned with optimized protocol (Group B: arms flexed on a chest pillow without repositioning. Additionally, influence of two different scanning protocols on image quality in the most relevant body regions was assessed by two blinded readers. Results: Total wbCT duration was about 35% or 3:46 min shorter in B than in A. Artifacts in aorta (27 vs 6%, liver (40 vs 8% and spleen (27 vs 5% occurred significantly more often in B than in A. No incident of non-diagnostic image quality was reported, and no significant differences for lungs and spine were found. Conclusion: An optimized wbCT positioning protocol for trauma patients allows a significant reduction of examination time while still maintaining diagnostic image quality. Keywords: CT scan, polytrauma, acute care, time requirement, positioning

  20. Postoperative myocardial infarction documented by technetium pyrophosphate scan using single-photon emission computed tomography: Significance of intraoperative myocardial ischemia and hemodynamic control

    International Nuclear Information System (INIS)

    Cheng, D.C.; Chung, F.; Burns, R.J.; Houston, P.L.; Feindel, C.M.

    1989-01-01

    The aim of this prospective study was to document postoperative myocardial infarction (PMI) by technetium pyrophosphate scan using single-photon emission computed tomography (TcPPi-SPECT) in 28 patients undergoing elective coronary bypass grafting (CABG). The relationships of intraoperative electrocardiographic myocardial ischemia, hemodynamic responses, and pharmacological requirements to this incidence of PMI were correlated. Radionuclide cardioangiography and TcPPi-SPECT were performed 24 h preoperatively and 48 h postoperatively. A standard high-dose fentanyl anesthetic protocol was used. Twenty-five percent of elective CABG patients were complicated with PMI, as documented by TcPPi-SPECT with an infarcted mass of 38.0 +/- 5.5 g. No significant difference in demographic, preoperative right and left ventricular function, number of coronary vessels grafted, or aortic cross-clamp time was observed between the PMI and non-PMI groups. The distribution of patients using preoperative beta-adrenergic blocking drugs or calcium channel blocking drugs was found to have no correlation with the outcome of PMI. As well, no significant differences in hemodynamic changes or pharmacological requirements were observed in the PMI and non-PMI groups during prebypass or postbypass periods, indicating careful intraoperative control of hemodynamic indices did not prevent the outcome of PMI in these patients. However, the incidence of prebypass ischemia was 39.3% and significantly correlated with the outcome of positive TcPPi-SPECT, denoting a 3.9-fold increased risk of developing PMI. Prebypass ischemic changes in leads II and V5 were shown to correlate with increased CPK-MB release (P less than 0.05) and tends to occur more frequently with lateral myocardial infarction

  1. [Effective Techniques to Reduce Radiation Exposure to Medical Staff during Assist of X-ray Computed Tomography Examination].

    Science.gov (United States)

    Miyajima, Ryuichi; Fujibuchi, Toshioh; Miyachi, Yusuke; Tateishi, Satoshi; Uno, Yoshinori; Amakawa, Kazutoshi; Ohura, Hiroki; Orita, Shinichi

    2018-01-01

    Medical staffs like radiological technologists, doctors, and nurses are at an increased risk of exposure to radiation while assisting the patient in a position or monitor contrast medium injection during computed tomography (CT). However, methods to protect medical staff from radiation exposure and protocols for using radiological protection equipment have not been standardized and differ among hospitals. In this study, the distribution of scattered X-rays in a CT room was measured by placing electronic personal dosimeters in locations where medical staff stands beside the CT scanner gantry while assisting the patient and the exposure dose was measured. Moreover, we evaluated non-uniform exposure and revealed effective techniques to reduce the exposure dose to medical staff during CT. The dose of the scattered X-rays was the lowest at the gantry and at the examination table during both head and abdominal CT. The dose was the highest at the trunk of the upper body of the operator corresponding to a height of 130 cm during head CT and at the head corresponding to a height of 150 cm during abdominal CT. The maximum dose to the crystalline lens was approximately 600 μSv during head CT. We found that the use of volumetric CT scanning and X-ray protective goggles, and face direction toward the gantry reduced the exposure dose, particularly to the crystalline lens, for which lower equivalent dose during CT scan has been recently recommended in the International Commission on Radiological Protection Publication 118.

  2. Using the Hadoop/MapReduce approach for monitoring the CERN storage system and improving the ATLAS computing model

    CERN Document Server

    Russo, Stefano Alberto; Lamanna, M

    The processing of huge amounts of data, an already fundamental task for the research in the elementary particle physics field, is becoming more and more important also for companies operating in the Information Technology (IT) industry. In this context, if conventional approaches are adopted several problems arise, starting from the congestion of the communication channels. In the IT sector, one of the approaches designed to minimize this congestion on is to exploit the data locality, or in other words, to bring the computation as closer as possible to where the data resides. The most common implementation of this concept is the Hadoop/MapReduce framework. In this thesis work I evaluate the usage of Hadoop/MapReduce in two areas: a standard one similar to typical IT analyses, and an innovative one related to high energy physics analyses. The first consists in monitoring the history of the storage cluster which stores the data generated by the LHC experiments, the second in the physics analysis of the latter, ...

  3. Reducing overlay sampling for APC-based correction per exposure by replacing measured data with computational prediction

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc

    2016-03-01

    One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  5. Reduced memory skills and increased hair cortisol levels in recent Ecstasy/MDMA users: significant but independent neurocognitive and neurohormonal deficits.

    Science.gov (United States)

    Downey, Luke A; Sands, Helen; Jones, Lewis; Clow, Angela; Evans, Phil; Stalder, Tobias; Parrott, Andrew C

    2015-05-01

    The goals of this study were to measure the neurocognitive performance of recent users of recreational Ecstasy and investigate whether it was associated with the stress hormone cortisol. The 101 participants included 27 recent light users of Ecstasy (one to four times in the last 3 months), 23 recent heavier Ecstasy users (five or more times) and 51 non-users. Rivermead paragraph recall provided an objective measure for immediate and delayed recall. The prospective and retrospective memory questionnaire provided a subjective index of memory deficits. Cortisol levels were taken from near-scalp 3-month hair samples. Cortisol was significantly raised in recent heavy Ecstasy users compared with controls, whereas hair cortisol in light Ecstasy users was not raised. Both Ecstasy groups were significantly impaired on the Rivermead delayed word recall, and both groups reported significantly more retrospective and prospective memory problems. Stepwise regression confirmed that lifetime Ecstasy predicted the extent of these memory deficits. Recreational Ecstasy is associated with increased levels of the bio-energetic stress hormone cortisol and significant memory impairments. No significant relationship between cortisol and the cognitive deficits was observed. Ecstasy users did display evidence of a metacognitive deficit, with the strength of the correlations between objective and subjective memory performances being significantly lower in the Ecstasy users. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Design criteria for rhenium-reduced nickel-based single-crystal alloys. Identification and computer-assisted conversion

    International Nuclear Information System (INIS)

    Goehler, Thomas

    2016-01-01

    In the present work, design criteria and property models for the creep strength optimization of rhenium-free nickel based single crystal Superalloys are investigated. The study focuses on a typical load condition of 1050 C and 150 MPa, which is representative for flight engine applications. Thereby the key aspect is to link chemical composition, manufacturing processes, microstructure formation and mechanistic understanding of dislocation creep through a computational materials engineering approach. Beside the positive effect of rhenium on solid solution hardening, a second mechanism in which rhenium increases high temperature creep strength is identified. It indirectly stabilizes precipitation hardening by reducing the coarsening kinetics of γ'-rafting. Five 1st and 2nd generation technical Superalloys show a comparable microstructure evolution for up to 2 % plastic elongation, while creep times differ by a factor of five. The application of a microstructure sensitive creep model shows that these coarsening processes can activate γ-cutting and thus lead to an increasing creep rate. Based on these calculations a threshold value of φ γ/γ' > 2,5 at 150 MPa is estimated. This ratio of matrix channel to raft thickness has been proofed for multiple positions by microstructure analysis of interrupted creep tests. The mechanism described previously can be decelerated by the enrichment of the γ-matrix with slow diffusing elements. The same principle also increases the solid solution strength of the γ-matrix. Therefore, the present work delivers an additional mechanistic explanation why creep properties of single phase nickel based alloys can be transferred to two phase technical Superalloys with rafted γ'-structure. Following, the best way to substitute both rhenium fundamental properties, namely a slow diffusion coefficient and a small solubility in g', has been investigated by means of CALPHAD-modeling. Only molybdenum and especially tungsten

  7. A Randomized Controlled Trial to Compare Computer-assisted Motivational Intervention with Didactic Educational Counseling to Reduce Unprotected Sex in Female Adolescents.

    Science.gov (United States)

    Gold, Melanie A; Tzilos, Golfo K; Stein, L A R; Anderson, Bradley J; Stein, Michael D; Ryan, Christopher M; Zuckoff, Allan; DiClemente, Carlo

    2016-02-01

    To examine a computer-assisted, counselor-guided motivational intervention (CAMI) aimed at reducing the risk of unprotected sexual intercourse. DESIGN, SETTING, PARTICIPANTS, INTERVENTIONS, AND MAIN OUTCOME MEASURES: We conducted a 9-month, longitudinal randomized controlled trial with a multisite recruitment strategy including clinic, university, and social referrals, and compared the CAMI with didactic educational counseling in 572 female adolescents with a mean age of 17 years (SD = 2.2 years; range = 13-21 years; 59% African American) who were at risk for pregnancy and sexually transmitted diseases. The primary outcome was the acceptability of the CAMI according to self-reported rating scales. The secondary outcome was the reduction of pregnancy and sexually transmitted disease risk using a 9-month, self-report timeline follow-back calendar of unprotected sex. The CAMI was rated easy to use. Compared with the didactic educational counseling, there was a significant effect of the intervention which suggested that the CAMI helped reduce unprotected sex among participants who completed the study. However, because of the high attrition rate, the intent to treat analysis did not demonstrate a significant effect of the CAMI on reducing the rate of unprotected sex. Among those who completed the intervention, the CAMI reduced unprotected sex among an at-risk, predominantly minority sample of female adolescents. Modification of the CAMI to address methodological issues that contributed to a high drop-out rate are needed to make the intervention more acceptable and feasible for use among sexually active predominantly minority, at-risk, female adolescents. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  8. Metaldyne: Plant-Wide Assessment at Royal Oak Finds Opportunities to Improve Manufacturing Efficiency, Reduce Energy Use, and Achieve Significant Cost Savings

    Energy Technology Data Exchange (ETDEWEB)

    2005-05-01

    This case study prepared for the U.S. Department of Energy's Industrial Technologies Program describes a plant-wide energy assessment conducted at the Metaldyne, Inc., forging plant in Royal Oak, Michigan. The assessment focused on reducing the plant's operating costs, inventory, and energy use. If the company were to implement all the recommendations that came out of the assessment, its total annual energy savings for electricity would be about 11.5 million kWh and annual cost savings would be $12.6 million.

  9. Determining the haemodynamic significance of arterial stenosis: the relationship between CT angiography, computational fluid dynamics, and non-invasive fractional flow reserve

    International Nuclear Information System (INIS)

    Pang, C.L.; Alcock, R.; Pilkington, N.; Reis, T.; Roobottom, C.

    2016-01-01

    Coronary artery disease causes significant morbidity and mortality worldwide. Invasive coronary angiography (ICA) is currently the reference standard investigation. Fractional flow reserve (FFR) complements traditional ICA by providing extra information on blood flow, which has convincingly led to better patient management and improved cost-effectiveness. Computed tomography coronary angiography (CTCA) is suitable for the investigation of chest pain, especially in the low- and intermediate-risk groups. FFR generated using CT data (producing FFR_C_T) may improve the positive predictive value of CTCA. The basic science of FFR_C_T is like a “black box” to most imaging professionals. A fundamental principle is that good quality CTCA is likely to make any post-processing easier and more reliable. Both diagnostic and observational studies have suggested that the accuracy and the short-term outcome of using FFR_C_T are both comparable with FFR in ICA. More multidisciplinary research with further refined diagnostic and longer-term observational studies will hopefully pinpoint the role of FFR_C_T in existing clinical pathways.

  10. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve

    Science.gov (United States)

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-01-01

    Abstract The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD). We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC. FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = −0.584 and r = −0.568, respectively, both P system can predict FFR at an optimal cut-off of <0.80, and we propose a novel application of CT-AC to MPI-IQ-SPECT for predicting clinically significant and insignificant FFR even in nonobese patients. PMID:29390486

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. Reducing radiation dose in the diagnosis of pulmonary embolism using adaptive statistical iterative reconstruction and lower tube potential in computed tomography

    International Nuclear Information System (INIS)

    Kaul, David; Grupp, Ulrich; Kahn, Johannes; Wiener, Edzard; Hamm, Bernd; Streitparth, Florian; Ghadjar, Pirus

    2014-01-01

    To assess the impact of ASIR (adaptive statistical iterative reconstruction) and lower tube potential on dose reduction and image quality in chest computed tomography angiographies (CTAs) of patients with pulmonary embolism. CT data from 44 patients with pulmonary embolism were acquired using different protocols - Group A: 120 kV, filtered back projection, n = 12; Group B: 120 kV, 40 % ASIR, n = 12; Group C: 100 kV, 40 % ASIR, n = 12 and Group D: 80 kV, 40 % ASIR, n = 8. Normalised effective dose was calculated; image quality was assessed quantitatively and qualitatively. Normalised effective dose in Group B was 33.8 % lower than in Group A (p = 0.014) and 54.4 % lower in Group C than in Group A (p < 0.001). Group A, B and C did not show significant differences in qualitative or quantitative analysis of image quality. Group D showed significantly higher noise levels in qualitative and quantitative analysis, significantly more artefacts and decreased overall diagnosability. Best results, considering dose reduction and image quality, were achieved in Group C. The combination of ASIR and lower tube potential is an option to reduce radiation without significant worsening of image quality in the diagnosis of pulmonary embolism. (orig.)

  14. Reducing radiation dose in the diagnosis of pulmonary embolism using adaptive statistical iterative reconstruction and lower tube potential in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kaul, David [Campus Virchow-Klinikum, Department of Radiation Oncology, Charite School of Medicine and University Hospital, Berlin (Germany); Charite School of Medicine and University Hospital, Department of Radiology, Berlin (Germany); Grupp, Ulrich; Kahn, Johannes; Wiener, Edzard; Hamm, Bernd; Streitparth, Florian [Charite School of Medicine and University Hospital, Department of Radiology, Berlin (Germany); Ghadjar, Pirus [Campus Virchow-Klinikum, Department of Radiation Oncology, Charite School of Medicine and University Hospital, Berlin (Germany)

    2014-11-15

    To assess the impact of ASIR (adaptive statistical iterative reconstruction) and lower tube potential on dose reduction and image quality in chest computed tomography angiographies (CTAs) of patients with pulmonary embolism. CT data from 44 patients with pulmonary embolism were acquired using different protocols - Group A: 120 kV, filtered back projection, n = 12; Group B: 120 kV, 40 % ASIR, n = 12; Group C: 100 kV, 40 % ASIR, n = 12 and Group D: 80 kV, 40 % ASIR, n = 8. Normalised effective dose was calculated; image quality was assessed quantitatively and qualitatively. Normalised effective dose in Group B was 33.8 % lower than in Group A (p = 0.014) and 54.4 % lower in Group C than in Group A (p < 0.001). Group A, B and C did not show significant differences in qualitative or quantitative analysis of image quality. Group D showed significantly higher noise levels in qualitative and quantitative analysis, significantly more artefacts and decreased overall diagnosability. Best results, considering dose reduction and image quality, were achieved in Group C. The combination of ASIR and lower tube potential is an option to reduce radiation without significant worsening of image quality in the diagnosis of pulmonary embolism. (orig.)

  15. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  16. Soluble CD36 and risk markers of insulin resistance and atherosclerosis are elevated in polycystic ovary syndrome and significantly reduced during pioglitazone treatment

    DEFF Research Database (Denmark)

    Glintborg, Dorte; Højlund, Kurt; Andersen, Marianne

    2007-01-01

    Objective: We investigated the relation between soluble CD36 (sCD36), risk markers of atherosclerosis and body composition, and glucose and lipid metabolism in polycystic ovary syndrome (PCOS) Research Design and Methods: Thirty PCOS patients were randomized to pioglitazone, 30 mg/day or placebo...... units), oxLDL (44.9 (26.9 - 75.1) vs. 36.1 (23.4 - 55.5) U/l), and hsCRP (0.26 (0.03 - 2.41) vs. 0.12 (0.02 - 0.81) mg/dl) were significantly increased in PCOS patients vs. controls (geometric mean (+/- 2SD)). In PCOS, positive correlations were found between central fat mass and sCD36 (r=0.43), hs......CRP (r=0.43), and IL-6 (r=0.42), all pPCOS patients and controls (n=44). sCD36 and oxLDL were significant...

  17. Does Liposomal Bupivacaine (Exparel) Significantly Reduce Postoperative Pain/Numbness in Symptomatic Teeth with a Diagnosis of Necrosis? A Prospective, Randomized, Double-blind Trial.

    Science.gov (United States)

    Glenn, Brandon; Drum, Melissa; Reader, Al; Fowler, Sara; Nusstein, John; Beck, Mike

    2016-09-01

    Medical studies have shown some potential for infiltrations of liposomal bupivacaine (Exparel; Pacira Pharmaceuticals, San Diego, CA), a slow-release bupivacaine solution, to extend postoperative benefits of numbness/pain relief for up to several days. Because the Food and Drug Administration has approved Exparel only for infiltrations, we wanted to evaluate if it would be effective as an infiltration to control postoperative pain. The purpose of this study was to compare an infiltration of bupivacaine with liposomal bupivacaine for postoperative numbness and pain in symptomatic patients diagnosed with pulpal necrosis experiencing moderate to severe preoperative pain. One hundred patients randomly received a 4.0-mL buccal infiltration of either bupivacaine or liposomal bupivacaine after endodontic debridement. For postoperative pain, patients were given ibuprofen/acetaminophen, and they could receive narcotic pain medication as an escape. Patients recorded their level of numbness, pain, and medication use the night of the appointment and over the next 5 days. Success was defined as no or mild postoperative pain and no narcotic use. The success rate was 29% for the liposomal group and 22% for the bupivacaine group, with no significant difference (P = .4684) between the groups. Liposomal bupivacaine had some effect on soft tissue numbness, pain, and use of non-narcotic medications, but it was not clinically significant. There was no significant difference in the need for escape medication. For symptomatic patients diagnosed with pulpal necrosis experiencing moderate to severe preoperative pain, a 4.0-mL infiltration of liposomal bupivacaine did not result in a statistically significant increase in postoperative success compared with an infiltration of 4.0 mL bupivacaine. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  18. On the computational assessment of white matter hyperintensity progression: difficulties in method selection and bias field correction performance on images with significant white matter pathology

    Energy Technology Data Exchange (ETDEWEB)

    Valdes Hernandez, Maria del C.; Gonzalez-Castro, Victor; Wang, Xin; Doubal, Fergus; Munoz Maniega, Susana; Wardlaw, Joanna M. [Centre for Clinical Brian Sciences, Department of Neuroimaging Sciences, Edinburgh (United Kingdom); Ghandour, Dina T. [University of Edinburgh, College of Medicine and Veterinary Medicine, Edinburgh (United Kingdom); Armitage, Paul A. [University of Sheffield, Department of Cardiovascular Sciences, Sheffield (United Kingdom)

    2016-05-15

    Subtle inhomogeneities in the scanner's magnetic fields (B{sub 0} and B{sub 1}) alter the intensity levels of the structural magnetic resonance imaging (MRI) affecting the volumetric assessment of WMH changes. Here, we investigate the influence that (1) correcting the images for the B{sub 1} inhomogeneities (i.e. bias field correction (BFC)) and (2) selection of the WMH change assessment method can have on longitudinal analyses of WMH progression and discuss possible solutions. We used brain structural MRI from 46 mild stroke patients scanned at stroke onset and 3 years later. We tested three BFC approaches: FSL-FAST, N4 and exponentially entropy-driven homomorphic unsharp masking (E{sup 2}D-HUM) and analysed their effect on the measured WMH change. Separately, we tested two methods to assess WMH changes: measuring WMH volumes independently at both time points semi-automatically (MCMxxxVI) and subtracting intensity-normalised FLAIR images at both time points following image gamma correction. We then combined the BFC with the computational method that performed best across the whole sample to assess WMH changes. Analysis of the difference in the variance-to-mean intensity ratio in normal tissue between BFC and uncorrected images and visual inspection showed that all BFC methods altered the WMH appearance and distribution, but FSL-FAST in general performed more consistently across the sample and MRI modalities. The WMH volume change over 3 years obtained with MCMxxxVI with vs. without FSL-FAST BFC did not significantly differ (medians(IQR)(with BFC) = 3.2(6.3) vs. 2.9(7.4)ml (without BFC), p = 0.5), but both differed significantly from the WMH volume change obtained from subtracting post-processed FLAIR images (without BFC)(7.6(8.2)ml, p < 0.001). This latter method considerably inflated the WMH volume change as subtle WMH at baseline that became more intense at follow-up were counted as increase in the volumetric change. Measurement of WMH volume change remains

  19. Reduced estimated glomerular filtration rate (eGFR 73 m2 ) at first transurethral resection of bladder tumour is a significant predictor of subsequent recurrence and progression.

    Science.gov (United States)

    Blute, Michael L; Kucherov, Victor; Rushmer, Timothy J; Damodaran, Shivashankar; Shi, Fangfang; Abel, E Jason; Jarrard, David F; Richards, Kyle A; Messing, Edward M; Downs, Tracy M

    2017-09-01

    To evaluate if moderate chronic kidney disease [CKD; estimated glomerular filtration rate (eGFR) 73 m 2 ] is associated with high rates of non-muscle-invasive bladder cancer (NMIBC) recurrence or progression. A multi-institutional database identified patients with serum creatinine values prior to first transurethral resection of bladder tumour (TURBT). The CKD-epidemiology collaboration formula calculated patient eGFR. Cox proportional hazards models evaluated associations with recurrence-free (RFS) and progression-free survival (PFS). In all, 727 patients were identified with a median (interquartile range [IQR]) patient age of 69.8 (60.1-77.6) years. Data for eGFR were available for 632 patients. During a median (IQR) follow-up of 3.7 (1.5-6.5) years, 400 (55%) patients had recurrence and 145 (19.9%) patients had progression of tumour stage or grade. Moderate or severe CKD was identified in 183 patients according to eGFR. Multivariable analysis identified an eGFR of 73 m 2 (hazard ratio [HR] 1.5, 95% confidence interval [CI]: 1.2-1.9; P = 0.002) as a predictor of tumour recurrence. The 5-year RFS rate was 46% for patients with an eGFR of ≥60 mL/min/1.73 m 2 and 27% for patients with an eGFR of 73 m 2 (P 73 m 2 (HR 3.7, 95% CI: 1.75-7.94; P = 0.001) was associated with progression to muscle-invasive disease. The 5-year PFS rate was 83% for patients with an eGFR of ≥60 mL/min/1.73 m 2 and 71% for patients with an eGFR of 73 m 2 (P = 0.01). Moderate CKD at first TURBT is associated with reduced RFS and PFS. Patients with reduced renal function should be considered for increased surveillance. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  20. Denver screening protocol for blunt cerebrovascular injury reduces the use of multi-detector computed tomography angiography.

    Science.gov (United States)

    Beliaev, Andrei M; Barber, P Alan; Marshall, Roger J; Civil, Ian

    2014-06-01

    Blunt cerebrovascular injury (BCVI) occurs in 0.2-2.7% of blunt trauma patients and has up to 30% mortality. Conventional screening does not recognize up to 20% of BCVI patients. To improve diagnosis of BCVI, both an expanded battery of screening criteria and a multi-detector computed tomography angiography (CTA) have been suggested. The aim of this study is to investigate whether the use of CTA restricted to the Denver protocol screen-positive patients would reduce the unnecessary use of CTA as a pre-emptive screening tool. This is a registry-based study of blunt trauma patients admitted to Auckland City Hospital from 1998 to 2012. The diagnosis of BCVI was confirmed or excluded with CTA, magnetic resonance angiography and, if these imaging were non-conclusive, four-vessel digital subtraction angiography. Thirty (61%) BCVI and 19 (39%) non-BCVI patients met eligibility criteria. The Denver protocol applied to our cohort of patients had a sensitivity of 97% (95% confidence interval (CI): 83-100%) and a specificity of 42% (95% CI: 20-67%). With a prevalence of BCVI in blunt trauma patients of 0.2% and 2.7%, post-test odds of a screen-positive test were 0.03 (95% CI: 0.002-0.005) and 0.046 (95% CI: 0.314-0.068), respectively. Application of the CTA to the Denver protocol screen-positive trauma patients can decrease the use of CTA as a pre-emptive screening tool by 95-97% and reduces its hazards. © 2013 Royal Australasian College of Surgeons.

  1. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. Intra-articular laser treatment plus Platelet Rich Plasma (PRP) significantly reduces pain in many patients who had failed prior PRP treatment

    Science.gov (United States)

    Prodromos, Chadwick C.; Finkle, Susan; Dawes, Alexander; Dizon, Angelo

    2018-02-01

    INTRODUCTION: In our practice Platelet Rich Plasma (PRP) injections effectively reduce pain in most but not all arthritic patients. However, for patients who fail PRP treatment, no good alternative currently exists except total joint replacement surgery. Low level laser therapy (LLLT) on the surface of the skin has not been helpful for arthritis patients in our experience. However, we hypothesized that intra-articular laser treatment would be an effective augmentation to PRP injection and would increase its efficacy in patients who had failed prior PRP injection alone. METHODS: We offered Intra-articular Low Level Laser Therapy (IAL) treatment in conjunction with repeat PRP injection to patients who had received no benefit from PRP injection alone at our center. They were the treatment group. They were not charged for PRP or IAL. They also served as a historical control group since they had all had failed PRP treatment alone. 28 patients (30 joints) accepted treatment after informed consent. 22 knees, 4 hips, 2 shoulder glenohumeral joints and 1 first carpo-metacarpal (1st CMC) joint were treated RESULTS: All patients were followed up at 1 month and no adverse events were seen from the treatment. At 6 months post treatment 46% of patients had good outcomes, and at 1 year 17% still showed improvement after treatment. 11 patients failed treatment and went on to joint replacement. DISCUSSION: A single treatment of IAL with PRP salvaged 46% of patients who had failed PRP treatment alone, allowing avoidance of surgery and good pain control.

  9. Towards reducing impact-induced brain injury: lessons from a computational study of army and football helmet pads.

    Science.gov (United States)

    Moss, William C; King, Michael J; Blackman, Eric G

    2014-01-01

    We use computational simulations to compare the impact response of different football and U.S. Army helmet pad materials. We conduct experiments to characterise the material response of different helmet pads. We simulate experimental helmet impact tests performed by the U.S. Army to validate our methods. We then simulate a cylindrical impactor striking different pads. The acceleration history of the impactor is used to calculate the head injury criterion for each pad. We conduct sensitivity studies exploring the effects of pad composition, geometry and material stiffness. We find that (1) the football pad materials do not outperform the currently used military pad material in militarily relevant impact scenarios; (2) optimal material properties for a pad depend on impact energy and (3) thicker pads perform better at all velocities. Although we considered only the isolated response of pad materials, not entire helmet systems, our analysis suggests that by using larger helmet shells with correspondingly thicker pads, impact-induced traumatic brain injury may be reduced.

  10. Glucagon-like peptide-1 acutely affects renal blood flow and urinary flow rate in spontaneously hypertensive rats despite significantly reduced renal expression of GLP-1 receptors.

    Science.gov (United States)

    Ronn, Jonas; Jensen, Elisa P; Wewer Albrechtsen, Nicolai J; Holst, Jens Juul; Sorensen, Charlotte M

    2017-12-01

    Glucagon-like peptide-1 (GLP-1) is an incretin hormone increasing postprandial insulin release. GLP-1 also induces diuresis and natriuresis in humans and rodents. The GLP-1 receptor is extensively expressed in the renal vascular tree in normotensive rats where acute GLP-1 treatment leads to increased mean arterial pressure (MAP) and increased renal blood flow (RBF). In hypertensive animal models, GLP-1 has been reported both to increase and decrease MAP. The aim of this study was to examine expression of renal GLP-1 receptors in spontaneously hypertensive rats (SHR) and to assess the effect of acute intrarenal infusion of GLP-1. We hypothesized that GLP-1 would increase diuresis and natriuresis and reduce MAP in SHR. Immunohistochemical staining and in situ hybridization for the GLP-1 receptor were used to localize GLP-1 receptors in the kidney. Sevoflurane-anesthetized normotensive Sprague-Dawley rats and SHR received a 20 min intrarenal infusion of GLP-1 and changes in MAP, RBF, heart rate, dieresis, and natriuresis were measured. The vasodilatory effect of GLP-1 was assessed in isolated interlobar arteries from normo- and hypertensive rats. We found no expression of GLP-1 receptors in the kidney from SHR. However, acute intrarenal infusion of GLP-1 increased MAP, RBF, dieresis, and natriuresis without affecting heart rate in both rat strains. These results suggest that the acute renal effects of GLP-1 in SHR are caused either by extrarenal GLP-1 receptors activating other mechanisms (e.g., insulin) to induce the renal changes observed or possibly by an alternative renal GLP-1 receptor. © 2017 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  11. Human Tubal-Derived Mesenchymal Stromal Cells Associated with Low Level Laser Therapy Significantly Reduces Cigarette Smoke-Induced COPD in C57BL/6 mice.

    Directory of Open Access Journals (Sweden)

    Jean Pierre Schatzmann Peron

    Full Text Available Cigarette smoke-induced chronic obstructive pulmonary disease is a very debilitating disease, with a very high prevalence worldwide, which results in a expressive economic and social burden. Therefore, new therapeutic approaches to treat these patients are of unquestionable relevance. The use of mesenchymal stromal cells (MSCs is an innovative and yet accessible approach for pulmonary acute and chronic diseases, mainly due to its important immunoregulatory, anti-fibrogenic, anti-apoptotic and pro-angiogenic. Besides, the use of adjuvant therapies, whose aim is to boost or synergize with their function should be tested. Low level laser (LLL therapy is a relatively new and promising approach, with very low cost, no invasiveness and no side effects. Here, we aimed to study the effectiveness of human tube derived MSCs (htMSCs cell therapy associated with a 30mW/3J-660 nm LLL irradiation in experimental cigarette smoke-induced chronic obstructive pulmonary disease. Thus, C57BL/6 mice were exposed to cigarette smoke for 75 days (twice a day and all experiments were performed on day 76. Experimental groups receive htMSCS either intraperitoneally or intranasally and/or LLL irradiation either alone or in association. We show that co-therapy greatly reduces lung inflammation, lowering the cellular infiltrate and pro-inflammatory cytokine secretion (IL-1β, IL-6, TNF-α and KC, which were followed by decreased mucus production, collagen accumulation and tissue damage. These findings seemed to be secondary to the reduction of both NF-κB and NF-AT activation in lung tissues with a concomitant increase in IL-10. In summary, our data suggests that the concomitant use of MSCs + LLLT may be a promising therapeutic approach for lung inflammatory diseases as COPD.

  12. Peak medial (but not lateral) hamstring activity is significantly lower during stance phase of running. An EMG investigation using a reduced gravity treadmill.

    Science.gov (United States)

    Hansen, Clint; Einarson, Einar; Thomson, Athol; Whiteley, Rodney

    2017-09-01

    The hamstrings are seen to work during late swing phase (presumably to decelerate the extending shank) then during stance phase (presumably stabilizing the knee and contributing to horizontal force production during propulsion) of running. A better understanding of this hamstring activation during running may contribute to injury prevention and performance enhancement (targeting the specific role via specific contraction mode). Twenty active adult males underwent surface EMG recordings of their medial and lateral hamstrings while running on a reduced gravity treadmill. Participants underwent 36 different conditions for combinations of 50%-100% altering bodyweight (10% increments) & 6-16km/h (2km/h increments, i.e.: 36 conditions) for a minimum of 6 strides of each leg (maximum 32). EMG was normalized to the peak value seen for each individual during any stride in any trial to describe relative activation levels during gait. Increasing running speed effected greater increases in EMG for all muscles than did altering bodyweight. Peak EMG for the lateral hamstrings during running trials was similar for both swing and stance phase whereas the medial hamstrings showed an approximate 20% reduction during stance compared to swing phase. It is suggested that the lateral hamstrings work equally hard during swing and stance phase however the medial hamstrings are loaded slightly less every stance phase. Likely this helps explain the higher incidence of lateral hamstring injury. Hamstring injury prevention and rehabilitation programs incorporating running should consider running speed as more potent stimulus for increasing hamstring muscle activation than impact loading. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    Science.gov (United States)

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (Pimproved image quality compared with local processing protocols and has been

  14. Variability of the Cyclin-Dependent Kinase 2 Flexibility Without Significant Change in the Initial Conformation of the Protein or Its Environment; a Computational Study.

    Science.gov (United States)

    Taghizadeh, Mohammad; Goliaei, Bahram; Madadkar-Sobhani, Armin

    2016-06-01

    Protein flexibility, which has been referred as a dynamic behavior has various roles in proteins' functions. Furthermore, for some developed tools in bioinformatics, such as protein-protein docking software, considering the protein flexibility, causes a higher degree of accuracy. Through undertaking the present work, we have accomplished the quantification plus analysis of the variations in the human Cyclin Dependent Kinase 2 (hCDK2) protein flexibility without affecting a significant change in its initial environment or the protein per se. The main goal of the present research was to calculate variations in the flexibility for each residue of the hCDK2, analysis of their flexibility variations through clustering, and to investigate the functional aspects of the residues with high flexibility variations. Using Gromacs package (version 4.5.4), three independent molecular dynamics (MD) simulations of the hCDK2 protein (PDB ID: 1HCL) was accomplished with no significant changes in their initial environments, structures, or conformations, followed by Root Mean Square Fluctuations (RMSF) calculation of these MD trajectories. The amount of variations in these three curves of RMSF was calculated using two formulas. More than 50% of the variation in the flexibility (the distance between the maximum and the minimum amount of the RMSF) was found at the region of Val-154. As well, there are other major flexibility fluctuations in other residues. These residues were mostly positioned in the vicinity of the functional residues. The subsequent works were done, as followed by clustering all hCDK2 residues into four groups considering the amount of their variability with respect to flexibility and their position in the RMSF curves. This work has introduced a new class of flexibility aspect of the proteins' residues. It could also help designing and engineering proteins, with introducing a new dynamic aspect of hCDK2, and accordingly, for the other similar globular proteins. In

  15. A Computer-Based Glucose Management System Reduces the Incidence of Forgotten Glucose Measurements: A Retrospective Observational Study.

    Science.gov (United States)

    Okura, Tsuyoshi; Teramoto, Kei; Koshitani, Rie; Fujioka, Yohei; Endo, Yusuke; Ueki, Masaru; Kato, Masahiko; Taniguchi, Shin-Ichi; Kondo, Hiroshi; Yamamoto, Kazuhiro

    2018-04-17

    Frequent glucose measurements are needed for good blood glucose control in hospitals; however, this requirement means that measurements can be forgotten. We developed a novel glucose management system using an iPod ® and electronic health records. A time schedule system for glucose measurement was developed using point-of-care testing, an iPod ® , and electronic health records. The system contains the glucose measurement schedule and an alarm sounds if a measurement is forgotten. The number of times measurements were forgotten was analyzed. Approximately 7000 glucose measurements were recorded per month. Before implementation of the system, the average number of times measurements were forgotten was 4.8 times per month. This significantly decreased to 2.6 times per month after the system started. We also analyzed the incidence of forgotten glucose measurements as a proportion of the total number of measurements for each period and found a significant difference between the two 9-month periods (43/64,049-24/65,870, P = 0.014, chi-squared test). This computer-based blood glucose monitoring system is useful for the management of glucose monitoring in hospitals. Johnson & Johnson Japan.

  16. Discovery of Highly Potent Tyrosinase Inhibitor, T1, with Significant Anti-Melanogenesis Ability by zebrafish in vivo Assay and Computational Molecular Modeling

    Science.gov (United States)

    Chen, Wang-Chuan; Tseng, Tien-Sheng; Hsiao, Nai-Wan; Lin, Yun-Lian; Wen, Zhi-Hong; Tsai, Chin-Chuan; Lee, Yu-Ching; Lin, Hui-Hsiung; Tsai, Keng-Chang

    2015-01-01

    Tyrosinase is involved in melanin biosynthesis and the abnormal accumulation of melanin pigments leading to hyperpigmentation disorders that can be treated with depigmenting agents. A natural product T1, bis(4-hydroxybenzyl)sulfide, isolated from the Chinese herbal plant, Gastrodia elata, is a strong competitive inhibitor against mushroom tyrosinase (IC50 = 0.53 μM, Ki = 58 +/- 6 nM), outperforms than kojic acid. The cell viability and melanin quantification assay demonstrate that 50 μM of T1 apparently attenuates 20% melanin content of human normal melanocytes without significant cell toxicity. Moreover, the zebrafish in vivo assay reveals that T1 effectively reduces melanogenesis with no adverse side effects. The acute oral toxicity study evidently confirms that T1 molecule is free of discernable cytotoxicity in mice. Furthermore, the molecular modeling demonstrates that the sulfur atom of T1 coordinating with the copper ions in the active site of tyrosinase is essential for mushroom tyrosinase inhibition and the ability of diminishing the human melanin synthesis. These results evident that T1 isolated from Gastrodia elata is a promising candidate in developing pharmacological and cosmetic agents of great potency in skin-whitening.

  17. The chemical digestion of Ti6Al7Nb scaffolds produced by Selective Laser Melting reduces significantly ability of Pseudomonas aeruginosa to form biofilm.

    Science.gov (United States)

    Junka, Adam F; Szymczyk, Patrycja; Secewicz, Anna; Pawlak, Andrzej; Smutnicka, Danuta; Ziółkowski, Grzegorz; Bartoszewicz, Marzenna; Chlebus, Edward

    2016-01-01

    In our previous work we reported the impact of hydrofluoric and nitric acid used for chemical polishing of Ti-6Al-7Nb scaffolds on decrease of the number of Staphylococcus aureus biofilm forming cells. Herein, we tested impact of the aforementioned substances on biofilm of Gram-negative microorganism, Pseudomonas aeruginosa, dangerous pathogen responsible for plethora of implant-related infections. The Ti-6Al-7Nb scaffolds were manufactured using Selective Laser Melting method. Scaffolds were subjected to chemical polishing using a mixture of nitric acid and fluoride or left intact (control group). Pseudomonal biofilm was allowed to form on scaffolds for 24 hours and was removed by mechanical vortex shaking. The number of pseudomonal cells was estimated by means of quantitative culture and Scanning Electron Microscopy. The presence of nitric acid and fluoride on scaffold surfaces was assessed by means of IR and rentgen spetorscopy. Quantitative data were analysed using the Mann-Whitney test (P ≤ 0.05). Our results indicate that application of chemical polishing correlates with significant drop of biofilm-forming pseudomonal cells on the manufactured Ti-6Al-7Nb scaffolds ( p = 0.0133, Mann-Whitney test) compared to the number of biofilm-forming cells on non-polished scaffolds. As X-ray photoelectron spectroscopy revealed the presence of fluoride and nitrogen on the surface of scaffold, we speculate that drop of biofilm forming cells may be caused by biofilm-supressing activity of these two elements.

  18. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  19. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  1. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  5. Prognostic significance of metabolic tumor burden by positron emission tomography/computed tomography in patients with relapsed/refractory diffuse large B-cell lymphoma.

    Science.gov (United States)

    Tateishi, Ukihide; Tatsumi, Mitsuaki; Terauchi, Takashi; Ando, Kiyoshi; Niitsu, Nozomi; Kim, Won Seog; Suh, Cheolwon; Ogura, Michinori; Tobinai, Kensei

    2015-02-01

    The aim of the present study was to investigate the feasibility of measuring metabolic tumor burden using [F-18] fluorodeoxyglucose ((18) F-FDG) positron emission tomography/computed tomography (PET/CT) in patients with relapsed or refractory diffuse large B-cell lymphoma (DLBCL) treated with bendamustine-rituximab. Because the standardized uptake value is a critical parameter of tumor characterization, we carried out a phantom study of (18) F-FDG PET/CT to ensure quality control for 28 machines in the 24 institutions (Japan, 17 institutions; Korea, 7 institutions) participating in our clinical study. Fifty-five patients with relapsed or refractory DLBCL were enrolled. The (18) F-FDG PET/CT was acquired before treatment, after two cycles, and after the last treatment cycle. Treatment response was assessed after two cycles and after the last cycle using the Lugano classification. Using this classification, remission was complete in 15 patients (27%) and incomplete in 40 patients (73%) after two cycles of therapy, and remission was complete in 32 patients (58%) and incomplete in 23 patients (42%) after the last treatment cycle. The percentage change in all PET/CT parameters except for the area under the curve of the cumulative standardized uptake value-volume histogram was significantly greater in complete response patients than in non-complete response patients after two cycles and the last cycle. The Cox proportional hazard model and best subset selection method revealed that the percentage change of the sum of total lesion glycolysis after the last cycle (relative risk, 5.24; P = 0.003) was an independent predictor of progression-free survival. The percent change of sum of total lesion glycolysis, calculated from PET/CT, can be used to quantify the response to treatment and can predict progression-free survival after the last treatment cycle in patients with relapsed or refractory DLBCL treated with bendamustine-rituximab. © 2014 The Authors. Cancer Science

  6. Characteristics of patients with a significant stenosis in a conventional coronary angiogram with a normal multi-detector computed tomographic coronary angiogram

    International Nuclear Information System (INIS)

    Jeong, Hae Chang; Ahn, Youngkeun; Jeong, Myung Ho

    2009-01-01

    Multi-detector computed tomography (MDCT) has high diagnostic value for detecting or excluding coronary artery stenosis. However, conventional coronary angiograms (CCA) are occasionally required in patients having persistent chest pain with normal MDCT. We retrospectively analyzed 90 patients who underwent CCA due to persistent chest pain with normal MDCT. The patients were classified into patients having more than 50% diameter stenosis in CCA (false negative, group I: n=14, 62.6±7.5 years, 7 males) and those having less than 50% diameter stenosis (true negative, group II: n=76, 52.1±12.0 years, 42 males). Significant stenosis was observed in 9 patients at the left anterior descending artery, 4 at the right coronary artery, and 1 at the left circumflex artery in group I. Group I patients were older than group II patients (63±8 versus 52±12 years, P<0.001). There were more patients with hypertension and smoking in group I (64.3% versus 7.9%, 35.7% versus 3.9%, P<0.001, P<0.001, respectively). The levels of uric acid and homocysteine were higher in group I than in group II (5.7±1.5 versus 4.6±1.2 mg/dL, 9.6±3.1 versus 7.4±2.5 mol/L, P=0.008, P=0.010, respectively). There were more ST or T changes in the electrocardiograms in group I (35.7% versus 1.3%) (P<0.001). In multivariate analysis, a history of hypertension, uric acid levels, and ischemic evidence in the electrocardiogram were independent factors for a false negative of MDCT (odds ratio 11.11, 4.76, 1.81, 95% confidence interval 4.67 to 10.00, 1.41 to 1.61, 1.05 to 3.33, P=0.009, P=0.012, P=0.046, respectively). In certain situations, the findings of coronary stenosis by MDCT do not always correlate with that of CCA. (author)

  7. Effects of a Web-Based Computer-Tailored Game to Reduce Binge Drinking Among Dutch Adolescents: A Cluster Randomized Controlled Trial.

    Science.gov (United States)

    Jander, Astrid; Crutzen, Rik; Mercken, Liesbeth; Candel, Math; de Vries, Hein

    2016-02-03

    Binge drinking among Dutch adolescents is among the highest in Europe. Few interventions so far have focused on adolescents aged 15 to 19 years. Because binge drinking increases significantly during those years, it is important to develop binge drinking prevention programs for this group. Web-based computer-tailored interventions can be an effective tool for reducing this behavior in adolescents. Embedding the computer-tailored intervention in a serious game may make it more attractive to adolescents. The aim was to assess whether a Web-based computer-tailored intervention is effective in reducing binge drinking in Dutch adolescents aged 15 to 19 years. Secondary outcomes were reduction in excessive drinking and overall consumption during the previous week. Personal characteristics associated with program adherence were also investigated. A cluster randomized controlled trial was conducted among 34 Dutch schools. Each school was randomized into either an experimental (n=1622) or a control (n=1027) condition. Baseline assessment took place in January and February 2014. At baseline, demographic variables and alcohol use were assessed. Follow-up assessment of alcohol use took place 4 months later (May and June 2014). After the baseline assessment, participants in the experimental condition started with the intervention consisting of a game about alcohol in which computer-tailored feedback regarding motivational characteristics was embedded. Participants in the control condition only received the baseline questionnaire. Both groups received the 4-month follow-up questionnaire. Effects of the intervention were assessed using logistic regression mixed models analyses for binge and excessive drinking and linear regression mixed models analyses for weekly consumption. Factors associated with intervention adherence in the experimental condition were explored by means of a linear regression model. In total, 2649 adolescents participated in the baseline assessment. At follow

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. A Proposed Model for Improving Performance and Reducing Costs of IT Through Cloud Computing of Egyptian Business Enterprises

    OpenAIRE

    Mohamed M.El Hadi; Azza Monir Ismail

    2016-01-01

    Information technologies are affecting the big business enterprises of todays from data processing and transactions to achieve the goals efficiently and effectively, affecting creates new business opportunities and towards new competitive advantage, service must be enough to match the recent trends of IT such as cloud computing. Cloud computing technology has provided all IT services. Therefore, cloud computing offers an alternative to adaptable with technology model current , creating reduci...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Nuclear energy significantly reduces carbon dioxide emissions

    International Nuclear Information System (INIS)

    Koprda, V.

    2006-01-01

    This article is devoted to nuclear energy, to its acceptability, compatibility and sustainability. Nuclear energy is non-dispensable part of energy sources with vast innovation potential. The safety of nuclear energy, radioactive waste deposition, and prevention of risk from misuse of nuclear material have to be very seriously adjudged and solved. Nuclear energy is one of the ways how to decrease the contamination of atmosphere with carbon dioxide and it solves partially also the problem of global increase of temperature and climate changes. Given are the main factors responsible for the renaissance of nuclear energy. (author)

  12. Utility of the inspiratory phase in high-resolution computed tomography evaluations of pediatric patients with bronchiolitis obliterans after allogeneic bone marrow transplant: reducing patient radiation exposure

    Energy Technology Data Exchange (ETDEWEB)

    Togni Filho, Paulo Henrique; Casagrande, Joao Luiz Marin; Lederman, Henrique Manoel, E-mail: paulotognifilho@gmail.com [Universidade Federal de Sao Paulo (EPM/UNIFESP), Sao Paulo, SP (Brazil). Escola Paulista de Medicina. Dept. of Diagnostico por Imagem; Universidade de Sao Paulo (InRad/HC/FMUSP), Sao Paulo, SP (Brazil). Hospital das Clinicas. Instituto de Radiologia

    2017-03-15

    Objective: To evaluate the utility of the inspiratory phase in high-resolution computed tomography (HRCT) of the chest for the diagnosis of post-bone marrow transplantation bronchiolitis obliterans. Materials and Methods: This was a retrospective, observational, cross-sectional study. We selected patients of either gender who underwent bone marrow transplantation and chest HRCT between March 1, 2002 and December 12, 2014. Ages ranged from 3 months to 20.7 years. We included all examinations in which the HRCT was performed appropriately. The examinations were read by two radiologists, one with extensive experience in pediatric radiology and another in the third year of residency, who determined the presence or absence of the following imaging features: air trapping, bronchiectasis, alveolar opacities, nodules, and atelectasis. Results: A total of 222 examinations were evaluated (mean, 5.4 ± 4.5 examinations per patient). The expiratory phase findings were comparable to those obtained in the inspiratory phase, except in one patient, in whom a small uncharacteristic nodule was identified only in the inspiratory phase. Air trapping was identified in a larger number of scans in the expiratory phase than in the inspiratory phase, as was atelectasis, although the difference was statistically significant only for air trapping. Conclusion: In children being evaluated for post-bone marrow transplantation bronchiolitis obliterans, the inspiratory phase can be excluded from the chest HRCT protocol, thus reducing by half the radiation exposure in this population. (author)

  13. Augmented Quadruple-Phase Contrast Media Administration and Triphasic Scan Protocol Increases Image Quality at Reduced Radiation Dose During Computed Tomography Urography.

    Science.gov (United States)

    Saade, Charbel; Mohamad, May; Kerek, Racha; Hamieh, Nadine; Alsheikh Deeb, Ibrahim; El-Achkar, Bassam; Tamim, Hani; Abdul Razzak, Farah; Haddad, Maurice; Abi-Ghanem, Alain S; El-Merhi, Fadi

    The aim of this article was to investigate the opacification of the renal vasculature and the urogenital system during computed tomography urography by using a quadruple-phase contrast media in a triphasic scan protocol. A total of 200 patients with possible urinary tract abnormalities were equally divided between 2 protocols. Protocol A used the conventional single bolus and quadruple-phase scan protocol (pre, arterial, venous, and delayed), retrospectively. Protocol B included a quadruple-phase contrast media injection with a triphasic scan protocol (pre, arterial and combined venous, and delayed), prospectively. Each protocol used 100 mL contrast and saline at a flow rate of 4.5 mL. Attenuation profiles and contrast-to-noise ratio of the renal arteries, veins, and urogenital tract were measured. Effective radiation dose calculation, data analysis by independent sample t test, receiver operating characteristic, and visual grading characteristic analyses were performed. In arterial circulation, only the inferior interlobular arteries in both protocols showed a statistical significance (P contrast-to-noise ratio than protocol A (protocol B: 22.68 ± 13.72; protocol A: 14.75 ± 5.76; P contrast media and triphasic scan protocol usage increases the image quality at a reduced radiation dose.

  14. Utility of the inspiratory phase in high-resolution computed tomography evaluations of pediatric patients with bronchiolitis obliterans after allogeneic bone marrow transplant: reducing patient radiation exposure

    International Nuclear Information System (INIS)

    Togni Filho, Paulo Henrique; Casagrande, Joao Luiz Marin; Lederman, Henrique Manoel; Universidade de Sao Paulo

    2017-01-01

    Objective: To evaluate the utility of the inspiratory phase in high-resolution computed tomography (HRCT) of the chest for the diagnosis of post-bone marrow transplantation bronchiolitis obliterans. Materials and Methods: This was a retrospective, observational, cross-sectional study. We selected patients of either gender who underwent bone marrow transplantation and chest HRCT between March 1, 2002 and December 12, 2014. Ages ranged from 3 months to 20.7 years. We included all examinations in which the HRCT was performed appropriately. The examinations were read by two radiologists, one with extensive experience in pediatric radiology and another in the third year of residency, who determined the presence or absence of the following imaging features: air trapping, bronchiectasis, alveolar opacities, nodules, and atelectasis. Results: A total of 222 examinations were evaluated (mean, 5.4 ± 4.5 examinations per patient). The expiratory phase findings were comparable to those obtained in the inspiratory phase, except in one patient, in whom a small uncharacteristic nodule was identified only in the inspiratory phase. Air trapping was identified in a larger number of scans in the expiratory phase than in the inspiratory phase, as was atelectasis, although the difference was statistically significant only for air trapping. Conclusion: In children being evaluated for post-bone marrow transplantation bronchiolitis obliterans, the inspiratory phase can be excluded from the chest HRCT protocol, thus reducing by half the radiation exposure in this population. (author)

  15. Cognitive cooperation groups mediated by computers and internet present significant improvement of cognitive status in older adults with memory complaints: a controlled prospective study

    Directory of Open Access Journals (Sweden)

    Rodrigo de Rosso Krug

    Full Text Available ABSTRACT Objective To estimate the effect of participating in cognitive cooperation groups, mediated by computers and the internet, on the Mini-Mental State Examination (MMSE percent variation of outpatients with memory complaints attending two memory clinics. Methods A prospective controlled intervention study carried out from 2006 to 2013 with 293 elders. The intervention group (n = 160 attended a cognitive cooperation group (20 sessions of 1.5 hours each. The control group (n = 133 received routine medical care. Outcome was the percent variation in the MMSE. Control variables included gender, age, marital status, schooling, hypertension, diabetes, dyslipidaemia, hypothyroidism, depression, vascular diseases, polymedication, use of benzodiazepines, exposure to tobacco, sedentary lifestyle, obesity and functional capacity. The final model was obtained by multivariate linear regression. Results The intervention group obtained an independent positive variation of 24.39% (CI 95% = 14.86/33.91 in the MMSE compared to the control group. Conclusion The results suggested that cognitive cooperation groups, mediated by computers and the internet, are associated with cognitive status improvement of older adults in memory clinics.

  16. Safer passenger car front shapes for pedestrians: A computational approach to reduce overall pedestrian injury risk in realistic impact scenarios.

    Science.gov (United States)

    Li, Guibing; Yang, Jikuang; Simms, Ciaran

    2017-03-01

    Vehicle front shape has a significant influence on pedestrian injuries and the optimal design for overall pedestrian protection remains an elusive goal, especially considering the variability of vehicle-to-pedestrian accident scenarios. Therefore this study aims to develop and evaluate an efficient framework for vehicle front shape optimization for pedestrian protection accounting for the broad range of real world impact scenarios and their distributions in recent accident data. Firstly, a framework for vehicle front shape optimization for pedestrian protection was developed based on coupling of multi-body simulations and a genetic algorithm. This framework was then applied for optimizing passenger car front shape for pedestrian protection, and its predictions were evaluated using accident data and kinematic analyses. The results indicate that the optimization shows a good convergence and predictions of the optimization framework are corroborated when compared to the available accident data, and the optimization framework can distinguish 'good' and 'poor' vehicle front shapes for pedestrian safety. Thus, it is feasible and reliable to use the optimization framework for vehicle front shape optimization for reducing overall pedestrian injury risk. The results also show the importance of considering the broad range of impact scenarios in vehicle front shape optimization. A safe passenger car for overall pedestrian protection should have a wide and flat bumper (covering pedestrians' legs from the lower leg up to the shaft of the upper leg with generally even contacts), a bonnet leading edge height around 750mm, a short bonnet (17° or car front shape for head and leg protection are generally consistent, but partially conflict with pelvis protection. In particular, both head and leg injury risk increase with increasing bumper lower height and depth, and decrease with increasing bonnet leading edge height, while pelvis injury risk increases with increasing bonnet leading

  17. Significance of findings of both emergency chest X-ray and thoracic computed tomography routinely performed at the emergency unit in 102 polytrauma patients. A prospective study

    International Nuclear Information System (INIS)

    Grieser, T.; Buehne, K.H.; Haeuser, H.; Bohndorf, K.

    2001-01-01

    Purpose: To evaluate prospectively whether and to what extent both thoracic computed tomography (Tx-CT) and supine X-ray of the chest (Rx-Tx) are able to show additional findings that are therapeutically relevant. Patients and Methods: According to a fixed study protocol, we performed Rx-Tx and Tx-CT in 102 consecutive, haemodynamically stable polytrauma patients (mean age, 41.2 yrs; age range, 12-93 yrs). Findings of therapeutical relevance drawn from both Tx-CT and Rx-Tx, and urgent interventions indicated by an attending trauma team were documented on a standardized evaluation sheet immediately. Any change in the patient's management that is different from routine life-saving procedures, and any therapeutical intervention done in the emergency room or elsewhere (operating theatre, angiographic facility) were considered therapeutically relevant. Results: Of 102 patients, 43 (42.2%) had a total of 51 therapeutically relevant findings. Rx-Tx alone yielded 23 relevant findings (45.1%) in 23 patients (22.5%). Of them, Tx-CT has shown additional important findings in 7 patients (30.4%). When Tx-CT alone is considered, it revealed 22 new findings of therapeutical relevance (43.2%) in 20 patients (46.5%). Altogether, Tx-CT was able to show 30 relevant findings in 27 patients, i.e., there was a therapeutical benefit for 26.5% of all polytrauma patients included. Most frequently, there was a need for chest-tube insertion (n=29). Conclusions: Polytrauma patients if haemodynamically stable may profit from computed tomography of the chest when therapeutically relevant thoracic injuries are looked for or early therapeutical interventions are to be checked. However, chest X-ray should stay as a 'front-line' screening method because of its superbly quick feasibility and availability. (orig.) [de

  18. Use of a hybrid iterative reconstruction technique to reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography.

    Science.gov (United States)

    Kligerman, Seth; Mehta, Dhruv; Farnadesh, Mahmmoudreza; Jeudy, Jean; Olsen, Kathryn; White, Charles

    2013-01-01

    To determine whether an iterative reconstruction (IR) technique (iDose, Philips Healthcare) can reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography (CTPA). The study was Health Insurance Portability and Accountability Act compliant and approved by our institutional review board. A total of 33 obese patients (average body mass index: 42.7) underwent CTPA studies following standard departmental protocols. The data were reconstructed with filtered back projection (FBP) and 3 iDose strengths (iDoseL1, iDoseL3, and iDoseL5) for a total of 132 studies. FBP data were collected from 33 controls (average body mass index: 22) undergoing CTPA. Regions of interest were drawn at 6 identical levels in the pulmonary artery (PA), from the main PA to a subsegmental branch, in both the control group and study groups using each algorithm. Noise and attenuation were measured at all PA levels. Three thoracic radiologists graded each study on a scale of 1 (very poor) to 5 (ideal) by 4 categories: image quality, noise, PA enhancement, and "plastic" appearance. Statistical analysis was performed using an unpaired t test, 1-way analysis of variance, and linear weighted κ. Compared with the control group, there was significantly higher noise with FBP, iDoseL1, and iDoseL3 algorithms (Pnoise in the control group and iDoseL5 algorithm in the study group. Analysis within the study group showed a significant and progressive decrease in noise and increase in the contrast-to-noise ratio as the level of IR was increased (Pnoise and PA enhancement with increasing levels of iDose. The use of an IR technique leads to qualitative and quantitative improvements in image noise and image quality in obese patients undergoing CTPA.

  19. Calculation of limits for significant unidirectional changes in two or more serial results of a biomarker based on a computer simulation model

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2015-01-01

    BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV...... the presented factors. The first result is multiplied by the appropriate factor for increase or decrease, which gives the limits for a significant difference.......BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV......,000 simulated data from healthy individuals, a series of up to 20 results from an individual was generated using different values for the within-subject biological variation plus the analytical variation. Each new result in this series was compared to the initial measurement result. These successive serial...

  20. Clinical significance of cerebrospinal fluid tap test and magnetic resonance imaging/computed tomography findings of tight high convexity in patients with possible idiopathic normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Ishikawa, Masatsune; Furuse, Motomasa; Nishida, Namiko; Oowaki, Hisayuki; Matsumoto, Atsuhito; Suzuki, Takayuki

    2010-01-01

    Idiopathic normal pressure hydrocephalus (iNPH) is a treatable syndrome with a classical triad of symptoms. The Japanese iNPH guidelines indicate that the cerebrospinal fluid (CSF) tap test and tight high-convexity on magnetic resonance (MR) imaging are important for the diagnosis. The relationships between the effectiveness of CSF shunt surgery in possible iNPH patients, the tap test result, and the MR imaging/computed tomography (CT) findings of tight high-convexity were evaluated in 88 possible iNPH patients (mean age 75 years) with one or more of the classical triad of symptoms, and mild to moderate ventricular dilation. All patients underwent the tap test in the outpatient clinic, and patients and caregivers assessed the clinical changes during one week. The tap test was positive in 47 patients and negative in 41 patients. Surgery was performed in 19 patients with positive tap test, and was effective in 17 patients. Although the findings were inconsistent in some patients, the result of the tap test was found to be highly correlated with the MR imaging/CT finding of tight high-convexity (p<0.0001), confirming that both these diagnostic tests are promising predictors of shunt effectiveness. (author)

  1. Computer simulation and experimental self-assembly of irradiated glycine amino acid under magnetic fields: Its possible significance in prebiotic chemistry.

    Science.gov (United States)

    Heredia, Alejandro; Colín-García, María; Puig, Teresa Pi I; Alba-Aldave, Leticia; Meléndez, Adriana; Cruz-Castañeda, Jorge A; Basiuk, Vladimir A; Ramos-Bernal, Sergio; Mendoza, Alicia Negrón

    2017-12-01

    Ionizing radiation may have played a relevant role in chemical reactions for prebiotic biomolecule formation on ancient Earth. Environmental conditions such as the presence of water and magnetic fields were possibly relevant in the formation of organic compounds such as amino acids. ATR-FTIR, Raman, EPR and X-ray spectroscopies provide valuable information about molecular organization of different glycine polymorphs under static magnetic fields. γ-glycine polymorph formation increases in irradiated samples interacting with static magnetic fields. The increase in γ-glycine polymorph agrees with the computer simulations. The AM1 semi-empirical simulations show a change in the catalyst behavior and dipole moment values in α and γ-glycine interaction with the static magnetic field. The simulated crystal lattice energy in α-glycine is also affected by the free radicals under the magnetic field, which decreases its stability. Therefore, solid α and γ-glycine containing free radicals under static magnetic fields might have affected the prebiotic scenario on ancient Earth by causing the oligomerization of glycine in prebiotic reactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. New algorithm to reduce the number of computing steps in reliability formula of Weighted-k-out-of-n system

    Directory of Open Access Journals (Sweden)

    Tatsunari Ohkura

    2007-02-01

    Full Text Available In the disjoint products version of reliability analysis of weighted–k–out–of–n systems, it is necessary to determine the order in which the weight of components is to be considered. The k–out–of–n:G(F system consists of n components; each com-ponent has its own probability and positive integer weight such that the system is operational (failed if and only if the total weight of some operational (failure components is at least k. This paper designs a method to compute the reliability in O(nk computing time and in O(nk memory space. The proposed method expresses the system reliability in fewer product terms than those already published.

  3. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  4. Effectiveness of a Web-Based Computer-Tailored Multiple-Lifestyle Intervention for People Interested in Reducing their Cardiovascular Risk: A Randomized Controlled Trial.

    Science.gov (United States)

    Storm, Vera; Dörenkämper, Julia; Reinwand, Dominique Alexandra; Wienert, Julian; De Vries, Hein; Lippke, Sonia

    2016-04-11

    Web-based computer-tailored interventions for multiple health behaviors can improve the strength of behavior habits in people who want to reduce their cardiovascular risk. Nonetheless, few randomized controlled trials have tested this assumption to date. The study aim was to test an 8-week Web-based computer-tailored intervention designed to improve habit strength for physical activity and fruit and vegetable consumption among people who want to reduce their cardiovascular risk. In a randomized controlled design, self-reported changes in perceived habit strength, self-efficacy, and planning across different domains of physical activity as well as fruit and vegetable consumption were evaluated. This study was a randomized controlled trial involving an intervention group (n=403) and a waiting control group (n=387). Web-based data collection was performed in Germany and the Netherlands during 2013-2015. The intervention content was based on the Health Action Process Approach and involved personalized feedback on lifestyle behaviors, which indicated whether participants complied with behavioral guidelines for physical activity and fruit and vegetable consumption. There were three Web-based assessments: baseline (T0, N=790), a posttest 8 weeks after the baseline (T1, n=206), and a follow-up 3 months after the baseline (T2, n=121). Data analysis was conducted by analyzing variances and structural equation analysis. Significant group by time interactions revealed superior treatment effects for the intervention group, with substantially higher increases in self-reported habit strength for physical activity (F1,199=7.71, P=.006, Cohen's d=0.37) and fruit and vegetable consumption (F1,199=7.71, P=.006, Cohen's d=0.30) at posttest T1 for the intervention group. Mediation analyses yielded behavior-specific sequential mediator effects for T1 planning and T1 self-efficacy between the intervention and habit strength at follow-up T2 (fruit and vegetable consumption: beta=0.12, 95

  5. Clinical significance of magnetic resonance cholangiopancreatography for the diagnosis of cystic tumor of the pancreas compared with endoscopic retrograde cholangiopancreatography and computed tomography

    International Nuclear Information System (INIS)

    Mera, Kiyomi; Tajiri, Hisao; Muto, Manabu

    1999-01-01

    Cystic tumor of the pancreas has been investigated by a variety of imaging techniques. Magnetic resonance cholangiopancreatography (MRCP) is being widely used as a non-invasive diagnostic modality for investigation of the biliary tree and pancreatic duct system. The purpose of this study was to compare MRCP images with those of endoscopic retrograde cholangiopancreatography (ERCP) and computed tomography (CT) in order to clarify the diagnostic efficacy of MRCP for cystic tumor of the pancreas. We retrospectively studied 15 patients with cystic tumor of the pancreas that had been surgically resected and histopathologically confirmed. There were five cases of intraductal papillary adenocarcinoma, five of intraductal papillary adenoma, two of serous cyst adenoma, two of retention cyst associated with invasive ductal adenocarcinoma and one of solid cystic tumor. In all cases MRCP correctly identified the main pancreatic duct (MPD) and showed the entire cystic tumor and the communication between the tumor and the MPD. On the other hand, the detection rate by ERCP of the cystic tumor and the communication between the cystic tumor and the MPD was only 60%. Although the detection rates by CT for the septum and solid components inside the cystic tumor were 100 and 90.0%, respectively, those of MRCP for each were 58.3 and 20.0%. MRCP is capable of providing diagnostic information superior to ERCP for the diagnosis of cystic tumor of the pancreas. Although MRCP may provide complementary information about the whole lesion of interest, the characteristic internal features of cystic tumor of the pancreas should be carefully diagnosed in combination with CT. (author)

  6. Reduced-order computational model in nonlinear structural dynamics for structures having numerous local elastic modes in the low-frequency range. Application to fuel assemblies

    International Nuclear Information System (INIS)

    Batou, A.; Soize, C.; Brie, N.

    2013-01-01

    Highlights: • A ROM of a nonlinear dynamical structure is built with a global displacements basis. • The reduced order model of fuel assemblies is accurate and of very small size. • The shocks between grids of a row of seven fuel assemblies are computed. -- Abstract: We are interested in the construction of a reduced-order computational model for nonlinear complex dynamical structures which are characterized by the presence of numerous local elastic modes in the low-frequency band. This high modal density makes the use of the classical modal analysis method not suitable. Therefore the reduced-order computational model is constructed using a basis of a space of global displacements, which is constructed a priori and which allows the nonlinear dynamical response of the structure observed on the stiff part to be predicted with a good accuracy. The methodology is applied to a complex industrial structure which is made up of a row of seven fuel assemblies with possibility of collisions between grids and which is submitted to a seismic loading

  7. Reduced-order computational model in nonlinear structural dynamics for structures having numerous local elastic modes in the low-frequency range. Application to fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Batou, A., E-mail: anas.batou@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-la-Vallee (France); Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-la-Vallee (France); Brie, N., E-mail: nicolas.brie@edf.fr [EDF R and D, Département AMA, 1 avenue du général De Gaulle, 92140 Clamart (France)

    2013-09-15

    Highlights: • A ROM of a nonlinear dynamical structure is built with a global displacements basis. • The reduced order model of fuel assemblies is accurate and of very small size. • The shocks between grids of a row of seven fuel assemblies are computed. -- Abstract: We are interested in the construction of a reduced-order computational model for nonlinear complex dynamical structures which are characterized by the presence of numerous local elastic modes in the low-frequency band. This high modal density makes the use of the classical modal analysis method not suitable. Therefore the reduced-order computational model is constructed using a basis of a space of global displacements, which is constructed a priori and which allows the nonlinear dynamical response of the structure observed on the stiff part to be predicted with a good accuracy. The methodology is applied to a complex industrial structure which is made up of a row of seven fuel assemblies with possibility of collisions between grids and which is submitted to a seismic loading.

  8. Pre-Treatment Deep Curettage Can Significantly Reduce Tumour Thickness in Thick Basal Cell Carcinoma While Maintaining a Favourable Cosmetic Outcome When Used in Combination with Topical Photodynamic Therapy

    International Nuclear Information System (INIS)

    Christensen, E.; Mork, C.; Foss, O. A.

    2011-01-01

    Topical photodynamic therapy (PDT) has limitations in the treatment of thick skin tumours. The aim of the study was to evaluate the effect of pre-PDT deep curettage on tumour thickness in thick (≥2 mm) basal cell carcinoma (BCC). Additionally, 3-month treatment outcome and change of tumour thickness from diagnosis to treatment were investigated. At diagnosis, mean tumour thickness was 2.3 mm (range 2.0-4.0). Pre- and post-curettage biopsies were taken from each tumour prior to PDT. Of 32 verified BCCs, tumour thickness was reduced by 50% after deep curettage (ρ≤0.001) . Mean tumour thickness was also reduced from diagnosis to treatment. At 3-month followup, complete tumour response was found in 93% and the cosmetic outcome was rated excellent or good in 100% of cases. In conclusion, deep curettage significantly reduces BCC thickness and may with topical PDT provide a favourable clinical and cosmetic short-term outcome.

  9. Significant RF-EMF and thermal levels observed in a computational model of a person with a tibial plate for grounded 40 MHz exposure.

    Science.gov (United States)

    McIntosh, Robert L; Iskra, Steve; Anderson, Vitas

    2014-05-01

    Using numerical modeling, a worst-case scenario is considered when a person with a metallic implant is exposed to a radiofrequency (RF) electromagnetic field (EMF). An adult male standing on a conductive ground plane was exposed to a 40 MHz vertically polarized plane wave field, close to whole-body resonance where maximal induced current flows are expected in the legs. A metal plate (50-300 mm long) was attached to the tibia in the left leg. The findings from this study re-emphasize the need to ensure compliance with limb current reference levels for exposures near whole-body resonance, and not just rely on compliance with ambient electric (E) and magnetic (H) field reference levels. Moreover, we emphasize this recommendation for someone with a tibial plate, as failure to comply may result in significant tissue damage (increases in the localized temperature of 5-10 °C were suggested by the modeling for an incident E-field of 61.4 V/m root mean square (rms)). It was determined that the occupational reference level for limb current (100 mA rms), as stipulated in the 1998 guidelines of the International Commission on Non-Ionizing Radiation Protection (ICNIRP), is satisfied if the plane wave incident E-field levels are no more than 29.8 V/m rms without an implant and 23.4 V/m rms for the model with a 300 mm implant. © 2014 Wiley Periodicals, Inc.

  10. Leakage Reduction in Water Distribution Systems with Efficient Placement and Control of Pressure Reducing Valves Using Soft Computing Techniques

    Directory of Open Access Journals (Sweden)

    A. Gupta

    2017-04-01

    Full Text Available Reduction of leakages in a water distribution system (WDS is one of the major concerns of water industries. Leakages depend on pressure, hence installing pressure reducing valves (PRVs in the water network is a successful techniques for reducing leakages. Determining the number of valves, their locations, and optimal control setting are the challenges faced. This paper presents a new algorithm-based rule for determining the location of valves in a WDS having a variable demand pattern, which results in more favorable optimization of PRV localization than that caused by previous techniques. A multiobjective genetic algorithm (NSGA-II was used to determine the optimized control value of PRVs and to minimize the leakage rate in the WDS. Minimum required pressure was maintained at all nodes to avoid pressure deficiency at any node. Proposed methodology is applied in a benchmark WDS and after using PRVs, the average leakage rate was reduced by 6.05 l/s (20.64%, which is more favorable than the rate obtained with the existing techniques used for leakage control in the WDS. Compared with earlier studies, a lower number of PRVs was required for optimization, thus the proposed algorithm tends to provide a more cost-effective solution. In conclusion, the proposed algorithm leads to more favorable optimized localization and control of PRV with improved leakage reduction rate.

  11. Intraarticular Sacroiliac Joint Injection Under Computed Tomography Fluoroscopic Guidance: A Technical Note to Reduce Procedural Time and Radiation Dose

    International Nuclear Information System (INIS)

    Paik, Nam Chull

    2016-01-01

    PurposeA technique for computed tomography fluoroscopy (CTF)-guided intraarticular (IA) sacroiliac joint (SIJ) injection was devised to limit procedural time and radiation dose.MethodsOur Institutional Review Board approved this retrospective analysis and waived the requirement for informed consent. Overall, 36 consecutive diagnostic or therapeutic IA SIJ injections (unilateral, 20; bilateral, 16) performed in 34 patients (female, 18; male, 16) with a mean age of 45.5 years (range 20–76 years) under CTF guidance were analyzed, assessing technical success (i.e., IA contrast spread), procedural time, and radiation dose.ResultsAll injections were successful from a technical perspective and were free of serious complications. Respective median procedural times and effective doses of SIJ injection were as follows: unilateral, 5.28 min (range 3.58–8.00 min) and 0.11 millisievert (mSv; range 0.07–0.24 mSv); and bilateral, 6.72 min (range 4.17–21.17 min) and 0.11 mSv (range 0.09–0.51 mSv).ConclusionsGiven the high rate of technical success achieved in limited time duration and with little radiation exposure, CTF-guided IA SIJ injection is a practical and low-risk procedure.

  12. Intraarticular Sacroiliac Joint Injection Under Computed Tomography Fluoroscopic Guidance: A Technical Note to Reduce Procedural Time and Radiation Dose

    Energy Technology Data Exchange (ETDEWEB)

    Paik, Nam Chull, E-mail: pncspine@gmail.com [Arumdaun Wooldul Spine Hospital, Department of Radiology (Korea, Republic of)

    2016-07-15

    PurposeA technique for computed tomography fluoroscopy (CTF)-guided intraarticular (IA) sacroiliac joint (SIJ) injection was devised to limit procedural time and radiation dose.MethodsOur Institutional Review Board approved this retrospective analysis and waived the requirement for informed consent. Overall, 36 consecutive diagnostic or therapeutic IA SIJ injections (unilateral, 20; bilateral, 16) performed in 34 patients (female, 18; male, 16) with a mean age of 45.5 years (range 20–76 years) under CTF guidance were analyzed, assessing technical success (i.e., IA contrast spread), procedural time, and radiation dose.ResultsAll injections were successful from a technical perspective and were free of serious complications. Respective median procedural times and effective doses of SIJ injection were as follows: unilateral, 5.28 min (range 3.58–8.00 min) and 0.11 millisievert (mSv; range 0.07–0.24 mSv); and bilateral, 6.72 min (range 4.17–21.17 min) and 0.11 mSv (range 0.09–0.51 mSv).ConclusionsGiven the high rate of technical success achieved in limited time duration and with little radiation exposure, CTF-guided IA SIJ injection is a practical and low-risk procedure.

  13. Effectiveness of Bismuth Shield to Reduce Eye Lens Radiation Dose Using the Photoluminescence Dosimetry in Computed Tomography

    International Nuclear Information System (INIS)

    Jung, Mi Young; Kweon, Dae Cheol; Kwon, Soo Il

    2009-01-01

    The purpose of our study was to determine the eye radiation dose when performing routine multi-detector computed tomography (MDCT). We also evaluated dose reduction and the effect on image quality of using a bismuth eye shield when performing head MDCT. Examinations were performed with a 64MDCT scanner. To compare the shielded/unshielded lens dose, the examination was performed with and without bismuth shielding in anthropomorphic phantom. To determine the average lens radiation dose, we imaged an anthropomorphic phantom into which calibrated photoluminescence glass dosimeter (PLD) were placed to measure the dose to lens. The phantom was imaged using the same protocol. Radiation doses to the lens with and without the lens shielding were measured and compared using the Student t test. In the qualitative evaluation of the MDCT scans, all were considered to be of diagnostic quality. We did not see any differences in quality between the shielded and unshielded brain. The mean radiation doses to the eye with the shield and to those without the shield were 21.54 versus 10.46 mGy, respectively. The lens shield enabled a 51.3% decrease in radiation dose to the lens. Bismuth in-plane shielding for routine eye and head MDCT decreased radiation dose to the lens without qualitative changes in image quality. The other radiosensitive superficial organs specifically must be protected with shielding.

  14. A criterion based on computational singular perturbation for the identification of quasi steady state species: A reduced mechanism for methane oxidation with NO chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Tianfeng; Law, Chung K. [Department of Mechanical and Aerospace Engineering, Princeton University, Princeton, NJ 08544 (United States)

    2008-09-15

    A criterion based on computational singular perturbation (CSP) is proposed to effectively distinguish the quasi steady state (QSS) species from the fast species induced by reactions in partial equilibrium. Together with the method of directed relation graph (DRG), it was applied to the reduction of GRI-Mech 3.0 for methane oxidation, leading to the development of a 19-species reduced mechanism with 15 lumped steps, with the concentrations of the QSS species solved analytically for maximum computational efficiency. Compared to the 12-step and 16-species augmented reduced mechanism (ARM) previously developed by Sung, Law and Chen, three species, namely O, CH{sub 3}OH, and CH{sub 2}CO, are now excluded from the QSS species list. The reduced mechanism was validated with a variety of phenomena including perfectly stirred reactors, auto-ignition, and premixed and non-premixed flames, with the worst-case error being less than 10% over a wide range of parameters. This mechanism was then supplemented with the reactions involving NO formation, followed by validations in both homogeneous and diffusive systems. (author)

  15. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.

  16. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul

    2016-01-01

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room

  17. Innovative computational tools for reducing exploration risk through integration of water-rock interactions and magnetotelluric surveys

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Joseph [Univ. of Utah, Salt Lake City, UT (United States)

    2017-04-20

    Mapping permeability distributions in geothermal reservoirs is essential for reducing the cost of geothermal development. To avoid the cost and sampling bias of measuring permeability directly through drilling, we require remote methods of imaging permeability such as geophysics. Electrical resistivity (or its inverse, conductivity) is one of the most sensitive geophysical properties known to reflect long range fluid interconnection and thus the likelihood of permeability. Perhaps the most widely applied geophysical methods for imaging subsurface resistivity is magnetotellurics (MT) due to its relatively great penetration depths. A primary goal of this project is to confirm through ground truthing at existing geothermal systems that MT resistivity structure interpreted integratively is capable of revealing permeable fluid pathways into geothermal systems.

  18. Effectiveness of Adaptive Statistical Iterative Reconstruction for 64-Slice Dual-Energy Computed Tomography Pulmonary Angiography in Patients With a Reduced Iodine Load: Comparison With Standard Computed Tomography Pulmonary Angiography.

    Science.gov (United States)

    Lee, Ji Won; Lee, Geewon; Lee, Nam Kyung; Moon, Jin Il; Ju, Yun Hye; Suh, Young Ju; Jeong, Yeon Joo

    2016-01-01

    The aim of the study was to assess the effectiveness of the adaptive statistical iterative reconstruction (ASIR) for dual-energy computed tomography pulmonary angiography (DE-CTPA) with a reduced iodine load. One hundred forty patients referred for chest CT were randomly divided into a DE-CTPA group with a reduced iodine load or a standard CTPA group. Quantitative and qualitative image qualities of virtual monochromatic spectral (VMS) images with filtered back projection (VMS-FBP) and those with 50% ASIR (VMS-ASIR) in the DE-CTPA group were compared. Image qualities of VMS-ASIR images in the DE-CTPA group and ASIR images in the standard CTPA group were also compared. All quantitative and qualitative indices, except attenuation value of pulmonary artery in the VMS-ASIR subgroup, were superior to those in the VMS-FBP subgroup (all P ASIR images were superior to those of ASIR images in the standard CTPA group (P ASIR images of the DE-CTPA group than in ASIR images of the standard CTPA group (P = 0.001). The ASIR technique tends to improve the image quality of VMS imaging. Dual-energy computed tomography pulmonary angiography with ASIR can reduce contrast medium volume and produce images of comparable quality with those of standard CTPA.

  19. A modified Wright-Fisher model that incorporates Ne: A variant of the standard model with increased biological realism and reduced computational complexity.

    Science.gov (United States)

    Zhao, Lei; Gossmann, Toni I; Waxman, David

    2016-03-21

    The Wright-Fisher model is an important model in evolutionary biology and population genetics. It has been applied in numerous analyses of finite populations with discrete generations. It is recognised that real populations can behave, in some key aspects, as though their size that is not the census size, N, but rather a smaller size, namely the effective population size, Ne. However, in the Wright-Fisher model, there is no distinction between the effective and census population sizes. Equivalently, we can say that in this model, Ne coincides with N. The Wright-Fisher model therefore lacks an important aspect of biological realism. Here, we present a method that allows Ne to be directly incorporated into the Wright-Fisher model. The modified model involves matrices whose size is determined by Ne. Thus apart from increased biological realism, the modified model also has reduced computational complexity, particularly so when Ne⪡N. For complex problems, it may be hard or impossible to numerically analyse the most commonly-used approximation of the Wright-Fisher model that incorporates Ne, namely the diffusion approximation. An alternative approach is simulation. However, the simulations need to be sufficiently detailed that they yield an effective size that is different to the census size. Simulations may also be time consuming and have attendant statistical errors. The method presented in this work may then be the only alternative to simulations, when Ne differs from N. We illustrate the straightforward application of the method to some problems involving allele fixation and the determination of the equilibrium site frequency spectrum. We then apply the method to the problem of fixation when three alleles are segregating in a population. This latter problem is significantly more complex than a two allele problem and since the diffusion equation cannot be numerically solved, the only other way Ne can be incorporated into the analysis is by simulation. We have

  20. Treatment with a belly-board device significantly reduces the volume of small bowel irradiated and results in low acute toxicity in adjuvant radiotherapy for gynecologic cancer: results of a prospective study

    International Nuclear Information System (INIS)

    Martin, Joseph; Fitzpatrick, Kathryn; Horan, Gail; McCloy, Roisin; Buckney, Steve; O'Neill, Louise; Faul, Clare

    2005-01-01

    Background and purpose: To determine whether treatment prone on a belly-board significantly reduces the volume of small bowel irradiated in women receiving adjuvant radiotherapy for gynecologic cancer, and to prospectively study acute small bowel toxicity using an accepted recording instrument. Material and methods: Thirty-two gynecologic patients underwent simulation with CT scanning supine and prone. Small bowel was delineated on every CT slice, and treatment was prone on the belly-board using 3-5 fields-typically Anterior, Right and Left Lateral, plus or minus Lateral Boosts. Median prescribed dose was 50.4 Gy and all treatments were delivered in 1.8 Gy fractions. Concomitant Cisplatin was administered in 13 patients with cervical carcinoma. Comparison of small bowel dose-volumes was made between supine and prone, with each subject acting as their own matched pair. Acute small bowel toxicity was prospectively measured using the Common Toxicity Criteria: Version 2.0. Results: Treatment prone on the belly-board significantly reduced the volume of small bowel receiving ≥100; ≥95; ≥90; and ≥80% of the prescribed dose, but not ≥50%. This was found whether volume was defined in cubic centimeters or % of total small bowel volume. Of 29 evaluable subjects, 2 (7%) experienced 1 episode each of grade 3 diarrhoea. All other toxicity events were grade 2 or less and comprised diarrhoea (59%), abdominal pain or cramping (48%), nausea (38%), anorexia (17%), vomiting (10%). There were no Grade 4 events and no treatment days were lost due to toxicity. Conclusions: Treatment prone on a belly-board device results in significant small bowel sparing, during adjuvant radiotherapy for gynecologic cancer. The absence of Grade 4 events or Treatment Days Lost compares favorably with the published literature

  1. Respiratory-Gated Positron Emission Tomography and Breath-Hold Computed Tomography Coupling to Reduce the Influence of Respiratory Motion: Methodology and Feasibility

    International Nuclear Information System (INIS)

    Daouk, J.; Fin, L.; Bailly, P.; Meyer, M.E.

    2009-01-01

    Background: Respiratory motion causes uptake in positron emission tomography (PET) images of chest and abdominal structures to be blurred and reduced in intensity. Purpose: To compare two respiratory-gated PET binning methods (based on frequency and amplitude analyses of the respiratory signal) and to propose a 'BH-based' method based on an additional breath-hold computed tomography (CT) acquisition. Material and Methods: Respiratory-gated PET consists in list-mode (LM) acquisition with simultaneous respiratory signal recording. A phantom study featured rectilinear movement of a 0.5-ml sphere filled with 18 F-fluorodeoxyglucose ( 18 F-FDG) solution, placed in a radioactive background (sphere-to-background contrast 6:1). Two patients were also examined. Three figures of merit were calculated: the target-to-background ratio profile (TBRP) in the axial direction through the uptake (i.e., the sphere or lesion), full-width-at-half-maximum (FWHM) values, and maximized standard uptake values (SUVmax). Results: In the phantom study, the peak TBRP was 0.9 for non-gated volume, 1.83 for BH-based volume, and varied between 1.13 and 1.73 for Freq-based volumes and between 1.34 and 1.66 for Amp-based volumes. A reference volume (REF-static) was also acquired for the phantom (in a static, 'expiratory' state), with a peak TBRP at 1.88. TBRPs were computed for patient data, with higher peak values for all gated volumes than for non-gated volumes. Conclusion: Respiratory-gated PET acquisition reduces the blurring effect and increases image contrast. However, Freq-based and Amp-based volumes are still influenced by inappropriate attenuation correction and misregistration of mobile lesions on CT images. The proposed BH-based method both reduces motion artifacts and improves PET-CT registration

  2. Design and methods of the Echo WISELY (Will Inappropriate Scenarios for Echocardiography Lessen SignificantlY) study: An investigator-blinded randomized controlled trial of education and feedback intervention to reduce inappropriate echocardiograms.

    Science.gov (United States)

    Bhatia, R Sacha; Ivers, Noah; Yin, Cindy X; Myers, Dorothy; Nesbitt, Gillian; Edwards, Jeremy; Yared, Kibar; Wadhera, Rishi; Wu, Justina C; Wong, Brian; Hansen, Mark; Weinerman, Adina; Shadowitz, Steven; Johri, Amer; Farkouh, Michael; Thavendiranathan, Paaladinesh; Udell, Jacob A; Rambihar, Sherryn; Chow, Chi-Ming; Hall, Judith; Thorpe, Kevin E; Rakowski, Harry; Weiner, Rory B

    2015-08-01

    Appropriate use criteria (AUC) for transthoracic echocardiography (TTE) were developed to address concerns regarding inappropriate use of TTE. A previous pilot study suggests that an educational and feedback intervention can reduce inappropriate TTEs ordered by physicians in training. It is unknown if this type of intervention will be effective when targeted at attending level physicians in a variety of clinical settings. The aim of this international, multicenter study is to evaluate the hypothesis that an AUC-based educational and feedback intervention will reduce the proportion of inappropriate echocardiograms ordered by attending physicians in the ambulatory environment. In an ongoing multicentered, investigator-blinded, randomized controlled trial across Canada and the United States, cardiologists and primary care physicians practicing in the ambulatory setting will be enrolled. The intervention arm will receive (1) a lecture outlining the AUC and most recent available evidence highlighting appropriate use of TTE, (2) access to the American Society of Echocardiography mobile phone app, and (3) individualized feedback reports e-mailed monthly summarizing TTE ordering behavior including information on inappropriate TTEs and brief explanations of the inappropriate designation. The control group will receive no education on TTE appropriate use and order TTEs as usual practice. The Echo WISELY (Will Inappropriate Scenarios for Echocardiography Lessen Significantly in an education RCT) study is the first multicenter randomized trial of an AUC-based educational intervention. The study will examine whether an education and feedback intervention will reduce the rate of outpatient inappropriate TTEs ordered by attending level cardiologists and primary care physicians (www.clinicaltrials.gov identifier NCT02038101). Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Project Energise: Using participatory approaches and real time computer prompts to reduce occupational sitting and increase work time physical activity in office workers.

    Science.gov (United States)

    Gilson, Nicholas D; Ng, Norman; Pavey, Toby G; Ryde, Gemma C; Straker, Leon; Brown, Wendy J

    2016-11-01

    This efficacy study assessed the added impact real time computer prompts had on a participatory approach to reduce occupational sedentary exposure and increase physical activity. Quasi-experimental. 57 Australian office workers (mean [SD]; age=47 [11] years; BMI=28 [5]kg/m 2 ; 46 men) generated a menu of 20 occupational 'sit less and move more' strategies through participatory workshops, and were then tasked with implementing strategies for five months (July-November 2014). During implementation, a sub-sample of workers (n=24) used a chair sensor/software package (Sitting Pad) that gave real time prompts to interrupt desk sitting. Baseline and intervention sedentary behaviour and physical activity (GENEActiv accelerometer; mean work time percentages), and minutes spent sitting at desks (Sitting Pad; mean total time and longest bout) were compared between non-prompt and prompt workers using a two-way ANOVA. Workers spent close to three quarters of their work time sedentary, mostly sitting at desks (mean [SD]; total desk sitting time=371 [71]min/day; longest bout spent desk sitting=104 [43]min/day). Intervention effects were four times greater in workers who used real time computer prompts (8% decrease in work time sedentary behaviour and increase in light intensity physical activity; pcomputer prompts facilitated the impact of a participatory approach on reductions in occupational sedentary exposure, and increases in physical activity. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  4. The effectiveness of a training method using self-modeling webcam photos for reducing musculoskeletal risk among office workers using computers.

    Science.gov (United States)

    Taieb-Maimon, Meirav; Cwikel, Julie; Shapira, Bracha; Orenstein, Ido

    2012-03-01

    An intervention study was conducted to examine the effectiveness of an innovative self-modeling photo-training method for reducing musculoskeletal risk among office workers using computers. Sixty workers were randomly assigned to either: 1) a control group; 2) an office training group that received personal, ergonomic training and workstation adjustments or 3) a photo-training group that received both office training and an automatic frequent-feedback system that displayed on the computer screen a photo of the worker's current sitting posture together with the correct posture photo taken earlier during office training. Musculoskeletal risk was evaluated using the Rapid Upper Limb Assessment (RULA) method before, during and after the six weeks intervention. Both training methods provided effective short-term posture improvement; however, sustained improvement was only attained with the photo-training method. Both interventions had a greater effect on older workers and on workers suffering more musculoskeletal pain. The photo-training method had a greater positive effect on women than on men. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Left-colon water exchange preserves the benefits of whole colon water exchange at reduced cecal intubation time conferring significant advantage in diagnostic colonoscopy - a prospective, randomized controlled trial.

    Science.gov (United States)

    Wang, Xiangping; Luo, Hui; Xiang, Yi; Leung, Felix W; Wang, Limei; Zhang, Linhui; Liu, Zhiguo; Wu, Kaichun; Fan, Daiming; Pan, Yanglin; Guo, Xuegang

    2015-07-01

    Whole-colon water exchange (WWE) reduces insertion pain, increases cecal intubation success and adenoma detection rate, but requires longer insertion time, compared to air insufflation (AI) colonoscopy. We hypothesized that water exchange limited to the left colon (LWE) can speed up insertion with equivalent results. This prospective, randomized controlled study (NCT01735266) allocated patients (18-80 years) to WWE, LWE or AI group (1:1:1). The primary outcome was cecal intubation time. Three hundred subjects were randomized to the WWE (n = 100), LWE (n = 100) or AI group (n = 100). Ninety-four to ninety-five per cent of patients underwent diagnostic colonoscopy. Baseline characteristics were balanced. The median insertion time was shorter in LWE group (4.8 min (95%CI: 3.2-6.2)) than those in WWE (7.5 min (95%CI: 6.0-10.3)) and AI (6.4 min (95%CI: 4.2-9.8)) (both p rates in unsedated patients of the two water exchange methods (WWE 99%, LWE 99%) were significantly higher than that (89.8%) in AI group (p = 0.01). The final success rates were comparable among the three groups after sedation was given. Maximum pain scores and number of patients needing abdominal compression between WWE and LWE groups were comparable, both lower than those in AI group (p higher in WWE group. By preserving the benefits of WWE and reducing insertion time, LWE is appropriate for diagnostic colonoscopy, especially in settings with tight scheduling of patients. The higher PDR in the right colon in WWE group deserves to be further investigated.

  6. Policaptil Gel Retard significantly reduces body mass index and hyperinsulinism and may decrease the risk of type 2 diabetes mellitus (T2DM) in obese children and adolescents with family history of obesity and T2DM.

    Science.gov (United States)

    Stagi, Stefano; Lapi, Elisabetta; Seminara, Salvatore; Pelosi, Paola; Del Greco, Paolo; Capirchio, Laura; Strano, Massimo; Giglio, Sabrina; Chiarelli, Francesco; de Martino, Maurizio

    2015-02-15

    Treatments for childhood obesity are critically needed because of the risk of developing co-morbidities, although the interventions are frequently time-consuming, frustrating, difficult, and expensive. We conducted a longitudinal, randomised, clinical study, based on a per protocol analysis, on 133 obese children and adolescents (n = 69 males and 64 females; median age, 11.3 years) with family history of obesity and type 2 diabetes mellitus (T2DM). The patients were divided into three arms: Arm A (n = 53 patients), Arm B (n = 45 patients), and Arm C (n = 35 patients) patients were treated with a low-glycaemic-index (LGI) diet and Policaptil Gel Retard, only a LGI diet, or only an energy-restricted diet (ERD), respectively. The homeostasis model assessment of insulin resistance (HOMA-IR) and the Matsuda, insulinogenic and disposition indexes were calculated at T0 and after 1 year (T1). At T1, the BMI-SD scores were significantly reduced from 2.32 to 1.80 (p 1) in Arm A and from 2.23 to 1.99 (p 13.2% to 5.6%; p 1) and B (p 1) and B (p obese children and adolescents with family history of obesity and T2DM.

  7. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  8. Theory-driven, web-based, computer-tailored advice to reduce and interrupt sitting at work: development, feasibility and acceptability testing among employees.

    Science.gov (United States)

    De Cocker, Katrien; De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel

    2015-09-24

    Because of the adverse health effects in adults, interventions to influence workplace sitting, a large contributor to overall daily sedentary time, are needed. Computer-tailored interventions have demonstrated good outcomes in other health behaviours, though few have targeted sitting time at work. Therefore, the present aims were to (1) describe the development of a theory-driven, web-based, computer-tailored advice to influence sitting at work, (2) report on the feasibility of reaching employees, and (3) report on the acceptability of the advice. Employees from a public city service (n = 179) were invited by e-mail to participate. Employees interested to request the advice (n = 112) were sent the website link, a personal login and password. The online advice was based on different aspects of the Theory of Planned Behaviour, Self-Determination Theory and Self-Regulation Theory. Logistic regressions were conducted to compare characteristics (gender, age, education, employment status, amount of sitting and psychosocial correlates of workplace sitting) of employees requesting the advice (n = 90, 80.4%) with those who did not. Two weeks after visiting the website, 47 employees (52.2%) completed an online acceptability questionnaire. Those with a high education were more likely to request the advice than those with a low education (OR = 2.4, CI = 1.0-5.8), and those with a part-time job were more likely to request the advice compared to full-time employees (OR = 2.9, CI = 1.2-7.1). The majority found the advice interesting (n = 36/47, 76.6%), relevant (n = 33/47, 70.2%) and motivating (n = 29/47, 61.7%). Fewer employees believed the advice was practicable (n = 15/47, 31.9%). After completing the advice, 58.0% (n = 25/43) reported to have started interrupting their sitting and 32.6% (n = 17/43) additionally intended to do so; 14.0 % (n = 6/43) reported to have reduced their sitting and another 51.2% (n = 22/43) intended to do so. More efforts are needed to reach lower

  9. A new model predictive control algorithm by reducing the computing time of cost function minimization for NPC inverter in three-phase power grids.

    Science.gov (United States)

    Taheri, Asghar; Zhalebaghi, Mohammad Hadi

    2017-11-01

    This paper presents a new control strategy based on finite-control-set model-predictive control (FCS-MPC) for Neutral-point-clamped (NPC) three-level converters. Containing some advantages like fast dynamic response, easy inclusion of constraints and simple control loop, makes the FCS-MPC method attractive to use as a switching strategy for converters. However, the large amount of required calculations is a problem in the widespread of this method. In this way, to resolve this problem this paper presents a modified method that effectively reduces the computation load compare with conventional FCS-MPC method and at the same time does not affect on control performance. The proposed method can be used for exchanging power between electrical grid and DC resources by providing active and reactive power compensations. Experiments on three-level converter for three Power Factor Correction (PFC), inductive and capacitive compensation modes verify the good and comparable performance. The results have been simulated using MATLAB/SIMULINK software. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. A Web-based computer-tailored game to reduce binge drinking among 16 to 18 year old Dutch adolescents: development and study protocol.

    Science.gov (United States)

    Jander, Astrid; Crutzen, Rik; Mercken, Liesbeth; de Vries, Hein

    2014-10-09

    In The Netherlands, excessive alcohol use (e.g., binge drinking) is prevalent among adolescents. Alcohol use in general and binge drinking in particular comes with various immediate and long term health risks. Thus, reducing binge drinking among this target group is very important. This article describes a two-arm Cluster Randomized Controlled Trial (CRCT) of an intervention aimed at reducing binge drinking in this target group. The intervention is a Web-based, computer-tailored game in which adolescents receive personalized feedback on their drinking behavior aimed at changing motivational determinants related to this behavior. The development of the game is grounded in the I-Change Model. A CRTC is conducted to test the effectiveness of the game. Adolescents are recruited through schools, and schools are randomized into the experimental condition and the control condition. The experimental condition fills in a baseline questionnaire assessing demographic variables, motivational determinants of behavior (attitude, social influences, self-efficacy, intention) and alcohol use. They are also asked to invite their parents to take part in a short parental component that focusses on setting rules and communicating about alcohol. After completing the baseline questionnaire, the experimental condition continues playing the first of three game scenarios. The primary follow-up measurement takes place after four months and a second follow-up after eight months. The control condition only fills in the baseline, four and eight month follow-up measurement and then receives access to the game (i.e., a waiting list control condition). The effectiveness of the intervention to reduce binge drinking in the previous 30 days and alcohol use in the last week will be assessed. Furthermore, intention to drink and binge drink are assessed. Besides main effects, potential subgroup differences pertaining to gender, age, and educational background are explored. The study described in this

  11. Significance of exercise-induced ST segment depression in patients with myocardial infarction involving the left circumflex artery. Evaluation by exercise thallium-201 myocardial single photon emission computed tomography

    International Nuclear Information System (INIS)

    Koitabashi, Norimichi; Toyama, Takuji; Hoshizaki, Hiroshi

    2000-01-01

    The significance of exercise-induced ST segment depression in patients with left circumflex artery involvement was investigated by comparing exercise electrocardiography with exercise thallium-201 single photon emission computed tomography (Tl-SPECT) and the wall motion estimated by left ventriculography. Tl-SPECT and exercise electrocardiography were simultaneously performed in 51 patients with left circumflex artery involvement (angina pectoris 30, myocardial infarction 21). In patients with myocardial infarction, exercise-induced ST depression was frequently found in the V 2 , V 3 and V 4 leads. In patients with angina pectoris, ST depression was frequently found in the II, III, aV F , V 5 and V 6 leads. There was no obvious difference in the leads of ST depression in patients with myocardial infarction with ischemia and without ischemia on Tl-SPECT images. In patients with myocardial infarction, the lateral wall motion of the infarcted area evaluated by left ventriculography was more significantly impaired in the patients with ST depression than without ST depression (p<0.01). Exercise-induced ST depression in the precordial leads possibly reflects wall motion abnormality rather than ischemia in the lateral infarcted myocardium. (author)

  12. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    OpenAIRE

    Jingyu Wang; xuefeng Zheng; Dengliang Luo

    2011-01-01

    Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in...

  13. An automated Y-maze based on a reduced instruction set computer (RISC) microcontroller for the assessment of continuous spontaneous alternation in rats.

    Science.gov (United States)

    Heredia-López, Francisco J; Álvarez-Cervera, Fernando J; Collí-Alfaro, José G; Bata-García, José L; Arankowsky-Sandoval, Gloria; Góngora-Alfaro, José L

    2016-12-01

    Continuous spontaneous alternation behavior (SAB) in a Y-maze is used for evaluating working memory in rodents. Here, the design of an automated Y-maze equipped with three infrared optocouplers per arm, and commanded by a reduced instruction set computer (RISC) microcontroller is described. The software was devised for recording only true entries and exits to the arms. Experimental settings are programmed via a keyboard with three buttons and a display. The sequence of arm entries and the time spent in each arm and the neutral zone (NZ) are saved as a text file in a non-volatile memory for later transfer to a USB flash memory. Data files are analyzed with a program developed under LabVIEW® environment, and the results are exported to an Excel® spreadsheet file. Variables measured are: latency to exit the starting arm, sequence and number of arm entries, number of alternations, alternation percentage, and cumulative times spent in each arm and NZ. The automated Y-maze accurately detected the SAB decrease produced in rats by the muscarinic antagonist trihexyphenidyl, and its reversal by caffeine, having 100 % concordance with the alternation percentages calculated by two trained observers who independently watched videos of the same experiments. Although the values of time spent in the arms and NZ measured by the automated system had small discrepancies with those calculated by the observers, Bland-Altman analysis showed 95 % concordance in three pairs of comparisons, while in one it was 90 %, indicating that this system is a reliable and inexpensive alternative for the study of continuous SAB in rodents.

  14. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  15. Bringing MapReduce Closer To Data With Active Drives

    Science.gov (United States)

    Golpayegani, N.; Prathapan, S.; Warmka, R.; Wyatt, B.; Halem, M.; Trantham, J. D.; Markey, C. A.

    2017-12-01

    Moving computation closer to the data location has been a much theorized improvement to computation for decades. The increase in processor performance, the decrease in processor size and power requirement combined with the increase in data intensive computing has created a push to move computation as close to data as possible. We will show the next logical step in this evolution in computing: moving computation directly to storage. Hypothetical systems, known as Active Drives, have been proposed as early as 1998. These Active Drives would have a general-purpose CPU on each disk allowing for computations to be performed on them without the need to transfer the data to the computer over the system bus or via a network. We will utilize Seagate's Active Drives to perform general purpose parallel computing using the MapReduce programming model directly on each drive. We will detail how the MapReduce programming model can be adapted to the Active Drive compute model to perform general purpose computing with comparable results to traditional MapReduce computations performed via Hadoop. We will show how an Active Drive based approach significantly reduces the amount of data leaving the drive when performing several common algorithms: subsetting and gridding. We will show that an Active Drive based design significantly improves data transfer speeds into and out of drives compared to Hadoop's HDFS while at the same time keeping comparable compute speeds as Hadoop.

  16. Cloud Computing with iPlant Atmosphere.

    Science.gov (United States)

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  17. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  18. The DYD-RCT protocol: an on-line randomised controlled trial of an interactive computer-based intervention compared with a standard information website to reduce alcohol consumption among hazardous drinkers

    Directory of Open Access Journals (Sweden)

    Godfrey Christine

    2007-10-01

    Full Text Available Abstract Background Excessive alcohol consumption is a significant public health problem throughout the world. Although there are a range of effective interventions to help heavy drinkers reduce their alcohol consumption, these have little proven population-level impact. Researchers internationally are looking at the potential of Internet interventions in this area. Methods/Design In a two-arm randomised controlled trial, an on-line psychologically enhanced interactive computer-based intervention is compared with a flat, text-based information web-site. Recruitment, consent, randomisation and data collection are all on-line. The primary outcome is total past-week alcohol consumption; secondary outcomes include hazardous or harmful drinking, dependence, harm caused by alcohol, and mental health. A health economic analysis is included. Discussion This trial will provide information on the effectiveness and cost-effectiveness of an on-line intervention to help heavy drinkers drink less. Trial registration International Standard Randomised Controlled Trial Number Register ISRCTN31070347

  19. Benefits of adopting good radiation practices in reducing the whole body radiation dose to the nuclear medicine personnel during (18)F-fluorodeoxyglucose positron emission tomography/computed tomography imaging.

    Science.gov (United States)

    Verma, Shashwat; Kheruka, Subhash Chand; Maurya, Anil Kumar; Kumar, Narvesh; Gambhir, Sanjay; Kumari, Sarita

    2016-01-01

    Positron emission tomography has been established as an important imaging modality in the management of patients, especially in oncology. The higher gamma radiation energy of positron-emitting isotopes poses an additional radiation safety problem. Those working with this modality may likely to receive higher whole body doses than those working only in conventional nuclear medicine. The radiation exposure to the personnel occurs in dispensing the dose, administration of activity, patient positioning, and while removing the intravenous (i.v.) cannula. The estimation of radiation dose to Nuclear Medicine Physician (NMP) involved during administration of activity to the patient and technical staff assisting in these procedures in a positron emission tomography/computed tomography (PET/CT) facility was carried out. An i.v access was secured for the patient by putting the cannula and blood sugar was monitored. The activity was then dispensed and measured in the dose calibrator and administered to the patient by NMP. Personnel doses received by NMP and technical staff were measured using electronic pocket dosimeter. The radiation exposure levels at various working locations were assessed with the help of gamma survey meter. The radiation level at working distance while administering the radioactivity was found to be 106-170 μSv/h with a mean value of 126.5 ± 14.88 μSv/h which was reduced to 4.2-14.2 μSv/h with a mean value of 7.16 ± 2.29 μSv/h with introduction of L-bench for administration of radioactivity. This shows a mean exposure level reduction of 94.45 ± 1.03%. The radiation level at working distance, while removing the i.v. cannula postscanning was found to be 25-70 μSv/h with a mean value of 37.4 ± 13.16 μSv/h which was reduced to 1.0-5.0 μSv/h with a mean value of 2.77 ± 1.3 μSv/h with introduction of L-bench for removal of i.v cannula. This shows a mean exposure level reduction of 92.85 ± 1.78%. This study shows that good radiation practices are

  20. Prostate health index significantly reduced unnecessary prostate biopsies in patients with PSA 2-10 ng/mL and PSA >10 ng/mL: Results from a Multicenter Study in China.

    Science.gov (United States)

    Na, Rong; Ye, Dingwei; Qi, Jun; Liu, Fang; Helfand, Brian T; Brendler, Charles B; Conran, Carly A; Packiam, Vignesh; Gong, Jian; Wu, Yishuo; Zheng, Siqun L; Mo, Zengnan; Ding, Qiang; Sun, Yinghao; Xu, Jianfeng

    2017-08-01

    The performance of prostate health index (phi) in predicting prostate biopsy outcomes has been well established for patients with prostate-specific antigen (PSA) values between 2 and 10 ng/mL. However, the performance of phi remains unknown in patients with PSA >10 ng/mL, the vast majority in Chinese biopsy patients. We aimed to assess the ability of phi to predict prostate cancer (PCa) and high-grade disease (Gleason Score ≥7) on biopsy in a Chinese population. This is a prospective, observational, multi-center study of consecutive patients who underwent a transrectal ultrasound guided prostate biopsy at four hospitals in Shanghai, China from August 2013 to December 2014. In the cohort of 1538 patients, the detection rate of PCa was 40.2%. phi had a significantly better predictive performance for PCa than total PSA (tPSA). The areas under the receiver operating characteristic curve (AUC) were 0.90 and 0.79 for phi and tPSA, respectively, P 10 ng/mL (N = 838, 54.5%). The detection rates of PCa were 35.9% and 57.7% in patients with tPSA 10.1-20 and 20.1-50 ng/mL, respectively. The AUCs of phi (0.79 and 0.89, for these two groups, respectively) were also significantly higher than tPSA (0.57 and 0.63, respectively), both P 10 ng/mL). © 2017 Wiley Periodicals, Inc.

  1. IGOB131, a novel seed extract of the West African plant Irvingia gabonensis, significantly reduces body weight and improves metabolic parameters in overweight humans in a randomized double-blind placebo controlled investigation

    Directory of Open Access Journals (Sweden)

    Mbofung Carl MF

    2009-03-01

    Full Text Available Abstract Background A recent in vitro study indicates that IGOB131, a novel seed extract of the traditional West African food plant Irvingia gabonensis, favorably impacts adipogenesis through a variety of critical metabolic pathways including PPAR gamma, leptin, adiponectin, and glycerol-3 phosphate dehydrogenase. This study was therefore aimed at evaluating the effects of IGOB131, an extract of Irvingia gabonensis, on body weight and associated metabolic parameters in overweight human volunteers. Methods The study participants comprised of 102 healthy, overweight and/or obese volunteers (defined as BMI > 25 kg/m2 randomly divided into two groups. The groups received on a daily basis, either 150 mg of IGOB131 or matching placebo in a double blinded fashion, 30–60 minutes before lunch and dinner. At baseline, 4, 8 and 10 weeks of the study, subjects were evaluated for changes in anthropometrics and metabolic parameters to include fasting lipids, blood glucose, C-reactive protein, adiponectin, and leptin. Results Significant improvements in body weight, body fat, and waist circumference as well as plasma total cholesterol, LDL cholesterol, blood glucose, C-reactive protein, adiponectin and leptin levels were observed in the IGOB131 group compared with the placebo group. Conclusion Irvingia gabonensis administered 150 mg twice daily before meals to overweight and/or obese human volunteers favorably impacts body weight and a variety of parameters characteristic of the metabolic syndrome. This is the first double blind randomized placebo controlled clinical trial regarding the anti-obesity and lipid profile modulating effects of an Irvingia gabonensis extract. The positive clinical results, together with our previously published mechanisms of gene expression modulation related to key metabolic pathways in lipid metabolism, provide impetus for much larger clinical studies. Irvingia gabonensis extract may prove to be a useful tool in dealing with the

  2. Frontline diagnostic evaluation of patients suspected of angina by coronary computed tomography reduces downstream resource utilization when compared to conventional ischemia testing

    DEFF Research Database (Denmark)

    Nielsen, L. H.; Markenvard, John; Jensen, Jesper Møller

    2011-01-01

    It has been proposed that the increasing use of coronary computed tomographic angiography (CTA) may introduce additional unnecessary diagnostic procedures. However, no previous study has assessed the impact on downstream test utilization of conventional diagnostic testing relative to CTA in patie...... prospective trials are needed in order to define the most cost-effective diagnostic use of CTA relative to conventional ischemia testing....

  3. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  4. The co registration of initial PET on the CT-radiotherapy reduces significantly the variabilities of anatomo-clinical target volume in the child hodgkin disease; La coregistration de la TEP initiale sur la scanographie de radiotherapie diminue significativement les variabilites de volume cible anatomoclinique dans la maladie de Hodgkin de l'enfant

    Energy Technology Data Exchange (ETDEWEB)

    Metwally, H.; Blouet, A.; David, I.; Rives, M.; Izar, F.; Courbon, F.; Filleron, T.; Laprie, A. [Institut Claudius-Regaud, 31 - Toulouse (France); Plat, G.; Vial, J. [CHU-hopital des Enfants, 31 - Toulouse (France)

    2009-10-15

    It exists a great interobserver variability for the anatomo-clinical target volume (C.T.V.) definition in children suffering of Hodgkin disease. In this study, the co-registration of the PET with F.D.G. on the planning computed tomography has significantly lead to a greater coherence in the clinical target volume definition. (N.C.)

  5. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  6. Prospective evaluation of reduced dose computed tomography for the detection of low-contrast liver lesions. Direct comparison with concurrent standard dose imaging

    International Nuclear Information System (INIS)

    Pooler, B.D.; Lubner, Meghan G.; Kim, David H.; Chen, Oliver T.; Li, Ke; Chen, Guang-Hong; Pickhardt, Perry J.

    2017-01-01

    To prospectively compare the diagnostic performance of reduced-dose (RD) contrast-enhanced CT (CECT) with standard-dose (SD) CECT for detection of low-contrast liver lesions. Seventy adults with non-liver primary malignancies underwent abdominal SD-CECT immediately followed by RD-CECT, aggressively targeted at 60-70 % dose reduction. SD series were reconstructed using FBP. RD series were reconstructed with FBP, ASIR, and MBIR (Veo). Three readers - blinded to clinical history and comparison studies - reviewed all series, identifying liver lesions ≥4 mm. Non-blinded review by two experienced abdominal radiologists - assessing SD against available clinical and radiologic information - established the reference standard. RD-CECT mean effective dose was 2.01 ± 1.36 mSv (median, 1.71), a 64.1 ± 8.8 % reduction. Pooled per-patient performance data were (sensitivity/specificity/PPV/NPV/accuracy) 0.91/0.78/0.60/0.96/0.81 for SD-FBP compared with RD-FBP 0.79/0.75/0.54/0.91/0.76; RD-ASIR 0.84/0.75/0.56/0.93/0.78; and RD-MBIR 0.84/0.68/0.49/0.92/0.72. ROC AUC values were 0.896/0.834/0.858/0.854 for SD-FBP/RD-FBP/RD-ASIR/RD-MBIR, respectively. RD-FBP (P = 0.002) and RD-MBIR (P = 0.032) AUCs were significantly lower than those of SD-FBP; RD-ASIR was not (P = 0.052). Reader confidence was lower for all RD series (P < 0.001) compared with SD-FBP, especially when calling patients entirely negative. Aggressive CT dose reduction resulted in inferior diagnostic performance and reader confidence for detection of low-contrast liver lesions compared to SD. Relative to RD-ASIR, RD-FBP showed decreased sensitivity and RD-MBIR showed decreased specificity. (orig.)

  7. Prospective evaluation of reduced dose computed tomography for the detection of low-contrast liver lesions. Direct comparison with concurrent standard dose imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pooler, B.D.; Lubner, Meghan G.; Kim, David H.; Chen, Oliver T. [University of Wisconsin School of Medicine and Public Health, Department of Radiology, Madison, WI (United States); Li, Ke; Chen, Guang-Hong [University of Wisconsin School of Medicine and Public Health, Department of Radiology, Madison, WI (United States); University of Wisconsin School of Medicine and Public Health, Department of Medical Physics, Madison, WI (United States); Pickhardt, Perry J. [University of Wisconsin School of Medicine and Public Health, Department of Radiology, Madison, WI (United States); University of Wisconsin School of Medicine and Public Health, E3/311 Clinical Science Center, Department of Radiology, Madison, WI (United States)

    2017-05-15

    To prospectively compare the diagnostic performance of reduced-dose (RD) contrast-enhanced CT (CECT) with standard-dose (SD) CECT for detection of low-contrast liver lesions. Seventy adults with non-liver primary malignancies underwent abdominal SD-CECT immediately followed by RD-CECT, aggressively targeted at 60-70 % dose reduction. SD series were reconstructed using FBP. RD series were reconstructed with FBP, ASIR, and MBIR (Veo). Three readers - blinded to clinical history and comparison studies - reviewed all series, identifying liver lesions ≥4 mm. Non-blinded review by two experienced abdominal radiologists - assessing SD against available clinical and radiologic information - established the reference standard. RD-CECT mean effective dose was 2.01 ± 1.36 mSv (median, 1.71), a 64.1 ± 8.8 % reduction. Pooled per-patient performance data were (sensitivity/specificity/PPV/NPV/accuracy) 0.91/0.78/0.60/0.96/0.81 for SD-FBP compared with RD-FBP 0.79/0.75/0.54/0.91/0.76; RD-ASIR 0.84/0.75/0.56/0.93/0.78; and RD-MBIR 0.84/0.68/0.49/0.92/0.72. ROC AUC values were 0.896/0.834/0.858/0.854 for SD-FBP/RD-FBP/RD-ASIR/RD-MBIR, respectively. RD-FBP (P = 0.002) and RD-MBIR (P = 0.032) AUCs were significantly lower than those of SD-FBP; RD-ASIR was not (P = 0.052). Reader confidence was lower for all RD series (P < 0.001) compared with SD-FBP, especially when calling patients entirely negative. Aggressive CT dose reduction resulted in inferior diagnostic performance and reader confidence for detection of low-contrast liver lesions compared to SD. Relative to RD-ASIR, RD-FBP showed decreased sensitivity and RD-MBIR showed decreased specificity. (orig.)

  8. A comparison of the reduced and approximate systems for the time dependent computation of the polar wind and multiconstituent stellar winds

    International Nuclear Information System (INIS)

    Browning, G.L.; Holzer, T.E.

    1992-01-01

    The reduced system of equations commonly used to describe the time evolution of the polar wind and multiconstituent stellar winds is derived from the equations for a multispecies plasma with known temperature profiles by assuming that the electron thermal speed approaches infinity. The reduced system is proved to have unbounded growth near the sonic point of the protons for many of the standard parameter cases. For the same parameter cases, however, the unmodified system (from which the reduced system is derived) exhibits growth in some of the Fourier modes, but this growth is bounded. An alternate system (the approximate system) in which the electron thermal speed is slowed down is introduced. The approximate system retains the mathematical behavior of the unmodified system and can be shown to accurately describe the smooth solutions of the unmodified system. The approximate system has a number of other advantages over the reduced system becomes inaccurate. Also, for three-dimensional flows the correct reduced system requires the solution of an elliptic equation, while the approximate system is hyperbolic and only requires a time step approximately 1 order of magnitude less than the reduced system. Numerical solutions from models based on the two systems are compared with each other to illustrate these points

  9. A big oil company's approach to significantly reduce fatal incidents

    NARCIS (Netherlands)

    Peuscher, W.; Groeneweg, J.

    2012-01-01

    Within the Shell Group of companies (Shell), keeping people safe at work is a deeply held value and the company actively pursues the goal of no harm to people. Shell actively works to build a culture where every employee and contractor takes responsibility for making this goal possible - it is

  10. Significantly reducing registration time in IGRT using graphics processing units

    DEFF Research Database (Denmark)

    Noe, Karsten Østergaard; Denis de Senneville, Baudouin; Tanderup, Kari

    2008-01-01

    respiration phases in a free breathing volunteer and 41 anatomical landmark points in each image series. The registration method used is a multi-resolution GPU implementation of the 3D Horn and Schunck algorithm. It is based on the CUDA framework from Nvidia. Results On an Intel Core 2 CPU at 2.4GHz each...... registration took 30 minutes. On an Nvidia Geforce 8800GTX GPU in the same machine this registration took 37 seconds, making the GPU version 48.7 times faster. The nine image series of different respiration phases were registered to the same reference image (full inhale). Accuracy was evaluated on landmark...

  11. Innovative Phase Change Approach for Significant Energy Savings

    Science.gov (United States)

    2016-09-01

    related to the production, use, transmission , storage, control, or conservation of energy that will – (A) reduce the need for additional energy supplies...Conditions set for operation were: a. The computer with the broadband wireless card is to be used for data collection, transmission and...FINAL REPORT Innovative Phase Change Approach for Significant Energy Savings ESTCP Project EW-201138 SEPTEMBER 2016 Dr. Aly H Shaaban Applied

  12. Reduced sintering of mass-selected Au clusters on SiO2 by alloying with Ti: an aberration-corrected STEM and computational study

    DEFF Research Database (Denmark)

    Niu, Yubiao; Schlexer, Philomena; Sebök, Béla

    2018-01-01

    Au nanoparticles represent the most remarkable example of a size effect in heterogeneous catalysis. However, a major issue hindering the use of Au nanoparticles in technological applications is their rapid sintering. We explore the potential of stabilizing Au nanoclusters on SiO2 by alloying them...... in the Au/Ti clusters, but in line with the model computational investigation, Au atoms were still present on the surface. Thus size-selected, deposited nanoalloy Au/Ti clusters appear to be promising candidates for sustainable gold-based nanocatalysis....

  13. A Comparison of Computer-Assisted and Self-Management Programs for Reducing Alcohol Use among Students in First Year Experience Courses

    Science.gov (United States)

    Lane, David J.; Lindemann, Dana F.; Schmidt, James A.

    2012-01-01

    The National Institute of Alcohol Abuse and Alcoholism has called for the use of evidence-based approaches to address high-risk drinking prevalent on many college campuses. In line with this recommendation, the present study evaluated the efficacy of two evidence-based approaches to reducing alcohol use. One hundred and three college students in…

  14. A reduced-scaling density matrix-based method for the computation of the vibrational Hessian matrix at the self-consistent field level

    International Nuclear Information System (INIS)

    Kussmann, Jörg; Luenser, Arne; Beer, Matthias; Ochsenfeld, Christian

    2015-01-01

    An analytical method to calculate the molecular vibrational Hessian matrix at the self-consistent field level is presented. By analysis of the multipole expansions of the relevant derivatives of Coulomb-type two-electron integral contractions, we show that the effect of the perturbation on the electronic structure due to the displacement of nuclei decays at least as r −2 instead of r −1 . The perturbation is asymptotically local, and the computation of the Hessian matrix can, in principle, be performed with O(N) complexity. Our implementation exhibits linear scaling in all time-determining steps, with some rapid but quadratic-complexity steps remaining. Sample calculations illustrate linear or near-linear scaling in the construction of the complete nuclear Hessian matrix for sparse systems. For more demanding systems, scaling is still considerably sub-quadratic to quadratic, depending on the density of the underlying electronic structure

  15. A theory-based computer mediated communication intervention to promote mental health and reduce high-risk behaviors in the LGBT population.

    Science.gov (United States)

    DiNapoli, Jean Marie; Garcia-Dia, Mary Joy; Garcia-Ona, Leila; O'Flaherty, Deirdre; Siller, Jennifer

    2014-02-01

    The Healthy People 2020 (2012) report has identified that isolation, lack of social services, and a shortage of culturally competent providers serve as barriers to the health of lesbian, gay, bisexual, and transgender (LGBT) individuals who have HIV/AIDS. Self-transcendence theory proposes that individuals who face increased vulnerability or mortality may acquire an increased capacity for self-transcendence and its positive influence on mental health and well-being. The use of technology-enabled social and community support and group interventions through computer mediated self-help (CMSH) with LGBT individuals may help meet mental health needs of this group, and support healthy lifestyle practices. This article presents an overview of steps taken to propose a theory-based CMSH intervention for testing in research and eventual application in practice. © 2013.

  16. Diagnostic value of thallium-201 myocardial perfusion IQ-SPECT without and with computed tomography-based attenuation correction to predict clinically significant and insignificant fractional flow reserve: A single-center prospective study.

    Science.gov (United States)

    Tanaka, Haruki; Takahashi, Teruyuki; Ohashi, Norihiko; Tanaka, Koichi; Okada, Takenori; Kihara, Yasuki

    2017-12-01

    The aim of this study was to clarify the predictive value of fractional flow reserve (FFR) determined by myocardial perfusion imaging (MPI) using thallium (Tl)-201 IQ-SPECT without and with computed tomography-based attenuation correction (CT-AC) for patients with stable coronary artery disease (CAD).We assessed 212 angiographically identified diseased vessels using adenosine-stress Tl-201 MPI-IQ-SPECT/CT in 84 consecutive, prospectively identified patients with stable CAD. We compared the FFR in 136 of the 212 diseased vessels using visual semiquantitative interpretations of corresponding territories on MPI-IQ-SPECT images without and with CT-AC.FFR inversely correlated most accurately with regional summed difference scores (rSDS) in images without and with CT-AC (r = -0.584 and r = -0.568, respectively, both P system can predict FFR at an optimal cut-off of reserved.

  17. Detecting Novelty and Significance

    Science.gov (United States)

    Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.

    2013-01-01

    Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680

  18. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction

    International Nuclear Information System (INIS)

    Yang, C L; Wei, H Y; Soleimani, M; Adler, A

    2013-01-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current–voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results. (paper)

  19. Reducing computational costs in large scale 3D EIT by using a sparse Jacobian matrix with block-wise CGLS reconstruction.

    Science.gov (United States)

    Yang, C L; Wei, H Y; Adler, A; Soleimani, M

    2013-06-01

    Electrical impedance tomography (EIT) is a fast and cost-effective technique to provide a tomographic conductivity image of a subject from boundary current-voltage data. This paper proposes a time and memory efficient method for solving a large scale 3D EIT inverse problem using a parallel conjugate gradient (CG) algorithm. The 3D EIT system with a large number of measurement data can produce a large size of Jacobian matrix; this could cause difficulties in computer storage and the inversion process. One of challenges in 3D EIT is to decrease the reconstruction time and memory usage, at the same time retaining the image quality. Firstly, a sparse matrix reduction technique is proposed using thresholding to set very small values of the Jacobian matrix to zero. By adjusting the Jacobian matrix into a sparse format, the element with zeros would be eliminated, which results in a saving of memory requirement. Secondly, a block-wise CG method for parallel reconstruction has been developed. The proposed method has been tested using simulated data as well as experimental test samples. Sparse Jacobian with a block-wise CG enables the large scale EIT problem to be solved efficiently. Image quality measures are presented to quantify the effect of sparse matrix reduction in reconstruction results.

  20. Significant NRC Enforcement Actions

    Data.gov (United States)

    Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...

  1. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings.

    Science.gov (United States)

    Revell, A D; Wang, D; Wood, R; Morrow, C; Tempelman, H; Hamers, R L; Alvarez-Uria, G; Streinu-Cercel, A; Ene, L; Wensing, A M J; DeWolf, F; Nelson, M; Montaner, J S; Lane, H C; Larder, B A

    2013-06-01

    Genotypic HIV drug-resistance testing is typically 60%-65% predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART without a genotype and evaluated their potential as a treatment support tool in RLSs. Random forest models were trained to predict the probability of response to ART (≤400 copies HIV RNA/mL) using the following data from 14 891 treatment change episodes (TCEs) after virological failure, from well-resourced countries: viral load and CD4 count prior to treatment change, treatment history, drugs in the new regimen, time to follow-up and follow-up viral load. Models were assessed by cross-validation during development, with an independent set of 800 cases from well-resourced countries, plus 231 cases from Southern Africa, 206 from India and 375 from Romania. The area under the receiver operating characteristic curve (AUC) was the main outcome measure. The models achieved an AUC of 0.74-0.81 during cross-validation and 0.76-0.77 with the 800 test TCEs. They achieved AUCs of 0.58-0.65 (Southern Africa), 0.63 (India) and 0.70 (Romania). Models were more accurate for data from the well-resourced countries than for cases from Southern Africa and India (P < 0.001), but not Romania. The models identified alternative, available drug regimens predicted to result in virological response for 94% of virological failures in Southern Africa, 99% of those in India and 93% of those in Romania. We developed computational models that predict virological response to ART without a genotype with comparable accuracy to genotyping with rule-based interpretation. These models have the potential to help optimize antiretroviral therapy for patients in RLSs where genotyping is not generally available.

  2. Parallel algorithms and archtectures for computational structural mechanics

    Science.gov (United States)

    Patrick, Merrell; Ma, Shing; Mahajan, Umesh

    1989-01-01

    The determination of the fundamental (lowest) natural vibration frequencies and associated mode shapes is a key step used to uncover and correct potential failures or problem areas in most complex structures. However, the computation time taken by finite element codes to evaluate these natural frequencies is significant, often the most computationally intensive part of structural analysis calculations. There is continuing need to reduce this computation time. This study addresses this need by developing methods for parallel computation.

  3. Dose-reduced 16-slice multidetector-row spiral computed tomography in children with bronchoscopically suspected vascular tracheal stenosis - initial results

    International Nuclear Information System (INIS)

    Honnef, D.; Wildberger, J.E.; Das, M.; Hohl, C.; Mahnken, A.; Guenther, R.W.; Staatz, G.; Schnoering, H.; Vazquez-Jimenez, J.

    2006-01-01

    Purpose: To evaluate the diagnostic accuracy of contrast-enhanced dose-reduced 16-slice multidetector-row CT (MDCT) in newborns and infants with fiberoptic bronchoscopically suspected vascular-induced tracheal stenosis. Materials and Methods: 12 children (4 days to 3 years, 1.2-13.5 kg body weight) were examined using i.v. contrast-enhanced 16-slice MDCT (SOMATOM Sensation 16, Forchheim, Germany) without breath-hold and under sedation (11/12). All MDCTs were performed with a dose reduction. The beam collimation was 16 x 0.75 mm, except in the case of one child. MPRs along the tracheal axis in the x-, y- and z-directions and volume-rendering-reconstructions (VRTs) were calculated based on a secondary raw data set in addition to conventional axial slices. 2 radiologists used a three-point grade scale to evaluate the image quality, motion, and contrast media artifacts as well as the usefulness of the 2D- and 3D-reconstructions for determining the diagnosis. Statistical analysis was performed on the basis of a Kappa test. Results: In all cases the cause of the fiberoptic bronchoscopically suspected tracheal stenosis was revealed: compression due to the brachiocephalic trunk (n=7), double aortic arch (n=2), lusorian artery (n=1), vascular compression of the left main bronchus (n=2). In 3 patients further thoracic anomalies, such as tracheobronchial (n=2), and vascular (n=2) and vertebral (n=1) anomalies were found. The attenuation in the anomalous vessels was 307±140 HU. The image noise was 9.8±1.9 HU. The mean dose reduction was 82.7±3.2% compared to a standard adult thoracic CT. All examinations were rated as diagnostically good (median 1, range 1, k=1). 3D images did not show any stair artifacts (median 2, range 1-2, k=1). The image noise was minor to moderate and hardly any motion artifacts were seen (median 1, range 1-2, k=0.8). Contrast media artifacts were rated zero to minor (median 1.5, range 1-2, k=0.676). MPRs (median 1, range 1, k=1) and VRTs (median 1

  4. Determining the role of missense mutations in the POU domain of HNF1A that reduce the DNA-binding affinity: A computational approach.

    Directory of Open Access Journals (Sweden)

    Sneha P

    Full Text Available Maturity-onset diabetes of the young type 3 (MODY3 is a non-ketotic form of diabetes associated with poor insulin secretion. Over the past years, several studies have reported the association of missense mutations in the Hepatocyte Nuclear Factor 1 Alpha (HNF1A with MODY3. Missense mutations in the POU homeodomain (POUH of HNF1A hinder binding to the DNA, thereby leading to a dysfunctional protein. Missense mutations of the HNF1A were retrieved from public databases and subjected to a three-step computational mutational analysis to identify the underlying mechanism. First, the pathogenicity and stability of the mutations were analyzed to determine whether they alter protein structure and function. Second, the sequence conservation and DNA-binding sites of the mutant positions were assessed; as HNF1A protein is a transcription factor. Finally, the biochemical properties of the biological system were validated using molecular dynamic simulations in Gromacs 4.6.3 package. Two arginine residues (131 and 203 in the HNF1A protein are highly conserved residues and contribute to the function of the protein. Furthermore, the R131W, R131Q, and R203C mutations were predicted to be highly deleterious by in silico tools and showed lower binding affinity with DNA when compared to the native protein using the molecular docking analysis. Triplicate runs of molecular dynamic (MD simulations (50ns revealed smaller changes in patterns of deviation, fluctuation, and compactness, in complexes containing the R131Q and R131W mutations, compared to complexes containing the R203C mutant complex. We observed reduction in the number of intermolecular hydrogen bonds, compactness, and electrostatic potential, as well as the loss of salt bridges, in the R203C mutant complex. Substitution of arginine with cysteine at position 203 decreases the affinity of the protein for DNA, thereby destabilizing the protein. Based on our current findings, the MD approach is an important

  5. Doses of Coronary Study in 64 Channel Multi-Detector Computed Tomography : Reduced Radiation Dose According to Varity of Examnination Protocols

    International Nuclear Information System (INIS)

    Kim, Moon Chan

    2009-01-01

    that of conventional coronary CTA. And heart dose was 33.8 mGy, this represents 67.4% reduction. In the sequential scan technique under prospective ECG gating with low kVp the mean effective dose was 3.0 mSv, this represents an 83.2% reduction compared with that of conventional coronary CTA. And heart dose was 17.7 mGy, this represents an 82.9% reduction. In coronary CTA at retrospectively ECG gated helical scan, cardiac dose modulation technique using low kVp reduced dose to 50% above compared with the conventional helical scan. And the prospectively ECG gated sequential scan offers substantially reduced dose compared with the traditional retrospectively ECG gated helical scan.

  6. The feasibility of using computer graphics in environmental evaluations : interim report, documenting historic site locations using computer graphics.

    Science.gov (United States)

    1981-01-01

    This report describes a method for locating historic site information using a computer graphics program. If adopted for use by the Virginia Department of Highways and Transportation, this method should significantly reduce the time now required to de...

  7. Significant Tsunami Events

    Science.gov (United States)

    Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.

    2014-12-01

    Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/

  8. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  9. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  10. Testing Significance Testing

    Directory of Open Access Journals (Sweden)

    Joachim I. Krueger

    2018-04-01

    Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.

  11. Safety significance evaluation system

    International Nuclear Information System (INIS)

    Lew, B.S.; Yee, D.; Brewer, W.K.; Quattro, P.J.; Kirby, K.D.

    1991-01-01

    This paper reports that the Pacific Gas and Electric Company (PG and E), in cooperation with ABZ, Incorporated and Science Applications International Corporation (SAIC), investigated the use of artificial intelligence-based programming techniques to assist utility personnel in regulatory compliance problems. The result of this investigation is that artificial intelligence-based programming techniques can successfully be applied to this problem. To demonstrate this, a general methodology was developed and several prototype systems based on this methodology were developed. The prototypes address U.S. Nuclear Regulatory Commission (NRC) event reportability requirements, technical specification compliance based on plant equipment status, and quality assurance assistance. This collection of prototype modules is named the safety significance evaluation system

  12. Predicting significant torso trauma.

    Science.gov (United States)

    Nirula, Ram; Talmor, Daniel; Brasel, Karen

    2005-07-01

    Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.

  13. Gas revenue increasingly significant

    International Nuclear Information System (INIS)

    Megill, R.E.

    1991-01-01

    This paper briefly describes the wellhead prices of natural gas compared to crude oil over the past 70 years. Although natural gas prices have never reached price parity with crude oil, the relative value of a gas BTU has been increasing. It is one of the reasons that the total amount of money coming from natural gas wells is becoming more significant. From 1920 to 1955 the revenue at the wellhead for natural gas was only about 10% of the money received by producers. Most of the money needed for exploration, development, and production came from crude oil. At present, however, over 40% of the money from the upstream portion of the petroleum industry is from natural gas. As a result, in a few short years natural gas may become 50% of the money revenues generated from wellhead production facilities

  14. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  15. Tumor significant dose

    International Nuclear Information System (INIS)

    Supe, S.J.; Nagalaxmi, K.V.; Meenakshi, L.

    1983-01-01

    In the practice of radiotherapy, various concepts like NSD, CRE, TDF, and BIR are being used to evaluate the biological effectiveness of the treatment schedules on the normal tissues. This has been accepted as the tolerance of the normal tissue is the limiting factor in the treatment of cancers. At present when various schedules are tried, attention is therefore paid to the biological damage of the normal tissues only and it is expected that the damage to the cancerous tissues would be extensive enough to control the cancer. Attempt is made in the present work to evaluate the concent of tumor significant dose (TSD) which will represent the damage to the cancerous tissue. Strandquist in the analysis of a large number of cases of squamous cell carcinoma found that for the 5 fraction/week treatment, the total dose required to bring about the same damage for the cancerous tissue is proportional to T/sup -0.22/, where T is the overall time over which the dose is delivered. Using this finding the TSD was defined as DxN/sup -p/xT/sup -q/, where D is the total dose, N the number of fractions, T the overall time p and q are the exponents to be suitably chosen. The values of p and q are adjusted such that p+q< or =0.24, and p varies from 0.0 to 0.24 and q varies from 0.0 to 0.22. Cases of cancer of cervix uteri treated between 1978 and 1980 in the V. N. Cancer Centre, Kuppuswamy Naidu Memorial Hospital, Coimbatore, India were analyzed on the basis of these formulations. These data, coupled with the clinical experience, were used for choice of a formula for the TSD. Further, the dose schedules used in the British Institute of Radiology fraction- ation studies were also used to propose that the tumor significant dose is represented by DxN/sup -0.18/xT/sup -0.06/

  16. Uranium chemistry: significant advances

    International Nuclear Information System (INIS)

    Mazzanti, M.

    2011-01-01

    The author reviews recent progress in uranium chemistry achieved in CEA laboratories. Like its neighbors in the Mendeleev chart uranium undergoes hydrolysis, oxidation and disproportionation reactions which make the chemistry of these species in water highly complex. The study of the chemistry of uranium in an anhydrous medium has led to correlate the structural and electronic differences observed in the interaction of uranium(III) and the lanthanides(III) with nitrogen or sulfur molecules and the effectiveness of these molecules in An(III)/Ln(III) separation via liquid-liquid extraction. Recent work on the redox reactivity of trivalent uranium U(III) in an organic medium with molecules such as water or an azide ion (N 3 - ) in stoichiometric quantities, led to extremely interesting uranium aggregates particular those involved in actinide migration in the environment or in aggregation problems in the fuel processing cycle. Another significant advance was the discovery of a compound containing the uranyl ion with a degree of oxidation (V) UO 2 + , obtained by oxidation of uranium(III). Recently chemists have succeeded in blocking the disproportionation reaction of uranyl(V) and in stabilizing polymetallic complexes of uranyl(V), opening the way to to a systematic study of the reactivity and the electronic and magnetic properties of uranyl(V) compounds. (A.C.)

  17. Meaning and significance of

    Directory of Open Access Journals (Sweden)

    Ph D Student Roman Mihaela

    2011-05-01

    Full Text Available The concept of "public accountability" is a challenge for political science as a new concept in this area in full debate and developement ,both in theory and practice. This paper is a theoretical approach of displaying some definitions, relevant meanings and significance odf the concept in political science. The importance of this concept is that although originally it was used as a tool to improve effectiveness and eficiency of public governance, it has gradually become a purpose it itself. "Accountability" has become an image of good governance first in the United States of America then in the European Union.Nevertheless,the concept is vaguely defined and provides ambiguous images of good governance.This paper begins with the presentation of some general meanings of the concept as they emerge from specialized dictionaries and ancyclopaedies and continues with the meanings developed in political science. The concept of "public accontability" is rooted in economics and management literature,becoming increasingly relevant in today's political science both in theory and discourse as well as in practice in formulating and evaluating public policies. A first conclusin that emerges from, the analysis of the evolution of this term is that it requires a conceptual clarification in political science. A clear definition will then enable an appropriate model of proving the system of public accountability in formulating and assessing public policies, in order to implement a system of assessment and monitoring thereof.

  18. ATTACK WARNING: Costs to Modernize NORAD's Computer System Significantly Understated

    National Research Council Canada - National Science Library

    Cross, F

    1991-01-01

    ...) Integrated Tactical Warning and Attack Assessment (ITW/AA) system. These subsystems provide critical strategic surveillance and attack warning and assessment information to United States and Canadian leaders...

  19. Diagnostic significance of computed tomography in gastric cancer

    International Nuclear Information System (INIS)

    Kang, Eun Young; Cha, Sang Hoon; Seol, Hae Young; Chung, Kyoo Byung; Suh, Won Hyuck

    1985-01-01

    Gastric cancer is the most common gastrointestinal malignancy in Korea. Identification and evaluation of gastric mass lesions and regional-distant metastasis by abdominal CT scan are important for the treatment planning and prognostic implications of gastric cancer patients. Author reviewed CT scans of 61 cases of pathology proven gastric cancer, retrospectively, for recent 20 month from July 1983 to Feb. 1985 at Department of Radiology, Korea University, Hae Wha Hospital. The results were as follows: 1. There were 50 cases of advanced adenocarcinoma, 8 cases of early gastric cancer, 2 cases of leiomyosarcoma, and 1 case of lymphoma in total 61 cases. 2. The sex ratio of male to female was 2 : 1. Age distribution was from 24 to 75 year old and peak incidence was in 6th decade. 3. The most frequent site of involvement with gastric cancer was gastric antrum in 51% 4. 48 of 50 patients with advanced gastric adenocarcinoma (96%) had a wall thickness greater than 1 cm, and all of 8 cases of early gastric cancer had a wall thickness less than 1 cm. Regional lymph node tumor infiltration was found in 100% of gastric wall thickness greater than 2.0 cm, in 64% of cases of 1.5 to 2.0 cm, in 50% of cases of 1.0 to 1.5 cm, and 12.5% of cases of less than 1.0 cm. 5. In a comparison of enlargement of regional lymph node by CT scan to tumor infiltration of regional lymph node by histology, sensitivity was 52%, specificity was 87%, and reliability was 66%. 6. The structure involved by distant metastasis of these cases were the retroperitoneal lymph node in 15, liver in 8, and pancreas in 3. 7. The diagnostic accuracy of CT staging was considered about 68% by correlation of the surgical and histological findings. 8. The CT scan is one of the accurate and simple tool for evaluation of size, shape, extent, as well as distant metastasis in the cases of gastric malignancies

  20. Significance of preoperative staging of gastric carcinoma by computed tomography

    International Nuclear Information System (INIS)

    Shin, Jie Yeul; Shim, Jeon Seop; Kim, Byung Young; Lee, Jong Kil

    1989-01-01

    Gastric cancer is the most common gastrointestinal tract malignancy in Korea. When the patients has been detected, these tumors are usually advanced. CT is important for planning of treatment, assessing surgical resectability, postoperative evaluation and prognosis. Author reviewed CT scan of 78 cases of confirmed gastric cancer by UGI series and endoscopic biopsy, for 14 months from May 1988 to June 1989 at Department of Diagnostic Radiology, Taegu Fatima Hospital. The results were as follows; 1. Male to female ratio was 1:6:1 and the peak age groups are 6th decade and 7th decade. 2. The most frequent site of involvement was gastric antrum in 44.9% (35/78). antrum and body in 23.1% (18/78) in the order. 3. The reliability of pancreatic involvement was 88.2%(45/51). 4. The diagnostic accuracy of CT staging was 66.7% (34/51) by correlation of surgical and pathological findings. 5. The most common cause of non-operation was 17 cases (60.3%) of stage IV, old age, operation refusal in the order. 6. The accuracy of regional lymph node involvement between CT and pathologic finding was 62.7% (32/51)

  1. Significant Radionuclides Determination

    Energy Technology Data Exchange (ETDEWEB)

    Jo A. Ziegler

    2001-07-31

    The purpose of this calculation is to identify radionuclides that are significant to offsite doses from potential preclosure events for spent nuclear fuel (SNF) and high-level radioactive waste expected to be received at the potential Monitored Geologic Repository (MGR). In this calculation, high-level radioactive waste is included in references to DOE SNF. A previous document, ''DOE SNF DBE Offsite Dose Calculations'' (CRWMS M&O 1999b), calculated the source terms and offsite doses for Department of Energy (DOE) and Naval SNF for use in design basis event analyses. This calculation reproduces only DOE SNF work (i.e., no naval SNF work is included in this calculation) created in ''DOE SNF DBE Offsite Dose Calculations'' and expands the calculation to include DOE SNF expected to produce a high dose consequence (even though the quantity of the SNF is expected to be small) and SNF owned by commercial nuclear power producers. The calculation does not address any specific off-normal/DBE event scenarios for receiving, handling, or packaging of SNF. The results of this calculation are developed for comparative analysis to establish the important radionuclides and do not represent the final source terms to be used for license application. This calculation will be used as input to preclosure safety analyses and is performed in accordance with procedure AP-3.12Q, ''Calculations'', and is subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000) as determined by the activity evaluation contained in ''Technical Work Plan for: Preclosure Safety Analysis, TWP-MGR-SE-000010'' (CRWMS M&O 2000b) in accordance with procedure AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''.

  2. Rhythmic chaos: irregularities of computer ECG diagnosis.

    Science.gov (United States)

    Wang, Yi-Ting Laureen; Seow, Swee-Chong; Singh, Devinder; Poh, Kian-Keong; Chai, Ping

    2017-09-01

    Diagnostic errors can occur when physicians rely solely on computer electrocardiogram interpretation. Cardiologists often receive referrals for computer misdiagnoses of atrial fibrillation. Patients may have been inappropriately anticoagulated for pseudo atrial fibrillation. Anticoagulation carries significant risks, and such errors may carry a high cost. Have we become overreliant on machines and technology? In this article, we illustrate three such cases and briefly discuss how we can reduce these errors. Copyright: © Singapore Medical Association.

  3. Office ergonomics: deficiencies in computer workstation design.

    Science.gov (United States)

    Shikdar, Ashraf A; Al-Kindi, Mahmoud A

    2007-01-01

    The objective of this research was to study and identify ergonomic deficiencies in computer workstation design in typical offices. Physical measurements and a questionnaire were used to study 40 workstations. Major ergonomic deficiencies were found in physical design and layout of the workstations, employee postures, work practices, and training. The consequences in terms of user health and other problems were significant. Forty-five percent of the employees used nonadjustable chairs, 48% of computers faced windows, 90% of the employees used computers more than 4 hrs/day, 45% of the employees adopted bent and unsupported back postures, and 20% used office tables for computers. Major problems reported were eyestrain (58%), shoulder pain (45%), back pain (43%), arm pain (35%), wrist pain (30%), and neck pain (30%). These results indicated serious ergonomic deficiencies in office computer workstation design, layout, and usage. Strategies to reduce or eliminate ergonomic deficiencies in computer workstation design were suggested.

  4. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  5. Efficient 2-D DCT Computation from an Image Representation Point of View

    OpenAIRE

    Papakostas, G.A.; Koulouriotis, D.E.; Karakasis, E.G.

    2009-01-01

    A novel methodology that ensures the computation of 2-D DCT coefficients in gray-scale images as well as in binary ones, with high computation rates, was presented in the previous sections. Through a new image representation scheme, called ISR (Image Slice Representation) the 2-D DCT coefficients can be computed in significantly reduced time, with the same accuracy.

  6. Design criteria for rhenium-reduced nickel-based single-crystal alloys. Identification and computer-assisted conversion; Designkriterien fuer rheniumreduzierte Nickelbasis-Einkristalllegierungen. Identifikation und rechnergestuetzte Umsetzung

    Energy Technology Data Exchange (ETDEWEB)

    Goehler, Thomas

    2016-06-17

    In the present work, design criteria and property models for the creep strength optimization of rhenium-free nickel based single crystal Superalloys are investigated. The study focuses on a typical load condition of 1050 C and 150 MPa, which is representative for flight engine applications. Thereby the key aspect is to link chemical composition, manufacturing processes, microstructure formation and mechanistic understanding of dislocation creep through a computational materials engineering approach. Beside the positive effect of rhenium on solid solution hardening, a second mechanism in which rhenium increases high temperature creep strength is identified. It indirectly stabilizes precipitation hardening by reducing the coarsening kinetics of γ'-rafting. Five 1st and 2nd generation technical Superalloys show a comparable microstructure evolution for up to 2 % plastic elongation, while creep times differ by a factor of five. The application of a microstructure sensitive creep model shows that these coarsening processes can activate γ-cutting and thus lead to an increasing creep rate. Based on these calculations a threshold value of φ{sub γ/γ'} > 2,5 at 150 MPa is estimated. This ratio of matrix channel to raft thickness has been proofed for multiple positions by microstructure analysis of interrupted creep tests. The mechanism described previously can be decelerated by the enrichment of the γ-matrix with slow diffusing elements. The same principle also increases the solid solution strength of the γ-matrix. Therefore, the present work delivers an additional mechanistic explanation why creep properties of single phase nickel based alloys can be transferred to two phase technical Superalloys with rafted γ'-structure. Following, the best way to substitute both rhenium fundamental properties, namely a slow diffusion coefficient and a small solubility in g', has been investigated by means of CALPHAD-modeling. Only molybdenum and especially

  7. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  8. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  9. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  10. Application of ubiquitous computing in personal health monitoring systems.

    Science.gov (United States)

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  11. Assessing the effectiveness of a patient-centred computer-based clinic intervention, Health-E You/Salud iTu, to reduce health disparities in unintended pregnancies among Hispanic adolescents: study protocol for a cluster randomised control trial.

    Science.gov (United States)

    Tebb, Kathleen P; Rodriguez, Felicia; Pollack, Lance M; Trieu, Sang Leng; Hwang, Loris; Puffer, Maryjane; Adams, Sally; Ozer, Elizabeth M; Brindis, Claire D

    2018-01-10

    Teen pregnancy rates in the USA remain higher than any other industrialised nation, and pregnancies among Hispanic adolescents are disproportionately high. Computer-based interventions represent a promising approach to address sexual health and contraceptive use disparities. Preliminary findings have demonstrated that the Health-E You/Salud iTu, computer application (app) is feasible to implement, acceptable to Latina adolescents and improves sexual health knowledge and interest in selecting an effective contraceptive method when used in conjunction with a healthcare visit. The app is now ready for efficacy testing. The purpose of this manuscript is to describe patient-centred approaches used both in developing and testing the Health-E You app and to present the research methods used to evaluate its effectiveness in improving intentions to use an effective method of contraception as well as actual contraceptive use. This study is designed to assess the effectiveness of a patient-centred computer-based clinic intervention, Health-E You/Salud  iTu, on its ability to reduce health disparities in unintended pregnancies among Latina adolescent girls. This study uses a cluster randomised control trial design in which 18 school-based health centers from the Los Angeles Unified School District were randomly assigned, at equal chance, to either the intervention ( Health-E You app) or control group. Analyses will examine differences between the control and intervention group's knowledge of and attitudes towards contraceptive use, receipt of contraception at the clinic visit and self-reported use of contraception at 3-month and 6-month follow-ups. The study began enrolling participants in August 2016, and a total of 1400 participants (700 per treatment group) are expected to be enrolled by March 2018. Ethics approval was obtained through the University of California, San Francisco Institutional Review Board. Results of this trial will be submitted for publication in peer

  12. The energetic significance of cooking.

    Science.gov (United States)

    Carmody, Rachel N; Wrangham, Richard W

    2009-10-01

    While cooking has long been argued to improve the diet, the nature of the improvement has not been well defined. As a result, the evolutionary significance of cooking has variously been proposed as being substantial or relatively trivial. In this paper, we evaluate the hypothesis that an important and consistent effect of cooking food is a rise in its net energy value. The pathways by which cooking influences net energy value differ for starch, protein, and lipid, and we therefore consider plant and animal foods separately. Evidence of compromised physiological performance among individuals on raw diets supports the hypothesis that cooked diets tend to provide energy. Mechanisms contributing to energy being gained from cooking include increased digestibility of starch and protein, reduced costs of digestion for cooked versus raw meat, and reduced energetic costs of detoxification and defence against pathogens. If cooking consistently improves the energetic value of foods through such mechanisms, its evolutionary impact depends partly on the relative energetic benefits of non-thermal processing methods used prior to cooking. We suggest that if non-thermal processing methods such as pounding were used by Lower Palaeolithic Homo, they likely provided an important increase in energy gain over unprocessed raw diets. However, cooking has critical effects not easily achievable by non-thermal processing, including the relatively complete gelatinisation of starch, efficient denaturing of proteins, and killing of food borne pathogens. This means that however sophisticated the non-thermal processing methods were, cooking would have conferred incremental energetic benefits. While much remains to be discovered, we conclude that the adoption of cooking would have led to an important rise in energy availability. For this reason, we predict that cooking had substantial evolutionary significance.

  13. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  14. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  15. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  16. Review of pump suction reducer selection: Eccentric or concentric reducers

    OpenAIRE

    Mahaffey, R M; van Vuuren, S J

    2014-01-01

    Eccentric reducers are traditionally recommended for the pump suction reducer fitting to allow for transportation of air through the fitting to the pump. The ability of a concentric reducer to provide an improved approach flow to the pump while still allowing air to be transported through the fitting is investigated. Computational fluid dynamics (CFD) were utilised to analyse six concentric and six eccentric reducer geometries at four different inlet velocities to determine the flow velocity ...

  17. Reduced chemical kinetic mechanisms for hydrocarbon fuels

    International Nuclear Information System (INIS)

    Montgomery, C.J.; Cremer, M.A.; Heap, M.P.; Chen, J-Y.; Westbrook, C.K.; Maurice, L.Q.

    1999-01-01

    Using CARM (Computer Aided Reduction Method), a computer program that automates the mechanism reduction process, a variety of different reduced chemical kinetic mechanisms for ethylene and n-heptane have been generated. The reduced mechanisms have been compared to detailed chemistry calculations in simple homogeneous reactors and experiments. Reduced mechanisms for combustion of ethylene having as few as 10 species were found to give reasonable agreement with detailed chemistry over a range of stoichiometries and showed significant improvement over currently used global mechanisms. The performance of reduced mechanisms derived from a large detailed mechanism for n-heptane was compared to results from a reduced mechanism derived from a smaller semi-empirical mechanism. The semi-empirical mechanism was advantageous as a starting point for reduction for ignition delay, but not for PSR calculations. Reduced mechanisms with as few as 12 species gave excellent results for n-heptane/air PSR calculations but 16-25 or more species are needed to simulate n-heptane ignition delay

  18. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  19. Quantum computing with trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  20. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... to reduce the risk of an allergic reaction. These medications must be taken 12 hours prior to ... planes, and can even generate three-dimensional images. These images can be viewed on a computer monitor, ...

  2. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... to reduce the risk of an allergic reaction. These medications must be taken 12 hours prior to ... planes, and can even generate three-dimensional images. These images can be viewed on a computer monitor, ...

  3. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    Science.gov (United States)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  4. Reduced Chemical Kinetic Mechanisms for JP-8 Combustion

    National Research Council Canada - National Science Library

    Montgomery, Christopher J; Cannon, S. M; Mawid, M. A; Sekar, B

    2002-01-01

    Using CARM (Computer Aided Reduction Method), a computer program that automates the mechanism reduction process, six different reduced chemical kinetic mechanisms for JP-8 combustion have been generated...

  5. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  6. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  7. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. Reduced regional cerebral blood flow in aged noninsulin-dependent diabetic patients with no history of cerebrovascular disease: evaluation by N-isopropyl-123I-p-iodoamphetamine with single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Wakisaka, M.; Nagamachi, S.; Inoue, K.; Morotomi, Y.; Nunoi, K.; Fujishima, M.

    1990-01-01

    Regional cerebral blood flow was measured using N-isopropyl- 123 I-iodoamphetamine with single-photon emission computed tomography (CT) in 16 aged patients with noninsulin-dependent diabetes mellitus (NIDDM, average age 72.8 years, average fasting plasma glucose 7.7 mmol/L), and 12 nondiabetic subjects (71.6 years, 5.3 mmol/L). None had any history of a cerebrovascular accident. Systolic blood pressure (SBP), total cholesterol, and triglyceride levels did not differ between groups. Areas of hypoperfusion were observed in 14 diabetic patients (12 patients had multiple lesions) and in 6 nondiabetic subjects (3 had multiple lesions). Areas where radioactivity was greater than or equal to 65% of the maximum count of the slice was defined as a region with normal cerebral blood flow (region of interest A, ROI-A), and areas where the count was greater than or equal to 45% were defined as brain tissue regions other than ventricles (ROI-B). The average ROI-A/B ratio of 16 slices was used as a semiquantitative indicator of normal cerebral blood flow throughout the entire brain. Mean ROI-A/B ratio was 49.6 +/- 1.7% in the diabetic group, significantly lower than the 57.9 +/- 1.6% at the nondiabetic group (p less than 0.005). The ratio was inversely correlated with SBP (r = -0.61, p less than 0.05), total cholesterol (r = -0.51, p less than 0.05), and atherogenic index (r = -0.64, p less than 0.01), and was positively correlated with high-density lipoprotein (HDL) cholesterol (r = 0.51, p less than 0.05) in the diabetic, but not the nondiabetic group. These observations suggest that the age-related reduction in cerebral blood flow may be accelerated by a combination of hyperglycemia plus other risk factors for atherosclerosis

  9. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  10. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  11. Reducing Resistance

    DEFF Research Database (Denmark)

    Lindell, Johanna

    care may influence decisions on antibiotic use. Based on video-and audio recordings of physician-patient consultations it is investigated how treatment recommendations are presented, can be changed, are forecast and explained, and finally, how they seemingly meet resistance and how this resistance......Antibiotic resistance is a growing public health problem both nationally and internationally, and efficient strategies are needed to reduce unnecessary use. This dissertation presents four research studies, which examine how communication between general practitioners and patients in Danish primary...... is responded to.The first study in the dissertation suggests that treatment recommendations on antibiotics are often done in a way that encourages patient acceptance. In extension of this, the second study of the dissertation examines a case, where acceptance of such a recommendation is changed into a shared...

  12. Mapping the most significant computer hacking events to a temporal computer attack model

    CSIR Research Space (South Africa)

    Van Heerden, R

    2012-09-01

    Full Text Available of an impending net- work attack, and occur before any real damage has occurred. Popular reconnaissance actions include network mapping and scanning with tools such as Nmap and Nessus. Google and other search engines can also be used to identify potential weak.... ? Reconnaissance: Using well-known search engines to search for potential weak- nesses in the Windows operating system and Microsoft Outlook. ? Ramp-up: Writing the I-LOVE-YOU worm code. ? Damage: The spreading the I-LOVE-YOU worm via e-mail. It led...

  13. Diagnostic significance of CT discography

    International Nuclear Information System (INIS)

    Nishiyama, Toru; Tomita, Yu; Maeda, Katsuhisa; Takayama, Atsuya; Takada, Shunichi; Murakami, Masazumi; Saito, Yasufumi.

    1986-01-01

    In a total of 179 patients, comprising 161 with intervertebral disorder and 18 with spondylolysis or spondylolisthesis, CT discographic findings of 456 intervertebral discs were reviewed. Computed tomographic discography showed the direction of herniation, the size of hernial masses, and the deformity of the intervertebral disc and joint, being helpful in deciding indications for surgery and selecting surgical procedures. Computed tomographic discographic patterns of disk deformity were classified as types A, B, and C. This classification almost concurred with the findings of conventional discography. Discography or CT discography is recommended to be used when deformity at the L5-S1 level may be missed on myelography and there is extraforaminal lateral disc herniation. Combined use of myelography, discography, and CT discography would increase the diagnostic accuracy for lumbar diseases. (Namekawa, K.)

  14. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  15. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  16. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  17. Reducing costs by reducing size

    International Nuclear Information System (INIS)

    Hayns, M.R.; Shepherd, J.

    1991-01-01

    The present paper discusses briefly the many factors, including capital cost, which have to be taken into account in determining whether a series of power stations based on a small nuclear plant can be competitive with a series based on traditional large unit sizes giving the guaranteed level of supply. The 320 MWe UK/US Safe Integral Reactor is described as a good example of how the factors discussed can be beneficially incorporated into a design using proven technology. Finally it goes on to illustrate how the overall costs of a generating system can indeed by reduced by use of the 320 MWe Safe Integral Reactor rather than conventional units of around 1200 MWe. (author). 9 figs

  18. Interactive Computer Graphics

    Science.gov (United States)

    Kenwright, David

    2000-01-01

    Aerospace data analysis tools that significantly reduce the time and effort needed to analyze large-scale computational fluid dynamics simulations have emerged this year. The current approach for most postprocessing and visualization work is to explore the 3D flow simulations with one of a dozen or so interactive tools. While effective for analyzing small data sets, this approach becomes extremely time consuming when working with data sets larger than one gigabyte. An active area of research this year has been the development of data mining tools that automatically search through gigabyte data sets and extract the salient features with little or no human intervention. With these so-called feature extraction tools, engineers are spared the tedious task of manually exploring huge amounts of data to find the important flow phenomena. The software tools identify features such as vortex cores, shocks, separation and attachment lines, recirculation bubbles, and boundary layers. Some of these features can be extracted in a few seconds; others take minutes to hours on extremely large data sets. The analysis can be performed off-line in a batch process, either during or following the supercomputer simulations. These computations have to be performed only once, because the feature extraction programs search the entire data set and find every occurrence of the phenomena being sought. Because the important questions about the data are being answered automatically, interactivity is less critical than it is with traditional approaches.

  19. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  20. Total variation-based neutron computed tomography

    Science.gov (United States)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  1. Secure cloud computing implementation study for Singapore military operations

    OpenAIRE

    Guoquan, Lai

    2016-01-01

    Approved for public release; distribution is unlimited Cloud computing benefits organizations in many ways. With characteristics such as resource pooling, broad network access, on-demand self-service, and rapid elasticity, an organization's overall IT management can be significantly reduced (in terms of labor, software, and hardware) and its work processes made more efficient. However, is cloud computing suitable for the Singapore Armed Forces (SAF)? How can the SAF migrate its traditional...

  2. 简述海事部门云计算平台建设的意义及研究方法%A Brief Introduction on the Significance of the Establishment of the Cloud Computing Platform and Its Research Method in Maritime Department

    Institute of Scientific and Technical Information of China (English)

    周章海

    2015-01-01

    This paper discussed the current situations of the information resources processing and pointed out the objectives and significance to build a cloud computing platform in the maritime depart-ment.It also determined the research method of the platform and through this managers could integrate the existing information resources and do a thorough cleaning,filtering and aggregation of the business data to ultimately realize the unified management of applications and business data of all national mari-time departments.%本文阐述了当前海事部门信息资源处理的现状,指明了海事部门搭建云计算平台的目标与意义,确定了海事部门云计算平台建设的研究方法,通过统一云平台,实现对现有信息化资源的整合,对业务流程的梳理,对业务数据的统一清洗、筛选和聚合,最终达到对全国海事部门应用系统和业务数据的统一管理。

  3. Reduced order surrogate modelling (ROSM) of high dimensional deterministic simulations

    Science.gov (United States)

    Mitry, Mina

    Often, computationally expensive engineering simulations can prohibit the engineering design process. As a result, designers may turn to a less computationally demanding approximate, or surrogate, model to facilitate their design process. However, owing to the the curse of dimensionality, classical surrogate models become too computationally expensive for high dimensional data. To address this limitation of classical methods, we develop linear and non-linear Reduced Order Surrogate Modelling (ROSM) techniques. Two algorithms are presented, which are based on a combination of linear/kernel principal component analysis and radial basis functions. These algorithms are applied to subsonic and transonic aerodynamic data, as well as a model for a chemical spill in a channel. The results of this thesis show that ROSM can provide a significant computational benefit over classical surrogate modelling, sometimes at the expense of a minor loss in accuracy.

  4. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  5. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  6. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  7. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  8. Communication: Proper treatment of classically forbidden electronic transitions significantly improves detailed balance in surface hopping

    Energy Technology Data Exchange (ETDEWEB)

    Sifain, Andrew E. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Wang, Linjun [Department of Chemistry, Zhejiang University, Hangzhou 310027 (China); Prezhdo, Oleg V. [Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089-0485 (United States); Department of Chemistry, University of Southern California, Los Angeles, California 90089-1062 (United States)

    2016-06-07

    Surface hopping is the most popular method for nonadiabatic molecular dynamics. Many have reported that it does not rigorously attain detailed balance at thermal equilibrium, but does so approximately. We show that convergence to the Boltzmann populations is significantly improved when the nuclear velocity is reversed after a classically forbidden hop. The proposed prescription significantly reduces the total number of classically forbidden hops encountered along a trajectory, suggesting that some randomization in nuclear velocity is needed when classically forbidden hops constitute a large fraction of attempted hops. Our results are verified computationally using two- and three-level quantum subsystems, coupled to a classical bath undergoing Langevin dynamics.

  9. Historical Significant Volcanic Eruption Locations

    Data.gov (United States)

    Department of Homeland Security — A significant eruption is classified as one that meets at least one of the following criteriacaused fatalities, caused moderate damage (approximately $1 million or...

  10. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  11. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  12. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  13. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  14. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  15. Computational mathematics in China

    CERN Document Server

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  16. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  17. Offline computing and networking

    International Nuclear Information System (INIS)

    Appel, J.A.; Avery, P.; Chartrand, G.

    1985-01-01

    This note summarizes the work of the Offline Computing and Networking Group. The report is divided into two sections; the first deals with the computing and networking requirements and the second with the proposed way to satisfy those requirements. In considering the requirements, we have considered two types of computing problems. The first is CPU-intensive activity such as production data analysis (reducing raw data to DST), production Monte Carlo, or engineering calculations. The second is physicist-intensive computing such as program development, hardware design, physics analysis, and detector studies. For both types of computing, we examine a variety of issues. These included a set of quantitative questions: how much CPU power (for turn-around and for through-put), how much memory, mass-storage, bandwidth, and so on. There are also very important qualitative issues: what features must be provided by the operating system, what tools are needed for program design, code management, database management, and for graphics

  18. Significance of irradiation of blood

    International Nuclear Information System (INIS)

    Sekine, Hiroshi; Gotoh, Eisuke; Mochizuki, Sachio

    1992-01-01

    Many reports of fatal GVHD occurring in non-immunocompromised patients after blood transfusion have been published in Japan. One explantation is that transfused lymphocytes were simulated and attack the recipient organs recognized as HLA incompatible. That is so called 'one-way matching'. To reduce the risk of post-transfusion GVHD, one of the most convenient methods is to irradiate the donated blood at an appropriate dose for inactivation of lymphocytes. Because no one knows about the late effect of irradiated blood, it is necessary to make the prospective safety control. (author)

  19. Principles for the wise use of computers by children.

    Science.gov (United States)

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  20. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  1. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  2. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  3. Significant Lactic Acidosis from Albuterol

    Directory of Open Access Journals (Sweden)

    Deborah Diercks

    2018-03-01

    Full Text Available Lactic acidosis is a clinical entity that demands rapid assessment and treatment to prevent significant morbidity and mortality. With increased lactate use across many clinical scenarios, lactate values themselves cannot be interpreted apart from their appropriate clinical picture. The significance of Type B lactic acidosis is likely understated in the emergency department (ED. Given the mortality that sepsis confers, a serum lactate is an important screening study. That said, it is with extreme caution that we should interpret and react to the resultant elevated value. We report a patient with a significant lactic acidosis. Though he had a high lactate value, he did not require aggressive resuscitation. A different classification scheme for lactic acidosis that focuses on the bifurcation of the “dangerous” and “not dangerous” causes of lactic acidosis may be of benefit. In addition, this case is demonstrative of the potential overuse of lactates in the ED.

  4. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  5. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  6. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  7. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  8. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  9. The historical significance of oak

    Science.gov (United States)

    J. V. Thirgood

    1971-01-01

    A brief history of the importance of oak in Europe, contrasting the methods used in France and Britain to propagate the species and manage the forests for continued productivity. The significance of oak as a strategic resource during the sailing-ship era is stressed, and mention is made of the early development of oak management in North America. The international...

  10. SSTRAP: A computational model for genomic motif discovery ...

    African Journals Online (AJOL)

    Computational methods can potentially provide high-quality prediction of biological molecules such as DNA binding sites and Transcription factors and therefore reduce the time needed for experimental verification and challenges associated with experimental methods. These biological molecules or motifs have significant ...

  11. Shrinkage Reducing Admixture for Concrete

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    Concrete shrinkage cracking is a common problem in all types of concrete structures, especially for structures and environments where the cracks are prevalent and the repercussions are most severe. A liquid shrinkage reducing admixture for concrete, developed by GRACE Construction Products and ARCO Chemical Company, that reduces significantly the shrinkage during concrete drying and potentially reduces overall cracking over time.

  12. Computational fluid dynamics simulations of light water reactor flows

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Weber, D.P.

    1999-01-01

    Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing ha