WorldWideScience

Sample records for total computational time

  1. Television viewing, computer use and total screen time in Canadian youth.

    Science.gov (United States)

    Mark, Amy E; Boyce, William F; Janssen, Ian

    2006-11-01

    Research has linked excessive television viewing and computer use in children and adolescents to a variety of health and social problems. Current recommendations are that screen time in children and adolescents should be limited to no more than 2 h per day. To determine the percentage of Canadian youth meeting the screen time guideline recommendations. The representative study sample consisted of 6942 Canadian youth in grades 6 to 10 who participated in the 2001/2002 World Health Organization Health Behaviour in School-Aged Children survey. Only 41% of girls and 34% of boys in grades 6 to 10 watched 2 h or less of television per day. Once the time of leisure computer use was included and total daily screen time was examined, only 18% of girls and 14% of boys met the guidelines. The prevalence of those meeting the screen time guidelines was higher in girls than boys. Fewer than 20% of Canadian youth in grades 6 to 10 met the total screen time guidelines, suggesting that increased public health interventions are needed to reduce the number of leisure time hours that Canadian youth spend watching television and using the computer.

  2. A new system of computer-assisted navigation leading to reduction in operating time in uncemented total hip replacement in a matched population.

    Science.gov (United States)

    Chaudhry, Fouad A; Ismail, Sanaa Z; Davis, Edward T

    2018-05-01

    Computer-assisted navigation techniques are used to optimise component placement and alignment in total hip replacement. It has developed in the last 10 years but despite its advantages only 0.3% of all total hip replacements in England and Wales are done using computer navigation. One of the reasons for this is that computer-assisted technology increases operative time. A new method of pelvic registration has been developed without the need to register the anterior pelvic plane (BrainLab hip 6.0) which has shown to improve the accuracy of THR. The purpose of this study was to find out if the new method reduces the operating time. This was a retrospective analysis of comparing operating time in computer navigated primary uncemented total hip replacement using two methods of registration. Group 1 included 128 cases that were performed using BrainLab versions 2.1-5.1. This version relied on the acquisition of the anterior pelvic plane for registration. Group 2 included 128 cases that were performed using the newest navigation software, BrainLab hip 6.0 (registration possible with the patient in the lateral decubitus position). The operating time was 65.79 (40-98) minutes using the old method of registration and was 50.87 (33-74) minutes using the new method of registration. This difference was statistically significant. The body mass index (BMI) was comparable in both groups. The study supports the use of new method of registration in improving the operating time in computer navigated primary uncemented total hip replacements.

  3. Can You Depend Totally on Computers?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 2. Can You Depend Totally on Computers? Computer Security, Availability and Correctness. H N Mahabala. General Article Volume 3 Issue 2 February 1998 pp 35-44 ...

  4. Total variation-based neutron computed tomography

    Science.gov (United States)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  5. Cross-sectional associations of total sitting and leisure screen time with cardiometabolic risk in adults. Results from the HUNT Study, Norway.

    Science.gov (United States)

    Chau, Josephine Y; Grunseit, Anne; Midthjell, Kristian; Holmen, Jostein; Holmen, Turid L; Bauman, Adrian E; van der Ploeg, Hidde P

    2014-01-01

    To examine associations of total sitting time, TV-viewing and leisure-time computer use with cardiometabolic risk biomarkers in adults. Population based cross-sectional study. Waist circumference, BMI, total cholesterol, HDL cholesterol, blood pressure, non-fasting glucose, gamma glutamyltransferase (GGT) and triglycerides were measured in 48,882 adults aged 20 years or older from the Nord-Trøndelag Health Study 2006-2008 (HUNT3). Adjusted multiple regression models were used to test for associations between these biomarkers and self-reported total sitting time, TV-viewing and leisure-time computer use in the whole sample and by cardiometabolic disease status sub-groups. In the whole sample, reporting total sitting time ≥10 h/day was associated with poorer BMI, waist circumference, total cholesterol, HDL cholesterol, diastolic blood pressure, systolic blood pressure, non-fasting glucose, GGT and triglyceride levels compared to those reporting total sitting time Leisure-time computer use ≥1 h/day was associated with poorer BMI, total cholesterol, diastolic blood pressure, GGT and triglycerides compared with those reporting no leisure-time computing. Sub-group analyses by cardiometabolic disease status showed similar patterns in participants free of cardiometabolic disease, while similar albeit non-significant patterns were observed in those with cardiometabolic disease. Total sitting time, TV-viewing and leisure-time computer use are associated with poorer cardiometabolic risk profiles in adults. Reducing sedentary behaviour throughout the day and limiting TV-viewing and leisure-time computer use may have health benefits. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  6. Single machine total completion time minimization scheduling with a time-dependent learning effect and deteriorating jobs

    Science.gov (United States)

    Wang, Ji-Bo; Wang, Ming-Zheng; Ji, Ping

    2012-05-01

    In this article, we consider a single machine scheduling problem with a time-dependent learning effect and deteriorating jobs. By the effects of time-dependent learning and deterioration, we mean that the job processing time is defined by a function of its starting time and total normal processing time of jobs in front of it in the sequence. The objective is to determine an optimal schedule so as to minimize the total completion time. This problem remains open for the case of -1 < a < 0, where a denotes the learning index; we show that an optimal schedule of the problem is V-shaped with respect to job normal processing times. Three heuristic algorithms utilising the V-shaped property are proposed, and computational experiments show that the last heuristic algorithm performs effectively and efficiently in obtaining near-optimal solutions.

  7. Total and Partial Computation in Categorical Quantum Foundations

    Directory of Open Access Journals (Sweden)

    Kenta Cho

    2015-11-01

    Full Text Available This paper uncovers the fundamental relationship between total and partial computation in the form of an equivalence of certain categories. This equivalence involves on the one hand effectuses, which are categories for total computation, introduced by Jacobs for the study of quantum/effect logic. On the other hand, it involves what we call FinPACs with effects; they are finitely partially additive categories equipped with effect algebra structures, serving as categories for partial computation. It turns out that the Kleisli category of the lift monad (-+1 on an effectus is always a FinPAC with effects, and this construction gives rise to the equivalence. Additionally, state-and-effect triangles over FinPACs with effects are presented.

  8. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  9. Computer Assisted Surgery and Current Trends in Orthopaedics Research and Total Joint Replacements

    Science.gov (United States)

    Amirouche, Farid

    2008-06-01

    Musculoskeletal research has brought about revolutionary changes in our ability to perform high precision surgery in joint replacement procedures. Recent advances in computer assisted surgery as well better materials have lead to reduced wear and greatly enhanced the quality of life of patients. The new surgical techniques to reduce the size of the incision and damage to underlying structures have been the primary advance toward this goal. These new techniques are known as MIS or Minimally Invasive Surgery. Total hip and knee Arthoplasties are at all time high reaching 1.2 million surgeries per year in the USA. Primary joint failures are usually due to osteoarthristis, rheumatoid arthritis, osteocronis and other inflammatory arthritis conditions. The methods for THR and TKA are critical to initial stability and longevity of the prostheses. This research aims at understanding the fundamental mechanics of the joint Arthoplasty and providing an insight into current challenges in patient specific fitting, fixing, and stability. Both experimental and analytical work will be presented. We will examine Cementless total hip arthroplasty success in the last 10 years and how computer assisted navigation is playing in the follow up studies. Cementless total hip arthroplasty attains permanent fixation by the ingrowth of bone into a porous coated surface. Loosening of an ingrown total hip arthroplasty occurs as a result of osteolysis of the periprosthetic bone and degradation of the bone prosthetic interface. The osteolytic process occurs as a result of polyethylene wear particles produced by the metal polyethylene articulation of the prosthesis. The total hip arthroplasty is a congruent joint and the submicron wear particles produced are phagocytized by macrophages initiating an inflammatory cascade. This cascade produces cytokines ultimately implicated in osteolysis. Resulting bone loss both on the acetabular and femoral sides eventually leads to component instability. As

  10. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  11. Complications of fixed infrared emitters in computer-assisted total knee arthroplasties

    Directory of Open Access Journals (Sweden)

    Suárez-Vázquez Abelardo

    2007-07-01

    Full Text Available Abstract Background The first stage in the implant of a total knee arthroplasty with computer-assisted surgery is to fasten the emitters to the femur and the tibia. These trackers must be hard-fixed to the bone. The objectives of our study are to evaluate the technical problems and complications of these tracker-pins, the necessary time to fix them to the bone and the possible advantages of a new femoral-fixed tracker-pin. Methods Three hundred and sixty seven tracker-pins were used in one hundred and fifty one computer-assisted total knee replacements. A bicortical screw was used to fix the tracker to the tibia in all cases; in the femur, however, a bicortical tracker was used in 112 cases, while a new device (OrthoLock with percutaneous fixation pins was employed in the remaining 39. Results Technical problems related to the fixing of the trackers appeared in nine cases (2.5%. The mean surgery time to fix the tracker pin to the tibia was 3 minutes (range 2–7, and 5 minutes in the case of the femoral pin (range: 4–11, although with the new tool it was only three minutes (range 2–4 (p Conclusion The incidence of problems and complications with the fixing systems used in knee navigation is very small. The use of a new device with percutaneous pins facilitates the fixing of femoral trackers and decreases the time needed to place them.

  12. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  13. Total reduction of distorted echelle spectrograms - An automatic procedure. [for computer controlled microdensitometer

    Science.gov (United States)

    Peterson, R. C.; Title, A. M.

    1975-01-01

    A total reduction procedure, notable for its use of a computer-controlled microdensitometer for semi-automatically tracing curved spectra, is applied to distorted high-dispersion echelle spectra recorded by an image tube. Microdensitometer specifications are presented and the FORTRAN, TRACEN and SPOTS programs are outlined. The intensity spectrum of the photographic or electrographic plate is plotted on a graphic display. The time requirements are discussed in detail.

  14. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  15. Optimizing Ship Speed to Minimize Total Fuel Consumption with Multiple Time Windows

    Directory of Open Access Journals (Sweden)

    Jae-Gon Kim

    2016-01-01

    Full Text Available We study the ship speed optimization problem with the objective of minimizing the total fuel consumption. We consider multiple time windows for each port call as constraints and formulate the problem as a nonlinear mixed integer program. We derive intrinsic properties of the problem and develop an exact algorithm based on the properties. Computational experiments show that the suggested algorithm is very efficient in finding an optimal solution.

  16. Cross-sectional associations of total sitting and leisure screen time with cardiometabolic risk in adults. Results from the HUNT Study, Norway

    NARCIS (Netherlands)

    Chau, J.Y.; Grunseit, A.; Midthjell, K.; Holmen, J.; Holmen, T.L.; Bauman, A.E.; van der Ploeg, H.P.

    2014-01-01

    Objectives: To examine associations of total sitting time, TV-viewing and leisure-time computer use with cardiometabolic risk biomarkers in adults. Design: Population based cross-sectional study. Methods: Waist circumference, BMI, total cholesterol, HDL cholesterol, blood pressure, non-fasting

  17. 12 CFR 908.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 908.27 Section 908.27 Banks and... PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.27 Computing time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event...

  18. BIOMECHANICAL INDICES OF STANDING AND GAIT IN PATIENTS AFTER TOTAL KNEE REPLACEMENT USING COMPUTER NAVIGATION

    Directory of Open Access Journals (Sweden)

    Y. A. Bezgodkov

    2011-01-01

    Full Text Available Several biomechanical parameters of standing and walking in 50 patients with osteoarthrosis after total knee arthroplasty were evaluated. The patients were randomly divided in two equal groups: in the first group the surgery was performed with computer navigation system and in the second - with traditional instruments. After TKA with computer navigation centers of common body pressure and legs pressure during standing phase improved significantly better than in traditional group. Walking parameters like step length, ground contact time and rhythm coefficient improved in both groups of patients but without significant difference. Thereby more precise orientation of implant that achieved during computer assisted TKA leads to better functional performance at 6 and 12 month after surgery.

  19. Effects on mortality, treatment, and time management as a result of routine use of total body computed tomography in blunt high-energy trauma patients.

    Science.gov (United States)

    van Vugt, Raoul; Kool, Digna R; Deunk, Jaap; Edwards, Michael J R

    2012-03-01

    Currently, total body computed tomography (TBCT) is rapidly implemented in the evaluation of trauma patients. With this review, we aim to evaluate the clinical implications-mortality, change in treatment, and time management-of the routine use of TBCT in adult blunt high-energy trauma patients compared with a conservative approach with the use of conventional radiography, ultrasound, and selective computed tomography. A literature search for original studies on TBCT in blunt high-energy trauma patients was performed. Two independent observers included studies concerning mortality, change of treatment, and/or time management as outcome measures. For each article, relevant data were extracted and analyzed. In addition, the quality according to the Oxford levels of evidence was assessed. From 183 articles initially identified, the observers included nine original studies in consensus. One of three studies described a significant difference in mortality; four described a change of treatment in 2% to 27% of patients because of the use of TBCT. Five studies found a gain in time with the use of immediate routine TBCT. Eight studies scored a level of evidence of 2b and one of 3b. Current literature has predominantly suboptimal design to prove terminally that the routine use of TBCT results in improved survival of blunt high-energy trauma patients. TBCT can give a change of treatment and improves time intervals in the emergency department as compared with its selective use.

  20. Associations of Total and Domain-Specific Sedentary Time With Type 2 Diabetes in Taiwanese Older Adults

    Directory of Open Access Journals (Sweden)

    Ming-Chun Hsueh

    2016-07-01

    Full Text Available Background: The increasing prevalence of type 2 diabetes in older adults has become a public health concern. We investigated the associations of total and domain-specific sedentary time with risk of type 2 diabetes in older adults. Methods: The sample comprised 1046 older people (aged ≥65 years. Analyses were performed using crosssectional data collected via computer-assisted telephone-based interviews in 2014. Data on six self-reported domains of sedentary time (Measure of Older Adults’ Sedentary Time, type 2 diabetes status, and sociodemographic variables were included in the study. Binary logistic regression analysis was performed to calculate the adjusted odds ratios (ORs and 95% confidence intervals (CIs for total and individual sedentary behavior components and likelihood of type 2 diabetes. Results: A total of 17.5% of the participants reported type 2 diabetes. No significant associations were found between total sitting time and risk of type 2 diabetes, after controlling for confounding factors. After total sedentary behavior was stratified into six domains, only watching television for more than 2 hours per day was associated with higher odds of type 2 diabetes (OR 1.56; 95% CI, 1.10–2.21, but no significant associations were found between other domains of sedentary behavior (computer use, reading, socializing, transport, and hobbies and risk of type 2 diabetes. Conclusions: These findings suggest that, among domain-specific sedentary behavior, excessive television viewing might increase the risk of type 2 diabetes among older adults more than other forms of sedentary behavior.

  1. Novel crystal timing calibration method based on total variation

    Science.gov (United States)

    Yu, Xingjian; Isobe, Takashi; Watanabe, Mitsuo; Liu, Huafeng

    2016-11-01

    A novel crystal timing calibration method based on total variation (TV), abbreviated as ‘TV merge’, has been developed for a high-resolution positron emission tomography (PET) system. The proposed method was developed for a system with a large number of crystals, it can provide timing calibration at the crystal level. In the proposed method, the timing calibration process was formulated as a linear problem. To robustly optimize the timing resolution, a TV constraint was added to the linear equation. Moreover, to solve the computer memory problem associated with the calculation of the timing calibration factors for systems with a large number of crystals, the merge component was used for obtaining the crystal level timing calibration values. Compared with other conventional methods, the data measured from a standard cylindrical phantom filled with a radioisotope solution was sufficient for performing a high-precision crystal-level timing calibration. In this paper, both simulation and experimental studies were performed to demonstrate the effectiveness and robustness of the TV merge method. We compare the timing resolutions of a 22Na point source, which was located in the field of view (FOV) of the brain PET system, with various calibration techniques. After implementing the TV merge method, the timing resolution improved from 3.34 ns at full width at half maximum (FWHM) to 2.31 ns FWHM.

  2. Lot-Order Assignment Applying Priority Rules for the Single-Machine Total Tardiness Scheduling with Nonnegative Time-Dependent Processing Times

    Directory of Open Access Journals (Sweden)

    Jae-Gon Kim

    2015-01-01

    Full Text Available Lot-order assignment is to assign items in lots being processed to orders to fulfill the orders. It is usually performed periodically for meeting the due dates of orders especially in a manufacturing industry with a long production cycle time such as the semiconductor manufacturing industry. In this paper, we consider the lot-order assignment problem (LOAP with the objective of minimizing the total tardiness of the orders with distinct due dates. We show that we can solve the LOAP optimally by finding an optimal sequence for the single-machine total tardiness scheduling problem with nonnegative time-dependent processing times (SMTTSP-NNTDPT. Also, we address how the priority rules for the SMTTSP can be modified to those for the SMTTSP-NNTDPT to solve the LOAP. In computational experiments, we discuss the performances of the suggested priority rules and show the result of the proposed approach outperforms that of the commercial optimization software package.

  3. Timing the total reflection of light

    International Nuclear Information System (INIS)

    Chauvat, Dominique; Bonnet, Christophe; Dunseath, Kevin; Emile, Olivier; Le Floch, Albert

    2005-01-01

    We have identified for the first time the absolute delay at total reflection, envisioned by Newton. We show that there are in fact two divergent Wigner delays, depending on the polarisation of the incident light. These measurements give a new insight on the passage from total reflection to refraction

  4. Development of a totally computer-controlled triple quadrupole mass spectrometer system

    International Nuclear Information System (INIS)

    Wong, C.M.; Crawford, R.W.; Barton, V.C.; Brand, H.R.; Neufeld, K.W.; Bowman, J.E.

    1983-01-01

    A totally computer-controlled triple quadrupole mass spectrometer (TQMS) is described. It has a number of unique features not available on current commercial instruments, including: complete computer control of source and all ion axial potentials; use of dual computers for data acquisition and data processing; and capability for self-adaptive control of experiments. Furthermore, it has been possible to produce this instrument at a cost significantly below that of commercial instruments. This triple quadrupole mass spectrometer has been constructed using components commercially available from several different manufacturers. The source is a standard Hewlett-Packard 5985B GC/MS source. The two quadrupole analyzers and the quadrupole CAD region contain Balzers QMA 150 rods with Balzers QMG 511 rf controllers for the analyzers and a Balzers QHS-511 controller for the CAD region. The pulsed-positive-ion-negative-ion-chemical ionization (PPINICI) detector is made by Finnigan Corporation. The mechanical and electronics design were developed at LLNL for linking these diverse elements into a functional TQMS as described. The computer design for total control of the system is unique in that two separate LSI-11/23 minicomputers and assorted I/O peripherals and interfaces from several manufacturers are used. The evolution of this design concept from totally computer-controlled instrumentation into future self-adaptive or ''expert'' systems for instrumental analysis is described. Operational characteristics of the instrument and initial results from experiments involving the analysis of the high explosive HMX (1,3,5,7-Tetranitro-1,3,5,7-Tetrazacyclooctane) are presented

  5. 12 CFR 1780.11 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1780.11 Section 1780.11 Banks... time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event that commences the designated period of time is not included. The last day so...

  6. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  7. 6 CFR 13.27 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Computation of time. 13.27 Section 13.27 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.27 Computation of time. (a) In computing any period of time under this part or in an order issued...

  8. Noise-constrained switching times for heteroclinic computing

    Science.gov (United States)

    Neves, Fabio Schittler; Voit, Maximilian; Timme, Marc

    2017-03-01

    Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states—the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for

  9. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  10. Effects of computing time delay on real-time control systems

    Science.gov (United States)

    Shin, Kang G.; Cui, Xianzhong

    1988-01-01

    The reliability of a real-time digital control system depends not only on the reliability of the hardware and software used, but also on the speed in executing control algorithms. The latter is due to the negative effects of computing time delay on control system performance. For a given sampling interval, the effects of computing time delay are classified into the delay problem and the loss problem. Analysis of these two problems is presented as a means of evaluating real-time control systems. As an example, both the self-tuning predicted (STP) control and Proportional-Integral-Derivative (PID) control are applied to the problem of tracking robot trajectories, and their respective effects of computing time delay on control performance are comparatively evaluated. For this example, the STP (PID) controller is shown to outperform the PID (STP) controller in coping with the delay (loss) problem.

  11. Trends in television and computer/videogame use and total screen time in high school students from Caruaru city, Pernambuco, Brazil: A repeated panel study between 2007 and 2012

    Directory of Open Access Journals (Sweden)

    Luis José Lagos Aros

    2018-01-01

    Full Text Available Abstract Aim: to analyze the pattern and trends of use of screen-based devices and associated factors from two surveys conducted on public high school students in Caruaru-PE. Methods: two representative school-based cross-sectional surveys conducted in 2007 (n=600 and 2012 (n=715 on high school students (15-20 years old. The time of exposure to television (TV and computer/videogames PC/VG was obtained through a validated questionnaire, and ≥3 hours/day was considered as being excessive exposure. The independent variables were socioeconomic status, school related, and physical activity. Crude and adjusted binary logistic regression were employed to examine the factors associated with screen time. The statistical significance was set at p<0.05. Results: There was a significant reduction in TV time on weekdays and total weekly, but no change in the prevalence of excessive exposure. The proportion of exposure to PC/VG of ≥3 hours/day increased 182.5% on weekdays and 69.5% on weekends (p <0.05. In 2007, being physically active was the only protection factor for excessive exposure to total screen time. In 2012, girls presented less chance of excessive exposure to all screen-based devices and total screen time. Other protective factors were studying at night and being physically active (PC/VG time, while residing in an urban area [OR 5.03(2.77-7.41] and having higher family income [OR 1.55(1.04-2.30] were risk factors. Conclusion: Significant and important changes in the time trends and pattern of use PC/VG were observed during the interval of 5 years. This rapid increase could be associated with increased family income and improved access to these devices, driven by technological developments.

  12. CONTEMPORARY VIEW ON COMPUTER NAVIGATION USING AT PRIMARY KNEE TOTAL REPLACEMENT (REVIEW

    Directory of Open Access Journals (Sweden)

    A. I. Petukhov

    2010-01-01

    Full Text Available The topical questions of optical computer navigation at knee total arthroplasty are widely covered. The indications, contraindication to use, using features and possible complications of this technique are listed. The analysis of literature data makes it clear that computer navigation assists in the accuracy of endoprosthesis implantation that may to decrease the rate of revision surgeries in future.

  13. Television Viewing, Computer Use, Time Driving and All‐Cause Mortality: The SUN Cohort

    Science.gov (United States)

    Basterra‐Gortari, Francisco Javier; Bes‐Rastrollo, Maira; Gea, Alfredo; Núñez‐Córdoba, Jorge María; Toledo, Estefanía; Martínez‐González, Miguel Ángel

    2014-01-01

    Background Sedentary behaviors have been directly associated with all‐cause mortality. However, little is known about different types of sedentary behaviors in relation to overall mortality. Our objective was to assess the association between different sedentary behaviors and all‐cause mortality. Methods and Results In this prospective, dynamic cohort study (the SUN Project) 13 284 Spanish university graduates with a mean age of 37 years were followed‐up for a median of 8.2 years. Television, computer, and driving time were assessed at baseline. Poisson regression models were fitted to examine the association between each sedentary behavior and total mortality. All‐cause mortality incidence rate ratios (IRRs) per 2 hours per day were 1.40 (95% confidence interval (CI): 1.06 to 1.84) for television viewing, 0.96 (95% CI: 0.79 to 1.18) for computer use, and 1.14 (95% CI: 0.90 to 1.44) for driving, after adjustment for age, sex, smoking status, total energy intake, Mediterranean diet adherence, body mass index, and physical activity. The risk of mortality was twofold higher for participants reporting ≥3 h/day of television viewing than for those reporting Television viewing was directly associated with all‐cause mortality. However, computer use and time spent driving were not significantly associated with higher mortality. Further cohort studies and trials designed to assess whether reductions in television viewing are able to reduce mortality are warranted. The lack of association between computer use or time spent driving and mortality needs further confirmation. PMID:24965030

  14. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  15. Fast algorithms for computing phylogenetic divergence time.

    Science.gov (United States)

    Crosby, Ralph W; Williams, Tiffani L

    2017-12-06

    The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.

  16. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  17. 5 CFR 890.101 - Definitions; time computations.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Definitions; time computations. 890.101....101 Definitions; time computations. (a) In this part, the terms annuitant, carrier, employee, employee... in section 8901 of title 5, United States Code, and supplement the following definitions: Appropriate...

  18. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  19. 29 CFR 1921.22 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Computation of time. 1921.22 Section 1921.22 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... WORKERS' COMPENSATION ACT Miscellaneous § 1921.22 Computation of time. Sundays and holidays shall be...

  20. Manual cross check of computed dose times for motorised wedged fields

    International Nuclear Information System (INIS)

    Porte, J.

    2001-01-01

    If a mass of tissue equivalent material is exposed in turn to wedged and open radiation fields of the same size, for equal times, it is incorrect to assume that the resultant isodose pattern will be effectively that of a wedge having half the angle of the wedged field. Computer programs have been written to address the problem of creating an intermediate wedge field, commonly known as a motorized wedge. The total exposure time is apportioned between the open and wedged fields, to produce a beam modification equivalent to that of a wedged field of a given wedge angle. (author)

  1. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  2. Total variation regularization for a backward time-fractional diffusion problem

    International Nuclear Information System (INIS)

    Wang, Liyan; Liu, Jijun

    2013-01-01

    Consider a two-dimensional backward problem for a time-fractional diffusion process, which can be considered as image de-blurring where the blurring process is assumed to be slow diffusion. In order to avoid the over-smoothing effect for object image with edges and to construct a fast reconstruction scheme, the total variation regularizing term and the data residual error in the frequency domain are coupled to construct the cost functional. The well posedness of this optimization problem is studied. The minimizer is sought approximately using the iteration process for a series of optimization problems with Bregman distance as a penalty term. This iteration reconstruction scheme is essentially a new regularizing scheme with coupling parameter in the cost functional and the iteration stopping times as two regularizing parameters. We give the choice strategy for the regularizing parameters in terms of the noise level of measurement data, which yields the optimal error estimate on the iterative solution. The series optimization problems are solved by alternative iteration with explicit exact solution and therefore the amount of computation is much weakened. Numerical implementations are given to support our theoretical analysis on the convergence rate and to show the significant reconstruction improvements. (paper)

  3. Total knee arthroplasty with computer-assisted navigation: an analysis of 200 cases,

    Directory of Open Access Journals (Sweden)

    Marcus Vinicius Malheiros Luzo

    2014-04-01

    Full Text Available OBJECTIVE: to evaluate the results from surgery with computer-assisted navigation in cases of total knee arthroplasty.METHOD: a total of 196 patients who underwent total knee arthroplasty with computer-assisted navigation were evaluated. The extension and flexion spaces (gaps were evaluated during the operation and the alignment after the operation was assessed. The Knee Society Score (KSS questionnaire for assessing patient's function was applied preoperatively and postoperatively after a mean follow-up of 22 months.RESULTS: in all, 86.7% of the patients presented good alignment of the mechanical axis (less than 3◦ of varus or valgus in relation to the mechanical axis and 96.4% of the patients presented balanced flexion and extension gaps. Before the operation, 97% of the patients presented poor or insufficient KSS, but after the operation, 77.6% presented good or excellent KSS.CONCLUSION: the navigation system made it possible to achieve aligned and balanced implants, with notable functional improvement among the patients. It was found to be useful in assessing, understanding and improving knowledge in relation to performing arthroplasty procedures.

  4. Hybrid Cloud Computing Architecture Optimization by Total Cost of Ownership Criterion

    Directory of Open Access Journals (Sweden)

    Elena Valeryevna Makarenko

    2014-12-01

    Full Text Available Achieving the goals of information security is a key factor in the decision to outsource information technology and, in particular, to decide on the migration of organizational data, applications, and other resources to the infrastructure, based on cloud computing. And the key issue in the selection of optimal architecture and the subsequent migration of business applications and data to the cloud organization information environment is the question of the total cost of ownership of IT infrastructure. This paper focuses on solving the problem of minimizing the total cost of ownership cloud.

  5. Time-Predictable Computer Architecture

    Directory of Open Access Journals (Sweden)

    Schoeberl Martin

    2009-01-01

    Full Text Available Today's general-purpose processors are optimized for maximum throughput. Real-time systems need a processor with both a reasonable and a known worst-case execution time (WCET. Features such as pipelines with instruction dependencies, caches, branch prediction, and out-of-order execution complicate WCET analysis and lead to very conservative estimates. In this paper, we evaluate the issues of current architectures with respect to WCET analysis. Then, we propose solutions for a time-predictable computer architecture. The proposed architecture is evaluated with implementation of some features in a Java processor. The resulting processor is a good target for WCET analysis and still performs well in the average case.

  6. Recent achievements in real-time computational seismology in Taiwan

    Science.gov (United States)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information ROS completes a 3D simulation real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  7. 7 CFR 1.603 - How are time periods computed?

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false How are time periods computed? 1.603 Section 1.603... Licenses General Provisions § 1.603 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2...

  8. Instruction timing for the CDC 7600 computer

    International Nuclear Information System (INIS)

    Lipps, H.

    1975-01-01

    This report provides timing information for all instructions of the Control Data 7600 computer, except for instructions of type 01X, to enable the optimization of 7600 programs. The timing rules serve as background information for timing charts which are produced by a program (TIME76) of the CERN Program Library. The rules that co-ordinate the different sections of the CPU are stated in as much detail as is necessary to time the flow of instructions for a given sequence of code. Instruction fetch, instruction issue, and access to small core memory are treated at length, since details are not available from the computer manuals. Annotated timing charts are given for 24 examples, chosen to display the full range of timing considerations. (Author)

  9. 50 CFR 221.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How are time periods computed? 221.3... Provisions § 221.3 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2) The last day of the...

  10. A general method for computing the total solar radiation force on complex spacecraft structures

    Science.gov (United States)

    Chan, F. K.

    1981-01-01

    The method circumvents many of the existing difficulties in computational logic presently encountered in the direct analytical or numerical evaluation of the appropriate surface integral. It may be applied to complex spacecraft structures for computing the total force arising from either specular or diffuse reflection or even from non-Lambertian reflection and re-radiation.

  11. Artroplastia total do joelho assistida por computador Computer-assisted knee total arthroplasty

    Directory of Open Access Journals (Sweden)

    Roberto Freire da Mota e Albuquerque

    2006-01-01

    Full Text Available Um dos avanços tecnológicos mais significativos da medicina atual é a cirurgia assistida por computador, sendo que na ortopedia uma das aplicações mais importantes dessa tecnologia é na artroplastia do joelho. A principal contribuição da cirurgia ortopédica assistida por computador (Computer Aided Orthopaedic Surgery’s - CAOS na artroplastia do joelho é o seu potencial em aprimorar a precisão da implantação das próteses e do alinhamento do membro operado contribuindo para a otimização e longevidade dos resultados. A navegação independente de imagens, baseada em referências anatômicas adquiridas durante o ato cirúrgico através de transmissores de raios infra-vermelho, tem sido a técnica preponderante na artroplastia do joelho. Utilizamos o sistema de navegação para artroplastia total do joelho "OrthoPilot" versão 2.2 para a colocação de 72 próteses de joelho "Search Evolution" da "Aesculap AG CO. KG" com ou sem estabilização posterior em uma série contínua. O objetivo foi aferir a precisão do alinhamento obtido com a navegação através de radiografias panorâmicas obtidas no período pós-operatório. Obtivemos um desvio médio do eixo mecânico nulo de 0,66º com desvio padrão de 0,7º, sendo que 98,6% dos joelhos ficaram dentro de uma margem de erro de 3º e 79,2% com erro inferior a 1º. Concluímos que o sistema é seguro e preciso, não adicionando morbidade à cirurgia convencional.One of the most significant technological advancements in current medicine is the computer-assisted surgery, which, for orthopaedics, one of the major uses of this technology is in knee arthroplasty. The main contribution provided by computer-assisted orthopaedic surgery (CAOS to knee arthroplasty is its potential to improve prosthesis implant precision and the operated limb alignment, contributing to results optimization and longevity. The image-independent navigation, based on anatomical references acquired during surgical

  12. Computer Package for Graphite Total Cross-Section Calculations

    International Nuclear Information System (INIS)

    Adib, M.; Fathalla, M.

    2008-01-01

    An additive formula is given which allows calculating the contribution of the total neut.>neutron transmission through crystalline graphite. The formula takes into account the graphite form of poly or pyrolytic crystals and its parameters. Computer package Graphite has been designed in order to provide the required calculations in the neutron energy range from 0.1 MeV to 10 eV. The package includes three codes: PCG (Polycrystalline Graphite), PG (Pyrolytic Graphite) and HOPG (Highly Oriented Pyrolytic Graphite) for calculating neutron transmission through fine graphite powder (polycrystalline), neutron transmission and removal coefficient of PG crystal in terms of its mosaic spread for neutrons incident along its c-axis and the transmission of neutrons incident on HOPG crystal at different angles, respectively. For comparison of the experimental neutron transmission data with the calculated values, the program takes into consideration the effect of both wavelength and neutron beam divergence in either 2 constant wavelength spread mode (δλ=constant) or constant wavelength resolution mode (δλ/λ=constant). In order to check the validity for application of computer package Graphite in cross-section calculations, a comparison between calculated values with the available experimental data were carried out. An overall agreement is indicated with an accuracy sufficient for determine the neutron transmission characteristics

  13. Project Energise: Using participatory approaches and real time computer prompts to reduce occupational sitting and increase work time physical activity in office workers.

    Science.gov (United States)

    Gilson, Nicholas D; Ng, Norman; Pavey, Toby G; Ryde, Gemma C; Straker, Leon; Brown, Wendy J

    2016-11-01

    This efficacy study assessed the added impact real time computer prompts had on a participatory approach to reduce occupational sedentary exposure and increase physical activity. Quasi-experimental. 57 Australian office workers (mean [SD]; age=47 [11] years; BMI=28 [5]kg/m 2 ; 46 men) generated a menu of 20 occupational 'sit less and move more' strategies through participatory workshops, and were then tasked with implementing strategies for five months (July-November 2014). During implementation, a sub-sample of workers (n=24) used a chair sensor/software package (Sitting Pad) that gave real time prompts to interrupt desk sitting. Baseline and intervention sedentary behaviour and physical activity (GENEActiv accelerometer; mean work time percentages), and minutes spent sitting at desks (Sitting Pad; mean total time and longest bout) were compared between non-prompt and prompt workers using a two-way ANOVA. Workers spent close to three quarters of their work time sedentary, mostly sitting at desks (mean [SD]; total desk sitting time=371 [71]min/day; longest bout spent desk sitting=104 [43]min/day). Intervention effects were four times greater in workers who used real time computer prompts (8% decrease in work time sedentary behaviour and increase in light intensity physical activity; pcomputer prompts facilitated the impact of a participatory approach on reductions in occupational sedentary exposure, and increases in physical activity. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  14. Real-time computational photon-counting LiDAR

    Science.gov (United States)

    Edgar, Matthew; Johnson, Steven; Phillips, David; Padgett, Miles

    2018-03-01

    The availability of compact, low-cost, and high-speed MEMS-based spatial light modulators has generated widespread interest in alternative sampling strategies for imaging systems utilizing single-pixel detectors. The development of compressed sensing schemes for real-time computational imaging may have promising commercial applications for high-performance detectors, where the availability of focal plane arrays is expensive or otherwise limited. We discuss the research and development of a prototype light detection and ranging (LiDAR) system via direct time of flight, which utilizes a single high-sensitivity photon-counting detector and fast-timing electronics to recover millimeter accuracy three-dimensional images in real time. The development of low-cost real time computational LiDAR systems could have importance for applications in security, defense, and autonomous vehicles.

  15. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  16. Radiologic total lung capacity measurement. Development and evaluation of a computer-based system

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Mazzeo, J.; Borgstrom, M.; Hunter, T.B.; Newell, J.D.; Bjelland, J.C.

    1986-11-01

    The development of a computer-based radiologic total lung capacity (TLC) measurement system designed to be used by non-physician personnel is detailed. Four operators tested the reliability and validity of the system by measuring inspiratory PA and lateral pediatric chest radiographs with a Graf spark pen interfaced to a DEC VAX 11/780 computer. First results suggest that the ultimate goal of developing an accurate and easy to use TLC measurement system for non-physician personnel is attainable.

  17. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  18. Current Role of Computer Navigation in Total Knee Arthroplasty.

    Science.gov (United States)

    Jones, Christopher W; Jerabek, Seth A

    2018-01-31

    Computer-assisted surgical (CAS) navigation has been developed with the aim of improving the accuracy and precision of total knee arthroplasty (TKA) component positioning and therefore overall limb alignment. The historical goal of knee arthroplasty has been to restore the mechanical alignment of the lower limb by aligning the femoral and tibial components perpendicular to the mechanical axis of the femur and tibia. Despite over 4 decades of TKA component development and nearly 2 decades of interest in CAS, the fundamental question remains; does the alignment goal and/or the method of achieving that goal affect the outcome of the TKA in terms of patient-reported outcome measures and/or overall survivorship? The quest for reliable and reproducible achievement of the intraoperative alignment goal has been the primary motivator for the introduction, development, and refinement of CAS navigation. Numerous proprietary systems now exist, and rapid technological advancements in computer processing power are stimulating further development of robotic surgical systems. Three categories of CAS can be defined: image-based large-console navigation; imageless large-console navigation, and more recently, accelerometer-based handheld navigation systems have been developed. A review of the current literature demonstrates that there are enough well-designed studies to conclude that both large-console CAS and handheld navigation systems improve the accuracy and precision of component alignment in TKA. However, missing from the evidence base, other than the subgroup analysis provided by the Australian Orthopaedic Association National Joint Replacement Registry, are any conclusive demonstrations of a clinical superiority in terms of improved patient-reported outcome measures and/or decreased cumulative revision rates in the long term. Few authors would argue that accuracy of alignment is a goal to ignore; therefore, in the absence of clinical evidence, many of the arguments against

  19. Minimizing total weighted tardiness for the single machine scheduling problem with dependent setup time and precedence constraints

    Directory of Open Access Journals (Sweden)

    Hamidreza Haddad

    2012-04-01

    Full Text Available This paper tackles the single machine scheduling problem with dependent setup time and precedence constraints. The primary objective of this paper is minimization of total weighted tardiness. Since the complexity of the resulted problem is NP-hard we use metaheuristics method to solve the resulted model. The proposed model of this paper uses genetic algorithm to solve the problem in reasonable amount of time. Because of high sensitivity of GA to its initial values of parameters, a Taguchi approach is presented to calibrate its parameters. Computational experiments validate the effectiveness and capability of proposed method.

  20. Time-of-Flight Cameras in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2010-01-01

    Computer Graphics, Computer Vision and Human Machine Interaction (HMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real-time geometry...

  1. 29 CFR 4245.8 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Computation of time. 4245.8 Section 4245.8 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION INSOLVENCY, REORGANIZATION, TERMINATION, AND OTHER RULES APPLICABLE TO MULTIEMPLOYER PLANS NOTICE OF INSOLVENCY § 4245.8 Computation of...

  2. Total quality through computer integrated manufacturing in the pharmaceutical industry.

    Science.gov (United States)

    Ufret, C M

    1995-01-01

    The role of Computer Integrated Manufacturing (CIM) in the pursue of total quality in pharmaceutical manufacturing is assessed. CIM key objectives, design criteria, and performance measurements, in addition to its scope and implementation in a hierarchical structure, are explored in detail. Key elements for the success of each phase in a CIM project and a brief status of current CIM implementations in the pharmaceutical industry are presented. The role of World Class Manufacturing performance standards and other key issues to achieve full CIM benefits are also addressed.

  3. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  4. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  5. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    International Nuclear Information System (INIS)

    Lent, Wineke A.M. van; Deetman, Joost W.; Teertstra, H. Jelle; Muller, Sara H.; Hans, Erwin W.; Harten, Wim H. van

    2012-01-01

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  6. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling

    Directory of Open Access Journals (Sweden)

    Eric R. Edelman

    2017-06-01

    Full Text Available For efficient utilization of operating rooms (ORs, accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT. We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT. TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related

  8. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling.

    Science.gov (United States)

    Edelman, Eric R; van Kuijk, Sander M J; Hamaekers, Ankie E W; de Korte, Marcel J M; van Merode, Godefridus G; Buhre, Wolfgang F F A

    2017-01-01

    For efficient utilization of operating rooms (ORs), accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT) per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT) and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA) physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT). We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT). TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related benefits.

  9. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  10. The relationship between TV/computer time and adolescents' health-promoting behavior: a secondary data analysis.

    Science.gov (United States)

    Chen, Mei-Yen; Liou, Yiing-Mei; Wu, Jen-Yee

    2008-03-01

    Television and computers provide significant benefits for learning about the world. Some studies have linked excessive television (TV) watching or computer game playing to disadvantage of health status or some unhealthy behavior among adolescents. However, the relationships between watching TV/playing computer games and adolescents adopting health promoting behavior were limited. This study aimed to discover the relationship between time spent on watching TV and on leisure use of computers and adolescents' health promoting behavior, and associated factors. This paper used secondary data analysis from part of a health promotion project in Taoyuan County, Taiwan. A cross-sectional design was used and purposive sampling was conducted among adolescents in the original project. A total of 660 participants answered the questions appropriately for this work between January and June 2004. Findings showed the mean age of the respondents was 15.0 +/- 1.7 years. The mean numbers of TV watching hours were 2.28 and 4.07 on weekdays and weekends respectively. The mean hours of leisure (non-academic) computer use were 1.64 and 3.38 on weekdays and weekends respectively. Results indicated that adolescents spent significant time watching TV and using the computer, which was negatively associated with adopting health-promoting behaviors such as life appreciation, health responsibility, social support and exercise behavior. Moreover, being boys, being overweight, living in a rural area, and being middle-school students were significantly associated with spending long periods watching TV and using the computer. Therefore, primary health care providers should record the TV and non-academic computer time of youths when conducting health promotion programs, and educate parents on how to become good and healthy electronic media users.

  11. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  12. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  13. Total Productive Maintenance at Paccar INC

    OpenAIRE

    Ştefan Farkas

    2010-01-01

    This paper reports the application of total productive maintenance method at Paccar Inc. truck’s plant in Victoria, Australia. The total productive maintenance method and total productive maintenance house are presented. The global equipment effectiveness is computed and exemplified. The production structure and organising maintenance are presented. Resultas of the variation of global equipment effectiveness and autonomous maintenance in a two weeks period of time are reported.

  14. Application of total care time and payment per unit time model for physician reimbursement for common general surgery operations.

    Science.gov (United States)

    Chatterjee, Abhishek; Holubar, Stefan D; Figy, Sean; Chen, Lilian; Montagne, Shirley A; Rosen, Joseph M; Desimone, Joseph P

    2012-06-01

    The relative value unit system relies on subjective measures of physician input in the care of patients. A payment per unit time model incorporates surgeon reimbursement to the total care time spent in the operating room, postoperative in-house, and clinic time to define payment per unit time. We aimed to compare common general surgery operations by using the total care time and payment per unit time method in order to demonstrate a more objective measurement for physician reimbursement. Average total physician payment per case was obtained for 5 outpatient operations and 4 inpatient operations in general surgery. Total care time was defined as the sum of operative time, 30 minutes per hospital day, and 30 minutes per office visit for each operation. Payment per unit time was calculated by dividing the physician reimbursement per case by the total care time. Total care time, physician payment per case, and payment per unit time for each type of operation demonstrated that an average payment per time spent for inpatient operations was $455.73 and slightly more at $467.51 for outpatient operations. Partial colectomy with primary anastomosis had the longest total care time (8.98 hours) and the least payment per unit time ($188.52). Laparoscopic gastric bypass had the highest payment per time ($707.30). The total care time and payment per unit time method can be used as an adjunct to compare reimbursement among different operations on an institutional level as well as on a national level. Although many operations have similar payment trends based on time spent by the surgeon, payment differences using this methodology are seen and may be in need of further review. Copyright © 2012 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Real-Time Thevenin Impedance Computation

    DEFF Research Database (Denmark)

    Sommer, Stefan Horst; Jóhannsson, Hjörtur

    2013-01-01

    operating state, and strict time constraints are difficult to adhere to as the complexity of the grid increases. Several suggested approaches for real-time stability assessment require Thevenin impedances to be determined for the observed system conditions. By combining matrix factorization, graph reduction......, and parallelization, we develop an algorithm for computing Thevenin impedances an order of magnitude faster than previous approaches. We test the factor-and-solve algorithm with data from several power grids of varying complexity, and we show how the algorithm allows realtime stability assessment of complex power...

  16. Spying on real-time computers to improve performance

    International Nuclear Information System (INIS)

    Taff, L.M.

    1975-01-01

    The sampled program-counter histogram, an established technique for shortening the execution times of programs, is described for a real-time computer. The use of a real-time clock allows particularly easy implementation. (Auth.)

  17. Total sitting time, leisure time physical activity and risk of hospitalization due to low back pain

    DEFF Research Database (Denmark)

    Balling, Mie; Holmberg, Teresa; Petersen, Christina B

    2018-01-01

    AIMS: This study aimed to test the hypotheses that a high total sitting time and vigorous physical activity in leisure time increase the risk of low back pain and herniated lumbar disc disease. METHODS: A total of 76,438 adults answered questions regarding their total sitting time and physical...... activity during leisure time in the Danish Health Examination Survey 2007-2008. Information on low back pain diagnoses up to 10 September 2015 was obtained from The National Patient Register. The mean follow-up time was 7.4 years. Data were analysed using Cox regression analysis with adjustment...... disc disease. However, moderate or vigorous physical activity, as compared to light physical activity, was associated with increased risk of low back pain (HR = 1.16, 95% CI: 1.03-1.30 and HR = 1.45, 95% CI: 1.15-1.83). Moderate, but not vigorous physical activity was associated with increased risk...

  18. A Distributed Computing Network for Real-Time Systems.

    Science.gov (United States)

    1980-11-03

    7 ) AU2 o NAVA TUNDEWATER SY$TEMS CENTER NEWPORT RI F/G 9/2 UIS RIBUT E 0 COMPUTIN G N LTWORK FOR REAL - TIME SYSTEMS .(U) UASSIFIED NOV Al 6 1...MORAIS - UT 92 dLEVEL c A Distributed Computing Network for Real - Time Systems . 11 𔃺-1 Gordon E/Morson I7 y tm- ,r - t "en t As J 2 -p .. - 7 I’ cNaval...NUMBER TD 5932 / N 4. TITLE mand SubotI. S. TYPE OF REPORT & PERIOD COVERED A DISTRIBUTED COMPUTING NETWORK FOR REAL - TIME SYSTEMS 6. PERFORMING ORG

  19. Total Productive Maintenance at Paccar INC

    Directory of Open Access Journals (Sweden)

    Ştefan Farkas

    2010-06-01

    Full Text Available This paper reports the application of total productive maintenance method at Paccar Inc. truck’s plant in Victoria, Australia. The total productive maintenance method and total productive maintenance house are presented. The global equipment effectiveness is computed and exemplified. The production structure and organising maintenance are presented. Resultas of the variation of global equipment effectiveness and autonomous maintenance in a two weeks period of time are reported.

  20. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  1. 43 CFR 45.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false How are time periods computed? 45.3... IN FERC HYDROPOWER LICENSES General Provisions § 45.3 How are time periods computed? (a) General... run is not included. (2) The last day of the period is included. (i) If that day is a Saturday, Sunday...

  2. Relativistic Photoionization Computations with the Time Dependent Dirac Equation

    Science.gov (United States)

    2016-10-12

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6795--16-9698 Relativistic Photoionization Computations with the Time Dependent Dirac... Photoionization Computations with the Time Dependent Dirac Equation Daniel F. Gordon and Bahman Hafizi Naval Research Laboratory 4555 Overlook Avenue, SW...Unclassified Unlimited Unclassified Unlimited 22 Daniel Gordon (202) 767-5036 Tunneling Photoionization Ionization of inner shell electrons by laser

  3. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  4. Computer-controlled neutron time-of-flight spectrometer. Part II

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1979-12-01

    A time-of-flight spectrometer for neutron inelastic scattering research has been interfaced to a PDP-15/30 computer. The computer is used for experimental data acquisition and analysis and for apparatus control. This report was prepared to summarize the functions of the computer and to act as a users' guide to the software system

  5. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  6. Heterogeneous real-time computing in radio astronomy

    Science.gov (United States)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  7. Ubiquitous computing technology for just-in-time motivation of behavior change.

    Science.gov (United States)

    Intille, Stephen S

    2004-01-01

    This paper describes a vision of health care where "just-in-time" user interfaces are used to transform people from passive to active consumers of health care. Systems that use computational pattern recognition to detect points of decision, behavior, or consequences automatically can present motivational messages to encourage healthy behavior at just the right time. Further, new ubiquitous computing and mobile computing devices permit information to be conveyed to users at just the right place. In combination, computer systems that present messages at the right time and place can be developed to motivate physical activity and healthy eating. Computational sensing technologies can also be used to measure the impact of the motivational technology on behavior.

  8. Whole blood coagulation time, haematocrit, haemoglobin and total ...

    African Journals Online (AJOL)

    The study was carried out to determine the values of whole blood coagulation time (WBCT), haematocrit (HM), haemaglobin (HB) and total protein (TP) of one hundred and eighteen apparently healthy turkeys reared under an extensive management system in Zaria. The mean values for WBCT, HM, HB and TP were 1.12 ...

  9. Simplified neural networks for solving linear least squares and total least squares problems in real time.

    Science.gov (United States)

    Cichocki, A; Unbehauen, R

    1994-01-01

    In this paper a new class of simplified low-cost analog artificial neural networks with on chip adaptive learning algorithms are proposed for solving linear systems of algebraic equations in real time. The proposed learning algorithms for linear least squares (LS), total least squares (TLS) and data least squares (DLS) problems can be considered as modifications and extensions of well known algorithms: the row-action projection-Kaczmarz algorithm and/or the LMS (Adaline) Widrow-Hoff algorithms. The algorithms can be applied to any problem which can be formulated as a linear regression problem. The correctness and high performance of the proposed neural networks are illustrated by extensive computer simulation results.

  10. Continuous-Time Symmetric Hopfield Nets are Computationally Universal

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 3 (2003), s. 693-733 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : continuous-time Hopfield network * Liapunov function * analog computation * computational power * Turing universality Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  11. Does antegrade JJ stenting affect the total operative time during laparoscopic pyeloplasty?

    Science.gov (United States)

    Bolat, Mustafa Suat; Çınar, Önder; Akdeniz, Ekrem

    2017-12-01

    We aimed to show the effect of retrograde JJ stenting and intraoperative antegrade JJ stenting techniques on operative time in patients who underwent laparoscopic pyeloplasty. A total of 34 patients were retrospectively investigated (15 male and 19 female) with ureteropelvic junction obstruction. Of the patients stentized under local anesthesia preoperatively, as a part of surgery, 15 were retrogradely stentized at the beginning of the procedure (Group 1), and 19 were antegradely stentized during the procedure (Group 2). A transperitoneal dismembered pyeloplasty technique was performed in all patients. The two groups were retrospectively compared in terms of complications, the mean total operative time, and the mean stenting times. The mean ages of the patients were 31.5±15.5 and 33.2±15.5 years (p=0.09), and the mean body mass indexes were 25.8±5.6 and 26.2.3±8.4 kg/m 2 in Group 1 and Group 2, respectively. The mean total operative times were 128.9±38.9 min and 112.7±21.9 min (p=0.04); the mean stenting times were 12.6±5.4 min and 3.5±2.4 min (p=0.02); and the mean rates of catheterization-to-total surgery times were 0.1 and 0.03 (p=0.01) in Group 1 and 2, respectively. The mean hospital stays and the mean anastomosis times were similar between the two groups (p>0.05). Antegrade JJ stenting during laparoscopic pyeloplasty significantly decreased the total operative time.

  12. An empirical method for peak-to-total ratio computation of a gamma-ray detector

    International Nuclear Information System (INIS)

    Cesana, A.; Terrani, M.

    1989-01-01

    A simple expression for peak-to-total ratio evaluation of gamma-ray detectors in the energy range 0.3-10 MeV is proposed. The quantities one needs to know for the computation are: Detector dimensions and chemical composition, photon corss sections and an empirical energy dependent function which is valid for all the detector materials considered. This procedure seems able to produce peak-to-total values with an accuracy comparable with the most sophisticated Monte Carlo calculations. It has been tested using experimental peak-to-total values of Ge, NaI, CsI and BGO detectors but it is reasonable to suppose that it is valid for any detector material. (orig.)

  13. Multiscale Space-Time Computational Methods for Fluid-Structure Interactions

    Science.gov (United States)

    2015-09-13

    thermo-fluid analysis of a ground vehicle and its tires ST-SI Computational Analysis of a Vertical - Axis Wind Turbine We have successfully...of a vertical - axis wind turbine . Multiscale Compressible-Flow Computation with Particle Tracking We have successfully tested the multiscale...Tezduyar, Spenser McIntyre, Nikolay Kostov, Ryan Kolesar, Casey Habluetzel. Space–time VMS computation of wind - turbine rotor and tower aerodynamics

  14. 29 CFR 779.253 - What is included in computing the total annual inflow volume.

    Science.gov (United States)

    2010-07-01

    ... FAIR LABOR STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES Employment to Which the Act May... taxes and other charges which the enterprise must pay for such goods. Generally, all charges will be... computing the total annual inflow volume. The goods which the establishment purchases or receives for resale...

  15. 12 CFR Appendix K to Part 226 - Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions

    Science.gov (United States)

    2010-01-01

    ... Appendix K to Part 226—Total Annual Loan Cost Rate Computations for Reverse Mortgage Transactions (a... loan cost rate for various transactions, as well as instructions, explanations, and examples for.... (2) Term of the transaction. For purposes of total annual loan cost disclosures, the term of a...

  16. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  17. Objectively Measured Total and Occupational Sedentary Time in Three Work Settings

    Science.gov (United States)

    van Dommelen, Paula; Coffeng, Jennifer K.; van der Ploeg, Hidde P.; van der Beek, Allard J.; Boot, Cécile R. L.; Hendriksen, Ingrid J. M.

    2016-01-01

    Background Sedentary behaviour increases the risk for morbidity. Our primary aim is to determine the proportion and factors associated with objectively measured total and occupational sedentary time in three work settings. Secondary aim is to study the proportion of physical activity and prolonged sedentary bouts. Methods Data were obtained using ActiGraph accelerometers from employees of: 1) a financial service provider (n = 49 men, 31 women), 2) two research institutes (n = 30 men, 57 women), and 3) a construction company (n = 38 men). Total (over the whole day) and occupational sedentary time, physical activity and prolonged sedentary bouts (lasting ≥30 minutes) were calculated by work setting. Linear regression analyses were performed to examine general, health and work-related factors associated with sedentary time. Results The employees of the financial service provider and the research institutes spent 76–80% of their occupational time in sedentary behaviour, 18–20% in light intensity physical activity and 3–5% in moderate-to-vigorous intensity physical activity. Occupational time in prolonged sedentary bouts was 27–30%. Total time was less sedentary (64–70%), and had more light intensity physical activity (26–33%). The employees of the construction company spent 44% of their occupational time in sedentary behaviour, 49% in light, and 7% in moderate intensity physical activity, and spent 7% in sedentary bouts. Total time spent in sedentary behavior was 56%, 40% in light, and 4% in moderate intensity physical behaviour, and 12% in sedentary bouts. For women, low to intermediate education was the only factor that was negatively associated with occupational sedentary time. Conclusions Sedentary behaviour is high among white-collar employees, especially in highly educated women. A relatively small proportion of sedentary time was accrued in sedentary bouts. It is recommended that worksite health promotion efforts should focus on reducing sedentary

  18. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  19. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  20. Total Work, Gender and Social Norms in EU and US Time Use

    OpenAIRE

    Burda , Michael C; Hamermesh , Daniel S; Weil , Philippe

    2008-01-01

    Using time-diary data from 27 countries, we demonstrate a negative relationship between real GDP per capita and the female-male difference in total work time--the sum of work for pay and work at home. We also show that in rich non-Catholic countries on four continents men and women do the same amount of total work on average. Our survey results demonstrate that labor economists, macroeconomists, sociologists and the general public consistently believe that women perform more total work. The f...

  1. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  2. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  3. 5 CFR 831.703 - Computation of annuities for part-time service.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Computation of annuities for part-time... part-time service. (a) Purpose. The computational method in this section shall be used to determine the annuity for an employee who has part-time service on or after April 7, 1986. (b) Definitions. In this...

  4. PROCESS INNOVATION: HOLISTIC SCENARIOS TO REDUCE TOTAL LEAD TIME

    Directory of Open Access Journals (Sweden)

    Alin POSTEUCĂ

    2015-11-01

    Full Text Available The globalization of markets requires continuous development of business holistic scenarios to ensure acceptable flexibility to satisfy customers. Continuous improvement of supply chain supposes continuous improvement of materials and products lead time and flow, material stocks and finished products stocks and increasing the number of suppliers close by as possible. The contribution of our study is to present holistic scenarios of total lead time improvement and innovation by implementing supply chain policy.

  5. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  6. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    Science.gov (United States)

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily

  7. Feasibility and safety of augmented-reality glass for computed tomography-assisted percutaneous revascularization of coronary chronic total occlusion: A single center prospective pilot study.

    Science.gov (United States)

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Staruch, Adam D; Kepka, Cezary; Rokicki, Jakub K; Sieradzki, Bartosz; Witkowski, Adam

    2017-11-01

    Percutaneous coronary intervention (PCI) of chronic total occlusion (CTO) may be facilitated by projection of coronary computed tomography angiography (CTA) datasets in the catheterization laboratory. There is no data on the feasibility and safety outcomes of CTA-assisted CTO PCI using a wearable augmented-reality glass. A total of 15 patients scheduled for elective antegrade CTO intervention were prospectively enrolled and underwent preprocedural coronary CTA. Three-dimensional and curved multiplanar CT reconstructions were transmitted to a head-mounted hands-free computer worn by interventional cardiologists during CTO PCI to provide additional information on CTO tortuosity and calcification. The results of CTO PCI using a wearable computer were compared with a time-matched prospective angiographic registry of 59 patients undergoing antegrade CTO PCI without a wearable computer. Operators' satisfaction was assessed by a 5-point Likert scale. Mean age was 64 ± 8 years and the mean J-CTO score was 2.1 ± 0.9 in the CTA-assisted group. The voice-activated co-registration and review of CTA images in a wearable computer during CTO PCI were feasible and highly rated by PCI operators (4.7/5 points). There were no major adverse cardiovascular events. Compared with standard CTO PCI, CTA-assisted recanalization of CTO using a wearable computer showed more frequent selection of the first-choice stiff wire (0% vs 40%, p augmented-reality glass is feasible and safe, and might reduce the resources required for the interventional treatment of CTO. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  8. Leisure time computer use and adolescent bone health--findings from the Tromsø Study, Fit Futures: a cross-sectional study.

    Science.gov (United States)

    Winther, Anne; Ahmed, Luai Awad; Furberg, Anne-Sofie; Grimnes, Guri; Jorde, Rolf; Nilsen, Ole Andreas; Dennison, Elaine; Emaus, Nina

    2015-04-22

    Low levels of physical activity may have considerable negative effects on bone health in adolescence, and increasing screen time in place of sporting activity during growth is worrying. This study explored the associations between self-reported screen time at weekends and bone mineral density (BMD). In 2010/2011, 1038 (93%) of the region's first-year upper-secondary school students (15-18 years) attended the Tromsø Study, Fit Futures 1 (FF1). A follow-up survey (FF2) took place in 2012/2013. BMD at total hip, femoral neck and total body was measured as g/cm(²) by dual X-ray absorptiometry (GE Lunar prodigy). Lifestyle variables were self-reported, including questions on hours per day spent in front of television/computer during weekends and hours spent on leisure time physical activities. Complete data sets for 388/312 girls and 359/231 boys at FF1/FF2, respectively, were used in analyses. Sex stratified multiple regression analyses were performed. Many adolescents balanced 2-4 h screen time with moderate or high physical activity levels. Screen time was positively related to body mass index (BMI) in boys (p=0.002), who spent more time in front of the computer than girls did (ptime was adversely associated with BMDFF1 at all sites, and these associations remained robust to adjustments for age, puberty, height, BMI, physical activity, vitamin D levels, smoking, alcohol, calcium and carbonated drink consumption (ptime was also negatively associated with total hip BMD(FF2) (p=0.031). In contrast, girls who spent 4-6 h in front of the computer had higher BMD than the reference (time spent on screen-based sedentary activity was negatively associated with BMD levels; this relationship persisted 2 years later. Such negative associations were not present among girls. Whether this surprising result is explained by biological differences remains unclear. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please

  9. Evaluation of a semi-automated computer algorithm for measuring total fat and visceral fat content in lambs undergoing in vivo whole body computed tomography.

    Science.gov (United States)

    Rosenblatt, Alana J; Scrivani, Peter V; Boisclair, Yves R; Reeves, Anthony P; Ramos-Nieves, Jose M; Xie, Yiting; Erb, Hollis N

    2017-10-01

    Computed tomography (CT) is a suitable tool for measuring body fat, since it is non-destructive and can be used to differentiate metabolically active visceral fat from total body fat. Whole body analysis of body fat is likely to be more accurate than single CT slice estimates of body fat. The aim of this study was to assess the agreement between semi-automated computer analysis of whole body volumetric CT data and conventional proximate (chemical) analysis of body fat in lambs. Data were collected prospectively from 12 lambs that underwent duplicate whole body CT, followed by slaughter and carcass analysis by dissection and chemical analysis. Agreement between methods for quantification of total and visceral fat was assessed by Bland-Altman plot analysis. The repeatability of CT was assessed for these measures using the mean difference of duplicated measures. When compared to chemical analysis, CT systematically underestimated total and visceral fat contents by more than 10% of the mean fat weight. Therefore, carcass analysis and semi-automated CT computer measurements were not interchangeable for quantifying body fat content without the use of a correction factor. CT acquisition was repeatable, with a mean difference of repeated measures being close to zero. Therefore, uncorrected whole body CT might have an application for assessment of relative changes in fat content, especially in growing lambs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Total and domain-specific sitting time among employees in desk-based work settings in Australia.

    Science.gov (United States)

    Bennie, Jason A; Pedisic, Zeljko; Timperio, Anna; Crawford, David; Dunstan, David; Bauman, Adrian; van Uffelen, Jannique; Salmon, Jo

    2015-06-01

    To describe the total and domain-specific daily sitting time among a sample of Australian office-based employees. In April 2010, paper-based surveys were provided to desk-based employees (n=801) in Victoria, Australia. Total daily and domain-specific (work, leisure-time and transport-related) sitting time (minutes/day) were assessed by validated questionnaires. Differences in sitting time were examined across socio-demographic (age, sex, occupational status) and lifestyle characteristics (physical activity levels, body mass index [BMI]) using multiple linear regression analyses. The median (95% confidence interval [CI]) of total daily sitting time was 540 (531-557) minutes/day. Insufficiently active adults (median=578 minutes/day, [95%CI: 564-602]), younger adults aged 18-29 years (median=561 minutes/day, [95%CI: 540-577]) reported the highest total daily sitting times. Occupational sitting time accounted for almost 60% of total daily sitting time. In multivariate analyses, total daily sitting time was negatively associated with age (unstandardised regression coefficient [B]=-1.58, pphysical activity (minutes/week) (B=-0.03, pemployees reported that more than half of their total daily sitting time was accrued in the work setting. Given the high contribution of occupational sitting to total daily sitting time among desk-based employees, interventions should focus on the work setting. © 2014 Public Health Association of Australia.

  11. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  12. Distributed computing for real-time petroleum reservoir monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Ayodele, O. R. [University of Alberta, Edmonton, AB (Canada)

    2004-05-01

    Computer software architecture is presented to illustrate how the concept of distributed computing can be applied to real-time reservoir monitoring processes, permitting the continuous monitoring of the dynamic behaviour of petroleum reservoirs at much shorter intervals. The paper describes the fundamental technologies driving distributed computing, namely Java 2 Platform Enterprise edition (J2EE) by Sun Microsystems, and the Microsoft Dot-Net (Microsoft.Net) initiative, and explains the challenges involved in distributed computing. These are: (1) availability of permanently placed downhole equipment to acquire and transmit seismic data; (2) availability of high bandwidth to transmit the data; (3) security considerations; (4) adaptation of existing legacy codes to run on networks as downloads on demand; and (5) credibility issues concerning data security over the Internet. Other applications of distributed computing in the petroleum industry are also considered, specifically MWD, LWD and SWD (measurement-while-drilling, logging-while-drilling, and simulation-while-drilling), and drill-string vibration monitoring. 23 refs., 1 fig.

  13. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  14. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  15. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  16. Optical computation based on nonlinear total reflectional optical ...

    Indian Academy of Sciences (India)

    Optical computing; beam splitter; optical switch; polarized beams. ... main research direction called quantum information and quantum computation is .... above has several advantages: Firstly, it is easy to be integrated with appropriate.

  17. Optical computation based on nonlinear total reflectional optical ...

    Indian Academy of Sciences (India)

    2School of Education Science, South China Normal University, Guangzhou, 510631, China. *Corresponding ... Before the computation, all the inputs are prepared in the polarization state. The key .... The all-optical computing system described.

  18. Investigating the influence of eating habits, body weight and television programme preferences on television viewing time and domestic computer usage.

    Science.gov (United States)

    Raptou, Elena; Papastefanou, Georgios; Mattas, Konstadinos

    2017-01-01

    The present study explored the influence of eating habits, body weight and television programme preference on television viewing time and domestic computer usage, after adjusting for sociodemographic characteristics and home media environment indicators. In addition, potential substitution or complementarity in screen time was investigated. Individual level data were collected via questionnaires that were administered to a random sample of 2,946 Germans. The econometric analysis employed a seemingly unrelated bivariate ordered probit model to conjointly estimate television viewing time and time engaged in domestic computer usage. Television viewing and domestic computer usage represent two independent behaviours in both genders and across all age groups. Dietary habits have a significant impact on television watching with less healthy food choices associated with increasing television viewing time. Body weight is found to be positively correlated with television screen time in both men and women, and overweight individuals have a higher propensity for heavy television viewing. Similar results were obtained for age groups where an increasing body mass index (BMI) in adults over 24 years old is more likely to be positively associated with a higher duration of television watching. With respect to dietary habits of domestic computer users, participants aged over 24 years of both genders seem to adopt more healthy dietary patterns. A downward trend in the BMI of domestic computer users was observed in women and adults aged 25-60 years. On the contrary, young domestic computer users 18-24 years old have a higher body weight than non-users. Television programme preferences also affect television screen time with clear differences to be observed between genders and across different age groups. In order to reduce total screen time, health interventions should target different types of screen viewing audiences separately.

  19. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  20. Total knee arthroplasty with a computer-navigated saw: a pilot study.

    Science.gov (United States)

    Garvin, Kevin L; Barrera, Andres; Mahoney, Craig R; Hartman, Curtis W; Haider, Hani

    2013-01-01

    Computer-aided surgery aims to improve implant alignment in TKA but has only been adopted by a minority for routine use. A novel approach, navigated freehand bone cutting (NFC), is intended to achieve wider acceptance by eliminating the need for cumbersome, implant-specific mechanical jigs and avoiding the expense of navigation. We determined cutting time, surface quality, implant fit, and implant alignment after NFC of synthetic femoral specimens and the feasibility and alignment of a complete TKA performed with NFC technology in cadaveric specimens. Seven surgeons prepared six synthetic femoral specimens each, using our custom NFC system. Cutting times, quality of bone cuts, and implant fit and alignment were assessed quantitatively by CT surface scanning and computational measurements. Additionally, a single surgeon performed a complete TKA on two cadaveric specimens using the NFC system, with cutting time and implant alignment analyzed through plain radiographs and CT. For the synthetic specimens, femoral coronal alignment was within ± 2° of neutral in 94% of the specimens. Sagittal alignment was within 0° to 5° of flexion in all specimens. Rotation was within ± 1° of the epicondylar axis in 97% of the specimens. The mean time to make cuts improved from 13 minutes for the first specimen to 9 minutes for the fourth specimen. TKA was performed in two cadaveric specimens without complications and implants were well aligned. TKA is feasible with NFC, which eliminates the need for implant-specific instruments. We observed a fast learning curve. NFC has the potential to improve TKA alignment, reduce operative time, and reduce the number of instruments in surgery. Fewer instruments and less sterilization could reduce costs associated with TKA.

  1. Exact and Heuristic Solutions to Minimize Total Waiting Time in the Blood Products Distribution Problem

    Directory of Open Access Journals (Sweden)

    Amir Salehipour

    2012-01-01

    Full Text Available This paper presents a novel application of operations research to support decision making in blood distribution management. The rapid and dynamic increasing demand, criticality of the product, storage, handling, and distribution requirements, and the different geographical locations of hospitals and medical centers have made blood distribution a complex and important problem. In this study, a real blood distribution problem containing 24 hospitals was tackled by the authors, and an exact approach was presented. The objective of the problem is to distribute blood and its products among hospitals and medical centers such that the total waiting time of those requiring the product is minimized. Following the exact solution, a hybrid heuristic algorithm is proposed. Computational experiments showed the optimal solutions could be obtained for medium size instances, while for larger instances the proposed hybrid heuristic is very competitive.

  2. Surgical time and complications of total transvaginal (total-NOTES, single-port laparoscopic-assisted and conventional ovariohysterectomy in bitches

    Directory of Open Access Journals (Sweden)

    M.A.M. Silva

    2015-06-01

    Full Text Available The recently developed minimally invasive techniques of ovariohysterectomy (OVH have been studied in dogs in order to optimize their benefits and decrease risks to the patients. The purpose of this study was to compare surgical time, complications and technical difficulties of transvaginal total-NOTES, single-port laparoscopic-assisted and conventional OVH in bitches. Twelve bitches were submitted to total-NOTES (NOTES group, while 13 underwent single-port laparoscopic-assisted (SPLA group and 15 were submitted to conventional OVH (OPEN group. Intra-operative period was divided into 7 stages: (1 access to abdominal cavity; (2 pneumoperitoneum; approach to the right (3 and left (4 ovarian pedicle and uterine body (5; (6 abdominal or vaginal synthesis, performed in 6 out of 12 patients of NOTES; (7 inoperative time. Overall and stages operative times, intra and postoperative complications and technical difficulties were compared among groups. Mean overall surgical time in NOTES (25.7±6.8 minutes and SPLA (23.1±4.0 minutes groups were shorter than in the OPEN group (34.0±6.4 minutes (P<0.05. The intraoperative stage that required the longest time was the approach to the uterine body in the NOTES group and abdominal and cutaneous sutures in the OPEN group. There was no difference regarding the rates of complications. Major complications included postoperative bleeding requiring reoperation in a bitch in the OPEN group, while minor complications included mild vaginal discharge in four patients in the NOTES group and seroma in three bitches in the SPLA group. In conclusion, total-NOTES and SPLA OVH were less time-consuming then conventional OVH in bitches. All techniques presented complications, which were properly managed.

  3. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  4. A users manual for a computer program which calculates time optical geocentric transfers using solar or nuclear electric and high thrust propulsion

    Science.gov (United States)

    Sackett, L. L.; Edelbaum, T. N.; Malchow, H. L.

    1974-01-01

    This manual is a guide for using a computer program which calculates time optimal trajectories for high-and low-thrust geocentric transfers. Either SEP or NEP may be assumed and a one or two impulse, fixed total delta V, initial high thrust phase may be included. Also a single impulse of specified delta V may be included after the low thrust state. The low thrust phase utilizes equinoctial orbital elements to avoid the classical singularities and Kryloff-Boguliuboff averaging to help insure more rapid computation time. The program is written in FORTRAN 4 in double precision for use on an IBM 360 computer. The manual includes a description of the problem treated, input/output information, examples of runs, and source code listings.

  5. Near real-time digital holographic microscope based on GPU parallel computing

    Science.gov (United States)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  6. Timing of Re-Transfusion Drain Removal Following Total Knee Replacement

    Science.gov (United States)

    Leeman, MF; Costa, ML; Costello, E; Edwards, D

    2006-01-01

    INTRODUCTION The use of postoperative drains following total knee replacement (TKR) has recently been modified by the use of re-transfusion drains. The aim of our study was to investigate the optimal time for removal of re-transfusion drains following TKR. PATIENTS AND METHODS The medical records of 66 patients who had a TKR performed between October 2003 and October 2004 were reviewed; blood drained before 6 h and the total volume of blood drained was recorded. RESULTS A total of 56 patients had complete records of postoperative drainage. The mean volume of blood collected in the drain in the first 6 h was 442 ml. The mean total volume of blood in the drain was 595 ml. Therefore, of the blood drained, 78% was available for transfusion. CONCLUSION Re-transfusion drains should be removed after 6 h, when no further re-transfusion is permissible. PMID:16551400

  7. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  8. Total sleep time severely drops during adolescence.

    Directory of Open Access Journals (Sweden)

    Damien Leger

    Full Text Available UNLABELLED: Restricted sleep duration among young adults and adolescents has been shown to increase the risk of morbidities such as obesity, diabetes or accidents. However there are few epidemiological studies on normal total sleep time (TST in representative groups of teen-agers which allow to get normative data. PURPOSE: To explore perceived total sleep time on schooldays (TSTS and non schooldays (TSTN and the prevalence of sleep initiating insomnia among a nationally representative sample of teenagers. METHODS: Data from 9,251 children aged 11 to 15 years-old, 50.7% of which were boys, as part of the cross-national study 2011 HBSC were analyzed. Self-completion questionnaires were administered in classrooms. An estimate of TSTS and TSTN (week-ends and vacations was calculated based on specifically designed sleep habits report. Sleep deprivation was estimated by a TSTN - TSTS difference >2 hours. Sleep initiating nsomnia was assessed according to International classification of sleep disorders (ICSD 2. Children who reported sleeping 7 hours or less per night were considered as short sleepers. RESULTS: A serious drop of TST was observed between 11 yo and 15 yo, both during the schooldays (9 hours 26 minutes vs. 7 h 55 min.; p<0.001 and at a lesser extent during week-ends (10 h 17 min. vs. 9 h 44 min.; p<0.001. Sleep deprivation concerned 16.0% of chidren aged of 11 yo vs. 40.5% of those of 15 yo (p<0.001. Too short sleep was reported by 2.6% of the 11 yo vs. 24.6% of the 15 yo (p<0.001. CONCLUSION: Despite the obvious need for sleep in adolescence, TST drastically decreases with age among children from 11 to 15 yo which creates significant sleep debt increasing with age.

  9. Computation of reactor control rod drop time under accident conditions

    International Nuclear Information System (INIS)

    Dou Yikang; Yao Weida; Yang Renan; Jiang Nanyan

    1998-01-01

    The computational method of reactor control rod drop time under accident conditions lies mainly in establishing forced vibration equations for the components under action of outside forces on control rod driven line and motion equation for the control rod moving in vertical direction. The above two kinds of equations are connected by considering the impact effects between control rod and its outside components. Finite difference method is adopted to make discretization of the vibration equations and Wilson-θ method is applied to deal with the time history problem. The non-linearity caused by impact is iteratively treated with modified Newton method. Some experimental results are used to validate the validity and reliability of the computational method. Theoretical and experimental testing problems show that the computer program based on the computational method is applicable and reliable. The program can act as an effective tool of design by analysis and safety analysis for the relevant components

  10. Real-time fusion of coronary CT angiography with X-ray fluoroscopy during chronic total occlusion PCI

    Energy Technology Data Exchange (ETDEWEB)

    Ghoshhajra, Brian B.; Takx, Richard A.P. [Harvard Medical School, Cardiac MR PET CT Program, Massachusetts General Hospital, Department of Radiology and Division of Cardiology, Boston, MA (United States); Stone, Luke L.; Yeh, Robert W.; Jaffer, Farouc A. [Harvard Medical School, Cardiac Cathetrization Laboratory, Cardiology Division, Massachusetts General Hospital, Boston, MA (United States); Girard, Erin E. [Siemens Healthcare, Princeton, NJ (United States); Brilakis, Emmanouil S. [Cardiology Division, Dallas VA Medical Center and UT Southwestern Medical Center, Dallas, TX (United States); Lombardi, William L. [University of Washington, Cardiology Division, Seattle, WA (United States)

    2017-06-15

    The purpose of this study was to demonstrate the feasibility of real-time fusion of coronary computed tomography angiography (CTA) centreline and arterial wall calcification with X-ray fluoroscopy during chronic total occlusion (CTO) percutaneous coronary intervention (PCI). Patients undergoing CTO PCI were prospectively enrolled. Pre-procedural CT scans were integrated with conventional coronary fluoroscopy using prototype software. We enrolled 24 patients who underwent CTO PCI using the prototype CT fusion software, and 24 consecutive CTO PCI patients without CT guidance served as a control group. Mean age was 66 ± 11 years, and 43/48 patients were men. Real-time CTA fusion during CTO PCI provided additional information regarding coronary arterial calcification and tortuosity that generated new insights into antegrade wiring, antegrade dissection/reentry, and retrograde wiring during CTO PCI. Overall CTO success rates and procedural outcomes remained similar between the two groups, despite a trend toward higher complexity in the fusion CTA group. This study demonstrates that real-time automated co-registration of coronary CTA centreline and calcification onto live fluoroscopic images is feasible and provides new insights into CTO PCI, and in particular, antegrade dissection reentry-based CTO PCI. (orig.)

  11. Real-Time Accumulative Computation Motion Detectors

    Directory of Open Access Journals (Sweden)

    Saturnino Maldonado-Bascón

    2009-12-01

    Full Text Available The neurally inspired accumulative computation (AC method and its application to motion detection have been introduced in the past years. This paper revisits the fact that many researchers have explored the relationship between neural networks and finite state machines. Indeed, finite state machines constitute the best characterized computational model, whereas artificial neural networks have become a very successful tool for modeling and problem solving. The article shows how to reach real-time performance after using a model described as a finite state machine. This paper introduces two steps towards that direction: (a A simplification of the general AC method is performed by formally transforming it into a finite state machine. (b A hardware implementation in FPGA of such a designed AC module, as well as an 8-AC motion detector, providing promising performance results. We also offer two case studies of the use of AC motion detectors in surveillance applications, namely infrared-based people segmentation and color-based people tracking, respectively.

  12. Working time intervals and total work time on nursing positions in Poland

    Directory of Open Access Journals (Sweden)

    Danuta Kunecka

    2015-06-01

    Full Text Available Background: For the last few years a topic of overwork on nursing posts has given rise to strong discussions. The author has set herself a goal of answering the question if it is a result of real overwork of this particular profession or rather commonly assumed frustration of this professional group. The aim of this paper is to conduct the analysis of working time on chosen nursing positions in relation to measures of time being used as intervals in the course of conducting standard professional activities during one working day. Material and Methods: Research material consisted of documentation of work time on chosen nursing workplaces, compiled between 2007–2012 within the framework of a nursing course at the Pomeranian Medical University in Szczecin. As a method of measurement a photograph of a working day has been used. Measurements were performed in institutions located in 6 voivodeships in Poland. Results: Results suggest that only 6.5% of total of surveyed representatives of nurse profession spends proper amount of time (meaning: a time set by the applicable standards on work intervals during a working day. Conclusions: The scale of the phenomenon indicates excessive workload for nursing positions, which along with a longer period of time, longer working hours may cause decrease in efficiency of work and cause a drop in quality of provided services. Med Pr 2015;66,(2:165–172

  13. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  14. 12 CFR 516.10 - How does OTS compute time periods under this part?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS compute time periods under this part? 516.10 Section 516.10 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES § 516.10 How does OTS compute time periods under this part? In computing...

  15. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  16. Effect of the MCNP model definition on the computation time

    International Nuclear Information System (INIS)

    Šunka, Michal

    2017-01-01

    The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)

  17. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention.

    Science.gov (United States)

    Ho, Chi-Kung; Chen, Fu-Cheng; Chen, Yung-Lung; Wang, Hui-Ting; Lee, Chien-Ho; Chung, Wen-Jung; Lin, Cheng-Jui; Hsueh, Shu-Kai; Hung, Shin-Chiang; Wu, Kuan-Han; Liu, Chu-Feng; Kung, Chia-Te; Cheng, Cheng-I

    2017-01-01

    This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB) time for ST segment elevation myocardial infarction (STEMI). A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p cloud computing system in our present protocol did not reduce DTB time.

  18. Differentiation of acute total occlusion of coronary artery from chronic total occlusion in coronary computed tomography angiography

    International Nuclear Information System (INIS)

    Kwag, Hyon Joo

    2012-01-01

    To compare the features of coronary computed tomography angiography (CCTA) imaging of the patients with acute total occlusion (ATO) of coronary artery with those of chronic total occlusion (CTO). CCTA of 26 patients with complete interruption of the coronary artery in CCTA and occlusion in conventional coronary angiography, were retrospectively analyzed. Discrimination between the ATO group (n = 11, patients with non ST elevation myocardial infarction or unstable angina) and the CTO group (n = 15, patients with stable angina or nonspecific symptom) was arbitrarily determined by clinical diagnosis. Lesion length, remodeling index (RI), plaque density measured by Hounsfield units (HU), plaque composition, percentage attenuation drop across the lesion, and presence of myocardial thinning were evaluated. Comparisons between the ATO and CTO groups revealed significantly shorter lesion length in the ATO group (0.40 cm vs. 1.87 cm, respectively; p = 0.001), and significantly higher RI (1.56 vs. 1.10, respectively; p = 0.004). Plaque density of the ATO group was lower (37.0 HU vs. 104.7 HU, respectively; p < 0.001) and non calcified plaque was frequently seen in the ATO group (72.7% vs. 26.7%, respectively; p = 0.02). Percentage attenuation drop across the lesion was lower for the ATO group (10.92% vs. 25.44%, respectively; p = 0.005). Myocardial thinning was exclusively observed in the CTO group (seven of 15 patients, p = 0.01). CCTA shows various statistically significant differences between the ATO and CTO groups

  19. Differentiation of acute total occlusion of coronary artery from chronic total occlusion in coronary computed tomography angiography

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Hyon Joo [Sungkyunkwan Univ. School of Medicine, Seoul (Korea, Republic of)

    2012-08-15

    To compare the features of coronary computed tomography angiography (CCTA) imaging of the patients with acute total occlusion (ATO) of coronary artery with those of chronic total occlusion (CTO). CCTA of 26 patients with complete interruption of the coronary artery in CCTA and occlusion in conventional coronary angiography, were retrospectively analyzed. Discrimination between the ATO group (n = 11, patients with non ST elevation myocardial infarction or unstable angina) and the CTO group (n = 15, patients with stable angina or nonspecific symptom) was arbitrarily determined by clinical diagnosis. Lesion length, remodeling index (RI), plaque density measured by Hounsfield units (HU), plaque composition, percentage attenuation drop across the lesion, and presence of myocardial thinning were evaluated. Comparisons between the ATO and CTO groups revealed significantly shorter lesion length in the ATO group (0.40 cm vs. 1.87 cm, respectively; p = 0.001), and significantly higher RI (1.56 vs. 1.10, respectively; p = 0.004). Plaque density of the ATO group was lower (37.0 HU vs. 104.7 HU, respectively; p < 0.001) and non calcified plaque was frequently seen in the ATO group (72.7% vs. 26.7%, respectively; p = 0.02). Percentage attenuation drop across the lesion was lower for the ATO group (10.92% vs. 25.44%, respectively; p = 0.005). Myocardial thinning was exclusively observed in the CTO group (seven of 15 patients, p = 0.01). CCTA shows various statistically significant differences between the ATO and CTO groups.

  20. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. Control bandwidth improvements in GRAVITY fringe tracker by switching to a synchronous real time computer architecture

    Science.gov (United States)

    Abuter, Roberto; Dembet, Roderick; Lacour, Sylvestre; di Lieto, Nicola; Woillez, Julien; Eisenhauer, Frank; Fedou, Pierre; Phan Duc, Than

    2016-08-01

    The new VLTI (Very Large Telescope Interferometer) 1 instrument GRAVITY5, 22, 23 is equipped with a fringe tracker16 able to stabilize the K-band fringes on six baselines at the same time. It has been designed to achieve a performance for average seeing conditions of a residual OPD (Optical Path Difference) lower than 300 nm with objects brighter than K = 10. The control loop implementing the tracking is composed of a four stage real time computer system compromising: a sensor where the detector pixels are read in and the OPD and GD (Group Delay) are calculated; a controller receiving the computed sensor quantities and producing commands for the piezo actuators; a concentrator which combines both the OPD commands with the real time tip/tilt corrections offloading them to the piezo actuator; and finally a Kalman15 parameter estimator. This last stage is used to monitor current measurements over a window of few seconds and estimate new values for the main Kalman15 control loop parameters. The hardware and software implementation of this design runs asynchronously and communicates the four computers for data transfer via the Reflective Memory Network3. With the purpose of improving the performance of the GRAVITY5, 23 fringe tracking16, 22 control loop, a deviation from the standard asynchronous communication mechanism has been proposed and implemented. This new scheme operates the four independent real time computers involved in the tracking loop synchronously using the Reflective Memory Interrupts2 as the coordination signal. This synchronous mechanism had the effect of reducing the total pure delay of the loop from 3.5 [ms] to 2.0 [ms] which then translates on a better stabilization of the fringes as the bandwidth of the system is substantially improved. This paper will explain in detail the real time architecture of the fringe tracker in both is synchronous and synchronous implementation. The achieved improvements on reducing the delay via this mechanism will be

  3. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  4. Fast Megavoltage Computed Tomography: A Rapid Imaging Method for Total Body or Marrow Irradiation in Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Magome, Taiki [Department of Radiological Sciences, Faculty of Health Sciences, Komazawa University, Tokyo (Japan); Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Haga, Akihiro [Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Takahashi, Yutaka [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology, Osaka University, Osaka (Japan); Nakagawa, Keiichi [Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Dusenbery, Kathryn E. [Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Hui, Susanta K., E-mail: shui@coh.org [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology and Beckman Research Institute, City of Hope, Duarte, California (United States)

    2016-11-01

    Purpose: Megavoltage computed tomographic (MVCT) imaging has been widely used for the 3-dimensional (3-D) setup of patients treated with helical tomotherapy (HT). One drawback of MVCT is its very long imaging time, the result of slow couch speeds of approximately 1 mm/s, which can be difficult for the patient to tolerate. We sought to develop an MVCT imaging method allowing faster couch speeds and to assess its accuracy for image guidance for HT. Methods and Materials: Three cadavers were scanned 4 times with couch speeds of 1, 2, 3, and 4 mm/s. The resulting MVCT images were reconstructed using an iterative reconstruction (IR) algorithm with a penalty term of total variation and with a conventional filtered back projection (FBP) algorithm. The MVCT images were registered with kilovoltage CT images, and the registration errors from the 2 reconstruction algorithms were compared. This fast MVCT imaging was tested in 3 cases of total marrow irradiation as a clinical trial. Results: The 3-D registration errors of the MVCT images reconstructed with the IR algorithm were smaller than the errors of images reconstructed with the FBP algorithm at fast couch speeds (2, 3, 4 mm/s). The scan time and imaging dose at a speed of 4 mm/s were reduced to 30% of those from a conventional coarse mode scan. For the patient imaging, faster MVCT (3 mm/s couch speed) scanning reduced the imaging time and still generated images useful for anatomic registration. Conclusions: Fast MVCT with the IR algorithm is clinically feasible for large 3-D target localization, which may reduce the overall time for the treatment procedure. This technique may also be useful for calculating daily dose distributions or organ motion analyses in HT treatment over a wide area. Automated integration of this imaging is at least needed to further assess its clinical benefits.

  5. Real-time brain computer interface using imaginary movements

    DEFF Research Database (Denmark)

    El-Madani, Ahmad; Sørensen, Helge Bjarup Dissing; Kjær, Troels W.

    2015-01-01

    Background: Brain Computer Interface (BCI) is the method of transforming mental thoughts and imagination into actions. A real-time BCI system can improve the quality of life of patients with severe neuromuscular disorders by enabling them to communicate with the outside world. In this paper...

  6. 17 CFR 402.2c - Appendix C-Consolidated computations of liquid capital and total haircuts for certain...

    Science.gov (United States)

    2010-04-01

    ... SECURITIES EXCHANGE ACT OF 1934 FINANCIAL RESPONSIBILITY § 402.2c Appendix C—Consolidated computations of liquid capital and total haircuts for certain subsidiaries and affiliates. (a) Consolidation. (1) A...) Principles of consolidation. The following minimum and non-exclusive requirements shall govern the...

  7. Contrast timing in computed tomography: Effect of different contrast media concentrations on bolus geometry

    International Nuclear Information System (INIS)

    Mahnken, Andreas H.; Jost, Gregor; Seidensticker, Peter; Kuhl, Christiane; Pietsch, Hubertus

    2012-01-01

    Objective: To assess the effect of low-osmolar, monomeric contrast media with different iodine concentrations on bolus shape in aortic CT angiography. Materials and methods: Repeated sequential computed tomography scanning of the descending aorta of eight beagle dogs (5 male, 12.7 ± 3.1 kg) was performed without table movement with a standardized CT scan protocol. Iopromide 300 (300 mg I/mL), iopromide 370 (370 mg I/mL) and iomeprol 400 (400 mg I/mL) were administered via a foreleg vein with an identical iodine delivery rate of 1.2 g I/s and a total iodine dose of 300 mg I/kg body weight. Time-enhancement curves were computed and analyzed. Results: Iopromide 300 showed the highest peak enhancement (445.2 ± 89.1 HU), steepest up-slope (104.2 ± 17.5 HU/s) and smallest full width at half maximum (FWHM; 5.8 ± 1.0 s). Peak enhancement, duration of FWHM, enhancement at FWHM and up-slope differed significantly between iopromide 300 and iomeprol 400 (p 0.05). Conclusions: Low viscous iopromide 300 results in a better defined bolus with a significantly higher peak enhancement, steeper up-slope and smaller FWHM when compared to iomeprol 400. These characteristics potentially affect contrast timing.

  8. Protocol for concomitant temporomandibular joint custom-fitted total joint reconstruction and orthognathic surgery utilizing computer-assisted surgical simulation.

    Science.gov (United States)

    Movahed, Reza; Teschke, Marcus; Wolford, Larry M

    2013-12-01

    Clinicians who address temporomandibular joint (TMJ) pathology and dentofacial deformities surgically can perform the surgery in 1 stage or 2 separate stages. The 2-stage approach requires the patient to undergo 2 separate operations and anesthesia, significantly prolonging the overall treatment. However, performing concomitant TMJ and orthognathic surgery (CTOS) in these cases requires careful treatment planning and surgical proficiency in the 2 surgical areas. This article presents a new treatment protocol for the application of computer-assisted surgical simulation in CTOS cases requiring reconstruction with patient-fitted total joint prostheses. The traditional and new CTOS protocols are described and compared. The new CTOS protocol helps decrease the preoperative workup time and increase the accuracy of model surgery. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  9. A prospective study of spine fractures diagnosed by total spine computed tomography in high energy trauma patients

    International Nuclear Information System (INIS)

    Takami, Masanari; Nohda, Kazuhiro; Sakanaka, Junya; Nakamura, Masamichi; Yoshida, Munehito

    2011-01-01

    Since it is known to be impossible to identify spinal fractures in high-energy trauma patients the primary trauma evaluation, we have been performing total spine computed tomography (CT) in high-energy trauma cases. We investigated the spinal fractures that it was possible to detect by total spine CT in 179 cases and evaluated the usefulness of total spine CT prospectively. There were 54 (30.2%) spinal fractures among the 179 cases. Six (37.5%) of the 16 cervical spine fractures that were not detected on plain X-ray films were identified by total spine CT. Six (14.0%) of 43 thoracolumbar spine fractures were considered difficult to diagnose based on the clinical findings if total spine CT had not been performed. We therefore concluded that total spine CT is very useful and should be performed during the primary trauma evaluation in high-energy trauma cases. (author)

  10. [Determination of total and segmental colonic transit time in constipated children].

    Science.gov (United States)

    Zhang, Shu-cheng; Wang, Wei-lin; Bai, Yu-zuo; Yuan, Zheng-wei; Wang, Wei

    2003-03-01

    To determine the total and segmental colonic transit time of normal Chinese children and to explore its value in constipation in children. The subjects involved in this study were divided into 2 groups. One group was control, which had 33 healthy children (21 males and 12 females) aged 2 - 13 years (mean 5 years). The other was constipation group, which had 25 patients (15 males and 10 females) aged 3 - 14 years (mean 7 years) with constipation according to Benninga's criteria. Written informed consent was obtained from the parents of each subject. In this study the simplified method of radio opaque markers was used to determine the total gastrointestinal transit time and segmental colonic transit time of the normal and constipated children, and in part of these patients X-ray defecography was also used. The total gastrointestinal transit time (TGITT), right colonic transit time (RCTT), left colonic transit time (LCTT) and rectosigmoid colonic transit time (RSTT) of the normal children were 28.7 +/- 7.7 h, 7.5 +/- 3.2 h, 6.5 +/- 3.8 h and 13.4 +/- 5.6 h, respectively. In the constipated children, the TGITT, LCTT and RSTT were significantly longer than those in controls (92.2 +/- 55.5 h vs 28.7 +/- 7.7 h, P < 0.001; 16.9 +/- 12.6 h vs 6.5 +/- 3.8 h, P < 0.01; 61.5 +/- 29.0 h vs 13.4 +/- 5.6 h, P < 0.001), while the RCTT had no significant difference. X-ray defecography demonstrated one rectocele, one perineal descent syndrome and one puborectal muscle syndrome, respectively. The TGITT, RCTT, LCTT and RSTT of the normal children were 28.7 +/- 7.7 h, 7.5 +/- 3.2 h, 6.5 +/- 3.8 h and 13.4 +/- 5.6 h, respectively. With the segmental colonic transit time, constipation can be divided into four types: slow-transit constipation, outlet obstruction, mixed type and normal transit constipation. X-ray defecography can demonstrate the anatomical or dynamic abnormalities within the anorectal area, with which constipation can be further divided into different subtypes, and

  11. Computational Approach to Profit Optimization of a Loss-Queueing System

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Yadav

    2010-01-01

    Full Text Available Objective of the paper is to deal with the profit optimization of a loss queueing system with the finite capacity. Here, we define and compute total expected cost (TEC, total expected revenue (TER and consequently we compute the total optimal profit (TOP of the system. In order to compute the total optimal profit of the system, a computing algorithm has been developed and a fast converging N-R method has been employed which requires least computing time and lesser memory space as compared to other methods. Sensitivity analysis and its observations based on graphics have added a significant value to this model.

  12. Sorting on STAR. [CDC computer algorithm timing comparison

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  13. Variation in computer time with geometry prescription in monte carlo code KENO-IV

    International Nuclear Information System (INIS)

    Gopalakrishnan, C.R.

    1988-01-01

    In most studies, the Monte Carlo criticality code KENO-IV has been compared with other Monte Carlo codes, but evaluation of its performance with different box descriptions has not been done so far. In Monte Carlo computations, any fractional savings of computing time is highly desirable. Variation in computation time with box description in KENO for two different fast reactor fuel subassemblies of FBTR and PFBR is studied. The K eff of an infinite array of fuel subassemblies is calculated by modelling the subassemblies in two different ways (i) multi-region, (ii) multi-box. In addition to these two cases, excess reactivity calculations of FBTR are also performed in two ways to study this effect in a complex geometry. It is observed that the K eff values calculated by multi-region and multi-box models agree very well. However the increase in computation time from the multi-box to the multi-region is considerable, while the difference in computer storage requirements for the two models is negligible. This variation in computing time arises from the way the neutron is tracked in the two cases. (author)

  14. SLMRACE: a noise-free RACE implementation with reduced computational time

    Science.gov (United States)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  15. Efficient Geo-Computational Algorithms for Constructing Space-Time Prisms in Road Networks

    Directory of Open Access Journals (Sweden)

    Hui-Ping Chen

    2016-11-01

    Full Text Available The Space-time prism (STP is a key concept in time geography for analyzing human activity-travel behavior under various Space-time constraints. Most existing time-geographic studies use a straightforward algorithm to construct STPs in road networks by using two one-to-all shortest path searches. However, this straightforward algorithm can introduce considerable computational overhead, given the fact that accessible links in a STP are generally a small portion of the whole network. To address this issue, an efficient geo-computational algorithm, called NTP-A*, is proposed. The proposed NTP-A* algorithm employs the A* and branch-and-bound techniques to discard inaccessible links during two shortest path searches, and thereby improves the STP construction performance. Comprehensive computational experiments are carried out to demonstrate the computational advantage of the proposed algorithm. Several implementation techniques, including the label-correcting technique and the hybrid link-node labeling technique, are discussed and analyzed. Experimental results show that the proposed NTP-A* algorithm can significantly improve STP construction performance in large-scale road networks by a factor of 100, compared with existing algorithms.

  16. An Enhanced Discrete Artificial Bee Colony Algorithm to Minimize the Total Flow Time in Permutation Flow Shop Scheduling with Limited Buffers

    Directory of Open Access Journals (Sweden)

    Guanlong Deng

    2016-01-01

    Full Text Available This paper presents an enhanced discrete artificial bee colony algorithm for minimizing the total flow time in the flow shop scheduling problem with buffer capacity. First, the solution in the algorithm is represented as discrete job permutation to directly convert to active schedule. Then, we present a simple and effective scheme called best insertion for the employed bee and onlooker bee and introduce a combined local search exploring both insertion and swap neighborhood. To validate the performance of the presented algorithm, a computational campaign is carried out on the Taillard benchmark instances, and computations and comparisons show that the proposed algorithm is not only capable of solving the benchmark set better than the existing discrete differential evolution algorithm and iterated greedy algorithm, but also capable of performing better than two recently proposed discrete artificial bee colony algorithms.

  17. Computational Procedures for a Class of GI/D/k Systems in Discrete Time

    Directory of Open Access Journals (Sweden)

    Md. Mostafizur Rahman

    2009-01-01

    Full Text Available A class of discrete time GI/D/k systems is considered for which the interarrival times have finite support and customers are served in first-in first-out (FIFO order. The system is formulated as a single server queue with new general independent interarrival times and constant service duration by assuming cyclic assignment of customers to the identical servers. Then the queue length is set up as a quasi-birth-death (QBD type Markov chain. It is shown that this transformed GI/D/1 system has special structures which make the computation of the matrix R simple and efficient, thereby reducing the number of multiplications in each iteration significantly. As a result we were able to keep the computation time very low. Moreover, use of the resulting structural properties makes the computation of the distribution of queue length of the transformed system efficient. The computation of the distribution of waiting time is also shown to be simple by exploiting the special structures.

  18. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo; Abdelaziz, Ibrahim; Aldilaijan, Abdulla; Canini, Marco; Kalnis, Panos

    2017-01-01

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers' computation time.

  19. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo

    2017-11-27

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers\\' computation time.

  20. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  1. Real-time fusion of coronary CT angiography with x-ray fluoroscopy during chronic total occlusion PCI.

    Science.gov (United States)

    Ghoshhajra, Brian B; Takx, Richard A P; Stone, Luke L; Girard, Erin E; Brilakis, Emmanouil S; Lombardi, William L; Yeh, Robert W; Jaffer, Farouc A

    2017-06-01

    The purpose of this study was to demonstrate the feasibility of real-time fusion of coronary computed tomography angiography (CTA) centreline and arterial wall calcification with x-ray fluoroscopy during chronic total occlusion (CTO) percutaneous coronary intervention (PCI). Patients undergoing CTO PCI were prospectively enrolled. Pre-procedural CT scans were integrated with conventional coronary fluoroscopy using prototype software. We enrolled 24 patients who underwent CTO PCI using the prototype CT fusion software, and 24 consecutive CTO PCI patients without CT guidance served as a control group. Mean age was 66 ± 11 years, and 43/48 patients were men. Real-time CTA fusion during CTO PCI provided additional information regarding coronary arterial calcification and tortuosity that generated new insights into antegrade wiring, antegrade dissection/reentry, and retrograde wiring during CTO PCI. Overall CTO success rates and procedural outcomes remained similar between the two groups, despite a trend toward higher complexity in the fusion CTA group. This study demonstrates that real-time automated co-registration of coronary CTA centreline and calcification onto live fluoroscopic images is feasible and provides new insights into CTO PCI, and in particular, antegrade dissection reentry-based CTO PCI. • Real-time semi-automated fusion of CTA/fluoroscopy is feasible during CTO PCI. • CTA fusion data can be toggled on/off as desired during CTO PCI • Real-time CT calcium and centreline overlay could benefit antegrade dissection/reentry-based CTO PCI.

  2. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  3. Computer Games and Instruction

    Science.gov (United States)

    Tobias, Sigmund, Ed.; Fletcher, J. D., Ed.

    2011-01-01

    There is intense interest in computer games. A total of 65 percent of all American households play computer games, and sales of such games increased 22.9 percent last year. The average amount of game playing time was found to be 13.2 hours per week. The popularity and market success of games is evident from both the increased earnings from games,…

  4. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  5. Computing the Maximum Detour of a Plane Graph in Subquadratic Time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    Let G be a plane graph where each edge is a line segment. We consider the problem of computing the maximum detour of G, defined as the maximum over all pairs of distinct points p and q of G of the ratio between the distance between p and q in G and the distance |pq|. The fastest known algorithm f...... for this problem has O(n^2) running time. We show how to obtain O(n^{3/2}*(log n)^3) expected running time. We also show that if G has bounded treewidth, its maximum detour can be computed in O(n*(log n)^3) expected time....

  6. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  7. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  8. Objectively measured total and occupational sedentary time in three work settings

    NARCIS (Netherlands)

    Dommelen, P. van; Coffeng, J. K.; Ploeg, H.P. van der; Beek, A.J. van der; Boot, C.R.; Hendriksen, I.J.

    2016-01-01

    Background. Sedentary behaviour increases the risk for morbidity. Our primary aim is to determine the proportion and factors associated with objectively measured total and occupational sedentary time in three work settings. Secondary aim is to study the proportion of physical activity and prolonged

  9. Computationally determining the salience of decision points for real-time wayfinding support

    Directory of Open Access Journals (Sweden)

    Makoto Takemiya

    2012-06-01

    Full Text Available This study introduces the concept of computational salience to explain the discriminatory efficacy of decision points, which in turn may have applications to providing real-time assistance to users of navigational aids. This research compared algorithms for calculating the computational salience of decision points and validated the results via three methods: high-salience decision points were used to classify wayfinders; salience scores were used to weight a conditional probabilistic scoring function for real-time wayfinder performance classification; and salience scores were correlated with wayfinding-performance metrics. As an exploratory step to linking computational and cognitive salience, a photograph-recognition experiment was conducted. Results reveal a distinction between algorithms useful for determining computational and cognitive saliences. For computational salience, information about the structural integration of decision points is effective, while information about the probability of decision-point traversal shows promise for determining cognitive salience. Limitations from only using structural information and motivations for future work that include non-structural information are elicited.

  10. Computation of transit times using the milestoning method with applications to polymer translocation

    Science.gov (United States)

    Hawk, Alexander T.; Konda, Sai Sriharsha M.; Makarov, Dmitrii E.

    2013-08-01

    Milestoning is an efficient approximation for computing long-time kinetics and thermodynamics of large molecular systems, which are inaccessible to brute-force molecular dynamics simulations. A common use of milestoning is to compute the mean first passage time (MFPT) for a conformational transition of interest. However, the MFPT is not always the experimentally observed timescale. In particular, the duration of the transition path, or the mean transit time, can be measured in single-molecule experiments, such as studies of polymers translocating through pores and fluorescence resonance energy transfer studies of protein folding. Here we show how to use milestoning to compute transit times and illustrate our approach by applying it to the translocation of a polymer through a narrow pore.

  11. A heterogeneous hierarchical architecture for real-time computing

    Energy Technology Data Exchange (ETDEWEB)

    Skroch, D.A.; Fornaro, R.J.

    1988-12-01

    The need for high-speed data acquisition and control algorithms has prompted continued research in the area of multiprocessor systems and related programming techniques. The result presented here is a unique hardware and software architecture for high-speed real-time computer systems. The implementation of a prototype of this architecture has required the integration of architecture, operating systems and programming languages into a cohesive unit. This report describes a Heterogeneous Hierarchial Architecture for Real-Time (H{sup 2} ART) and system software for program loading and interprocessor communication.

  12. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  13. Parallel algorithm of real-time infrared image restoration based on total variation theory

    Science.gov (United States)

    Zhu, Ran; Li, Miao; Long, Yunli; Zeng, Yaoyuan; An, Wei

    2015-10-01

    Image restoration is a necessary preprocessing step for infrared remote sensing applications. Traditional methods allow us to remove the noise but penalize too much the gradients corresponding to edges. Image restoration techniques based on variational approaches can solve this over-smoothing problem for the merits of their well-defined mathematical modeling of the restore procedure. The total variation (TV) of infrared image is introduced as a L1 regularization term added to the objective energy functional. It converts the restoration process to an optimization problem of functional involving a fidelity term to the image data plus a regularization term. Infrared image restoration technology with TV-L1 model exploits the remote sensing data obtained sufficiently and preserves information at edges caused by clouds. Numerical implementation algorithm is presented in detail. Analysis indicates that the structure of this algorithm can be easily implemented in parallelization. Therefore a parallel implementation of the TV-L1 filter based on multicore architecture with shared memory is proposed for infrared real-time remote sensing systems. Massive computation of image data is performed in parallel by cooperating threads running simultaneously on multiple cores. Several groups of synthetic infrared image data are used to validate the feasibility and effectiveness of the proposed parallel algorithm. Quantitative analysis of measuring the restored image quality compared to input image is presented. Experiment results show that the TV-L1 filter can restore the varying background image reasonably, and that its performance can achieve the requirement of real-time image processing.

  14. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  15. Increased Total Anesthetic Time Leads to Higher Rates of Surgical Site Infections in Spinal Fusions.

    Science.gov (United States)

    Puffer, Ross C; Murphy, Meghan; Maloney, Patrick; Kor, Daryl; Nassr, Ahmad; Freedman, Brett; Fogelson, Jeremy; Bydon, Mohamad

    2017-06-01

    A retrospective review of a consecutive series of spinal fusions comparing patient and procedural characteristics of patients who developed surgical site infections (SSIs) after spinal fusion. It is known that increased surgical time (incision to closure) is associated with a higher rate of postoperative SSIs. We sought to determine whether increased total anesthetic time (intubation to extubation) is a factor in the development of SSIs as well. In spine surgery for deformity and degenerative disease, SSI has been associated with operative time, revealing a nearly 10-fold increase in SSI rates in prolonged surgery. Surgical time is associated with infections in other surgical disciplines as well. No studies have reported whether total anesthetic time (intubation to extubation) has an association with SSIs. Surgical records were searched in a retrospective fashion to identify all spine fusion procedures performed between January 2010 and July 2012. All SSIs during that timeframe were recorded and compared with the list of cases performed between 2010 and 2012 in a case-control design. There were 20 (1.7%) SSIs in this fusion cohort. On univariate analyses of operative factors, there was a significant association between total anesthetic time (Infection 7.6 ± 0.5 hrs vs. no infection -6.0 ± 0.1 hrs, P operative time (infection 5.5 ± 0.4 hrs vs. no infection - 4.4 ± 0.06 hrs, P infections, whereas level of pathology and emergent surgery were not significant. On multivariate logistic analysis, BMI and total anesthetic time remained independent predictors of SSI whereas ASA status and operative time did not. Increasing BMI and total anesthetic time were independent predictors of SSIs in this cohort of over 1000 consecutive spinal fusions. 3.

  16. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  17. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  18. Climate Data Provenance Tracking for Just-In-Time Computation

    Science.gov (United States)

    Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.

    2016-12-01

    The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.

  19. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  20. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  1. Balancing Exploration, Uncertainty Representation and Computational Time in Many-Objective Reservoir Policy Optimization

    Science.gov (United States)

    Zatarain-Salazar, J.; Reed, P. M.; Quinn, J.; Giuliani, M.; Castelletti, A.

    2016-12-01

    As we confront the challenges of managing river basin systems with a large number of reservoirs and increasingly uncertain tradeoffs impacting their operations (due to, e.g. climate change, changing energy markets, population pressures, ecosystem services, etc.), evolutionary many-objective direct policy search (EMODPS) solution strategies will need to address the computational demands associated with simulating more uncertainties and therefore optimizing over increasingly noisy objective evaluations. Diagnostic assessments of state-of-the-art many-objective evolutionary algorithms (MOEAs) to support EMODPS have highlighted that search time (or number of function evaluations) and auto-adaptive search are key features for successful optimization. Furthermore, auto-adaptive MOEA search operators are themselves sensitive to having a sufficient number of function evaluations to learn successful strategies for exploring complex spaces and for escaping from local optima when stagnation is detected. Fortunately, recent parallel developments allow coordinated runs that enhance auto-adaptive algorithmic learning and can handle scalable and reliable search with limited wall-clock time, but at the expense of the total number of function evaluations. In this study, we analyze this tradeoff between parallel coordination and depth of search using different parallelization schemes of the Multi-Master Borg on a many-objective stochastic control problem. We also consider the tradeoff between better representing uncertainty in the stochastic optimization, and simplifying this representation to shorten the function evaluation time and allow for greater search. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple competing objectives for hydropower production, urban water supply, recreation and environmental flows need to be balanced. Our results provide guidance for balancing exploration, uncertainty, and computational demands when using the EMODPS

  2. Total sleep time, alcohol consumption, and the duration and severity of alcohol hangover

    NARCIS (Netherlands)

    van Schrojenstein Lantman, Marith; Mackus, Marlou; Roth, Thomas; Verster, Joris C|info:eu-repo/dai/nl/241442702

    2017-01-01

    INTRODUCTION: An evening of alcohol consumption often occurs at the expense of sleep time. The aim of this study was to determine the relationship between total sleep time and the duration and severity of the alcohol hangover. METHODS: A survey was conducted among Dutch University students to

  3. The influence of tourniquet use and operative time on the incidence of deep vein thrombosis in total knee arthroplasty.

    Science.gov (United States)

    Hernandez, Arnaldo José; Almeida, Adriano Marques de; Fávaro, Edmar; Sguizzato, Guilherme Turola

    2012-09-01

    To evaluate the association between tourniquet and total operative time during total knee arthroplasty and the occurrence of deep vein thrombosis. Seventy-eight consecutive patients from our institution underwent cemented total knee arthroplasty for degenerative knee disorders. The pneumatic tourniquet time and total operative time were recorded in minutes. Four categories were established for total tourniquet time: 120 minutes. Three categories were defined for operative time: 150 minutes. Between 7 and 12 days after surgery, the patients underwent ascending venography to evaluate the presence of distal or proximal deep vein thrombosis. We evaluated the association between the tourniquet time and total operative time and the occurrence of deep vein thrombosis after total knee arthroplasty. In total, 33 cases (42.3%) were positive for deep vein thrombosis; 13 (16.7%) cases involved the proximal type. We found no statistically significant difference in tourniquet time or operative time between patients with or without deep vein thrombosis. We did observe a higher frequency of proximal deep vein thrombosis in patients who underwent surgery lasting longer than 120 minutes. The mean total operative time was also higher in patients with proximal deep vein thrombosis. The tourniquet time did not significantly differ in these patients. We concluded that surgery lasting longer than 120 minutes increases the risk of proximal deep vein thrombosis.

  4. Computed tomography for preoperative planning in total hip arthroplasty: what radiologists need to know

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander [Charite - University Hospitals Berlin, Department of Radiology, Berlin (Germany); Imaging Science Institute Charite, Berlin (Germany); Radmer, Sebastian [Proendo, Orthopedic Surgery, Berlin (Germany); Wagner, Moritz; Hamm, Bernd [Charite - University Hospitals Berlin, Department of Radiology, Berlin (Germany); Roessler, Torsten [Klinikum Ernst von Bergmann, Department of Trauma and Orthopedic Surgery, Potsdam (Germany); Sparmann, Martin [Proendo, Orthopedic Surgery, Berlin (Germany); Charite - University Hospital, Berlin (Germany)

    2014-08-15

    The number of total hip arthroplasties is continuously rising. Although less invasive surgical techniques, sophisticated component design, and intraoperative navigation techniques have been introduced, the rate of peri- and postoperative complications, including dislocations, fractures, nerve palsies, and infections, is still a major clinical problem. Better patient outcome, faster recovery and rehabilitation, and shorter operation times therefore remain to be accomplished. A promising strategy is to use minimally invasive techniques in conjunction with modular implants, aimed at independently reconstructing femoral offset and leg length on the basis of highly accurate preoperative planning. Plain radiographs have clear limitations for the correct estimation of hip joint geometry and bone quality. Three-dimensional assessment based on computed tomography (CT) allows optimizing the choice and positions of implants and anticipating difficulties to be encountered during surgery. Postoperative CT is used to monitor operative translation and plays a role in arthroplastic quality management. Radiologists should be familiar with the needs of orthopedic surgeons in terms of CT acquisition, post-processing, and data transfer. The CT protocol should be optimized to enhance image quality and reduce radiation exposure. When dedicated orthopedic CT protocols and state-of-the-art scanner hardware are used, radiation exposure can be decreased to a level just marginally higher than that of conventional preoperative radiography. Surgeons and radiologists should use similar terminology to avoid misunderstanding and inaccuracies in the transfer of preoperative planning. (orig.)

  5. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  6. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  7. Continuous-variable quantum computing in optical time-frequency modes using quantum memories.

    Science.gov (United States)

    Humphreys, Peter C; Kolthammer, W Steven; Nunn, Joshua; Barbieri, Marco; Datta, Animesh; Walmsley, Ian A

    2014-09-26

    We develop a scheme for time-frequency encoded continuous-variable cluster-state quantum computing using quantum memories. In particular, we propose a method to produce, manipulate, and measure two-dimensional cluster states in a single spatial mode by exploiting the intrinsic time-frequency selectivity of Raman quantum memories. Time-frequency encoding enables the scheme to be extremely compact, requiring a number of memories that are a linear function of only the number of different frequencies in which the computational state is encoded, independent of its temporal duration. We therefore show that quantum memories can be a powerful component for scalable photonic quantum information processing architectures.

  8. Asymptotic behavior of total times For jobs that must start over if a failure occurs

    DEFF Research Database (Denmark)

    Asmussen, Søren; Fiorini, Pierre; Lipsky, Lester

    the ready queue, or it may restart the task. The behavior of systems under the first two scenarios is well documented, but the third (RESTART) has resisted detailed analysis. In this paper we derive tight asymptotic relations between the distribution of task times without failures to the total time when...... including failures, for any failure distribution. In particular, we show that if the task time distribution has an unbounded support then the total time distribution H is always heavy-tailed. Asymptotic expressions are given for the tail of H in various scenarios. The key ingredients of the analysis...

  9. Asymptotic behaviour of total times for jobs that must start over if a failure occurs

    DEFF Research Database (Denmark)

    Asmussen, Søren; Fiorini, Pierre; Lipsky, Lester

    2008-01-01

    the ready queue, or it may restart the task. The behavior of systems under the first two scenarios is well documented, but the third (RESTART) has resisted detailed analysis. In this paper we derive tight asymptotic relations between the distribution of task times without failures and the total time when...... including failures, for any failure distribution. In particular, we show that if the task-time distribution has an unbounded support, then the total-time distribution H is always heavy tailed. Asymptotic expressions are given for the tail of H in various scenarios. The key ingredients of the analysis...

  10. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  11. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  12. A note on computing average state occupation times

    Directory of Open Access Journals (Sweden)

    Jan Beyersmann

    2014-05-01

    Full Text Available Objective: This review discusses how biometricians would probably compute or estimate expected waiting times, if they had the data. Methods: Our framework is a time-inhomogeneous Markov multistate model, where all transition hazards are allowed to be time-varying. We assume that the cumulative transition hazards are given. That is, they are either known, as in a simulation, determined by expert guesses, or obtained via some method of statistical estimation. Our basic tool is product integration, which transforms the transition hazards into the matrix of transition probabilities. Product integration enjoys a rich mathematical theory, which has successfully been used to study probabilistic and statistical aspects of multistate models. Our emphasis will be on practical implementation of product integration, which allows us to numerically approximate the transition probabilities. Average state occupation times and other quantities of interest may then be derived from the transition probabilities.

  13. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  14. Total donor ischemic time: relationship to early hemodynamics and intensive care morbidity in pediatric cardiac transplant recipients.

    Science.gov (United States)

    Rodrigues, Warren; Carr, Michelle; Ridout, Deborah; Carter, Katherine; Hulme, Sara Louise; Simmonds, Jacob; Elliott, Martin; Hoskote, Aparna; Burch, Michael; Brown, Kate L

    2011-11-01

    Single-center studies have failed to link modest increases in total donor ischemic time to mortality after pediatric orthotopic heart transplant. We aimed to investigate whether prolonged total donor ischemic time is linked to pediatric intensive care morbidity after orthotopic heart transplant. Retrospective cohort review. Tertiary pediatric transplant center in the United Kingdom. Ninety-three pediatric orthotopic heart transplants between 2002 and 2006. Total donor ischemic time was investigated for association with early post-orthotopic heart transplant hemodynamics and intensive care unit morbidities. Of 43 males and 50 females with median age 7.2 (interquartile range 2.2, 13.0) yrs, 62 (68%) had dilated cardiomyopathy, 20 (22%) had congenital heart disease, and nine (10%) had restrictive cardiomyopathy. The mean total donor ischemic time was 225.9 (sd 65.6) mins. In the first 24 hrs after orthotopic heart transplant, age-adjusted mean arterial blood pressure increased (p total donor ischemic time was significantly associated with lower mean arterial blood pressure (p care unit (p = .004), and longer post-orthotopic heart transplant stay in hospital (p = .02). Total donor ischemic time was not related to levels of mean pulmonary arterial pressure (p = .62), left atrial pressure (p = .38), or central venous pressure (p = .76) early after orthotopic heart transplant. Prolonged total donor ischemic time has an adverse effect on the donor organ, contributing to lower mean arterial blood pressure, as well as more prolonged ventilation and intensive care unit and hospital stays post-orthotopic heart transplant, reflecting increased morbidity.

  15. Real-time data-intensive computing

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander; MacDowell, Alastair A.; Padmore, Howard A.; Shapiro, David; Tamura, Nobumichi [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Beattie, Keith; Krishnan, Harinarayan; Patton, Simon J.; Perciano, Talita; Stromsness, Rune; Tull, Craig E.; Ushizima, Daniela [Computational Research Division, Lawrence Berkeley National Laboratory Berkeley CA 94720 (United States); Correa, Joaquin; Deslippe, Jack R. [National Energy Research Scientific Computing Center, Berkeley, CA 94720 (United States); Dart, Eli; Tierney, Brian L. [Energy Sciences Network, Berkeley, CA 94720 (United States); Daurer, Benedikt J.; Maia, Filipe R. N. C. [Uppsala University, Uppsala (Sweden); and others

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficient closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.

  16. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  17. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  18. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    Science.gov (United States)

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are

  19. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    International Nuclear Information System (INIS)

    Na, Yong Hum; Kapp, Daniel S; Xing, Lei; Suh, Tae-Suk

    2013-01-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm 2 ) from the Varian TrueBeam TM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are

  20. FRANTIC: a computer code for time dependent unavailability analysis

    International Nuclear Information System (INIS)

    Vesely, W.E.; Goldberg, F.F.

    1977-03-01

    The FRANTIC computer code evaluates the time dependent and average unavailability for any general system model. The code is written in FORTRAN IV for the IBM 370 computer. Non-repairable components, monitored components, and periodically tested components are handled. One unique feature of FRANTIC is the detailed, time dependent modeling of periodic testing which includes the effects of test downtimes, test overrides, detection inefficiencies, and test-caused failures. The exponential distribution is used for the component failure times and periodic equations are developed for the testing and repair contributions. Human errors and common mode failures can be included by assigning an appropriate constant probability for the contributors. The output from FRANTIC consists of tables and plots of the system unavailability along with a breakdown of the unavailability contributions. Sensitivity studies can be simply performed and a wide range of tables and plots can be obtained for reporting purposes. The FRANTIC code represents a first step in the development of an approach that can be of direct value in future system evaluations. Modifications resulting from use of the code, along with the development of reliability data based on operating reactor experience, can be expected to provide increased confidence in its use and potential application to the licensing process

  1. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  2. Controlling total spot power from holographic laser by superimposing a binary phase grating.

    Science.gov (United States)

    Liu, Xiang; Zhang, Jian; Gan, Yu; Wu, Liying

    2011-04-25

    By superimposing a tunable binary phase grating with a conventional computer-generated hologram, the total power of multiple holographic 3D spots can be easily controlled by changing the phase depth of grating with high accuracy to a random power value for real-time optical manipulation without extra power loss. Simulation and experiment results indicate that a resolution of 0.002 can be achieved at a lower time cost for normalized total spot power.

  3. Patient-specific instrumentation for total knee arthroplasty does not match the pre-operative plan as assessed by intra-operative computer-assisted navigation.

    Science.gov (United States)

    Scholes, Corey; Sahni, Varun; Lustig, Sebastien; Parker, David A; Coolican, Myles R J

    2014-03-01

    The introduction of patient-specific instruments (PSI) for guiding bone cuts could increase the incidence of malalignment in primary total knee arthroplasty. The purpose of this study was to assess the agreement between one type of patient-specific instrumentation (Zimmer PSI) and the pre-operative plan with respect to bone cuts and component alignment during TKR using imageless computer navigation. A consecutive series of 30 femoral and tibial guides were assessed in-theatre by the same surgeon using computer navigation. Following surgical exposure, the PSI cutting guides were placed on the joint surface and alignment assessed using the navigation tracker. The difference between in-theatre data and the pre-operative plan was recorded and analysed. The error between in-theatre measurements and pre-operative plan for the femoral and tibial components exceeded 3° for 3 and 17% of the sample, respectively, while the error for total coronal alignment exceeded 3° for 27% of the sample. The present results indicate that alignment with Zimmer PSI cutting blocks, assessed by imageless navigation, does not match the pre-operative plan in a proportion of cases. To prevent unnecessary increases in the incidence of malalignment in primary TKR, it is recommended that these devices should not be used without objective verification of alignment, either in real-time or with post-operative imaging. Further work is required to identify the source of discrepancies and validate these devices prior to routine use. II.

  4. Computational time analysis of the numerical solution of 3D electrostatic Poisson's equation

    Science.gov (United States)

    Kamboh, Shakeel Ahmed; Labadin, Jane; Rigit, Andrew Ragai Henri; Ling, Tech Chaw; Amur, Khuda Bux; Chaudhary, Muhammad Tayyab

    2015-05-01

    3D Poisson's equation is solved numerically to simulate the electric potential in a prototype design of electrohydrodynamic (EHD) ion-drag micropump. Finite difference method (FDM) is employed to discretize the governing equation. The system of linear equations resulting from FDM is solved iteratively by using the sequential Jacobi (SJ) and sequential Gauss-Seidel (SGS) methods, simulation results are also compared to examine the difference between the results. The main objective was to analyze the computational time required by both the methods with respect to different grid sizes and parallelize the Jacobi method to reduce the computational time. In common, the SGS method is faster than the SJ method but the data parallelism of Jacobi method may produce good speedup over SGS method. In this study, the feasibility of using parallel Jacobi (PJ) method is attempted in relation to SGS method. MATLAB Parallel/Distributed computing environment is used and a parallel code for SJ method is implemented. It was found that for small grid size the SGS method remains dominant over SJ method and PJ method while for large grid size both the sequential methods may take nearly too much processing time to converge. Yet, the PJ method reduces computational time to some extent for large grid sizes.

  5. Complexities of the storm-time characteristics of ionospheric total electron content

    International Nuclear Information System (INIS)

    Kane, R.P.

    1982-01-01

    The complexities of the storm-time variations of the ionospheric total electron content are briefly reviewed. It is suggested that large variations from storm to storm may be due to irregular flows from the auroral region towards equator. A proper study of such flows needs an elaborate network of TEC measuring instruments. The need of planning and organizing such a network is emphasized

  6. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  7. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Anthony B., E-mail: acosta@northwestern.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Green, Jason R., E-mail: jason.green@umb.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Department of Chemistry, University of Massachusetts Boston, Boston, MA 02125 (United States)

    2013-08-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N{sup 2} (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra.

  8. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    International Nuclear Information System (INIS)

    Costa, Anthony B.; Green, Jason R.

    2013-01-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N 2 (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra

  9. Effects of computer-assisted oral anticoagulant therapy

    DEFF Research Database (Denmark)

    Rasmussen, Rune Skovgaard; Corell, Pernille; Madsen, Poul

    2012-01-01

    : Patients randomized to computer-assisted anticoagulation and the CoaguChek® system reached the therapeutic target range after 8 days compared to 14 days by prescriptions from physicians (p = 0.04). Time spent in the therapeutic target range did not differ between groups. The median INR value measured...... prescribed by physicians, and the total time spent within the therapeutic target range was similar. Thus computer-assisted oral anticoagulant therapy may reduce the cost of anticoagulation therapy without lowering the quality. INR values measured by CoaguChek® were reliable compared to measurements......UNLABELLED: BACKGROUND: Computer-assistance and self-monitoring lower the cost and may improve the quality of anticoagulation therapy. The main purpose of this clinical investigation was to use computer-assisted oral anticoagulant therapy to improve the time to reach and the time spent within...

  10. Total sitting time, leisure time physical activity and risk of hospitalization due to low back pain: The Danish Health Examination Survey cohort 2007-2008.

    Science.gov (United States)

    Balling, Mie; Holmberg, Teresa; Petersen, Christina B; Aadahl, Mette; Meyrowitsch, Dan W; Tolstrup, Janne S

    2018-02-01

    This study aimed to test the hypotheses that a high total sitting time and vigorous physical activity in leisure time increase the risk of low back pain and herniated lumbar disc disease. A total of 76,438 adults answered questions regarding their total sitting time and physical activity during leisure time in the Danish Health Examination Survey 2007-2008. Information on low back pain diagnoses up to 10 September 2015 was obtained from The National Patient Register. The mean follow-up time was 7.4 years. Data were analysed using Cox regression analysis with adjustment for potential confounders. Multiple imputations were performed for missing values. During the follow-up period, 1796 individuals were diagnosed with low back pain, of whom 479 were diagnosed with herniated lumbar disc disease. Total sitting time was not associated with low back pain or herniated lumbar disc disease. However, moderate or vigorous physical activity, as compared to light physical activity, was associated with increased risk of low back pain (HR = 1.16, 95% CI: 1.03-1.30 and HR = 1.45, 95% CI: 1.15-1.83). Moderate, but not vigorous physical activity was associated with increased risk of herniated lumbar disc disease. The results suggest that total sitting time is not associated with low back pain, but moderate and vigorous physical activity is associated with increased risk of low back pain compared with light physical activity.

  11. Computing Refined Buneman Trees in Cubic Time

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Östlin, A.

    2003-01-01

    Reconstructing the evolutionary tree for a set of n species based on pairwise distances between the species is a fundamental problem in bioinformatics. Neighbor joining is a popular distance based tree reconstruction method. It always proposes fully resolved binary trees despite missing evidence...... in the underlying distance data. Distance based methods based on the theory of Buneman trees and refined Buneman trees avoid this problem by only proposing evolutionary trees whose edges satisfy a number of constraints. These trees might not be fully resolved but there is strong combinatorial evidence for each...... proposed edge. The currently best algorithm for computing the refined Buneman tree from a given distance measure has a running time of O(n 5) and a space consumption of O(n 4). In this paper, we present an algorithm with running time O(n 3) and space consumption O(n 2). The improved complexity of our...

  12. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  13. Theory and computation of disturbance invariant sets for discrete-time linear systems

    Directory of Open Access Journals (Sweden)

    Kolmanovsky Ilya

    1998-01-01

    Full Text Available This paper considers the characterization and computation of invariant sets for discrete-time, time-invariant, linear systems with disturbance inputs whose values are confined to a specified compact set but are otherwise unknown. The emphasis is on determining maximal disturbance-invariant sets X that belong to a specified subset Γ of the state space. Such d-invariant sets have important applications in control problems where there are pointwise-in-time state constraints of the form χ ( t ∈ Γ . One purpose of the paper is to unite and extend in a rigorous way disparate results from the prior literature. In addition there are entirely new results. Specific contributions include: exploitation of the Pontryagin set difference to clarify conceptual matters and simplify mathematical developments, special properties of maximal invariant sets and conditions for their finite determination, algorithms for generating concrete representations of maximal invariant sets, practical computational questions, extension of the main results to general Lyapunov stable systems, applications of the computational techniques to the bounding of state and output response. Results on Lyapunov stable systems are applied to the implementation of a logic-based, nonlinear multimode regulator. For plants with disturbance inputs and state-control constraints it enlarges the constraint-admissible domain of attraction. Numerical examples illustrate the various theoretical and computational results.

  14. Iterated greedy algorithms to minimize the total family flow time for job-shop scheduling with job families and sequence-dependent set-ups

    Science.gov (United States)

    Kim, Ji-Su; Park, Jung-Hyeon; Lee, Dong-Ho

    2017-10-01

    This study addresses a variant of job-shop scheduling in which jobs are grouped into job families, but they are processed individually. The problem can be found in various industrial systems, especially in reprocessing shops of remanufacturing systems. If the reprocessing shop is a job-shop type and has the component-matching requirements, it can be regarded as a job shop with job families since the components of a product constitute a job family. In particular, sequence-dependent set-ups in which set-up time depends on the job just completed and the next job to be processed are also considered. The objective is to minimize the total family flow time, i.e. the maximum among the completion times of the jobs within a job family. A mixed-integer programming model is developed and two iterated greedy algorithms with different local search methods are proposed. Computational experiments were conducted on modified benchmark instances and the results are reported.

  15. Use of inpatient continuous passive motion versus no CPM in computer-assisted total knee arthroplasty.

    Science.gov (United States)

    Alkire, Martha R; Swank, Michael L

    2010-01-01

    Continuous passive motion (CPM) has shown positive effects on tissue healing, edema, hemarthrosis, and joint function (L. Brosseau et al., 2004). CPM has also been shown to increase short-term early flexion and decrease length of stay (LOS) ( L. Brosseau et al., 2004; C. M. Chiarello, C. M. S. Gundersen, & T. O'Halloran, 2004). The benefits of CPM for the population of patients undergoing computer-assisted total knee arthroplasty (TKA) have not been examined. The primary objective of this study was to determine whether the use of CPM following computer-assisted TKA resulted in differences in range of motion, edema/drainage, functional ability, and pain. This was an experimental, prospective, randomized study of patients undergoing unilateral, computer-assisted TKA. The experimental group received CPM thrice daily and physical therapy (PT) twice daily during their hospitalization. The control group received PT twice daily and no CPM during the hospital stay. Both groups received PT after discharge. Measurement included Knee Society scores, Western Ontario McMaster Osteoarthritis Index values, range of motion, knee circumference, and HemoVac drainage. Data were collected at various intervals from preoperatively through 3 months. Although the control group was found to be higher functioning preoperatively, there was no statistically significant difference in flexion, edema or drainage, function, or pain between groups through the 3-month study period.

  16. Ultrasonic divergent-beam scanner for time-of-flight tomography with computer evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Glover, G H

    1978-03-02

    The rotatable ultrasonic divergent-beam scanner is designed for time-of-flight tomography with computer evaluation. With it there can be measured parameters that are of importance for the structure of soft tissues, e.g. time as a function of the velocity distribution along a certain path of flight(the method is analogous to the transaxial X-ray tomography). Moreover it permits to perform the quantitative measurement of two-dimensional velocity distributions and may therefore be applied to serial examinations for detecting cancer of the breast. As computers digital memories as well as analog-digital-hybrid systems are suitable.

  17. Computing moment to moment BOLD activation for real-time neurofeedback

    Science.gov (United States)

    Hinds, Oliver; Ghosh, Satrajit; Thompson, Todd W.; Yoo, Julie J.; Whitfield-Gabrieli, Susan; Triantafyllou, Christina; Gabrieli, John D.E.

    2013-01-01

    Estimating moment to moment changes in blood oxygenation level dependent (BOLD) activation levels from functional magnetic resonance imaging (fMRI) data has applications for learned regulation of regional activation, brain state monitoring, and brain-machine interfaces. In each of these contexts, accurate estimation of the BOLD signal in as little time as possible is desired. This is a challenging problem due to the low signal-to-noise ratio of fMRI data. Previous methods for real-time fMRI analysis have either sacrificed the ability to compute moment to moment activation changes by averaging several acquisitions into a single activation estimate or have sacrificed accuracy by failing to account for prominent sources of noise in the fMRI signal. Here we present a new method for computing the amount of activation present in a single fMRI acquisition that separates moment to moment changes in the fMRI signal intensity attributable to neural sources from those due to noise, resulting in a feedback signal more reflective of neural activation. This method computes an incremental general linear model fit to the fMRI timeseries, which is used to calculate the expected signal intensity at each new acquisition. The difference between the measured intensity and the expected intensity is scaled by the variance of the estimator in order to transform this residual difference into a statistic. Both synthetic and real data were used to validate this method and compare it to the only other published real-time fMRI method. PMID:20682350

  18. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  19. On the Laws of Total Local Times for -Paths and Bridges of Symmetric Lévy Processes

    Directory of Open Access Journals (Sweden)

    Masafumi Hayashi

    2013-01-01

    Full Text Available The joint law of the total local times at two levels for -paths of symmetric Lévy processes is shown to admit an explicit representation in terms of the laws of the squared Bessel processes of dimensions two and zero. The law of the total local time at a single level for bridges is also discussed.

  20. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  1. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  2. A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Woo Seok; Kim, Soo Mee; Park, Min Jae; Lee, Dong Soo; Lee, Jae Sung [Seoul National University, Seoul (Korea, Republic of)

    2009-10-15

    The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 sec, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 sec, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries

  3. A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems

    International Nuclear Information System (INIS)

    Ha, Woo Seok; Kim, Soo Mee; Park, Min Jae; Lee, Dong Soo; Lee, Jae Sung

    2009-01-01

    The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 sec, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 sec, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries

  4. Constrained Total Generalized p-Variation Minimization for Few-View X-Ray Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Zhang, Hanming; Wang, Linyuan; Yan, Bin; Li, Lei; Cai, Ailong; Hu, Guoen

    2016-01-01

    Total generalized variation (TGV)-based computed tomography (CT) image reconstruction, which utilizes high-order image derivatives, is superior to total variation-based methods in terms of the preservation of edge information and the suppression of unfavorable staircase effects. However, conventional TGV regularization employs l1-based form, which is not the most direct method for maximizing sparsity prior. In this study, we propose a total generalized p-variation (TGpV) regularization model to improve the sparsity exploitation of TGV and offer efficient solutions to few-view CT image reconstruction problems. To solve the nonconvex optimization problem of the TGpV minimization model, we then present an efficient iterative algorithm based on the alternating minimization of augmented Lagrangian function. All of the resulting subproblems decoupled by variable splitting admit explicit solutions by applying alternating minimization method and generalized p-shrinkage mapping. In addition, approximate solutions that can be easily performed and quickly calculated through fast Fourier transform are derived using the proximal point method to reduce the cost of inner subproblems. The accuracy and efficiency of the simulated and real data are qualitatively and quantitatively evaluated to validate the efficiency and feasibility of the proposed method. Overall, the proposed method exhibits reasonable performance and outperforms the original TGV-based method when applied to few-view problems.

  5. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  6. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  7. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  8. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  9. Correlates of occupational, leisure and total sitting time in working adults: results from the Singapore multi-ethnic cohort.

    Science.gov (United States)

    Uijtdewilligen, Léonie; Yin, Jason Dean-Chen; van der Ploeg, Hidde P; Müller-Riemenschneider, Falk

    2017-12-13

    Evidence on the health risks of sitting is accumulating. However, research identifying factors influencing sitting time in adults is limited, especially in Asian populations. This study aimed to identify socio-demographic and lifestyle correlates of occupational, leisure and total sitting time in a sample of Singapore working adults. Data were collected between 2004 and 2010 from participants of the Singapore Multi Ethnic Cohort (MEC). Medical exclusion criteria for cohort participation were cancer, heart disease, stroke, renal failure and serious mental illness. Participants who were not working over the past 12 months and without data on sitting time were excluded from the analyses. Multivariable regression analyses were used to examine cross-sectional associations of self-reported age, gender, ethnicity, marital status, education, smoking, caloric intake and moderate-to-vigorous leisure time physical activity (LTPA) with self-reported occupational, leisure and total sitting time. Correlates were also studied separately for Chinese, Malays and Indians. The final sample comprised 9384 participants (54.8% male): 50.5% were Chinese, 24.0% Malay, and 25.5% Indian. For the total sample, mean occupational sitting time was 2.71 h/day, mean leisure sitting time was 2.77 h/day and mean total sitting time was 5.48 h/day. Sitting time in all domains was highest among Chinese. Age, gender, education, and caloric intake were associated with higher occupational sitting time, while ethnicity, marital status and smoking were associated with lower occupational sitting time. Marital status, smoking, caloric intake and LTPA were associated with higher leisure sitting time, while age, gender and ethnicity were associated with lower leisure sitting time. Gender, marital status, education, caloric intake and LTPA were associated with higher total sitting time, while ethnicity was associated with lower total sitting time. Stratified analyses revealed different associations within

  10. Minimizing Total Completion Time For Preemptive Scheduling With Release Dates And Deadline Constraints

    Directory of Open Access Journals (Sweden)

    He Cheng

    2014-02-01

    Full Text Available It is known that the single machine preemptive scheduling problem of minimizing total completion time with release date and deadline constraints is NP- hard. Du and Leung solved some special cases by the generalized Baker's algorithm and the generalized Smith's algorithm in O(n2 time. In this paper we give an O(n2 algorithm for the special case where the processing times and deadlines are agreeable. Moreover, for the case where the processing times and deadlines are disagreeable, we present two properties which could enable us to reduce the range of the enumeration algorithm

  11. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  12. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    Science.gov (United States)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  13. Objectively Measured Total and Occupational Sedentary Time in Three Work Settings

    OpenAIRE

    van Dommelen, Paula; Coffeng, Jennifer K.; van der Ploeg, Hidde P.; van der Beek, Allard J.; Boot, C?cile R. L.; Hendriksen, Ingrid J. M.

    2016-01-01

    Background. Sedentary behaviour increases the risk for morbidity. Our primary aim is to determine the proportion and factors associated with objectively measured total and occupational sedentary time in three work settings. Secondary aim is to study the proportion of physical activity and prolonged sedentary bouts. Methods. Data were obtained using ActiGraph accelerometers from employees of: 1) a financial service provider (n = 49 men, 31 women), 2) two research institutes (n = 30 men, 57 wom...

  14. A polynomial time algorithm for checking regularity of totally normed process algebra

    NARCIS (Netherlands)

    Yang, F.; Huang, H.

    2015-01-01

    A polynomial algorithm for the regularity problem of weak and branching bisimilarity on totally normed process algebra (PA) processes is given. Its time complexity is O(n 3 +mn) O(n3+mn), where n is the number of transition rules and m is the maximal length of the rules. The algorithm works for

  15. On some methods for improving time of reachability sets computation for the dynamic system control problem

    Science.gov (United States)

    Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir

    2016-12-01

    The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.

  16. A computationally simple and robust method to detect determinism in a time series

    DEFF Research Database (Denmark)

    Lu, Sheng; Ju, Ki Hwan; Kanters, Jørgen K.

    2006-01-01

    We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals. The IS ......We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals...

  17. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  18. Business resilience system (BRS) driven through Boolean, fuzzy logics and cloud computation real and near real time analysis and decision making system

    CERN Document Server

    Zohuri, Bahman

    2017-01-01

    This book provides a technical approach to a Business Resilience System with its Risk Atom and Processing Data Point based on fuzzy logic and cloud computation in real time. Its purpose and objectives define a clear set of expectations for Organizations and Enterprises so their network system and supply chain are totally resilient and protected against cyber-attacks, manmade threats, and natural disasters. These enterprises include financial, organizational, homeland security, and supply chain operations with multi-point manufacturing across the world. Market shares and marketing advantages are expected to result from the implementation of the system. The collected information and defined objectives form the basis to monitor and analyze the data through cloud computation, and will guarantee the success of their survivability's against any unexpected threats. This book will be useful for advanced undergraduate and graduate students in the field of computer engineering, engineers that work for manufacturing com...

  19. Online Operation Guidance of Computer System Used in Real-Time Distance Education Environment

    Science.gov (United States)

    He, Aiguo

    2011-01-01

    Computer system is useful for improving real time and interactive distance education activities. Especially in the case that a large number of students participate in one distance lecture together and every student uses their own computer to share teaching materials or control discussions over the virtual classrooms. The problem is that within…

  20. Just In Time Value Chain Total Quality Management Part Of Technical Strategic Management Accounting

    Directory of Open Access Journals (Sweden)

    Lesi Hertati

    2015-08-01

    Full Text Available This article aims to determine Just In Time Value Chain Total Quality Management tqm as a technique in management accounting stategis.Tujuan Just In Time value chain or value chain Total Quality Management TQM is strategic for customer satisfaction in the long term obtained from the information. Quality information is the way to continuous improvement in order to increase the companys financial performance in the long term to increase competitive advantage. Strategic Management Accounting process gather competitor information explore opportunities to reduce costs integrate accounting with emphasis on the strategic position of the competition is a great plan. An overall strategic plan interrelated and serves as the basis for achieving targets or goals ahead.

  1. First passage times in homogeneous nucleation: Dependence on the total number of particles

    International Nuclear Information System (INIS)

    Yvinec, Romain; Bernard, Samuel; Pujo-Menjouet, Laurent; Hingant, Erwan

    2016-01-01

    Motivated by nucleation and molecular aggregation in physical, chemical, and biological settings, we present an extension to a thorough analysis of the stochastic self-assembly of a fixed number of identical particles in a finite volume. We study the statistics of times required for maximal clusters to be completed, starting from a pure-monomeric particle configuration. For finite volumes, we extend previous analytical approaches to the case of arbitrary size-dependent aggregation and fragmentation kinetic rates. For larger volumes, we develop a scaling framework to study the first assembly time behavior as a function of the total quantity of particles. We find that the mean time to first completion of a maximum-sized cluster may have a surprisingly weak dependence on the total number of particles. We highlight how higher statistics (variance, distribution) of the first passage time may nevertheless help to infer key parameters, such as the size of the maximum cluster. Finally, we present a framework to quantify formation of macroscopic sized clusters, which are (asymptotically) very unlikely and occur as a large deviation phenomenon from the mean-field limit. We argue that this framework is suitable to describe phase transition phenomena, as inherent infrequent stochastic processes, in contrast to classical nucleation theory

  2. A real-time computational model for estimating kinematics of ankle ligaments.

    Science.gov (United States)

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Sheng Quan

    2016-01-01

    An accurate assessment of ankle ligament kinematics is crucial in understanding the injury mechanisms and can help to improve the treatment of an injured ankle, especially when used in conjunction with robot-assisted therapy. A number of computational models have been developed and validated for assessing the kinematics of ankle ligaments. However, few of them can do real-time assessment to allow for an input into robotic rehabilitation programs. An ankle computational model was proposed and validated to quantify the kinematics of ankle ligaments as the foot moves in real-time. This model consists of three bone segments with three rotational degrees of freedom (DOFs) and 12 ankle ligaments. This model uses inputs for three position variables that can be measured from sensors in many ankle robotic devices that detect postures within the foot-ankle environment and outputs the kinematics of ankle ligaments. Validation of this model in terms of ligament length and strain was conducted by comparing it with published data on cadaver anatomy and magnetic resonance imaging. The model based on ligament lengths and strains is in concurrence with those from the published studies but is sensitive to ligament attachment positions. This ankle computational model has the potential to be used in robot-assisted therapy for real-time assessment of ligament kinematics. The results provide information regarding the quantification of kinematics associated with ankle ligaments related to the disability level and can be used for optimizing the robotic training trajectory.

  3. Real time simulation of large systems on mini-computer

    International Nuclear Information System (INIS)

    Nakhle, Michel; Roux, Pierre.

    1979-01-01

    Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr

  4. Real-time computing in environmental monitoring of a nuclear power plant

    International Nuclear Information System (INIS)

    Deme, S.; Lang, E.; Nagy, Gy.

    1987-06-01

    A real-time computing method is described for calculating the environmental radiation exposure due to a nuclear power plant both at normal operation and at accident. The effects of the Gaussian plume are recalculated in every ten minutes based on meteorological parameters measured at a height of 20 and 120 m as well as on emission data. At normal operation the quantity of radioactive materials released through the stacks is measured and registered while, at an accident, the source strength is unknown and the calculated relative data are normalized to the values measured at the eight environmental monitoring stations. The doses due to noble gases and to dry and wet deposition as well as the time integral of 131 I concentration are calculated and stored by a professional personal computer for 720 points of the environment of 11 km radius. (author)

  5. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    Science.gov (United States)

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  6. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Junghoon Lee

    2011-03-01

    Full Text Available Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  7. Batch Scheduling for Hybrid Assembly Differentiation Flow Shop to Minimize Total Actual Flow Time

    Science.gov (United States)

    Maulidya, R.; Suprayogi; Wangsaputra, R.; Halim, A. H.

    2018-03-01

    A hybrid assembly differentiation flow shop is a three-stage flow shop consisting of Machining, Assembly and Differentiation Stages and producing different types of products. In the machining stage, parts are processed in batches on different (unrelated) machines. In the assembly stage, each part of the different parts is assembled into an assembly product. Finally, the assembled products will further be processed into different types of final products in the differentiation stage. In this paper, we develop a batch scheduling model for a hybrid assembly differentiation flow shop to minimize the total actual flow time defined as the total times part spent in the shop floor from the arrival times until its due date. We also proposed a heuristic algorithm for solving the problems. The proposed algorithm is tested using a set of hypothetic data. The solution shows that the algorithm can solve the problems effectively.

  8. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  9. Computer Technology for Industry

    Science.gov (United States)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  10. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  11. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    Science.gov (United States)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  12. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  13. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  14. Estimation of total bacteria by real-time PCR in patients with periodontal disease.

    Science.gov (United States)

    Brajović, Gavrilo; Popović, Branka; Puletić, Miljan; Kostić, Marija; Milasin, Jelena

    2016-01-01

    Periodontal diseases are associated with the presence of elevated levels of bacteria within the gingival crevice. The aim of this study was to evaluate a total amount of bacteria in subgingival plaque samples in patients with a periodontal disease. A quantitative evaluation of total bacteria amount using quantitative real-time polymerase chain reaction (qRT-PCR) was performed on 20 samples of patients with ulceronecrotic periodontitis and on 10 samples of healthy subjects. The estimation of total bacterial amount was based on gene copy number for 16S rRNA that was determined by comparing to Ct values/gene copy number of the standard curve. A statistically significant difference between average gene copy number of total bacteria in periodontal patients (2.55 x 10⁷) and healthy control (2.37 x 10⁶) was found (p = 0.01). Also, a trend of higher numbers of the gene copy in deeper periodontal lesions (> 7 mm) was confirmed by a positive value of coefficient of correlation (r = 0.073). The quantitative estimation of total bacteria based on gene copy number could be an important additional tool in diagnosing periodontitis.

  15. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  16. What are the important manoeuvres for beginners to minimize surgical time in primary total knee arthroplasty?

    Science.gov (United States)

    Harato, Kengo; Maeno, Shinichi; Tanikawa, Hidenori; Kaneda, Kazuya; Morishige, Yutaro; Nomoto, So; Niki, Yasuo

    2016-08-01

    It was hypothesized that surgical time of beginners would be much longer than that of experts. Our purpose was to investigate and clarify the important manoeuvres for beginners to minimize surgical time in primary total knee arthroplasty (TKA) as a multicentre study. A total of 300 knees in 248 patients (averaged 74.6 years) were enrolled. All TKAs were done using the same instruments and the same measured resection technique at 14 facilities by 25 orthopaedic surgeons. Surgeons were divided into three surgeon groups (four experts, nine medium-volume surgeons and 12 beginners). The surgical technique was divided into five phases. Detailed surgical time and ratio of the time in each phase to overall surgical time were recorded and compared among the groups in each phase. A total of 62, 119, and 119 TKAs were done by beginners, medium-volume surgeons, and experts, respectively. Significant differences in surgical time among the groups were seen in each phase. Concerning the ratio of the time, experts and medium-volume surgeons seemed cautious in fixation of the permanent component compared to other phases. Interestingly, even in ratio, beginners and medium-volume surgeons took more time in exposure of soft tissue compared to experts. (0.14 in beginners, 0.13 in medium-volume surgeons, 0.11 in experts, P time in exposure and closure of soft tissue compared to experts. Improvement in basic technique is essential to minimize surgical time among beginners. First of all, surgical instructors should teach basic techniques in primary TKA for beginners. Therapeutic studies, Level IV.

  17. Alternative majority-voting methods for real-time computing systems

    Science.gov (United States)

    Shin, Kang G.; Dolter, James W.

    1989-01-01

    Two techniques that provide a compromise between the high time overhead in maintaining synchronous voting and the difficulty of combining results in asynchronous voting are proposed. These techniques are specifically suited for real-time applications with a single-source/single-sink structure that need instantaneous error masking. They provide a compromise between a tightly synchronized system in which the synchronization overhead can be quite high, and an asynchronous system which lacks suitable algorithms for combining the output data. Both quorum-majority voting (QMV) and compare-majority voting (CMV) are most applicable to distributed real-time systems with single-source/single-sink tasks. All real-time systems eventually have to resolve their outputs into a single action at some stage. The development of the advanced information processing system (AIPS) and other similar systems serve to emphasize the importance of these techniques. Time bounds suggest that it is possible to reduce the overhead for quorum-majority voting to below that for synchronous voting. All the bounds assume that the computation phase is nonpreemptive and that there is no multitasking.

  18. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  19. Computer-assisted total knee arthroplasty marketing and patient education: an evaluation of quality, content and accuracy of related websites.

    Science.gov (United States)

    Shemesh, Shai S; Bronson, Michael J; Moucha, Calin S

    2016-10-01

    The internet is increasingly being used as a resource for health-related information by the general public. We sought to establish the authorship, content and accuracy of the information available online regarding computer-assisted total knee arthroplasty (CA-TKA). One hundred fifty search results from three leading search engines available online (Google, Yahoo!, Bing) from ten different countries worldwide were reviewed. While private physicians/groups authored 50.7 % of the websites, only 17.3 % were authored by a hospital/university. As compared to traditional TKA, 59.3 % of the websites claimed that navigated TKA offers better longevity, 46.6 % claimed accelerated recovery and 26 % claimed fewer complications. Only 11.3 % mentioned the prolonged operating room time required, and only 15.3 % noted the current lack of long-term evidence in support of this technology. Patients seeking information regarding CA-TKA through the major search engines are likely to encounter websites presenting a narrow, unscientific, viewpoint of the present technology, putting emphasis on unsubstantiated benefits while disregarding potential drawbacks. Survey of Materials-Internet.

  20. Physiotherapy Exercise After Fast-Track Total Hip and Knee Arthroplasty: Time for Reconsideration?

    DEFF Research Database (Denmark)

    Bandholm, Thomas; Kehlet, Henrik

    2012-01-01

    Bandholm T, Kehlet H. Physiotherapy exercise after fast-track total hip and knee arthroplasty: time for reconsideration? Major surgery, including total hip arthroplasty (THA) and total knee arthroplasty (TKA), is followed by a convalescence period, during which the loss of muscle strength......-track methodology or enhanced recovery programs. It is the nature of this methodology to systematically and scientifically optimize all perioperative care components, with the overall goal of enhancing recovery. This is also the case for the care component "physiotherapy exercise" after THA and TKA. The 2 latest...... meta-analyses on the effectiveness of physiotherapy exercise after THA and TKA generally conclude that physiotherapy exercise after THA and TKA either does not work or is not very effective. The reason for this may be that the "pill" of physiotherapy exercise typically offered after THA and TKA does...

  1. Joint association of physical activity in leisure and total sitting time with metabolic syndrome amongst 15,235 Danish adults

    DEFF Research Database (Denmark)

    Petersen, Christina Bjørk; Nielsen, Asser Jon; Bauman, Adrian

    2014-01-01

    and total daily sitting time were assessed by self-report in 15,235 men and women in the Danish Health Examination Survey 2007-2008. Associations between leisure time physical activity, total sitting time and metabolic syndrome were investigated in logistic regression analysis. RESULTS: Adjusted odds ratios......BACKGROUND: Recent studies suggest that physical inactivity as well as sitting time are associated with metabolic syndrome. Our aim was to examine joint associations of leisure time physical activity and total daily sitting time with metabolic syndrome. METHODS: Leisure time physical activity...... (OR) for metabolic syndrome were 2.14 (95% CI: 1.88-2.43) amongst participants who were inactive in leisure time compared to the most active, and 1.42 (95% CI: 1.26-1.61) amongst those who sat for ≥10h/day compared to physical activity, sitting time...

  2. Using a Cloud Computing System to Reduce Door-to-Balloon Time in Acute ST-Elevation Myocardial Infarction Patients Transferred for Percutaneous Coronary Intervention

    Directory of Open Access Journals (Sweden)

    Chi-Kung Ho

    2017-01-01

    Full Text Available Background. This study evaluated the impact on clinical outcomes using a cloud computing system to reduce percutaneous coronary intervention hospital door-to-balloon (DTB time for ST segment elevation myocardial infarction (STEMI. Methods. A total of 369 patients before and after implementation of the transfer protocol were enrolled. Of these patients, 262 were transferred through protocol while the other 107 patients were transferred through the traditional referral process. Results. There were no significant differences in DTB time, pain to door of STEMI receiving center arrival time, and pain to balloon time between the two groups. Pain to electrocardiography time in patients with Killip I/II and catheterization laboratory to balloon time in patients with Killip III/IV were significantly reduced in transferred through protocol group compared to in traditional referral process group (both p<0.05. There were also no remarkable differences in the complication rate and 30-day mortality between two groups. The multivariate analysis revealed that the independent predictors of 30-day mortality were elderly patients, advanced Killip score, and higher level of troponin-I. Conclusions. This study showed that patients transferred through our present protocol could reduce pain to electrocardiography and catheterization laboratory to balloon time in Killip I/II and III/IV patients separately. However, this study showed that using a cloud computing system in our present protocol did not reduce DTB time.

  3. Computer-determined assay time based on preset precision

    International Nuclear Information System (INIS)

    Foster, L.A.; Hagan, R.; Martin, E.R.; Wachter, J.R.; Bonner, C.A.; Malcom, J.E.

    1994-01-01

    Most current assay systems for special nuclear materials (SNM) operate on the principle of a fixed assay time which provides acceptable measurement precision without sacrificing the required throughput of the instrument. Waste items to be assayed for SNM content can contain a wide range of nuclear material. Counting all items for the same preset assay time results in a wide range of measurement precision and wastes time at the upper end of the calibration range. A short time sample taken at the beginning of the assay could optimize the analysis time on the basis of the required measurement precision. To illustrate the technique of automatically determining the assay time, measurements were made with a segmented gamma scanner at the Plutonium Facility of Los Alamos National Laboratory with the assay time for each segment determined by counting statistics in that segment. Segments with very little SNM were quickly determined to be below the lower limit of the measurement range and the measurement was stopped. Segments with significant SNM were optimally assays to the preset precision. With this method the total assay time for each item is determined by the desired preset precision. This report describes the precision-based algorithm and presents the results of measurements made to test its validity

  4. Joint Time-Frequency-Space Classification of EEG in a Brain-Computer Interface Application

    Directory of Open Access Journals (Sweden)

    Molina Gary N Garcia

    2003-01-01

    Full Text Available Brain-computer interface is a growing field of interest in human-computer interaction with diverse applications ranging from medicine to entertainment. In this paper, we present a system which allows for classification of mental tasks based on a joint time-frequency-space decorrelation, in which mental tasks are measured via electroencephalogram (EEG signals. The efficiency of this approach was evaluated by means of real-time experimentations on two subjects performing three different mental tasks. To do so, a number of protocols for visualization, as well as training with and without feedback, were also developed. Obtained results show that it is possible to obtain good classification of simple mental tasks, in view of command and control, after a relatively small amount of training, with accuracies around 80%, and in real time.

  5. Cone-beam computed tomography fusion and navigation for real-time positron emission tomography-guided biopsies and ablations: a feasibility study.

    Science.gov (United States)

    Abi-Jaoudeh, Nadine; Mielekamp, Peter; Noordhoek, Niels; Venkatesan, Aradhana M; Millo, Corina; Radaelli, Alessandro; Carelsen, Bart; Wood, Bradford J

    2012-06-01

    To describe a novel technique for multimodality positron emission tomography (PET) fusion-guided interventions that combines cone-beam computed tomography (CT) with PET/CT before the procedure. Subjects were selected among patients scheduled for a biopsy or ablation procedure. The lesions were not visible with conventional imaging methods or did not have uniform uptake on PET. Clinical success was defined by adequate histopathologic specimens for molecular profiling or diagnosis and by lack of enhancement on follow-up imaging for ablation procedures. Time to target (time elapsed between the completion of the initial cone-beam CT scan and first tissue sample or treatment), total procedure time (time from the moment the patient was on the table until the patient was off the table), and number of times the needle was repositioned were recorded. Seven patients underwent eight procedures (two ablations and six biopsies). Registration and procedures were completed successfully in all cases. Clinical success was achieved in all biopsy procedures and in one of the two ablation procedures. The needle was repositioned once in one biopsy procedure only. On average, the time to target was 38 minutes (range 13-54 min). Total procedure time was 95 minutes (range 51-240 min, which includes composite ablation). On average, fluoroscopy time was 2.5 minutes (range 1.3-6.2 min). An integrated cone-beam CT software platform can enable PET-guided biopsies and ablation procedures without the need for additional specialized hardware. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.

  6. Computational model for real-time determination of tritium inventory in a detritiation installation

    International Nuclear Information System (INIS)

    Bornea, Anisia; Stefanescu, Ioan; Zamfirache, Marius; Stefan, Iuliana; Sofalca, Nicolae; Bidica, Nicolae

    2008-01-01

    Full text: At ICIT Rm.Valcea an experimental pilot plant was built having as main objective the development of a technology for detritiation of heavy water processed in the CANDU-type reactors of the nuclear power plant at Cernavoda, Romania. The aspects related to safeguards and safety for such a detritiation installation being of great importance, a complex computational model has been developed. The model allows real-time calculation of tritium inventory in a working installation. The applied detritiation technology is catalyzed isotopic exchange coupled with cryogenic distillation. Computational models for non-steady working conditions have been developed for each process of isotopic exchange. By coupling these processes tritium inventory can be determined in real-time. The computational model was developed based on the experience gained on the pilot installation. The model uses a set of parameters specific to isotopic exchange processes. These parameters were experimentally determined in the pilot installation. The model is included in the monitoring system and uses as input data the parameters acquired in real-time from automation system of the pilot installation. A friendly interface has been created to visualize the final results as data or graphs. (authors)

  7. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  8. Hydrologic Response to Climate Change: Missing Precipitation Data Matters for Computed Timing Trends

    Science.gov (United States)

    Daniels, B.

    2016-12-01

    This work demonstrates the derivation of climate timing statistics and applying them to determine resulting hydroclimate impacts. Long-term daily precipitation observations from 50 California stations were used to compute climate trends of precipitation event Intensity, event Duration and Pause between events. Each precipitation event trend was then applied as input to a PRMS hydrology model which showed hydrology changes to recharge, baseflow, streamflow, etc. An important concern was precipitation uncertainty induced by missing observation values and causing errors in quantification of precipitation trends. Many standard statistical techniques such as ARIMA and simple endogenous or even exogenous imputation were applied but failed to help resolve these uncertainties. What helped resolve these uncertainties was use of multiple imputation techniques. This involved fitting of Weibull probability distributions to multiple imputed values for the three precipitation trends.Permutation resampling techniques using Monte Carlo processing were then applied to the multiple imputation values to derive significance p-values for each trend. Significance at the 95% level for Intensity was found for 11 of the 50 stations, Duration from 16 of the 50, and Pause from 19, of which 12 were 99% significant. The significance weighted trends for California are Intensity -4.61% per decade, Duration +3.49% per decade, and Pause +3.58% per decade. Two California basins with PRMS hydrologic models were studied: Feather River in the northern Sierra Nevada mountains and the central coast Soquel-Aptos. Each local trend was changed without changing the other trends or the total precipitation. Feather River Basin's critical supply to Lake Oroville and the State Water Project benefited from a total streamflow increase of 1.5%. The Soquel-Aptos Basin water supply was impacted by a total groundwater recharge decrease of -7.5% and streamflow decrease of -3.2%.

  9. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  10. Computational imaging with multi-camera time-of-flight systems

    KAUST Repository

    Shrestha, Shikhar

    2016-07-11

    Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.

  11. Computer tomography urography assisted real-time ultrasound-guided percutaneous nephrolithotomy on renal calculus.

    Science.gov (United States)

    Fang, You-Qiang; Wu, Jie-Ying; Li, Teng-Cheng; Zheng, Hao-Feng; Liang, Guan-Can; Chen, Yan-Xiong; Hong, Xiao-Bin; Cai, Wei-Zhong; Zang, Zhi-Jun; Di, Jin-Ming

    2017-06-01

    This study aimed to assess the role of pre-designed route on computer tomography urography (CTU) in the ultrasound-guided percutaneous nephrolithotomy (PCNL) for renal calculus.From August 2013 to May 2016, a total of 100 patients diagnosed with complex renal calculus in our hospital were randomly divided into CTU group and control group (without CTU assistance). CTU was used to design a rational route for puncturing in CTU group. Ultrasound was used in both groups to establish a working trace in the operation areas. Patients' perioperative parameters and postoperative complications were recorded.All operations were successfully performed, without transferring to open surgery. Time of channel establishment in CTU group (6.5 ± 4.3 minutes) was shorter than the control group (10.0 ± 6.7 minutes) (P = .002). In addition, there was shorter operation time, lower rates of blood transfusion, secondary operation, and less establishing channels. The incidence of postoperative complications including residual stones, sepsis, severe hemorrhage, and perirenal hematoma was lower in CTU group than in control group.Pre-designing puncture route on CTU images would improve the puncturing accuracy, lessen establishing channels as well as improve the security in the ultrasound-guided PCNL for complex renal calculus, but at the cost of increased radiation exposure.

  12. Can a surgery-first orthognathic approach reduce the total treatment time?

    Science.gov (United States)

    Jeong, Woo Shik; Choi, Jong Woo; Kim, Do Yeon; Lee, Jang Yeol; Kwon, Soon Man

    2017-04-01

    Although pre-surgical orthodontic treatment has been accepted as a necessary process for stable orthognathic correction in the traditional orthognathic approach, recent advances in the application of miniscrews and in the pre-surgical simulation of orthodontic management using dental models have shown that it is possible to perform a surgery-first orthognathic approach without pre-surgical orthodontic treatment. This prospective study investigated the surgical outcomes of patients with diagnosed skeletal class III dentofacial deformities who underwent orthognathic surgery between December 2007 and December 2014. Cephalometric landmark data for patients undergoing the surgery-first approach were analyzed in terms of postoperative changes in vertical and horizontal skeletal pattern, dental pattern, and soft tissue profile. Forty-five consecutive Asian patients with skeletal class III dentofacial deformities who underwent surgery-first orthognathic surgery and 52 patients who underwent conventional two-jaw orthognathic surgery were included. The analysis revealed that the total treatment period for the surgery-first approach averaged 14.6 months, compared with 22.0 months for the orthodontics-first approach. Comparisons between the immediate postoperative and preoperative and between the postoperative and immediate postoperative cephalometric data revealed factors that correlated with the total treatment duration. The surgery-first orthognathic approach can dramatically reduce the total treatment time, with no major complications. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  14. Reduced Operating Time but Not Blood Loss With Cruciate Retaining Total Knee Arthroplasty

    Science.gov (United States)

    Vermesan, Dinu; Trocan, Ilie; Prejbeanu, Radu; Poenaru, Dan V; Haragus, Horia; Gratian, Damian; Marrelli, Massimo; Inchingolo, Francesco; Caprio, Monica; Cagiano, Raffaele; Tatullo, Marco

    2015-01-01

    Background There is no consensus regarding the use of retaining or replacing cruciate implants for patients with limited deformity who undergo a total knee replacement. Scope of this paper is to evaluate whether a cruciate sparing total knee replacement could have a reduced operating time compared to a posterior stabilized implant. Methods For this purpose, we performed a randomized study on 50 subjects. All procedures were performed by a single surgeon in the same conditions to minimize bias and only knees with a less than 20 varus deviation and/or maximum 15° fixed flexion contracture were included. Results Surgery time was significantly shorter with the cruciate retaining implant (P = 0.0037). The mean duration for the Vanguard implant was 68.9 (14.7) and for the NexGen II Legacy was 80.2 (11.3). A higher range of motion, but no significant Knee Society Scores at 6 months follow-up, was used as controls. Conclusions In conclusion, both implants had the potential to assure great outcomes. However, if a decision has to be made, choosing a cruciate retaining procedure could significantly reduce the surgical time. When performed under tourniquet, this gain does not lead to reduced blood loss. PMID:25584102

  15. Total-System Performance Assessment for the Yucca Mountain Site

    International Nuclear Information System (INIS)

    Wilson, M.L.

    2001-01-01

    Yucca Mountain, Nevada, is under consideration as a potential site for a repository for high-level radioactive waste. Total-system performance-assessment simulations are performed to evaluate the safety of the site. Features, events, and processes have been systematically evaluated to determine which ones are significant to the safety assessment. Computer models of the disposal system have been developed within a probabilistic framework, including both engineered and natural components. Selected results are presented for three different total-system simulations, and the behavior of the disposal system is discussed. The results show that risk is dominated by igneous activity at early times, because the robust waste-package design prevents significant nominal (non-disruptive) releases for tens of thousands of years or longer. The uncertainty in the nominal performance is dominated by uncertainties related to waste-package corrosion at early times and by uncertainties in the natural system, most significantly infiltration, at late times

  16. Independent and combined associations of total sedentary time and television viewing time with food intake patterns of 9- to 11-year-old Canadian children.

    Science.gov (United States)

    Borghese, Michael M; Tremblay, Mark S; Leduc, Genevieve; Boyer, Charles; Bélanger, Priscilla; LeBlanc, Allana G; Francis, Claire; Chaput, Jean-Philippe

    2014-08-01

    The relationships among sedentary time, television viewing time, and dietary patterns in children are not fully understood. The aim of this paper was to determine which of self-reported television viewing time or objectively measured sedentary time is a better correlate of the frequency of consumption of healthy and unhealthy foods. A cross-sectional study was conducted of 9- to 11-year-old children (n = 523; 57.1% female) from Ottawa, Ontario, Canada. Accelerometers were used to determine total sedentary time, and questionnaires were used to determine the number of hours of television watching and the frequency of consumption of foods per week. Television viewing was negatively associated with the frequency of consumption of fruits, vegetables, and green vegetables, and positively associated with the frequency of consumption of sweets, soft drinks, diet soft drinks, pastries, potato chips, French fries, fruit juices, ice cream, fried foods, and fast food. Except for diet soft drinks and fruit juices, these associations were independent of covariates, including sedentary time. Total sedentary time was negatively associated with the frequency of consumption of sports drinks, independent of covariates, including television viewing. In combined sedentary time and television viewing analyses, children watching >2 h of television per day consumed several unhealthy food items more frequently than did children watching ≤2 h of television, regardless of sedentary time. In conclusion, this paper provides evidence to suggest that television viewing time is more strongly associated with unhealthy dietary patterns than is total sedentary time. Future research should focus on reducing television viewing time, as a means of improving dietary patterns and potentially reducing childhood obesity.

  17. Variable dead time counters: 2. A computer simulation

    International Nuclear Information System (INIS)

    Hooton, B.W.; Lees, E.W.

    1980-09-01

    A computer model has been developed to give a pulse train which simulates that generated by a variable dead time counter (VDC) used in safeguards determination of Pu mass. The model is applied to two algorithms generally used for VDC analysis. It is used to determine their limitations at high counting rates and to investigate the effects of random neutrons from (α,n) reactions. Both algorithms are found to be deficient for use with masses of 240 Pu greater than 100g and one commonly used algorithm is shown, by use of the model and also by theory, to yield a result which is dependent on the random neutron intensity. (author)

  18. Total system for manufacture of nuclear vessels by computer: VECTRON

    International Nuclear Information System (INIS)

    Inagawa, Jin; Ueno, Osamu; Hanai, Yoshiharu; Ohkawa, Isao; Washizu, Hideyuki

    1980-01-01

    VECTRON (Vessel Engineering by Computer Tool and Rapid Operating for the N/C System) is a CAM (Computer Aided Manufacturing) system that has been developed to produce high quality and highly accurate vessels for nuclear power plants and other industrial plants. Outputs of this system are design drawings, manufacturing information and magnetic tapes of the N/C marking machine for vessel shell plates including their attachments. And it can also output information at each stage of designing, marking, cutting, forming and assembling by treating the vessels in three dimensions and by using data filing systems and plotting program for general use. The data filing systems consist of functional and manufacturing data of each part of vessels. This system not only realizes a change from manual work to computer work, but also leads us to improve production engineering and production jigs for safety and high quality. At present, VECTRON is being applied to the manufacture of the shell plates of primary containment vessels in the Kashiwazaki-Kariwa Nuclear Power Station Unit 1 (K-1) and the Fukushima Daini Nuclear Power Station Unit 3 (2F-3), to realize increased productivity. (author)

  19. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  20. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  1. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  2. Near-real-time Estimation and Forecast of Total Precipitable Water in Europe

    Science.gov (United States)

    Bartholy, J.; Kern, A.; Barcza, Z.; Pongracz, R.; Ihasz, I.; Kovacs, R.; Ferencz, C.

    2013-12-01

    Information about the amount and spatial distribution of atmospheric water vapor (or total precipitable water) is essential for understanding weather and the environment including the greenhouse effect, the climate system with its feedbacks and the hydrological cycle. Numerical weather prediction (NWP) models need accurate estimations of water vapor content to provide realistic forecasts including representation of clouds and precipitation. In the present study we introduce our research activity for the estimation and forecast of atmospheric water vapor in Central Europe using both observations and models. The Eötvös Loránd University (Hungary) operates a polar orbiting satellite receiving station in Budapest since 2002. This station receives Earth observation data from polar orbiting satellites including MODerate resolution Imaging Spectroradiometer (MODIS) Direct Broadcast (DB) data stream from satellites Terra and Aqua. The received DB MODIS data are automatically processed using freely distributed software packages. Using the IMAPP Level2 software total precipitable water is calculated operationally using two different methods. Quality of the TPW estimations is a crucial question for further application of the results, thus validation of the remotely sensed total precipitable water fields is presented using radiosonde data. In a current research project in Hungary we aim to compare different estimations of atmospheric water vapor content. Within the frame of the project we use a NWP model (DBCRAS; Direct Broadcast CIMSS Regional Assimilation System numerical weather prediction software developed by the University of Wisconsin, Madison) to forecast TPW. DBCRAS uses near real time Level2 products from the MODIS data processing chain. From the wide range of the derived Level2 products the MODIS TPW parameter found within the so-called mod07 results (Atmospheric Profiles Product) and the cloud top pressure and cloud effective emissivity parameters from the so

  3. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  4. 21 CFR 10.20 - Submission of documents to Division of Dockets Management; computation of time; availability for...

    Science.gov (United States)

    2010-04-01

    ... Management; computation of time; availability for public disclosure. 10.20 Section 10.20 Food and Drugs FOOD... Management; computation of time; availability for public disclosure. (a) A submission to the Division of Dockets Management of a petition, comment, objection, notice, compilation of information, or any other...

  5. Time-based analysis of total cost of patient episodes: a case study of hip replacement.

    Science.gov (United States)

    Peltokorpi, Antti; Kujala, Jaakko

    2006-01-01

    Healthcare in the public and private sectors is facing increasing pressure to become more cost-effective. Time-based competition and work-in-progress have been used successfully to measure and improve the efficiency of industrial manufacturing. Seeks to address this issue. Presents a framework for time based management of the total cost of a patient episode and apply it to the six sigma DMAIC-process development approach. The framework is used to analyse hip replacement patient episodes in Päijät-Häme Hospital District in Finland, which has a catchment area of 210,000 inhabitants and performs an average of 230 hip replacements per year. The work-in-progress concept is applicable to healthcare--notably that the DMAIC-process development approach can be used to analyse the total cost of patient episodes. Concludes that a framework, which combines the patient-in-process and the DMAIC development approach, can be used not only to analyse the total cost of patient episode but also to improve patient process efficiency. Presents a framework that combines patient-in-process and DMAIC-process development approaches, which can be used to analyse the total cost of a patient episode in order to improve patient process efficiency.

  6. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  7. Time-dependent density functional theory description of total photoabsorption cross sections

    Science.gov (United States)

    Tenorio, Bruno Nunes Cabral; Nascimento, Marco Antonio Chaer; Rocha, Alexandre Braga

    2018-02-01

    The time-dependent version of the density functional theory (TDDFT) has been used to calculate the total photoabsorption cross section of a number of molecules, namely, benzene, pyridine, furan, pyrrole, thiophene, phenol, naphthalene, and anthracene. The discrete electronic pseudo-spectra, obtained in a L2 basis set calculation were used in an analytic continuation procedure to obtain the photoabsorption cross sections. The ammonia molecule was chosen as a model system to compare the results obtained with TDDFT to those obtained with the linear response coupled cluster approach in order to make a link with our previous work and establish benchmarks.

  8. An Efficient Integer Coding and Computing Method for Multiscale Time Segment

    Directory of Open Access Journals (Sweden)

    TONG Xiaochong

    2016-12-01

    Full Text Available This article focus on the exist problem and status of current time segment coding, proposed a new set of approach about time segment coding: multi-scale time segment integer coding (MTSIC. This approach utilized the tree structure and the sort by size formed among integer, it reflected the relationship among the multi-scale time segments: order, include/contained, intersection, etc., and finally achieved an unity integer coding processing for multi-scale time. On this foundation, this research also studied the computing method for calculating the time relationships of MTSIC, to support an efficient calculation and query based on the time segment, and preliminary discussed the application method and prospect of MTSIC. The test indicated that, the implement of MTSIC is convenient and reliable, and the transformation between it and the traditional method is convenient, it has the very high efficiency in query and calculating.

  9. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  10. Universality of black hole quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Dvali, Gia [Muenchen Univ. (Germany). Arnold Sommerfeld Center for Theoretical Physics; Max-Planck-Institut fuer Physik, Muenchen (Germany); New York Univ., NY (United States). Center for Cosmology and Particle Physics; Gomez, Cesar [Muenchen Univ. (Germany). Arnold Sommerfeld Center for Theoretical Physics; Univ. Autonoma de Madrid (Spain). Inst. de Fisica Teorica UAM-CSIC; Luest, Dieter [Muenchen Univ. (Germany). Arnold Sommerfeld Center for Theoretical Physics; Max-Planck-Institut fuer Physik, Muenchen (Germany); Omar, Yasser [Instituto de Telecomunicacoes (Portugal). Physics of Information and Quantum Technologies Group; Lisboa Univ. (Portugal). Inst. Superior Tecnico; Richter, Benedikt [Muenchen Univ. (Germany). Arnold Sommerfeld Center for Theoretical Physics; Instituto de Telecomunicacoes (Portugal). Physics of Information and Quantum Technologies Group; Lisboa Univ. (Portugal). Inst. Superior Tecnico

    2017-01-15

    By analyzing the key properties of black holes from the point of view of quantum information, we derive a model-independent picture of black hole quantum computing. It has been noticed that this picture exhibits striking similarities with quantum critical condensates, allowing the use of a common language to describe quantum computing in both systems. We analyze such quantum computing by allowing coupling to external modes, under the condition that the external influence must be soft-enough in order not to offset the basic properties of the system. We derive model-independent bounds on some crucial time-scales, such as the times of gate operation, decoherence, maximal entanglement and total scrambling. We show that for black hole type quantum computers all these time-scales are of the order of the black hole half-life time. Furthermore, we construct explicitly a set of Hamiltonians that generates a universal set of quantum gates for the black hole type computer. We find that the gates work at maximal energy efficiency. Furthermore, we establish a fundamental bound on the complexity of quantum circuits encoded on these systems, and characterize the unitary operations that are implementable. It becomes apparent that the computational power is very limited due to the fact that the black hole life-time is of the same order of the gate operation time. As a consequence, it is impossible to retrieve its information, within the life-time of a black hole, by externally coupling to the black hole qubits. However, we show that, in principle, coupling to some of the internal degrees of freedom allows acquiring knowledge about the micro-state. Still, due to the trivial complexity of operations that can be performed, there is no time advantage over the collection of Hawking radiation and subsequent decoding. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Design considerations for computationally constrained two-way real-time video communication

    Science.gov (United States)

    Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.

    2009-08-01

    Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.

  12. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  13. Does the brake response time of the right leg change after left total knee arthroplasty? A prospective study.

    Science.gov (United States)

    Marques, Carlos J; Barreiros, João; Cabri, Jan; Carita, Ana I; Friesecke, Christian; Loehr, Jochen F

    2008-08-01

    Patients undergoing total knee arthroplasty often ask when they can safely resume car driving. There is little evidence available on which physicians can rely when advising patients on this issue. In a prospective study we assessed the brake response time of 24 patients admitted to the clinic for left total knee arthroplasty preoperatively and then 10 days after surgery. On each measurement day the patients performed two tasks, a simple and a complex brake response time task in a car simulator. Ten days after left TKA the brake response time for the simple task had decreased by 3.6% (p=0.24), the reaction time by 3.1% (p=0.34) and the movement time by 6.6% (p=0.07). However, the performance improvement was not statistically significant. Task complexity increased brake response time at both time points. A 5.8% increase was significant (p=0.01) at 10 days after surgery. Based on our results, we suggest that patients who have undergone left total knee arthroplasty may resume car driving 10 days after surgery as long as they drive a car with automatic transmission.

  14. A computational parametric study on edge loading in ceramic-on-ceramic total hip joint replacements.

    Science.gov (United States)

    Liu, Feng; Feng, Li; Wang, Junyuan

    2018-07-01

    Edge loading in ceramic-on-ceramic total hip joint replacement is an adverse condition that occurs as the result of a direct contact between the head and the cup rim. It has been associated with translational mismatch in the centres of rotation of the cup and head, and found to cause severe wear and early failure of the implants. Edge loading has been considered in particular in relation to dynamic separation of the cup and head centres during a gait cycle. Research has been carried out both experimentally and computationally to understand the mechanism including the influence of bearing component positioning on the occurrence and severity of edge loading. However, it is experimentally difficult to measure both the load magnitude and duration of edge loading as it occurs as a short impact within the tight space of hip joints. Computationally, a dynamic contact model, for example, developed using the MSC ADAMS software for a multi-body dynamics simulation can be particularly useful for calculating the loads and characterising the edge loading. The aim of the present study was to further develop the computational model, and improve the predictions of contact force and the understanding of mechanism in order to provide guidance on design and surgical factors to avoid or to reduce edge loading and wear. The results have shown that edge loading can be avoided for a low range of translational mismatch in the centres of rotation of the cup and head during gait at the level of approximately 1.0 mm for a cup at 45° inclination, keeping a correct cup inclination at 45° is important to reduce the edge loading severity, and edge loading can be avoided for a certain range of translational mismatch of the cup and head centres with an increased swing phase load. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real-time

  16. Qualità totale e mobilità totale Total Quality and Total Mobility

    Directory of Open Access Journals (Sweden)

    Giuseppe Trieste

    2010-05-01

    Full Text Available FIABA ONLUS (Italian Fund for Elimination of Architectural Barriers was founded in 2000 with the aim of promoting a culture of equal opportunities and, above all, it has as its main goal to involve public and private institutions to create a really accessible and usable environment for everyone. Total accessibility, Total usability and Total mobility are key indicators to define quality of life within cities. A supportive environment that is free of architectural, cultural and psychological barriers allows everyone to live with ease and universality. In fact, people who access to goods and services in the urban context can use to their advantage time and space, so they can do their activities and can maintain relationships that are deemed significant for their social life. The main aim of urban accessibility is to raise the comfort of space for citizens, eliminating all barriers that discriminate people, and prevent from an equality of opportunity. “FIABA FUND - City of ... for the removal of architectural barriers” is an idea of FIABA that has already affected many regions of Italy as Lazio, Lombardy, Campania, Abruzzi and Calabria. It is a National project which provides for opening a bank account in the cities of referring, in which for the first time, all together, individuals and private and public institutions can make a donation to fund initiatives for the removal of architectural barriers within its own territory for a real and effective total accessibility. Last February the fund was launched in Rome with the aim of achieving a Capital without barriers and a Town European model of accessibility and usability. Urban mobility is a prerequisite to access to goods and services, and to organize activities related to daily life. FIABA promotes the concept of sustainable mobility for all, supported by the European Commission’s White Paper. We need a cultural change in management and organization of public means, which might focus on

  17. Computer-games for gravitational wave science outreach: Black Hole Pong and Space Time Quest

    International Nuclear Information System (INIS)

    Carbone, L; Bond, C; Brown, D; Brückner, F; Grover, K; Lodhia, D; Mingarelli, C M F; Fulda, P; Smith, R J E; Unwin, R; Vecchio, A; Wang, M; Whalley, L; Freise, A

    2012-01-01

    We have established a program aimed at developing computer applications and web applets to be used for educational purposes as well as gravitational wave outreach activities. These applications and applets teach gravitational wave physics and technology. The computer programs are generated in collaboration with undergraduates and summer students as part of our teaching activities, and are freely distributed on a dedicated website. As part of this program, we have developed two computer-games related to gravitational wave science: 'Black Hole Pong' and 'Space Time Quest'. In this article we present an overview of our computer related outreach activities and discuss the games and their educational aspects, and report on some positive feedback received.

  18. Empirical forecast of quiet time ionospheric Total Electron Content maps over Europe

    Science.gov (United States)

    Badeke, Ronny; Borries, Claudia; Hoque, Mainul M.; Minkwitz, David

    2018-06-01

    An accurate forecast of the atmospheric Total Electron Content (TEC) is helpful to investigate space weather influences on the ionosphere and technical applications like satellite-receiver radio links. The purpose of this work is to compare four empirical methods for a 24-h forecast of vertical TEC maps over Europe under geomagnetically quiet conditions. TEC map data are obtained from the Space Weather Application Center Ionosphere (SWACI) and the Universitat Politècnica de Catalunya (UPC). The time-series methods Standard Persistence Model (SPM), a 27 day median model (MediMod) and a Fourier Series Expansion are compared to maps for the entire year of 2015. As a representative of the climatological coefficient models the forecast performance of the Global Neustrelitz TEC model (NTCM-GL) is also investigated. Time periods of magnetic storms, which are identified with the Dst index, are excluded from the validation. By calculating the TEC values with the most recent maps, the time-series methods perform slightly better than the coefficient model NTCM-GL. The benefit of NTCM-GL is its independence on observational TEC data. Amongst the time-series methods mentioned, MediMod delivers the best overall performance regarding accuracy and data gap handling. Quiet-time SWACI maps can be forecasted accurately and in real-time by the MediMod time-series approach.

  19. Kajian dan Implementasi Real TIME Operating System pada Single Board Computer Berbasis Arm

    OpenAIRE

    A, Wiedjaja; M, Handi; L, Jonathan; Christian, Benyamin; Kristofel, Luis

    2014-01-01

    Operating System is an important software in computer system. For personal and office use the operating system is sufficient. However, to critical mission applications such as nuclear power plants and braking system on the car (auto braking system) which need a high level of reliability, it requires operating system which operates in real time. The study aims to assess the implementation of the Linux-based operating system on a Single Board Computer (SBC) ARM-based, namely Pandaboard ES with ...

  20. Impact of increasing social media use on sitting time and body mass index.

    Science.gov (United States)

    Alley, Stephanie; Wellens, Pauline; Schoeppe, Stephanie; de Vries, Hein; Rebar, Amanda L; Short, Camille E; Duncan, Mitch J; Vandelanotte, Corneel

    2017-08-01

    Issue addressed Sedentary behaviours, in particular sitting, increases the risk of cardiovascular disease, type 2 diabetes, obesity and poorer mental health status. In Australia, 70% of adults sit for more than 8h per day. The use of social media applications (e.g. Facebook, Twitter, and Instagram) is on the rise; however, no studies have explored the association of social media use with sitting time and body mass index (BMI). Methods Cross-sectional self-report data on demographics, BMI and sitting time were collected from 1140 participants in the 2013 Queensland Social Survey. Generalised linear models were used to estimate associations of a social media score calculated from social media use, perceived importance of social media, and number of social media contacts with sitting time and BMI. Results Participants with a high social media score had significantly greater sitting times while using a computer in leisure time and significantly greater total sitting time on non-workdays. However, no associations were found between social media score and sitting to view TV, use motorised transport, work or participate in other leisure activities; or total workday, total sitting time or BMI. Conclusions These results indicate that social media use is associated with increased sitting time while using a computer, and total sitting time on non-workdays. So what? The rise in social media use may have a negative impact on health by contributing to computer sitting and total sitting time on non-workdays. Future longitudinal research with a representative sample and objective sitting measures is needed to confirm findings.

  1. When is it safe to resume driving after total hip and total knee arthroplasty? a meta-analysis of literature on post-operative brake reaction times.

    Science.gov (United States)

    van der Velden, C A; Tolk, J J; Janssen, R P A; Reijman, M

    2017-05-01

    The aim of this study was to assess the current available evidence about when patients might resume driving after elective, primary total hip (THA) or total knee arthroplasty (TKA) undertaken for osteoarthritis (OA). In February 2016, EMBASE, MEDLINE, Web of Science, Scopus, Cochrane, PubMed Publisher, CINAHL, EBSCO and Google Scholar were searched for clinical studies reporting on 'THA', 'TKA', 'car driving', 'reaction time' and 'brake response time'. Two researchers (CAV and JJT) independently screened the titles and abstracts for eligibility and assessed the risk of bias. Both fixed and random effects were used to pool data and calculate mean differences (MD) and 95% confidence intervals (CI) between pre- and post-operative total brake response time (TBRT). A total of 19 studies were included. The assessment of the risk of bias showed that one study was at high risk, six studies at moderate risk and 12 studies at low risk. Meta-analysis of TBRT showed a MD decrease of 25.54 ms (95% CI -32.02 to 83.09) two weeks after right-sided THA, and of 18.19 ms (95% CI -6.13 to 42.50) four weeks after a right-sided TKA, when compared with the pre-operative value. The TBRT returned to baseline two weeks after a right-sided THA and four weeks after a right-sided TKA. These results may serve as guidelines for orthopaedic surgeons when advising patients when to resume driving. However, the advice should be individualised. Cite this article: Bone Joint J 2017;99-B:566-76. ©2017 The British Editorial Society of Bone & Joint Surgery.

  2. Real-time analysis of total, elemental, and total speciated mercury

    International Nuclear Information System (INIS)

    Schlager, R.J.; Wilson, K.G.; Sappey, A.D.

    1995-01-01

    ADA Technologies, Inc., is developing a continuous emissions monitoring system that measures the concentrations of mercury in flue gas. Mercury is emitted as an air pollutant from a number of industrial processes. The largest contributors of these emissions are coal and oil combustion, municipal waste combustion, medical waste combustion, and the thermal treatment of hazardous materials. It is difficult, time consuming, and expensive to measure mercury emissions using current testing methods. Part of the difficulty lies in the fact that mercury is emitted from sources in several different forms, such as elemental mercury and mercuric chloride. The ADA analyzer measures these emissions in real time, thus providing a number of advantages over existing test methods: (1) it will provide a real-time measure of emission rates, (2) it will assure facility operators, regulators, and the public that emissions control systems are working at peak efficiency, and (3) it will provide information as to the nature of the emitted mercury (elemental mercury or speciated compounds). This update presents an overview of the CEM and describes features of key components of the monitoring system--the mercury detector, a mercury species converter, and the analyzer calibration system

  3. Real-time analysis of total, elemental, and total speciated mercury

    Energy Technology Data Exchange (ETDEWEB)

    Schlager, R.J.; Wilson, K.G.; Sappey, A.D. [ADA Technologies, Inc., Englewood, CO (United States)

    1995-11-01

    ADA Technologies, Inc., is developing a continuous emissions monitoring system that measures the concentrations of mercury in flue gas. Mercury is emitted as an air pollutant from a number of industrial processes. The largest contributors of these emissions are coal and oil combustion, municipal waste combustion, medical waste combustion, and the thermal treatment of hazardous materials. It is difficult, time consuming, and expensive to measure mercury emissions using current testing methods. Part of the difficulty lies in the fact that mercury is emitted from sources in several different forms, such as elemental mercury and mercuric chloride. The ADA analyzer measures these emissions in real time, thus providing a number of advantages over existing test methods: (1) it will provide a real-time measure of emission rates, (2) it will assure facility operators, regulators, and the public that emissions control systems are working at peak efficiency, and (3) it will provide information as to the nature of the emitted mercury (elemental mercury or speciated compounds). This update presents an overview of the CEM and describes features of key components of the monitoring system--the mercury detector, a mercury species converter, and the analyzer calibration system.

  4. Time-gated scintillator imaging for real-time optical surface dosimetry in total skin electron therapy

    Science.gov (United States)

    Bruza, Petr; Gollub, Sarah L.; Andreozzi, Jacqueline M.; Tendler, Irwin I.; Williams, Benjamin B.; Jarvis, Lesley A.; Gladstone, David J.; Pogue, Brian W.

    2018-05-01

    The purpose of this study was to measure surface dose by remote time-gated imaging of plastic scintillators. A novel technique for time-gated, intensified camera imaging of scintillator emission was demonstrated, and key parameters influencing the signal were analyzed, including distance, angle and thickness. A set of scintillator samples was calibrated by using thermo-luminescence detector response as reference. Examples of use in total skin electron therapy are described. The data showed excellent room light rejection (signal-to-noise ratio of scintillation SNR  ≈  470), ideal scintillation dose response linearity, and 2% dose rate error. Individual sample scintillation response varied by 7% due to sample preparation. Inverse square distance dependence correction and lens throughput error (8% per meter) correction were needed. At scintillator-to-source angle and observation angle  <50°, the radiant energy fluence error was smaller than 1%. The achieved standard error of the scintillator cumulative dose measurement compared to the TLD dose was 5%. The results from this proof-of-concept study documented the first use of small scintillator targets for remote surface dosimetry in ambient room lighting. The measured dose accuracy renders our method to be comparable to thermo-luminescent detector dosimetry, with the ultimate realization of accuracy likely to be better than shown here. Once optimized, this approach to remote dosimetry may substantially reduce the time and effort required for surface dosimetry.

  5. Computing time-series suspended-sediment concentrations and loads from in-stream turbidity-sensor and streamflow data

    Science.gov (United States)

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Doug; Ziegler, Andrew C.

    2010-01-01

    Over the last decade, use of a method for computing suspended-sediment concentration and loads using turbidity sensors—primarily nephelometry, but also optical backscatter—has proliferated. Because an in- itu turbidity sensor is capa le of measuring turbidity instantaneously, a turbidity time series can be recorded and related directly to time-varying suspended-sediment concentrations. Depending on the suspended-sediment characteristics of the measurement site, this method can be more reliable and, in many cases, a more accurate means for computing suspended-sediment concentrations and loads than traditional U.S. Geological Survey computational methods. Guidelines and procedures for estimating time s ries of suspended-sediment concentration and loading as a function of turbidity and streamflow data have been published in a U.S. Geological Survey Techniques and Methods Report, Book 3, Chapter C4. This paper is a summary of these guidelines and discusses some of the concepts, s atistical procedures, and techniques used to maintain a multiyear suspended sediment time series.

  6. Green computing: power optimisation of vfi-based real-time multiprocessor dataflow applications

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  7. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  8. Time-of-Flight Sensors in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2009-01-01

    , including Computer Graphics, Computer Vision and Man Machine Interaction (MMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real...

  9. Individual and family environmental correlates of television and computer time in 10- to 12-year-old European children: the ENERGY-project.

    Science.gov (United States)

    Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse

    2015-09-18

    The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules

  10. Different but Equal: Total Work, Gender and Social Norms in EU and US Time Use

    OpenAIRE

    Daniel S Hamermesh; Michael C Burda; Philippe Weil

    2008-01-01

    Using time-diary data from 27 countries, we demonstrate a negative relationship between real GDP per capita and the female-male difference in total work time—the sum of work for pay and work at home. We also show that in rich non-Catholic countries on four continents men and women do the same amount of total work on average. Our survey results demonstrate that labor economists, macroeconomists, sociologists and the general public consistently believe that women perform more tot...

  11. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  12. Real-time computer treatment of THz passive device images with the high image quality

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  13. Time related total lactic acid bacteria population diversity and ...

    African Journals Online (AJOL)

    The total lactic acid bacterial community involved in the spontaneous fermentation of malted cowpea fortified cereal weaning food was investigated by phenotypically and cultivation independent method. A total of 74 out of the isolated 178 strains were Lactobacillus plantarum, 32 were Pediococcus acidilactici and over 60% ...

  14. An assessment of the real-time application capabilities of the SIFT computer system

    Science.gov (United States)

    Butler, R. W.

    1982-01-01

    The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.

  15. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  16. The Impact of Total Ischemic Time, Donor Age and the Pathway of Donor Death on Graft Outcomes After Deceased Donor Kidney Transplantation.

    Science.gov (United States)

    Wong, Germaine; Teixeira-Pinto, Armando; Chapman, Jeremy R; Craig, Jonathan C; Pleass, Henry; McDonald, Stephen; Lim, Wai H

    2017-06-01

    Prolonged ischemia is a known risk factor for delayed graft function (DGF) and its interaction with donor characteristics, the pathways of donor death, and graft outcomes may have important implications for allocation policies. Using data from the Australian and New Zealand Dialysis and Transplant registry (1994-2013), we examined the relationship between total ischemic time with graft outcomes among recipients who received their first deceased donor kidney transplants. Total ischemic time (in hours) was defined as the time of the donor renal artery interruption or aortic clamp, until the time of release of the clamp on the renal artery in the recipient. A total of 7542 recipients were followed up over a median follow-up time of 5.3 years (interquartile range of 8.2 years). Of these, 1823 (24.6%) experienced DGF and 2553 (33.9%) experienced allograft loss. Recipients with total ischemic time of 14 hours or longer experienced an increased odd of DGF compared with those with total ischemic time less than 14 hours. This effect was most marked among those with older donors (P value for interaction = 0.01). There was a significant interaction between total ischemic time, donor age, and graft loss (P value for interaction = 0.03). There was on average, a 9% increase in the overall risk of graft loss per hour increase in the total ischemic time (adjusted hazard ratio, 1.09; 95% confidence interval, 1.01-1.18; P = 0.02) in recipients with older donation after circulatory death grafts. There is a clinically important interaction between donor age, the pathway of donor death, and total ischemic time on graft outcomes, such that the duration of ischemic time has the greatest impact on graft survival in recipients with older donation after circulatory death kidneys.

  17. Smoking is associated with earlier time to revision of total knee arthroplasty.

    Science.gov (United States)

    Lim, Chin Tat; Goodman, Stuart B; Huddleston, James I; Harris, Alex H S; Bhowmick, Subhrojyoti; Maloney, William J; Amanatullah, Derek F

    2017-10-01

    Smoking is associated with early postoperative complications, increased length of hospital stay, and an increased risk of revision after total knee arthroplasty (TKA). However, the effect of smoking on time to revision TKA is unknown. A total of 619 primary TKAs referred to an academic tertiary center for revision TKA were retrospectively stratified according to the patient smoking status. Smoking status was then analyzed for associations with time to revision TKA using a Chi square test. The association was also analyzed according to the indication for revision TKA. Smokers (37/41, 90%) have an increased risk of earlier revision for any reason compared to non-smokers (274/357, 77%, p=0.031). Smokers (37/41, 90%) have an increased risk of earlier revision for any reason compared to ex-smokers (168/221, 76%, p=0.028). Subgroup analysis did not reveal a difference in indication for revision TKA (p>0.05). Smokers are at increased risk of earlier revision TKA when compared to non-smokers and ex-smokers. The risk for ex-smokers was similar to that of non-smokers. Smoking appears to have an all-or-none effect on earlier revision TKA as patients who smoked more did not have higher risk of early revision TKA. These results highlight the need for clinicians to urge patients not to begin smoking and encourage smokers to quit smoking prior to primary TKA. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. ADAPTATION OF JOHNSON SEQUENCING ALGORITHM FOR JOB SCHEDULING TO MINIMISE THE AVERAGE WAITING TIME IN CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    SOUVIK PAL

    2016-09-01

    Full Text Available Cloud computing is an emerging paradigm of Internet-centric business computing where Cloud Service Providers (CSPs are providing services to the customer according to their needs. The key perception behind cloud computing is on-demand sharing of resources available in the resource pool provided by CSP, which implies new emerging business model. The resources are provisioned when jobs arrive. The job scheduling and minimization of waiting time are the challenging issue in cloud computing. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper includes system design for implementation which is concerned with Johnson Scheduling Algorithm that provides the optimal sequence. With that sequence, service times can be obtained. The waiting time and queue length can be reduced using queuing model with multi-server and finite capacity which improves the job scheduling model.

  19. Effect of time lapse on the diagnostic accuracy of cone beam computed tomography for detection of vertical root fractures

    Energy Technology Data Exchange (ETDEWEB)

    Eskandarloo, Amir; Shokri, Abbas, E-mail: Dr.a.shokri@gmail.com [Dental Research Center, Department of Oral and Maxillofacial Radiology, Hamadan University of Medical Sciences, Hamadan (Iran, Islamic Republic of); Asl, Amin Mahdavi [Department of Oral and Maxillofacial Radiology, Golestan University of Medical Sciences, Gorgan (Iran, Islamic Republic of); Jalalzadeh, Mohsen [Department of Endodontics, Hamadan University of Medical Sciences, Hamadan (Iran, Islamic Republic of); Tayari, Maryam [Department of Pedodontics, Golestan University of Medical Sciences, Gorgan (Iran, Islamic Republic of); Hosseinipanah, Mohammad [Department of Anatomy, School of Medicine, Hamadan University of Medical Sciences, Hamadan (Iran, Islamic Republic of); Fardmal, Javad [Research Center for Health Sciences and Department of Epidemiology and Biostatistics, School of Public Health, Hamadan University of Medical Sciences, Hamadan (Iran, Islamic Republic of)

    2016-01-15

    Accurate and early diagnosis of vertical root fractures (VRFs) is imperative to prevent extensive bone loss and unnecessary endodontic and prosthodontic treatments. The aim of this study was to assess the effect of time lapse on the diagnostic accuracy of cone beam computed tomography (CBCT) for VRFs in endodontically treated dog’s teeth. Forty eight incisors and premolars of three adult male dogs underwent root canal therapy. The teeth were assigned to two groups: VRFs were artificially induced in the first group (n=24) while the teeth in the second group remained intact (n=24). The CBCT scans were obtained by NewTom 3G unit immediately after inducing VRFs and after one, two, three, four, eight, 12 and 16 weeks. Three oral and maxillofacial radiologists blinded to the date of radiographs assessed the presence/absence of VRFs on CBCT scans. The sensitivity, specificity and accuracy values were calculated and data were analyzed using SPSS v.16 software and ANOVA. The total accuracy of detection of VRFs immediately after surgery, one, two, three, four, eight, 12 and 16 weeks was 67.3%, 68.7%, 66.6%, 64.6%, 64.5%, 69.4%, 68.7%, 68% respectively. The effect of time lapse on detection of VRFs was not significant (p>0.05). Overall sensitivity, specificity and accuracy of CBCT for detection of VRFs were 74.3%, 62.2%, 67.2% respectively. Cone beam computed tomography is a valuable tool for detection of VRFs. Time lapse (four months) had no effect on detection of VRFs on CBCT scans. (author)

  20. Effect of time lapse on the diagnostic accuracy of cone beam computed tomography for detection of vertical root fractures

    International Nuclear Information System (INIS)

    Eskandarloo, Amir; Shokri, Abbas; Asl, Amin Mahdavi; Jalalzadeh, Mohsen; Tayari, Maryam; Hosseinipanah, Mohammad; Fardmal, Javad

    2016-01-01

    Accurate and early diagnosis of vertical root fractures (VRFs) is imperative to prevent extensive bone loss and unnecessary endodontic and prosthodontic treatments. The aim of this study was to assess the effect of time lapse on the diagnostic accuracy of cone beam computed tomography (CBCT) for VRFs in endodontically treated dog’s teeth. Forty eight incisors and premolars of three adult male dogs underwent root canal therapy. The teeth were assigned to two groups: VRFs were artificially induced in the first group (n=24) while the teeth in the second group remained intact (n=24). The CBCT scans were obtained by NewTom 3G unit immediately after inducing VRFs and after one, two, three, four, eight, 12 and 16 weeks. Three oral and maxillofacial radiologists blinded to the date of radiographs assessed the presence/absence of VRFs on CBCT scans. The sensitivity, specificity and accuracy values were calculated and data were analyzed using SPSS v.16 software and ANOVA. The total accuracy of detection of VRFs immediately after surgery, one, two, three, four, eight, 12 and 16 weeks was 67.3%, 68.7%, 66.6%, 64.6%, 64.5%, 69.4%, 68.7%, 68% respectively. The effect of time lapse on detection of VRFs was not significant (p>0.05). Overall sensitivity, specificity and accuracy of CBCT for detection of VRFs were 74.3%, 62.2%, 67.2% respectively. Cone beam computed tomography is a valuable tool for detection of VRFs. Time lapse (four months) had no effect on detection of VRFs on CBCT scans. (author)

  1. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  2. A stable computational scheme for stiff time-dependent constitutive equations

    International Nuclear Information System (INIS)

    Shih, C.F.; Delorenzi, H.G.; Miller, A.K.

    1977-01-01

    Viscoplasticity and creep type constitutive equations are increasingly being employed in finite element codes for evaluating the deformation of high temperature structural members. These constitutive equations frequently exhibit stiff regimes which makes an analytical assessment of the structure very costly. A computational scheme for handling deformation in stiff regimes is proposed in this paper. By the finite element discretization, the governing partial differential equations in the spatial (x) and time (t) variables are reduced to a system of nonlinear ordinary differential equations in the independent variable t. The constitutive equations are expanded in a Taylor's series about selected values of t. The resulting system of differential equations are then integrated by an implicit scheme which employs a predictor technique to initiate the Newton-Raphson procedure. To examine the stability and accuracy of the computational scheme, a series of calculations were carried out for uniaxial specimens and thick wall tubes subjected to mechanical and thermal loading. (Auth.)

  3. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  4. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  5. Determinantal Representation of the Time-Dependent Stationary Correlation Function for the Totally Asymmetric Simple Exclusion Model

    Directory of Open Access Journals (Sweden)

    Nikolay M. Bogoliubov

    2009-04-01

    Full Text Available The basic model of the non-equilibrium low dimensional physics the so-called totally asymmetric exclusion process is related to the 'crystalline limit' (q → ∞ of the SU_q(2 quantum algebra. Using the quantum inverse scattering method we obtain the exact expression for the time-dependent stationary correlation function of the totally asymmetric simple exclusion process on a one dimensional lattice with the periodic boundary conditions.

  6. Real-time dynamics of lattice gauge theories with a few-qubit quantum computer

    Science.gov (United States)

    Martinez, Esteban A.; Muschik, Christine A.; Schindler, Philipp; Nigg, Daniel; Erhard, Alexander; Heyl, Markus; Hauke, Philipp; Dalmonte, Marcello; Monz, Thomas; Zoller, Peter; Blatt, Rainer

    2016-06-01

    Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. This has recently stimulated theoretical effort, using Feynman’s idea of a quantum simulator, to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realizing (1 + 1)-dimensional quantum electrodynamics (the Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which can be directly and efficiently implemented on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulation of high-energy theories using atomic physics experiments—the long-term intention is to extend this approach to real-time quantum simulations of non-Abelian lattice gauge theories.

  7. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  8. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  9. A new Mumford-Shah total variation minimization based model for sparse-view x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Chen, Bo; Bian, Zhaoying; Zhou, Xiaohui; Chen, Wensheng; Ma, Jianhua; Liang, Zhengrong

    2018-04-12

    Total variation (TV) minimization for the sparse-view x-ray computer tomography (CT) reconstruction has been widely explored to reduce radiation dose. However, due to the piecewise constant assumption for the TV model, the reconstructed images often suffer from over-smoothness on the image edges. To mitigate this drawback of TV minimization, we present a Mumford-Shah total variation (MSTV) minimization algorithm in this paper. The presented MSTV model is derived by integrating TV minimization and Mumford-Shah segmentation. Subsequently, a penalized weighted least-squares (PWLS) scheme with MSTV is developed for the sparse-view CT reconstruction. For simplicity, the proposed algorithm is named as 'PWLS-MSTV.' To evaluate the performance of the present PWLS-MSTV algorithm, both qualitative and quantitative studies were conducted by using a digital XCAT phantom and a physical phantom. Experimental results show that the present PWLS-MSTV algorithm has noticeable gains over the existing algorithms in terms of noise reduction, contrast-to-ratio measure and edge-preservation.

  10. A close-form solution to predict the total melting time of an ablating slab in contact with a plasma

    International Nuclear Information System (INIS)

    Yeh, F.-B.

    2007-01-01

    An exact melt-through time is derived for a one-dimensional heated slab in contact with a plasma when the melted material is immediately removed. The plasma is composed of a collisionless presheath and sheath on a slab, which partially reflects and secondarily emits ions and electrons. The energy transport from plasma to the surface accounting for the presheath and sheath is determined from the kinetic analysis. This work proposes a semi-analytical model to calculate the total melting time of a slab based on a direct integration of the unsteady heat conduction equation, and provides quantitative results applicable to control the total melting time of the slab. The total melting time as a function of plasma parameters and thermophysical properties of the slab are obtained. The predicted energy transmission factor as a function of dimensionless wall potential agrees well with the experimental data. The effects of reflectivities of the ions and electrons on the wall, electron-to-ion source temperature ratio at the presheath edge, charge number, ion-to-electron mass ratio, ionization energy, plasma flow work-to-heat conduction ratios, Stefan number, melting temperature, Biot number and bias voltage on the total melting time of the slab are quantitatively provided in this work

  11. Total vaginectomy and urethral lengthening at time of neourethral prelamination in transgender men.

    Science.gov (United States)

    Medina, Carlos A; Fein, Lydia A; Salgado, Christopher J

    2017-11-29

    For transgender men (TGM), gender-affirmation surgery (GAS) is often the final stage of their gender transition. GAS involves creating a neophallus, typically using tissue remote from the genital region, such as radial forearm free-flap phalloplasty. Essential to this process is vaginectomy. Complexity of vaginal fascial attachments, atrophy due to testosterone use, and need to preserve integrity of the vaginal epithelium for tissue rearrangement add to the intricacy of the procedure during GAS. We designed the technique presented here to minimize complications and contribute to overall success of the phalloplasty procedure. After obtaining approval from the Institutional Review Board, our transgender (TG) database at the University of Miami Hospital was reviewed to identify cases with vaginectomy and urethral elongation performed at the time of radial forearm free-flap phalloplasty prelamination. Surgical technique for posterior vaginectomy and anterior vaginal wall-flap harvest with subsequent urethral lengthening is detailed. Six patients underwent total vaginectomy and urethral elongation at the time of radial forearm free-flap phalloplasty prelamination. Mean estimated blood loss (EBL) was 290 ± 199.4 ml for the vaginectomy and urethral elongation, and no one required transfusion. There were no intraoperative complications (cystotomy, ureteral obstruction, enterotomy, proctotomy, or neurological injury). One patient had a urologic complication (urethral stricture) in the neobulbar urethra. Total vaginectomy and urethral lengthening procedures at the time of GAS are relatively safe procedures, and using the described technique provides excellent tissue for urethral prelamination and a low complication rate in both the short and long term.

  12. Artificial neuron operations and spike-timing-dependent plasticity using memristive devices for brain-inspired computing

    Science.gov (United States)

    Marukame, Takao; Nishi, Yoshifumi; Yasuda, Shin-ichi; Tanamoto, Tetsufumi

    2018-04-01

    The use of memristive devices for creating artificial neurons is promising for brain-inspired computing from the viewpoints of computation architecture and learning protocol. We present an energy-efficient multiplier accumulator based on a memristive array architecture incorporating both analog and digital circuitries. The analog circuitry is used to full advantage for neural networks, as demonstrated by the spike-timing-dependent plasticity (STDP) in fabricated AlO x /TiO x -based metal-oxide memristive devices. STDP protocols for controlling periodic analog resistance with long-range stability were experimentally verified using a variety of voltage amplitudes and spike timings.

  13. Real time recording system of radioisotopes by local area network (LAN) computer system and user input processing

    International Nuclear Information System (INIS)

    Shinohara, Kunio; Ito, Atsushi; Kawaguchi, Hajime; Yanase, Makoto; Uno, Kiyoshi.

    1991-01-01

    A computer-assisted real time recording system was developed for management of radioisotopes. The system composed of two personal computers forming LAN, identification-card (ID-card) reader, and electricity-operating door-lock. One computer is operated by radiation safety staffs and stores the records of radioisotopes. The users of radioisotopes are registered in this computer. Another computer is installed in front of the storage room for radioisotopes. This computer is ready for operation by a registered ID-card and is input data by the user. After the completion of data input, the door to the storage room is unlocked. The present system enables us the following merits: Radiation safety staffs can easily keep up with the present states of radioisotopes in the storage room and save much labor. Radioactivity is always corrected. The upper limit of radioactivities in use per day is automatically checked and users are regulated when they input the amounts to be used. Users can obtain storage records of radioisotopes any time. In addition, the system is applicable to facilities which have more than two storage rooms. (author)

  14. Quantification of Artifact Reduction With Real-Time Cine Four-Dimensional Computed Tomography Acquisition Methods

    International Nuclear Information System (INIS)

    Langner, Ulrich W.; Keall, Paul J.

    2010-01-01

    Purpose: To quantify the magnitude and frequency of artifacts in simulated four-dimensional computed tomography (4D CT) images using three real-time acquisition methods- direction-dependent displacement acquisition, simultaneous displacement and phase acquisition, and simultaneous displacement and velocity acquisition- and to compare these methods with commonly used retrospective phase sorting. Methods and Materials: Image acquisition for the four 4D CT methods was simulated with different displacement and velocity tolerances for spheres with radii of 0.5 cm, 1.5 cm, and 2.5 cm, using 58 patient-measured tumors and respiratory motion traces. The magnitude and frequency of artifacts, CT doses, and acquisition times were computed for each method. Results: The mean artifact magnitude was 50% smaller for the three real-time methods than for retrospective phase sorting. The dose was ∼50% lower, but the acquisition time was 20% to 100% longer for the real-time methods than for retrospective phase sorting. Conclusions: Real-time acquisition methods can reduce the frequency and magnitude of artifacts in 4D CT images, as well as the imaging dose, but they increase the image acquisition time. The results suggest that direction-dependent displacement acquisition is the preferred real-time 4D CT acquisition method, because on average, the lowest dose is delivered to the patient and the acquisition time is the shortest for the resulting number and magnitude of artifacts.

  15. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Science.gov (United States)

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  16. The importance of bony impingement in restricting flexion after total knee arthroplasty: computer simulation model with clinical correlation.

    Science.gov (United States)

    Mizu-Uchi, Hideki; Colwell, Clifford W; Fukagawa, Shingo; Matsuda, Shuichi; Iwamoto, Yukihide; D'Lima, Darryl D

    2012-10-01

    We constructed patient-specific models from computed tomography data after total knee arthroplasty to predict knee flexion based on implant-bone impingement. The maximum flexion before impingement between the femur and the tibial insert was computed using a musculoskeletal modeling program (KneeSIM; LifeModeler, Inc, San Clemente, California) during a weight-bearing deep knee bend. Postoperative flexion was measured in a clinical cohort of 21 knees (low-flex group: 6 knees with 125° of flexion at 2 years). Average predicted flexion angles were within 2° of clinical measurements for the high-flex group. In the low-flex group, 4 cases had impingement involving the bone cut at the posterior condyle, and the average predicted knee flexion was 102° compared with 93° measured clinically. These results indicate that the level of the distal femoral resection should be carefully planned and that exposed bone proximal to the tips of the posterior condyles of the femoral component should be removed if there is risk of impingement. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Real time computer control of a nonlinear Multivariable System via Linearization and Stability Analysis

    International Nuclear Information System (INIS)

    Raza, K.S.M.

    2004-01-01

    This paper demonstrates that if a complicated nonlinear, non-square, state-coupled multi variable system is smartly linearized and subjected to a thorough stability analysis then we can achieve our design objectives via a controller which will be quite simple (in term of resource usage and execution time) and very efficient (in terms of robustness). Further the aim is to implement this controller via computer in a real time environment. Therefore first a nonlinear mathematical model of the system is achieved. An intelligent work is done to decouple the multivariable system. Linearization and stability analysis techniques are employed for the development of a linearized and mathematically sound control law. Nonlinearities like the saturation in actuators are also been catered. The controller is then discretized using Runge-Kutta integration. Finally the discretized control law is programmed in a computer in a real time environment. The programme is done in RT -Linux using GNU C for the real time realization of the control scheme. The real time processes, like sampling and controlled actuation, and the non real time processes, like graphical user interface and display, are programmed as different tasks. The issue of inter process communication, between real time and non real time task is addressed quite carefully. The results of this research pursuit are presented graphically. (author)

  18. Time-domain numerical computations of electromagnetic fields in cylindrical co-ordinates using the transmission line matrix: evaluation of radiaion losses from a charge bunch passing through a pill-box resonator

    International Nuclear Information System (INIS)

    Sarma, J.; Robson, P.N.

    1979-01-01

    The two dimensional transmission line matrix (TLM) numerical method has been adapted to compute electromagnetic field distributions in cylindrical co-ordinates and it is applied to evaluate the radiation loss from a charge bunch passing through a 'pill-box' resonator. The computer program has been developed to calculate not only the total energy loss to the resonator but also that component of it which exists in the TM 010 mode. The numerically computed results are shown to agree very well with the analytically derived values as found in the literature which, therefore, established the degree of accuracy that is obtained with the TLM method. The particular features of computational simplicity, numerical stability and the inherently time-domain solutions produced by the TLM method are cited as additional, attractive reasons for using this numerical procedure in solving such problems. (Auth.)

  19. Guidelines and Procedures for Computing Time-Series Suspended-Sediment Concentrations and Loads from In-Stream Turbidity-Sensor and Streamflow Data

    Science.gov (United States)

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.

    2009-01-01

    In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.

  20. Person-related determinants of TV viewing and computer time in a cohort of young Dutch adults: Who sits the most?

    NARCIS (Netherlands)

    Uijtdewilligen, L.; Singh, A.S.; Chin A Paw, M.J.M.; Twisk, J.W.R.; van Mechelen, W.

    2015-01-01

    We aimed to assess the associations of person-related factors with leisure time television (TV) viewing and computer time among young adults. We analyzed self-reported TV viewing (h/week) and leisure computer time (h/week) from 475 Dutch young adults (47% male) who had participated in the Amsterdam

  1. Increased epicardial fat volume quantified by 64-multidetector computed tomography is associated with coronary atherosclerosis and totally occlusive lesions

    International Nuclear Information System (INIS)

    Ueno, Koji; Anzai, Toshihisa; Jinzaki, Masahiro

    2009-01-01

    The relationship between the epicardial fat volume measured by 64-slice multidetector computed tomography (MDCT) and the extension and severity of coronary atherosclerosis was investigated. Both MDCT and conventional coronary angiography (CAG) were performed in 71 consecutive patients who presented with effort angina. The volume of epicardial adipose tissue (EAT) was measured by MDCT. The severity of coronary atherosclerosis was assessed by evaluating the extension of coronary plaques in 790 segments using MDCT data, and the percentage diameter stenosis in 995 segments using CAG data. The estimated volume of EAT indexed by body surface area was defined as VEAT. Increased VEAT was associated with advanced age, male sex, degree of metabolic alterations, a history of acute coronary syndrome (ACS) and the presence of total occlusions, and showed positive correlation with the stenosis score r=0.28, P=0.02) and the atheromatosis score (r=0.67, P 3 /m 2 ) to be the strongest independent determinant of the presence of total occlusions odds ratio 4.64. P=0.02). VEAT correlates with the degree of metabolic alterations and coronary atheromatosis. Excessive accumulation of EAT might contribute to the development of ACS and coronary total occlusions. (author)

  2. Y2K issues for real time computer systems for fast breeder test reactor

    International Nuclear Information System (INIS)

    Swaminathan, P.

    1999-01-01

    Presentation shows the classification of real time systems related to operation, control and monitoring of the fast breeder test reactor. Software life cycle includes software requirement specification, software design description, coding, commissioning, operation and management. A software scheme in supervisory computer of fast breeder test rector is described with the twenty years of experience in design, development, installation, commissioning, operation and maintenance of computer based supervision control system for nuclear installation with a particular emphasis on solving the Y2K problem

  3. Elastic Spatial Query Processing in OpenStack Cloud Computing Environment for Time-Constraint Data Analysis

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-03-01

    Full Text Available Geospatial big data analysis (GBDA is extremely significant for time-constraint applications such as disaster response. However, the time-constraint analysis is not yet a trivial task in the cloud computing environment. Spatial query processing (SQP is typical computation-intensive and indispensable for GBDA, and the spatial range query, join query, and the nearest neighbor query algorithms are not scalable without using MapReduce-liked frameworks. Parallel SQP algorithms (PSQPAs are trapped in screw-processing, which is a known issue in Geoscience. To satisfy time-constrained GBDA, we propose an elastic SQP approach in this paper. First, Spark is used to implement PSQPAs. Second, Kubernetes-managed Core Operation System (CoreOS clusters provide self-healing Docker containers for running Spark clusters in the cloud. Spark-based PSQPAs are submitted to Docker containers, where Spark master instances reside. Finally, the horizontal pod auto-scaler (HPA would scale-out and scale-in Docker containers for supporting on-demand computing resources. Combined with an auto-scaling group of virtual instances, HPA helps to find each of the five nearest neighbors for 46,139,532 query objects from 834,158 spatial data objects in less than 300 s. The experiments conducted on an OpenStack cloud demonstrate that auto-scaling containers can satisfy time-constraint GBDA in clouds.

  4. Resource-Aware Load Balancing Scheme using Multi-objective Optimization in Cloud Computing

    OpenAIRE

    Kavita Rana; Vikas Zandu

    2016-01-01

    Cloud computing is a service based, on-demand, pay per use model consisting of an interconnected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are ...

  5. NUFFT-Based Iterative Image Reconstruction via Alternating Direction Total Variation Minimization for Sparse-View CT

    Directory of Open Access Journals (Sweden)

    Bin Yan

    2015-01-01

    Full Text Available Sparse-view imaging is a promising scanning method which can reduce the radiation dose in X-ray computed tomography (CT. Reconstruction algorithm for sparse-view imaging system is of significant importance. The adoption of the spatial iterative algorithm for CT image reconstruction has a low operation efficiency and high computation requirement. A novel Fourier-based iterative reconstruction technique that utilizes nonuniform fast Fourier transform is presented in this study along with the advanced total variation (TV regularization for sparse-view CT. Combined with the alternating direction method, the proposed approach shows excellent efficiency and rapid convergence property. Numerical simulations and real data experiments are performed on a parallel beam CT. Experimental results validate that the proposed method has higher computational efficiency and better reconstruction quality than the conventional algorithms, such as simultaneous algebraic reconstruction technique using TV method and the alternating direction total variation minimization approach, with the same time duration. The proposed method appears to have extensive applications in X-ray CT imaging.

  6. A Primal-Dual Approach for a Total Variation Wasserstein Flow

    KAUST Repository

    Benning, Martin; Calatroni, Luca; Dü ring, Bertram; Schö nlieb, Carola-Bibiane

    2013-01-01

    We consider a nonlinear fourth-order diffusion equation that arises in denoising of image densities. We propose an implicit time-stepping scheme that employs a primal-dual method for computing the subgradient of the total variation seminorm. The constraint on the dual variable is relaxed by adding a penalty term, depending on a parameter that determines the weight of the penalisation. The paper is furnished with some numerical examples showing the denoising properties of the model considered. © 2013 Springer-Verlag.

  7. First-in-Man Computed Tomography-Guided Percutaneous Revascularization of Coronary Chronic Total Occlusion Using a Wearable Computer: Proof of Concept.

    Science.gov (United States)

    Opolski, Maksymilian P; Debski, Artur; Borucki, Bartosz A; Szpak, Marcin; Staruch, Adam D; Kepka, Cezary; Witkowski, Adam

    2016-06-01

    We report a case of successful computed tomography-guided percutaneous revascularization of a chronically occluded right coronary artery using a wearable, hands-free computer with a head-mounted display worn by interventional cardiologists in the catheterization laboratory. The projection of 3-dimensional computed tomographic reconstructions onto the screen of virtual reality glass allowed the operators to clearly visualize the distal coronary vessel, and verify the direction of the guide wire advancement relative to the course of the occluded vessel segment. This case provides proof of concept that wearable computers can improve operator comfort and procedure efficiency in interventional cardiology. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  8. Computing nature turing centenary perspective

    CERN Document Server

    Giovagnoli, Raffaela

    2013-01-01

    This book is about nature considered as the totality of physical existence, the universe, and our present day attempts to understand it. If we see the universe as a network of networks of computational processes at many different levels of organization, what can we learn about physics, biology, cognition, social systems, and ecology expressed through interacting networks of elementary particles, atoms, molecules, cells, (and especially neurons when it comes to understanding of cognition and intelligence), organs, organisms and their ecologies? Regarding our computational models of natural phenomena Feynman famously wondered: “Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do?” Phenomena themselves occur so quickly and automatically in nature. Can we learn how to harness nature’s computational power as we harness its energy and materials? This volume includes a selection of contributions from the Symposium on Natural Computing/Unconventional Com...

  9. Towards OpenVL: Improving Real-Time Performance of Computer Vision Applications

    Science.gov (United States)

    Shen, Changsong; Little, James J.; Fels, Sidney

    Meeting constraints for real-time performance is a main issue for computer vision, especially for embedded computer vision systems. This chapter presents our progress on our open vision library (OpenVL), a novel software architecture to address efficiency through facilitating hardware acceleration, reusability, and scalability for computer vision systems. A logical image understanding pipeline is introduced to allow parallel processing. We also discuss progress on our middleware—vision library utility toolkit (VLUT)—that enables applications to operate transparently over a heterogeneous collection of hardware implementations. OpenVL works as a state machine,with an event-driven mechanismto provide users with application-level interaction. Various explicit or implicit synchronization and communication methods are supported among distributed processes in the logical pipelines. The intent of OpenVL is to allow users to quickly and easily recover useful information from multiple scenes, in a cross-platform, cross-language manner across various software environments and hardware platforms. To validate the critical underlying concepts of OpenVL, a human tracking system and a local positioning system are implemented and described. The novel architecture separates the specification of algorithmic details from the underlying implementation, allowing for different components to be implemented on an embedded system without recompiling code.

  10. Talking with the alien: interaction with computers in the GP consultation.

    Science.gov (United States)

    Dowell, Anthony; Stubbe, Maria; Scott-Dowell, Kathy; Macdonald, Lindsay; Dew, Kevin

    2013-01-01

    This study examines New Zealand GPs' interaction with computers in routine consultations. Twenty-eight video-recorded consultations from 10 GPs were analysed in micro-detail to explore: (i) how doctors divide their time and attention between computer and patient; (ii) the different roles ascribed to the computer; and (iii) how computer use influences the interactional flow of the consultation. All GPs engaged with the computer in some way for at least 20% of each consultation, and on average spent 12% of time totally focussed on the computer. Patterns of use varied; most GPs inputted all or most notes during the consultation, but a few set aside dedicated time afterwards. The computer acted as an additional participant enacting roles like information repository and legitimiser of decisions. Computer use also altered some of the normal 'rules of engagement' between doctor and patient. Long silences and turning away interrupted the smooth flow of conversation, but various 'multitasking' strategies allowed GPs to remain engaged with patients during episodes of computer use (e.g. signposting, online commentary, verbalising while typing, social chat). Conclusions were that use of computers has many benefits but also significantly influences the fine detail of the GP consultation. Doctors must consciously develop strategies to manage this impact.

  11. Resolving time of scintillation camera-computer system and methods of correction for counting loss, 2

    International Nuclear Information System (INIS)

    Iinuma, Takeshi; Fukuhisa, Kenjiro; Matsumoto, Toru

    1975-01-01

    Following the previous work, counting-rate performance of camera-computer systems was investigated for two modes of data acquisition. The first was the ''LIST'' mode in which image data and timing signals were sequentially stored on magnetic disk or tape via a buffer memory. The second was the ''HISTOGRAM'' mode in which image data were stored in a core memory as digital images and then the images were transfered to magnetic disk or tape by the signal of frame timing. Firstly, the counting-rates stored in the buffer memory was measured as a function of display event-rates of the scintillation camera for the two modes. For both modes, stored counting-rated (M) were expressed by the following formula: M=N(1-Ntau) where N was the display event-rates of the camera and tau was the resolving time including analog-to-digital conversion time and memory cycle time. The resolving time for each mode may have been different, but it was about 10 μsec for both modes in our computer system (TOSBAC 3400 model 31). Secondly, the date transfer speed from the buffer memory to the external memory such as magnetic disk or tape was considered for the two modes. For the ''LIST'' mode, the maximum value of stored counting-rates from the camera was expressed in terms of size of the buffer memory, access time and data transfer-rate of the external memory. For the ''HISTOGRAM'' mode, the minimum time of the frame was determined by size of the buffer memory, access time and transfer rate of the external memory. In our system, the maximum value of stored counting-rates were about 17,000 counts/sec. with the buffer size of 2,000 words, and minimum frame time was about 130 msec. with the buffer size of 1024 words. These values agree well with the calculated ones. From the author's present analysis, design of the camera-computer system becomes possible for quantitative dynamic imaging and future improvements are suggested. (author)

  12. Job tasks, computer use, and the decreasing part-time pay penalty for women in the UK

    NARCIS (Netherlands)

    Elsayed, A.E.A.; de Grip, A.; Fouarge, D.

    2014-01-01

    Using data from the UK Skills Surveys, we show that the part-time pay penalty for female workers within low- and medium-skilled occupations decreased significantly over the period 1997-2006. The convergence in computer use between part-time and full-time workers within these occupations explains a

  13. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    Science.gov (United States)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  14. [Evaluation of production and clinical working time of computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays for complete denture].

    Science.gov (United States)

    Wei, L; Chen, H; Zhou, Y S; Sun, Y C; Pan, S X

    2017-02-18

    To compare the technician fabrication time and clinical working time of custom trays fabricated using two different methods, the three-dimensional printing custom trays and the conventional custom trays, and to prove the feasibility of the computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays in clinical use from the perspective of clinical time cost. Twenty edentulous patients were recruited into this study, which was prospective, single blind, randomized self-control clinical trials. Two custom trays were fabricated for each participant. One of the custom trays was fabricated using functional suitable denture (FSD) system through CAD/CAM process, and the other was manually fabricated using conventional methods. Then the final impressions were taken using both the custom trays, followed by utilizing the final impression to fabricate complete dentures respectively. The technician production time of the custom trays and the clinical working time of taking the final impression was recorded. The average time spent on fabricating the three-dimensional printing custom trays using FSD system and fabricating the conventional custom trays manually were (28.6±2.9) min and (31.1±5.7) min, respectively. The average time spent on making the final impression with the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually were (23.4±11.5) min and (25.4±13.0) min, respectively. There was significant difference in the technician fabrication time and the clinical working time between the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually (Pmanufacture custom trays by three-dimensional printing method, there is no need to pour preliminary cast after taking the primary impression, therefore, it can save the impression material and model material. As to completing denture restoration, manufacturing custom trays using FSD system is worth being

  15. Confabulation Based Real-time Anomaly Detection for Wide-area Surveillance Using Heterogeneous High Performance Computing Architecture

    Science.gov (United States)

    2015-06-01

    CONFABULATION BASED REAL-TIME ANOMALY DETECTION FOR WIDE-AREA SURVEILLANCE USING HETEROGENEOUS HIGH PERFORMANCE COMPUTING ARCHITECTURE SYRACUSE...DETECTION FOR WIDE-AREA SURVEILLANCE USING HETEROGENEOUS HIGH PERFORMANCE COMPUTING ARCHITECTURE 5a. CONTRACT NUMBER FA8750-12-1-0251 5b. GRANT...processors including graphic processor units (GPUs) and Intel Xeon Phi processors. Experimental results showed significant speedups, which can enable

  16. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro; Ibanescu, Mihai; Joannopoulos, J. D.; Johnson, Steven G.; Iannuzzi, Davide

    2007-01-01

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the fluctuation-dissipation theorem, is designed to directly exploit fast methods developed for classical computational electromagnetism, since it only involves repeated evaluation of the Green's function for imaginary frequencies (equivalently, real frequencies in imaginary time). We develop the approach by systematically examining various formulations of Casimir forces from the previous decades and evaluating them according to their suitability for numerical computation. We illustrate our approach with a simple finite-difference frequency-domain implementation, test it for known geometries such as a cylinder and a plate, and apply it to new geometries. In particular, we show that a pistonlike geometry of two squares sliding between metal walls, in both two and three dimensions with both perfect and realistic metallic materials, exhibits a surprising nonmonotonic ''lateral'' force from the walls

  17. Region-oriented CT image representation for reducing computing time of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Sarrut, David; Guigues, Laurent

    2008-01-01

    Purpose. We propose a new method for efficient particle transportation in voxelized geometry for Monte Carlo simulations. We describe its use for calculating dose distribution in CT images for radiation therapy. Material and methods. The proposed approach, based on an implicit volume representation named segmented volume, coupled with an adapted segmentation procedure and a distance map, allows us to minimize the number of boundary crossings, which slows down simulation. The method was implemented with the GEANT4 toolkit and compared to four other methods: One box per voxel, parameterized volumes, octree-based volumes, and nested parameterized volumes. For each representation, we compared dose distribution, time, and memory consumption. Results. The proposed method allows us to decrease computational time by up to a factor of 15, while keeping memory consumption low, and without any modification of the transportation engine. Speeding up is related to the geometry complexity and the number of different materials used. We obtained an optimal number of steps with removal of all unnecessary steps between adjacent voxels sharing a similar material. However, the cost of each step is increased. When the number of steps cannot be decreased enough, due for example, to the large number of material boundaries, such a method is not considered suitable. Conclusion. This feasibility study shows that optimizing the representation of an image in memory potentially increases computing efficiency. We used the GEANT4 toolkit, but we could potentially use other Monte Carlo simulation codes. The method introduces a tradeoff between speed and geometry accuracy, allowing computational time gain. However, simulations with GEANT4 remain slow and further work is needed to speed up the procedure while preserving the desired accuracy

  18. Real-time computation of parameter fitting and image reconstruction using graphical processing units

    Science.gov (United States)

    Locans, Uldis; Adelmann, Andreas; Suter, Andreas; Fischer, Jannis; Lustermann, Werner; Dissertori, Günther; Wang, Qiulin

    2017-06-01

    In recent years graphical processing units (GPUs) have become a powerful tool in scientific computing. Their potential to speed up highly parallel applications brings the power of high performance computing to a wider range of users. However, programming these devices and integrating their use in existing applications is still a challenging task. In this paper we examined the potential of GPUs for two different applications. The first application, created at Paul Scherrer Institut (PSI), is used for parameter fitting during data analysis of μSR (muon spin rotation, relaxation and resonance) experiments. The second application, developed at ETH, is used for PET (Positron Emission Tomography) image reconstruction and analysis. Applications currently in use were examined to identify parts of the algorithms in need of optimization. Efficient GPU kernels were created in order to allow applications to use a GPU, to speed up the previously identified parts. Benchmarking tests were performed in order to measure the achieved speedup. During this work, we focused on single GPU systems to show that real time data analysis of these problems can be achieved without the need for large computing clusters. The results show that the currently used application for parameter fitting, which uses OpenMP to parallelize calculations over multiple CPU cores, can be accelerated around 40 times through the use of a GPU. The speedup may vary depending on the size and complexity of the problem. For PET image analysis, the obtained speedups of the GPU version were more than × 40 larger compared to a single core CPU implementation. The achieved results show that it is possible to improve the execution time by orders of magnitude.

  19. Automated selection of brain regions for real-time fMRI brain-computer interfaces

    Science.gov (United States)

    Lührs, Michael; Sorger, Bettina; Goebel, Rainer; Esposito, Fabrizio

    2017-02-01

    Objective. Brain-computer interfaces (BCIs) implemented with real-time functional magnetic resonance imaging (rt-fMRI) use fMRI time-courses from predefined regions of interest (ROIs). To reach best performances, localizer experiments and on-site expert supervision are required for ROI definition. To automate this step, we developed two unsupervised computational techniques based on the general linear model (GLM) and independent component analysis (ICA) of rt-fMRI data, and compared their performances on a communication BCI. Approach. 3 T fMRI data of six volunteers were re-analyzed in simulated real-time. During a localizer run, participants performed three mental tasks following visual cues. During two communication runs, a letter-spelling display guided the subjects to freely encode letters by performing one of the mental tasks with a specific timing. GLM- and ICA-based procedures were used to decode each letter, respectively using compact ROIs and whole-brain distributed spatio-temporal patterns of fMRI activity, automatically defined from subject-specific or group-level maps. Main results. Letter-decoding performances were comparable to supervised methods. In combination with a similarity-based criterion, GLM- and ICA-based approaches successfully decoded more than 80% (average) of the letters. Subject-specific maps yielded optimal performances. Significance. Automated solutions for ROI selection may help accelerating the translation of rt-fMRI BCIs from research to clinical applications.

  20. Reduced opiate use after total knee arthroplasty using computer-assisted cryotherapy.

    Science.gov (United States)

    Thijs, Elke; Schotanus, Martijn G M; Bemelmans, Yoeri F L; Kort, Nanne P

    2018-05-03

    Despite multimodal pain management and advances in anesthetic techniques, total knee arthroplasty (TKA) remains painful during the early postoperative phase. This trial investigated whether computer-assisted cryotherapy (CAC) is effective in reduction of pain and consumption of opioids in patients operated for TKA following an outpatient surgery pathway. Sixty patients scheduled for primary TKA were included in this prospective, double-blind, randomized controlled trial receiving CAC at 10-12 °C (Cold-group, n = 30) or at 21 °C (Warm-group, n = 30) during the first 7 days after TKA according to a fixed schedule. All patients received the same pre-, peri- and postoperative care with a multimodal pain protocol. Pain was assessed before and after every session of cryotherapy using the numerical rating scale for pain (NRS-pain). The consumption of opioids was strictly noted during the first 4 postoperative days. Secondary outcomes were knee swelling, visual hematoma and patient reported outcome measures (PROMs). These parameters were measured pre-, 1, 2 and 6 weeks postoperatively. In both study groups, a reduction in NRS-pain after every CAC session were seen during the postoperative period of 7 days. A mean reduction of 0.9 and 0.7 on the NRS-pain was seen for respectively the Cold- (P = 0.008) and Warm-group (n.s.). A significant (P = 0.001) lower number of opioids were used by the Cold-group during the acute postoperative phase of 4 days, 47 and 83 tablets for respectively the Cold and Warm-group. No difference could be observed for secondary outcomes and adverse effects between both study groups. Postoperative CAC can be in added value in patients following an outpatient surgery pathway for TKA, resulting in reduced experienced pain and consumption of opioids during the first postoperative days.

  1. Cerebral perfusion computed tomography deconvolution via structure tensor total variation regularization

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Dong; Zhang, Xinyu; Bian, Zhaoying, E-mail: zybian@smu.edu.cn, E-mail: jhma@smu.edu.cn; Huang, Jing; Zhang, Hua; Lu, Lijun; Lyu, Wenbing; Feng, Qianjin; Chen, Wufan; Ma, Jianhua, E-mail: zybian@smu.edu.cn, E-mail: jhma@smu.edu.cn [Department of Biomedical Engineering, Southern Medical University, Guangzhou, Guangdong 510515, China and Guangdong Provincial Key Laboratory of Medical Image Processing, Southern Medical University, Guangzhou, Guangdong 510515 (China); Zhang, Jing [Department of Radiology, Tianjin Medical University General Hospital, Tianjin 300052 (China)

    2016-05-15

    Purpose: Cerebral perfusion computed tomography (PCT) imaging as an accurate and fast acute ischemic stroke examination has been widely used in clinic. Meanwhile, a major drawback of PCT imaging is the high radiation dose due to its dynamic scan protocol. The purpose of this work is to develop a robust perfusion deconvolution approach via structure tensor total variation (STV) regularization (PD-STV) for estimating an accurate residue function in PCT imaging with the low-milliampere-seconds (low-mAs) data acquisition. Methods: Besides modeling the spatio-temporal structure information of PCT data, the STV regularization of the present PD-STV approach can utilize the higher order derivatives of the residue function to enhance denoising performance. To minimize the objective function, the authors propose an effective iterative algorithm with a shrinkage/thresholding scheme. A simulation study on a digital brain perfusion phantom and a clinical study on an old infarction patient were conducted to validate and evaluate the performance of the present PD-STV approach. Results: In the digital phantom study, visual inspection and quantitative metrics (i.e., the normalized mean square error, the peak signal-to-noise ratio, and the universal quality index) assessments demonstrated that the PD-STV approach outperformed other existing approaches in terms of the performance of noise-induced artifacts reduction and accurate perfusion hemodynamic maps (PHM) estimation. In the patient data study, the present PD-STV approach could yield accurate PHM estimation with several noticeable gains over other existing approaches in terms of visual inspection and correlation analysis. Conclusions: This study demonstrated the feasibility and efficacy of the present PD-STV approach in utilizing STV regularization to improve the accuracy of residue function estimation of cerebral PCT imaging in the case of low-mAs.

  2. Self-Motion Perception: Assessment by Real-Time Computer Generated Animations

    Science.gov (United States)

    Parker, Donald E.

    1999-01-01

    Our overall goal is to develop materials and procedures for assessing vestibular contributions to spatial cognition. The specific objective of the research described in this paper is to evaluate computer-generated animations as potential tools for studying self-orientation and self-motion perception. Specific questions addressed in this study included the following. First, does a non- verbal perceptual reporting procedure using real-time animations improve assessment of spatial orientation? Are reports reliable? Second, do reports confirm expectations based on stimuli to vestibular apparatus? Third, can reliable reports be obtained when self-motion description vocabulary training is omitted?

  3. Distributed computer control systems in future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.; Watkins, L.M.

    1978-09-01

    Good operating experience with computer control in CANDU reactors over the last decade justifies a broadening of the role of digital electronic and computer related technologies in future plants. Functions of electronic systems in the total plant context are reappraised to help evolve an appropriate match between technology and future applications. The systems research, development and demonstration program at CRNL is described, focusing on the projects pertinent to the real-time data acquisition and process control requirements. (author)

  4. Central axis dose verification in patients treated with total body irradiation of photons using a Computed Radiography system

    International Nuclear Information System (INIS)

    Rubio Rivero, A.; Caballero Pinelo, R.; Gonzalez Perez, Y.

    2015-01-01

    To propose and evaluate a method for the central axis dose verification in patients treated with total body irradiation (TBI) of photons using images obtained through a Computed Radiography (CR) system. It was used the Computed Radiography (Fuji) portal imaging cassette readings and correlate with measured of absorbed dose in water using 10 x 10 irradiation fields with ionization chamber in the 60 Co equipment. The analytical and graphic expression is obtained through software 'Origin8', the TBI patient portal verification images were processed using software ImageJ, to obtain the patient dose. To validate the results, the absorbed dose in RW3 models was measured with ionization chamber with different thickness, simulating TBI real conditions. Finally it was performed a retrospective study over the last 4 years obtaining the patients absorbed dose based on the reading in the image and comparing with the planned dose. The analytical equation obtained permits estimate the absorbed dose using image pixel value and the dose measured with ionization chamber and correlated with patient clinical records. Those results are compared with reported evidence obtaining a difference less than 02%, the 3 methods were compared and the results are within 10%. (Author)

  5. Prospective pilot study of a tablet computer in an Emergency Department.

    Science.gov (United States)

    Horng, Steven; Goss, Foster R; Chen, Richard S; Nathanson, Larry A

    2012-05-01

    The recent availability of low-cost tablet computers can facilitate bedside information retrieval by clinicians. To evaluate the effect of physician tablet use in the Emergency Department. Prospective cohort study comparing physician workstation usage with and without a tablet. 55,000 visits/year Level 1 Emergency Department at a tertiary academic teaching hospital. 13 emergency physicians (7 Attendings, 4 EM3s, and 2 EM1s) worked a total of 168 scheduled shifts (130 without and 38 with tablets) during the study period. Physician use of a tablet computer while delivering direct patient care in the Emergency Department. The primary outcome measure was the time spent using the Emergency Department Information System (EDIS) at a computer workstation per shift. The secondary outcome measure was the number of EDIS logins at a computer workstation per shift. Clinician use of a tablet was associated with a 38min (17-59) decrease in time spent per shift using the EDIS at a computer workstation (pcomputer was associated with a reduction in the number of times physicians logged into a computer workstation and a reduction in the amount of time they spent there using the EDIS. The presumed benefit is that decreasing time at a computer workstation increases physician availability at the bedside. However, this association will require further investigation. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. Analysis of local ionospheric time varying characteristics with singular value decomposition

    DEFF Research Database (Denmark)

    Jakobsen, Jakob Anders; Knudsen, Per; Jensen, Anna B. O.

    2010-01-01

    In this paper, a time series from 1999 to 2007 of absolute total electron content (TEC) values has been computed and analyzed using singular value decomposition (SVD). The data set has been computed using a Kalman Filter and is based on dual frequency GPS data from three reference stations in Den...

  7. Pair plasma relaxation time scales.

    Science.gov (United States)

    Aksenov, A G; Ruffini, R; Vereshchagin, G V

    2010-04-01

    By numerically solving the relativistic Boltzmann equations, we compute the time scale for relaxation to thermal equilibrium for an optically thick electron-positron plasma with baryon loading. We focus on the time scales of electromagnetic interactions. The collisional integrals are obtained directly from the corresponding QED matrix elements. Thermalization time scales are computed for a wide range of values of both the total-energy density (over 10 orders of magnitude) and of the baryonic loading parameter (over 6 orders of magnitude). This also allows us to study such interesting limiting cases as the almost purely electron-positron plasma or electron-proton plasma as well as intermediate cases. These results appear to be important both for laboratory experiments aimed at generating optically thick pair plasmas as well as for astrophysical models in which electron-positron pair plasmas play a relevant role.

  8. An Implementation of Parallel and Networked Computing Schemes for the Real-Time Image Reconstruction Based on Electrical Tomography

    International Nuclear Information System (INIS)

    Park, Sook Hee

    2001-02-01

    This thesis implements and analyzes the parallel and networked computing libraries based on the multiprocessor computer architecture as well as networked computers, aiming at improving the computation speed of ET(Electrical Tomography) system which requires enormous CPU time in reconstructing the unknown internal state of the target object. As an instance of the typical tomography technology, ET partitions the cross-section of the target object into the tiny elements and calculates the resistivity of them with signal values measured at the boundary electrodes surrounding the surface of the object after injecting the predetermined current pattern through the object. The number of elements is determined considering the trade-off between the accuracy of the reconstructed image and the computation time. As the elements become more finer, the number of element increases, and the system can get the better image. However, the reconstruction time increases polynomially with the number of partitioned elements since the procedure consists of a number of time consuming matrix operations such as multiplication, inverse, pseudo inverse, Jacobian and so on. Consequently, the demand for improving computation speed via multiple processor grows indispensably. Moreover, currently released PCs can be stuffed with up to 4 CPUs interconnected to the shared memory while some operating systems enable the application process to benefit from such computer by allocating the threaded job to each CPU, resulting in concurrent processing. In addition, a networked computing or cluster computing environment is commonly available to almost every computer which contains communication protocol and is connected to local or global network. After partitioning the given job(numerical operation), each CPU or computer calculates the partial result independently, and the results are merged via common memory to produce the final result. It is desirable to adopt the commonly used library such as Matlab to

  9. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    Science.gov (United States)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  10. Total Correlation Function Integrals and Isothermal Compressibilities from Molecular Simulations

    DEFF Research Database (Denmark)

    Wedberg, Rasmus; Peters, Günther H.j.; Abildskov, Jens

    2008-01-01

    Generation of thermodynamic data, here compressed liquid density and isothermal compressibility data, using molecular dynamics simulations is investigated. Five normal alkane systems are simulated at three different state points. We compare two main approaches to isothermal compressibilities: (1...... in approximately the same amount of time. This suggests that computation of total correlation function integrals is a route to isothermal compressibility, as accurate and fast as well-established benchmark techniques. A crucial step is the integration of the radial distribution function. To obtain sensible results...

  11. Computing Camps for Girls : A First-Time Experience at the University of Limerick

    NARCIS (Netherlands)

    McInerney, Clare; Lamprecht, A.L.; Margaria, Tiziana

    2018-01-01

    Increasing the number of females in ICT-related university courses has been a major concern for several years. In 2015, we offered a girls-only computing summer camp for the first time, as a new component in our education and outreach activities to foster students’ interest in our discipline. In

  12. Computer-guided total synthesis of natural products: Recent examples and future perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Della-Felice, Franco; Pilli, Ronaldo A. [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Instituto de Química; Sarotti, Ariel M., E-mail: pilli@iqm.unicamp.br, E-mail: sarotti@iquir-conicet.gov.ar [Instituto de Química, Universidad Nacional de Rosario-CONICET (Argentina)

    2018-05-01

    Quantum chemical calculations of nuclear magnetic resonance (NMR) shifts and coupling constants have been extensively employed in recent years mainly to facilitate structural elucidation of organic molecules. When the results of such calculations are used to determine the most likely structure of a natural product in advance, guiding the subsequent synthetic work, the term 'computer-guided synthesis' could be coined. This review article describes the most relevant examples from recent literature, highlighting the scope and limitations of this merged computational/experimental approach as well. (author)

  13. Computer-guided total synthesis of natural products: Recent examples and future perspectives

    International Nuclear Information System (INIS)

    Della-Felice, Franco; Pilli, Ronaldo A.

    2018-01-01

    Quantum chemical calculations of nuclear magnetic resonance (NMR) shifts and coupling constants have been extensively employed in recent years mainly to facilitate structural elucidation of organic molecules. When the results of such calculations are used to determine the most likely structure of a natural product in advance, guiding the subsequent synthetic work, the term 'computer-guided synthesis' could be coined. This review article describes the most relevant examples from recent literature, highlighting the scope and limitations of this merged computational/experimental approach as well. (author)

  14. A Computational Model for Real-Time Calculation of Electric Field due to Transcranial Magnetic Stimulation in Clinics

    Directory of Open Access Journals (Sweden)

    Alessandra Paffi

    2015-01-01

    Full Text Available The aim of this paper is to propose an approach for an accurate and fast (real-time computation of the electric field induced inside the whole brain volume during a transcranial magnetic stimulation (TMS procedure. The numerical solution implements the admittance method for a discretized realistic brain model derived from Magnetic Resonance Imaging (MRI. Results are in a good agreement with those obtained using commercial codes and require much less computational time. An integration of the developed code with neuronavigation tools will permit real-time evaluation of the stimulated brain regions during the TMS delivery, thus improving the efficacy of clinical applications.

  15. A practical O(n log2 n) time algorithm for computing the triplet distance on binary trees

    DEFF Research Database (Denmark)

    Sand, Andreas; Pedersen, Christian Nørgaard Storm; Mailund, Thomas

    2013-01-01

    rooted binary trees in time O (n log2 n). The algorithm is related to an algorithm for computing the quartet distance between two unrooted binary trees in time O (n log n). While the quartet distance algorithm has a very severe overhead in the asymptotic time complexity that makes it impractical compared......The triplet distance is a distance measure that compares two rooted trees on the same set of leaves by enumerating all sub-sets of three leaves and counting how often the induced topologies of the tree are equal or different. We present an algorithm that computes the triplet distance between two...

  16. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  17. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  18. Computer work and musculoskeletal disorders of the neck and upper extremity: A systematic review

    Directory of Open Access Journals (Sweden)

    Veiersted Kaj Bo

    2010-04-01

    Full Text Available Abstract Background This review examines the evidence for an association between computer work and neck and upper extremity disorders (except carpal tunnel syndrome. Methods A systematic critical review of studies of computer work and musculoskeletal disorders verified by a physical examination was performed. Results A total of 22 studies (26 articles fulfilled the inclusion criteria. Results show limited evidence for a causal relationship between computer work per se, computer mouse and keyboard time related to a diagnosis of wrist tendonitis, and for an association between computer mouse time and forearm disorders. Limited evidence was also found for a causal relationship between computer work per se and computer mouse time related to tension neck syndrome, but the evidence for keyboard time was insufficient. Insufficient evidence was found for an association between other musculoskeletal diagnoses of the neck and upper extremities, including shoulder tendonitis and epicondylitis, and any aspect of computer work. Conclusions There is limited epidemiological evidence for an association between aspects of computer work and some of the clinical diagnoses studied. None of the evidence was considered as moderate or strong and there is a need for more and better documentation.

  19. Effect of temperature, time, and milling process on yield, flavonoid, and total phenolic content of Zingiber officinale water extract

    Science.gov (United States)

    Andriyani, R.; Kosasih, W.; Ningrum, D. R.; Pudjiraharti, S.

    2017-03-01

    Several parameters such as temperature, time of extraction, and size of simplicia play significant role in medicinal herb extraction. This study aimed to investigate the effect of those parameters on yield extract, flavonoid, and total phenolic content in water extract of Zingiber officinale. The temperatures used were 50, 70 and 90°C and the extraction times were 30, 60 and 90 min. Z. officinale in the form of powder and chips were used to study the effect of milling treatment. The correlation among those variables was analysed using ANOVA two-way factors without replication. The result showed that time and temperature did not influence the yield of extract of Powder simplicia. However, time of extraction influenced the extract of simplicia treated without milling process. On the other hand, flavonoid and total phenolic content were not influenced by temperature, time, and milling treatment.

  20. Time-driven Activity-based Cost of Fast-Track Total Hip and Knee Arthroplasty

    DEFF Research Database (Denmark)

    Andreasen, Signe E; Holm, Henriette B; Jørgensen, Mira

    2017-01-01

    this between 2 departments with different logistical set-ups. METHODS: Prospective data collection was analyzed using the time-driven activity-based costing method (TDABC) on time consumed by different staff members involved in patient treatment in the perioperative period of fast-track THA and TKA in 2 Danish...... orthopedic departments with standardized fast-track settings, but different logistical set-ups. RESULTS: Length of stay was median 2 days in both departments. TDABC revealed minor differences in the perioperative settings between departments, but the total cost excluding the prosthesis was similar at USD......-track methodology, the result could be a more cost-effective pathway altogether. As THA and TKA are potentially costly procedures and the numbers are increasing in an economical limited environment, the aim of this study is to present baseline detailed economical calculations of fast-track THA and TKA and compare...

  1. Neural Computations in a Dynamical System with Multiple Time Scales

    Directory of Open Access Journals (Sweden)

    Yuanyuan Mi

    2016-09-01

    Full Text Available Neural systems display rich short-term dynamics at various levels, e.g., spike-frequencyadaptation (SFA at single neurons, and short-term facilitation (STF and depression (STDat neuronal synapses. These dynamical features typically covers a broad range of time scalesand exhibit large diversity in different brain regions. It remains unclear what the computationalbenefit for the brain to have such variability in short-term dynamics is. In this study, we proposethat the brain can exploit such dynamical features to implement multiple seemingly contradictorycomputations in a single neural circuit. To demonstrate this idea, we use continuous attractorneural network (CANN as a working model and include STF, SFA and STD with increasing timeconstants in their dynamics. Three computational tasks are considered, which are persistent activity,adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, andhence cannot be implemented by a single dynamical feature or any combination with similar timeconstants. However, with properly coordinated STF, SFA and STD, we show that the network isable to implement the three computational tasks concurrently. We hope this study will shed lighton the understanding of how the brain orchestrates its rich dynamics at various levels to realizediverse cognitive functions.

  2. Self-reported screen time and cardiometabolic risk in obese Dutch adolescents.

    Directory of Open Access Journals (Sweden)

    Teatske M Altenburg

    Full Text Available BACKGROUND: It is not clear whether the association between sedentary time and cardiometabolic risk exists among obese adolescents. We examined the association between screen time (TV and computer time and cardiometabolic risk in obese Dutch adolescents. METHODS AND FINDINGS: For the current cross-sectional study, baseline data of 125 Dutch overweight and obese adolescents (12-18 years participating in the Go4it study were included. Self-reported screen time (Activity Questionnaire for Adolescents and Adults and clustered and individual cardiometabolic risk (i.e. body composition, systolic and diastolic blood pressure, low-density (LDL-C, high-density (HDL-C and total cholesterol (TC, triglycerides, glucose and insulin were assessed in all participants. Multiple linear regression analyses were used to assess the association between screen time and cardiometabolic risk, adjusting for age, gender, pubertal stage, ethnicity and moderate-to-vigorous physical activity. We found no significant relationship between self-reported total screen time and clustered cardiometabolic risk or individual risk factors in overweight and obese adolescents. Unexpectedly, self-reported computer time, but not TV time, was slightly but significantly inversely associated with TC (B = -0.002; CI = [-0.003;-0.000] and LDL-C (B = -0.002; CI = [-0.001;0.000]. CONCLUSIONS: In obese adolescents we could not confirm the hypothesised positive association between screen time and cardiometabolic risk. Future studies should consider computer use as a separate class of screen behaviour, thereby also discriminating between active video gaming and other computer activities.

  3. Timing of urinary catheter removal after uncomplicated total abdominal hysterectomy: a prospective randomized trial.

    Science.gov (United States)

    Ahmed, Magdy R; Sayed Ahmed, Waleed A; Atwa, Khaled A; Metwally, Lobna

    2014-05-01

    To assess whether immediate (0h), intermediate (after 6h) or delayed (after 24h) removal of an indwelling urinary catheter after uncomplicated abdominal hysterectomy can affect the rate of re-catheterization due to urinary retention, rate of urinary tract infection, ambulation time and length of hospital stay. Prospective randomized controlled trial conducted at Suez Canal University Hospital, Egypt. Two hundred and twenty-one women underwent total abdominal hysterectomy for benign gynecological diseases and were randomly allocated into three groups. Women in group A (73 patients) had their urinary catheter removed immediately after surgery. Group B (81 patients) had the catheter removed 6h post-operatively while in group C (67 patients) the catheter was removed after 24h. The main outcome measures were the frequency of urinary retention, urinary tract infections, ambulation time and length of hospital stay. There was a significantly higher number of urinary retention episodes requiring re-catheterization in the immediate removal group compared to the intermediate and delayed removal groups (16.4% versus 2.5% and 0% respectively). Delayed urinary catheter removal was associated with a higher incidence of urinary tract infections (15%), delayed ambulation time (10.3h) and longer hospital stay (5.6 days) compared to the early (1.4%, 4.1h and 3.2 days respectively) and intermediate (3.7%, 6.8h and 3.4 days respectively) removal groups. Removal of the urinary catheter 6h postoperatively appears to be more advantageous than early or late removal in cases of uncomplicated total abdominal hysterectomy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Total Path Length and Number of Terminal Nodes for Decision Trees

    KAUST Repository

    Hussain, Shahid

    2014-09-13

    This paper presents a new tool for study of relationships between total path length (average depth) and number of terminal nodes for decision trees. These relationships are important from the point of view of optimization of decision trees. In this particular case of total path length and number of terminal nodes, the relationships between these two cost functions are closely related with space-time trade-off. In addition to algorithm to compute the relationships, the paper also presents results of experiments with datasets from UCI ML Repository1. These experiments show how two cost functions behave for a given decision table and the resulting plots show the Pareto frontier or Pareto set of optimal points. Furthermore, in some cases this Pareto frontier is a singleton showing the total optimality of decision trees for the given decision table.

  5. The use of diffusion theory to compute invasion effects for the pulsed neutron thermal decay time log

    International Nuclear Information System (INIS)

    Tittle, C.W.

    1992-01-01

    Diffusion theory has been successfully used to model the effect of fluid invasion into the formation for neutron porosity logs and for the gamma-gamma density log. The purpose of this paper is to present results of computations using a five-group time-dependent diffusion code on invasion effects for the pulsed neutron thermal decay time log. Previous invasion studies by the author involved the use of a three-dimensional three-group steady-state diffusion theory to model the dual-detector thermal neutron porosity log and the gamma-gamma density log. The five-group time-dependent code MGNDE (Multi-Group Neutron Diffusion Equation) used in this work was written by Ferguson. It has been successfully used to compute the intrinsic formation life-time correction for pulsed neutron thermal decay time logs. This application involves the effect of fluid invasion into the formation

  6. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  7. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.

    2008-01-01

    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  8. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    Science.gov (United States)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  9. Prosthetic liner wear in total hip replacement: a longitudinal 13-year study with computed tomography.

    Science.gov (United States)

    Weidenhielm, Lars; Olivecrona, Henrik; Maguire, Gerald Q; Noz, Marilyn E

    2018-06-01

    This case report follows a woman who had a total hip replacement in 1992 when she was 45 years old. Six serial computed tomography (CT) examinations over a period of 13 years provided information that allowed her revision surgery to be limited to liner replacement as opposed to replacement of the entire prosthesis. Additionally, they provided data that ruled out the presence of osteolysis and indeed none was found at surgery. In 2004, when the first CT was performed, the 3D distance the femoral head had penetrated into the cup was determined to be 2.6 mm. By 2017, femoral head penetration had progressed to 5.0 mm. The extracted liner showed wear at the thinnest part to be 5.5 mm, as measured with a micrometer. The use of modern CT techniques can identify problems, while still correctable without major surgery. Furthermore, the ability of CT to assess the direction of wear revealed that the liner wear changed from the cranial to dorsal direction.

  10. Prevalence and correlates of problematic internet experiences and computer-using time: a two-year longitudinal study in korean school children.

    Science.gov (United States)

    Yang, Su-Jin; Stewart, Robert; Lee, Ju-Yeon; Kim, Jae-Min; Kim, Sung-Wan; Shin, Il-Seon; Yoon, Jin-Sang

    2014-01-01

    To measure the prevalence of and factors associated with online inappropriate sexual exposure, cyber-bullying victimisation, and computer-using time in early adolescence. A two-year, prospective school survey was performed with 1,173 children aged 13 at baseline. Data collected included demographic factors, bullying experience, depression, anxiety, coping strategies, self-esteem, psychopathology, attention-deficit hyperactivity disorder symptoms, and school performance. These factors were investigated in relation to problematic Internet experiences and computer-using time at age 15. The prevalence of online inappropriate sexual exposure, cyber-bullying victimisation, academic-purpose computer overuse, and game-purpose computer overuse was 31.6%, 19.2%, 8.5%, and 21.8%, respectively, at age 15. Having older siblings, more weekly pocket money, depressive symptoms, anxiety symptoms, and passive coping strategy were associated with reported online sexual harassment. Male gender, depressive symptoms, and anxiety symptoms were associated with reported cyber-bullying victimisation. Female gender was associated with academic-purpose computer overuse, while male gender, lower academic level, increased height, and having older siblings were associated with game-purpose computer-overuse. Different environmental and psychological factors predicted different aspects of problematic Internet experiences and computer-using time. This knowledge is important for framing public health interventions to educate adolescents about, and prevent, internet-derived problems.

  11. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  12. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  13. Quantitative analysis of orthopedic metal artefact reduction in 64-slice computed tomography scans in large head metal-on-metal total hip replacement, a phantom study

    NARCIS (Netherlands)

    Boomsma, Martijn F.; Warringa, Niek; Edens, Mireille A.; Mueller, Dirk; Ettema, Harmen B.; Verheyen, Cees C. P. M.; Maas, Mario

    2016-01-01

    Purpose: Quantification of the effect of O-MAR on decreasing metal artefacts caused by large head metal on metal total hip arthroplasty (MoM THA) in a dedicated phantom setup of the hip. Background: Pathological reactions of the hip capsule on Computed tomography (CT) can be difficult to diagnose

  14. Real time computer system with distributed microprocessors

    International Nuclear Information System (INIS)

    Heger, D.; Steusloff, H.; Syrbe, M.

    1979-01-01

    The usual centralized structure of computer systems, especially of process computer systems, cannot sufficiently use the progress of very large-scale integrated semiconductor technology with respect to increasing the reliability and performance and to decreasing the expenses especially of the external periphery. This and the increasing demands on process control systems has led the authors to generally examine the structure of such systems and to adapt it to the new surroundings. Computer systems with distributed, optical fibre-coupled microprocessors allow a very favourable problem-solving with decentralized controlled buslines and functional redundancy with automatic fault diagnosis and reconfiguration. A fit programming system supports these hardware properties: PEARL for multicomputer systems, dynamic loader, processor and network operating system. The necessary design principles for this are proved mainly theoretically and by value analysis. An optimal overall system of this new generation of process control systems was established, supported by results of 2 PDV projects (modular operating systems, input/output colour screen system as control panel), for the purpose of testing by apllying the system for the control of 28 pit furnaces of a steel work. (orig.) [de

  15. Total and segmental colon transit time in constipated children assessed by scintigraphy with 111In-DTPA given orally.

    Science.gov (United States)

    Vattimo, A; Burroni, L; Bertelli, P; Messina, M; Meucci, D; Tota, G

    1993-12-01

    Serial colon scintigraphy using 111In-DTPA (2 MBq) given orally was performed in 39 children referred for constipation, and the total and segmental colon transit times were measured. The bowel movements during the study were recorded and the intervals between defecations (ID) were calculated. This method proved able to identify children with normal colon morphology (no. = 32) and those with dolichocolon (no. = 7). Normal children were not included for ethical reasons and we used the normal range determined by others using x-ray methods (29 +/- 4 hours). Total and segmental colon transit times were found to be prolonged in all children with dolichocolon (TC: 113.55 +/- 41.20 hours; RC: 39.85 +/- 26.39 hours; LC: 43.05 +/- 18.30 hours; RS: 30.66 +/- 26.89 hours). In the group of children with a normal colon shape, 13 presented total and segmental colon transit times within the referred normal value (TC: 27.79 +/- 4.10 hours; RC: 9.11 +/- 2.53 hours; LC: 9.80 +/- 3.50 hours; RS: 8.88 +/- 4.09 hours) and normal bowel function (ID: 23.37 +/- 5.93 hours). In the remaining children, 5 presented prolonged retention in the rectum (RS: 53.36 +/- 29.66 hours), and 14 a prolonged transit time in all segments. A good correlation was found between the transit time and bowel function. From the point of view of radiation dosimetry, the most heavily irradiated organs were the lower large intestine and the ovaries, and the level of radiation burden depended on the colon transit time. We can conclude that the described method results safe, accurate and fully diagnostic.

  16. Decreasing Computational Time for VBBinaryLensing by Point Source Approximation

    Science.gov (United States)

    Tirrell, Bethany M.; Visgaitis, Tiffany A.; Bozza, Valerio

    2018-01-01

    The gravitational lens of a binary system produces a magnification map that is more intricate than a single object lens. This map cannot be calculated analytically and one must rely on computational methods to resolve. There are generally two methods of computing the microlensed flux of a source. One is based on ray-shooting maps (Kayser, Refsdal, & Stabell 1986), while the other method is based on an application of Green’s theorem. This second method finds the area of an image by calculating a Riemann integral along the image contour. VBBinaryLensing is a C++ contour integration code developed by Valerio Bozza, which utilizes this method. The parameters at which the source object could be treated as a point source, or in other words, when the source is far enough from the caustic, was of interest to substantially decrease the computational time. The maximum and minimum values of the caustic curves produced, were examined to determine the boundaries for which this simplification could be made. The code was then run for a number of different maps, with separation values and accuracies ranging from 10-1 to 10-3, to test the theoretical model and determine a safe buffer for which minimal error could be made for the approximation. The determined buffer was 1.5+5q, with q being the mass ratio. The theoretical model and the calculated points worked for all combinations of the separation values and different accuracies except the map with accuracy and separation equal to 10-3 for y1 max. An alternative approach has to be found in order to accommodate a wider range of parameters.

  17. Efficient Buffer Capacity and Scheduler Setting Computation for Soft Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Bekooij, Marco; Bekooij, Marco Jan Gerrit; Wiggers, M.H.; van Meerbergen, Jef

    2007-01-01

    Soft real-time applications that process data streams can often be intuitively described as dataflow process networks. In this paper we present a novel analysis technique to compute conservative estimates of the required buffer capacities in such process networks. With the same analysis technique

  18. Green computing: power optimisation of VFI-based real-time multiprocessor dataflow applications (extended version)

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  19. Time delay and duration of ionospheric total electron content responses to geomagnetic disturbances

    Directory of Open Access Journals (Sweden)

    J. Liu

    2010-03-01

    Full Text Available Although positive and negative signatures of ionospheric storms have been reported many times, global characteristics such as the time of occurrence, time delay and duration as well as their relations to the intensity of the ionospheric storms have not received enough attention. The 10 years of global ionosphere maps (GIMs of total electron content (TEC retrieved at Jet Propulsion Laboratory (JPL were used to conduct a statistical study of the time delay of the ionospheric responses to geomagnetic disturbances. Our results show that the time delays between geomagnetic disturbances and TEC responses depend on season, magnetic local time and magnetic latitude. In the summer hemisphere at mid- and high latitudes, the negative storm effects can propagate to the low latitudes at post-midnight to the morning sector with a time delay of 4–7 h. As the earth rotates to the sunlight, negative phase retreats to higher latitudes and starts to extend to the lower latitude toward midnight sector. In the winter hemisphere during the daytime and after sunset at mid- and low latitudes, the negative phase appearance time is delayed from 1–10 h depending on the local time, latitude and storm intensity compared to the same area in the summer hemisphere. The quick response of positive phase can be observed at the auroral area in the night-side of the winter hemisphere. At the low latitudes during the dawn-noon sector, the ionospheric negative phase responses quickly with time delays of 5–7 h in both equinoctial and solsticial months.

    Our results also manifest that there is a positive correlation between the intensity of geomagnetic disturbances and the time duration of both the positive phase and negative phase. The durations of both negative phase and positive phase have clear latitudinal, seasonal and magnetic local time (MLT dependence. In the winter hemisphere, long durations for the positive phase are 8–11 h and 12–14 h during the daytime at

  20. Time delay and duration of ionospheric total electron content responses to geomagnetic disturbances

    Directory of Open Access Journals (Sweden)

    J. Liu

    2010-03-01

    Full Text Available Although positive and negative signatures of ionospheric storms have been reported many times, global characteristics such as the time of occurrence, time delay and duration as well as their relations to the intensity of the ionospheric storms have not received enough attention. The 10 years of global ionosphere maps (GIMs of total electron content (TEC retrieved at Jet Propulsion Laboratory (JPL were used to conduct a statistical study of the time delay of the ionospheric responses to geomagnetic disturbances. Our results show that the time delays between geomagnetic disturbances and TEC responses depend on season, magnetic local time and magnetic latitude. In the summer hemisphere at mid- and high latitudes, the negative storm effects can propagate to the low latitudes at post-midnight to the morning sector with a time delay of 4–7 h. As the earth rotates to the sunlight, negative phase retreats to higher latitudes and starts to extend to the lower latitude toward midnight sector. In the winter hemisphere during the daytime and after sunset at mid- and low latitudes, the negative phase appearance time is delayed from 1–10 h depending on the local time, latitude and storm intensity compared to the same area in the summer hemisphere. The quick response of positive phase can be observed at the auroral area in the night-side of the winter hemisphere. At the low latitudes during the dawn-noon sector, the ionospheric negative phase responses quickly with time delays of 5–7 h in both equinoctial and solsticial months. Our results also manifest that there is a positive correlation between the intensity of geomagnetic disturbances and the time duration of both the positive phase and negative phase. The durations of both negative phase and positive phase have clear latitudinal, seasonal and magnetic local time (MLT dependence. In the winter hemisphere, long durations for the positive phase are 8–11 h and 12–14 h during the daytime at middle

  1. A State-of-the-Art Review of the Real-Time Computer-Aided Study of the Writing Process

    Science.gov (United States)

    Abdel Latif, Muhammad M.

    2008-01-01

    Writing researchers have developed various methods for investigating the writing process since the 1970s. The early 1980s saw the occurrence of the real-time computer-aided study of the writing process that relies on the protocols generated by recording the computer screen activities as writers compose using the word processor. This article…

  2. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    Science.gov (United States)

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  3. Massively parallel signal processing using the graphics processing unit for real-time brain-computer interface feature extraction

    Directory of Open Access Journals (Sweden)

    J. Adam Wilson

    2009-07-01

    Full Text Available The clock speeds of modern computer processors have nearly plateaued in the past five years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card (GPU was developed for real-time neural signal processing of a brain-computer interface (BCI. The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter, followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally-intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a CPU-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  4. Time-Domain Terahertz Computed Axial Tomography NDE System

    Science.gov (United States)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D

  5. Evolution of perturbed dynamical systems: analytical computation with time independent accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Gurzadyan, A.V. [Russian-Armenian (Slavonic) University, Department of Mathematics and Mathematical Modelling, Yerevan (Armenia); Kocharyan, A.A. [Monash University, School of Physics and Astronomy, Clayton (Australia)

    2016-12-15

    An analytical method for investigation of the evolution of dynamical systems with independent on time accuracy is developed for perturbed Hamiltonian systems. The error-free estimation using of computer algebra enables the application of the method to complex multi-dimensional Hamiltonian and dissipative systems. It also opens principal opportunities for the qualitative study of chaotic trajectories. The performance of the method is demonstrated on perturbed two-oscillator systems. It can be applied to various non-linear physical and astrophysical systems, e.g. to long-term planetary dynamics. (orig.)

  6. 20 CFR 226.52 - Total annuity subject to maximum.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Total annuity subject to maximum. 226.52... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Railroad Retirement Family Maximum § 226.52 Total annuity subject to maximum. The total annuity amount which is compared to the maximum monthly amount to...

  7. Optimum filters with time width constraints for liquid argon total-absorption detectors

    International Nuclear Information System (INIS)

    Gatti, E.; Radeka, V.

    1977-10-01

    Optimum filter responses are found for triangular current input pulses occurring in liquid argon ionization chambers used as total absorption detectors. The filters considered are subject to the following constraints: finite width of the output pulse having a prescribed ratio to the width of the triangular input current pulse and zero area of a bipolar antisymmetrical pulse or of a three lobe pulse, as required for high event rates. The feasibility of pulse shaping giving an output equal to, or shorter than, the input one is demonstrated. It is shown that the signal-to-noise ratio remains constant for the chamber interelectrode gap which gives an input pulse width (i.e., electron drift time) greater than one third of the required output pulse width

  8. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    Science.gov (United States)

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Measuring older adults' sedentary time: reliability, validity, and responsiveness.

    Science.gov (United States)

    Gardiner, Paul A; Clark, Bronwyn K; Healy, Genevieve N; Eakin, Elizabeth G; Winkler, Elisabeth A H; Owen, Neville

    2011-11-01

    With evidence that prolonged sitting has deleterious health consequences, decreasing sedentary time is a potentially important preventive health target. High-quality measures, particularly for use with older adults, who are the most sedentary population group, are needed to evaluate the effect of sedentary behavior interventions. We examined the reliability, validity, and responsiveness to change of a self-report sedentary behavior questionnaire that assessed time spent in behaviors common among older adults: watching television, computer use, reading, socializing, transport and hobbies, and a summary measure (total sedentary time). In the context of a sedentary behavior intervention, nonworking older adults (n = 48, age = 73 ± 8 yr (mean ± SD)) completed the questionnaire on three occasions during a 2-wk period (7 d between administrations) and wore an accelerometer (ActiGraph model GT1M) for two periods of 6 d. Test-retest reliability (for the individual items and the summary measure) and validity (self-reported total sedentary time compared with accelerometer-derived sedentary time) were assessed during the 1-wk preintervention period, using Spearman (ρ) correlations and 95% confidence intervals (CI). Responsiveness to change after the intervention was assessed using the responsiveness statistic (RS). Test-retest reliability was excellent for television viewing time (ρ (95% CI) = 0.78 (0.63-0.89)), computer use (ρ (95% CI) = 0.90 (0.83-0.94)), and reading (ρ (95% CI) = 0.77 (0.62-0.86)); acceptable for hobbies (ρ (95% CI) = 0.61 (0.39-0.76)); and poor for socializing and transport (ρ < 0.45). Total sedentary time had acceptable test-retest reliability (ρ (95% CI) = 0.52 (0.27-0.70)) and validity (ρ (95% CI) = 0.30 (0.02-0.54)). Self-report total sedentary time was similarly responsive to change (RS = 0.47) as accelerometer-derived sedentary time (RS = 0.39). The summary measure of total sedentary time has good repeatability and modest validity and is

  10. Terahertz time-domain attenuated total reflection spectroscopy applied to the rapid discrimination of the botanical origin of honeys

    Science.gov (United States)

    Liu, Wen; Zhang, Yuying; Yang, Si; Han, Donghai

    2018-05-01

    A new technique to identify the floral resources of honeys is demanded. Terahertz time-domain attenuated total reflection spectroscopy combined with chemometrics methods was applied to discriminate different categorizes (Medlar honey, Vitex honey, and Acacia honey). Principal component analysis (PCA), cluster analysis (CA) and partial least squares-discriminant analysis (PLS-DA) have been used to find information of the botanical origins of honeys. Spectral range also was discussed to increase the precision of PLS-DA model. The accuracy of 88.46% for validation set was obtained, using PLS-DA model in 0.5-1.5 THz. This work indicated terahertz time-domain attenuated total reflection spectroscopy was an available approach to evaluate the quality of honey rapidly.

  11. Could We Realize the Fully Flexible System by Real-Time Computing with Thin-Film Transistors?

    Directory of Open Access Journals (Sweden)

    Qin Li

    2017-11-01

    Full Text Available Flexible electronic devices, such as the typical thin-film transistors, are widely adopted in the area of sensors, displayers, wearable equipment, and such large-area applications, for their features of bending and stretching; additionally, in some applications of lower-resolution data converters recently, where a trend appears that implementing more parts of system with flexible devices to realize the fully flexible system. Nevertheless, relatively fewer works on the computation parts with flexible electronic devices are reported, due to their poor carrier mobility, which blocks the way to realize the fully flexible systems with uniform manufacturing process. In this paper, a novel circuit architecture for image processing accelerator using Oxide Thin-film transistor (TFT, which could realize real-time image pre-processing and classification in the analog domain, is proposed, where the performance and fault-tolerance of image signal processing is exploited. All of the computation is done in the analog signal domain and no clock signal is needed. Therefore, certain weaknesses of flexible electronic devices, such as low carrier mobility, could be remedied dramatically. In this paper, Simulations based on Oxide TFT device model have demonstrated that the flexible computing parts could perform 5 × 5 Gaussian convolution operation at a speed of 3.3 MOPS/s with the energy efficiency of 1.83 TOPS/J, and realize image classification at a speed of 10 k fps, with the energy efficiency of 5.25 GOPS/J, which means that the potential applications to realize real-time computing parts of complex algorithms with flexible electronic devices, as well as the future fully flexible systems containing sensors, data converters, energy suppliers, and real-time signal processing modules, all with flexible devices.

  12. Time-Space Trade-Offs for the Longest Common Substring Problem

    DEFF Research Database (Denmark)

    Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel

    2013-01-01

    The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give a ...... a positive answer in the case of two strings (d = m = 2) and 0 can be solved in O(n1-ε) space and O(n1+ε log2n (d log2n + d2)) time for any 0 ≤ ε ...The Longest Common Substring problem is to compute the longest substring which occurs in at least d ≥ 2 of m strings of total length n. In this paper we ask the question whether this problem allows a deterministic time-space trade-off using O(n1+ε) time and O(n1-ε) space for 0 ≤ ε ≤ 1. We give...

  13. Design and development of a diversified real time computer for future FBRs

    International Nuclear Information System (INIS)

    Sujith, K.R.; Bhattacharyya, Anindya; Behera, R.P.; Murali, N.

    2014-01-01

    The current safety related computer system of Prototype Fast Breeder Reactor (PFBR) under construction in Kalpakkam consists of two redundant Versa Module Europa (VME) bus based Real Time Computer system with a Switch Over Logic Circuit (SOLC). Since both the VME systems are identical, the dual redundant system is prone to common cause failure (CCF). The probability of CCF can be reduced by adopting diversity. Design diversity has long been used to protect redundant systems against common-mode failures. The conventional notion of diversity relies on 'independent' generation of 'different' implementations. This paper discusses the design and development of a diversified Real Time Computer which will replace one of the computer system in the dual redundant architecture. Compact PCI (cPCI) bus systems are widely used in safety critical applications such as avionics, railways, defence and uses diverse electrical signaling and logical specifications, hence was chosen for development of the diversified system. Towards the initial development a CPU card based on an ARM-9 processor, 16 channel Relay Output (RO) card and a 30 channel Analog Input (AI) card was developed. All the cards mentioned supports hot-swap and geographic addressing capability. In order to mitigate the component obsolescence problem the 32 bit PCI target controller and associated glue logic for the slave I/O cards was indigenously developed using VHDL. U-boot was selected as the boot loader and arm Linux 2.6 as the preliminary operating system for the CPU card. Board specific initialization code for the CPU card was written in ARM assembly language and serial port initialization was written in C language. Boot loader along with Linux 2.6 kernel and jffs2 file system was flashed into the CPU card. Test applications written in C language were used to test the various peripherals of the CPU card. Device driver for the AI and RO card was developed as Linux kernel modules and application library was also

  14. Fast Determination of Distribution-Connected PV Impacts Using a Variable Time-Step Quasi-Static Time-Series Approach: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Mather, Barry

    2017-08-24

    The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce the required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.

  15. Primary total knee arthroplasty assisted by computer (orthopilot)

    International Nuclear Information System (INIS)

    Naquira, Luis Felipe; Restrepo Tello, Fabio; Pineda Acero, Gustavo; Clavijo, Edgar; Buitrago, Mario

    2004-01-01

    Between December 2001 and December 2003, 21 patients suffering from osteoarthrosis of the knee were treated using computer assisted complete primary arthroplasty of the knee (Orthopilot) at the Department of Orthopedics and Traumatology in the Hospital Militar Central. Results were followed up for an average of 18 months. The criteria evaluated were function and mechanical axis, using ortho-radiography, and length of surgery. The average value obtained was 4.3 degrades with a mechanical axis passing through the centre of the knee joint in 95% of cases, confirmed by a mechanical femoral-tibial angle of -2.5 degrades. Length of surgery was increased by an average of 1 hour compared to the conventional technique. Complications were as follows: One patient experienced an infection which required the withdrawal of the prosthesis and in four of the operations technical problems with the navigator made it necessary to proceed using conventional techniques and these patients were excluded from the study. Radiological results ranged from Excellent to Good in 100% of cases

  16. Experimental technique for study on three-particle reactions in kinematically total experiments with usage of the two-processor complex on the M-400 computer basis

    International Nuclear Information System (INIS)

    Berezin, F.N.; Kisurin, V.A.; Nemets, O.F.; Ofengenden, R.G.; Pugach, V.M.; Pavlenko, Yu.N.; Patlan', Yu.V.; Savrasov, S.S.

    1981-01-01

    Experimental technique for investigation of three-particle nuclear reactions in kinematically total experiments is described. The technique provides the storage of one-dimensional and two- dimensional energy spectra from several detectors. A block diagram of the measuring system, using this technique, is presented. The measuring system consists of analog equipment for rapid-slow coincidences and of a two-processor complex on the base of the M-400 computer with a general bus. Application of a two-processor complex, each computer of which has a possibility of direct access to memory of another computer, permits to separate functions of data collection and data operational presentation and to perform necessary physical calculations. Software of the measuring complex which includes programs written using the ASSEMBLER language for the first computer and functional programs written using the BASIC language for the second computer, is considered. Software of the first computer includes the DISPETCHER dialog control program, driver package for control of external devices, of applied program package and system modules. The technique, described, is tested in experiment on investigation of d+ 10 B→α+α+α three- particle reaction at deutron energy of 13.6 MeV. The two-dimensional energy spectrum reaction obtained with the help of the technique described is presented [ru

  17. Continuous real-time water-quality monitoring and regression analysis to compute constituent concentrations and loads in the North Fork Ninnescah River upstream from Cheney Reservoir, south-central Kansas, 1999–2012

    Science.gov (United States)

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir, located in south-central Kansas, is the primary water supply for the city of Wichita. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station since 1998 on the North Fork Ninnescah River, the main source of inflow to Cheney Reservoir. Continuously measured water-quality physical properties include streamflow, specific conductance, pH, water temperature, dissolved oxygen, and turbidity. Discrete water-quality samples were collected during 1999 through 2009 and analyzed for sediment, nutrients, bacteria, and other water-quality constituents. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to compute concentrations of those constituents of interest that are not easily measured in real time because of limitations in sensor technology and fiscal constraints. Regression models were published in 2006 that were based on data collected during 1997 through 2003. This report updates those models using discrete and continuous data collected during January 1999 through December 2009. Models also were developed for four new constituents, including additional nutrient species and indicator bacteria. In addition, a conversion factor of 0.68 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the North Ninnescah River upstream from Cheney Reservoir site. Newly developed models and 14 years of hourly continuously measured data were used to calculate selected constituent concentrations and loads during January 1999 through December 2012. The water-quality information in this report is important to the city of Wichita because it allows the concentrations of many potential pollutants of interest to Cheney Reservoir, including nutrients and sediment, to be estimated in real time and characterized over conditions and time scales that

  18. Computer/Mobile Device Screen Time of Children and Their Eye Care Behavior: The Roles of Risk Perception and Parenting.

    Science.gov (United States)

    Chang, Fong-Ching; Chiu, Chiung-Hui; Chen, Ping-Hung; Miao, Nae-Fang; Chiang, Jeng-Tung; Chuang, Hung-Yi

    2018-03-01

    This study assessed the computer/mobile device screen time and eye care behavior of children and examined the roles of risk perception and parental practices. Data were obtained from a sample of 2,454 child-parent dyads recruited from 30 primary schools in Taipei city and New Taipei city, Taiwan, in 2016. Self-administered questionnaires were collected from students and parents. Fifth-grade students spend more time on new media (computer/smartphone/tablet: 16 hours a week) than on traditional media (television: 10 hours a week). The average daily screen time (3.5 hours) for these children exceeded the American Academy of Pediatrics recommendations (≤2 hours). Multivariate analysis results showed that after controlling for demographic factors, the parents with higher levels of risk perception and parental efficacy were more likely to mediate their child's eye care behavior. Children who reported lower academic performance, who were from non-intact families, reported lower levels of risk perception of mobile device use, had parents who spent more time using computers and mobile devices, and had lower levels of parental mediation were more likely to spend more time using computers and mobile devices; whereas children who reported higher academic performance, higher levels of risk perception, and higher levels of parental mediation were more likely to engage in higher levels of eye care behavior. Risk perception by children and parental practices are associated with the amount of screen time that children regularly engage in and their level of eye care behavior.

  19. Assessing total nitrogen in surface-water samples--precision and bias of analytical and computational methods

    Science.gov (United States)

    Rus, David L.; Patton, Charles J.; Mueller, David K.; Crawford, Charles G.

    2013-01-01

    The characterization of total-nitrogen (TN) concentrations is an important component of many surface-water-quality programs. However, three widely used methods for the determination of total nitrogen—(1) derived from the alkaline-persulfate digestion of whole-water samples (TN-A); (2) calculated as the sum of total Kjeldahl nitrogen and dissolved nitrate plus nitrite (TN-K); and (3) calculated as the sum of dissolved nitrogen and particulate nitrogen (TN-C)—all include inherent limitations. A digestion process is intended to convert multiple species of nitrogen that are present in the sample into one measureable species, but this process may introduce bias. TN-A results can be negatively biased in the presence of suspended sediment, and TN-K data can be positively biased in the presence of elevated nitrate because some nitrate is reduced to ammonia and is therefore counted twice in the computation of total nitrogen. Furthermore, TN-C may not be subject to bias but is comparatively imprecise. In this study, the effects of suspended-sediment and nitrate concentrations on the performance of these TN methods were assessed using synthetic samples developed in a laboratory as well as a series of stream samples. A 2007 laboratory experiment measured TN-A and TN-K in nutrient-fortified solutions that had been mixed with varying amounts of sediment-reference materials. This experiment identified a connection between suspended sediment and negative bias in TN-A and detected positive bias in TN-K in the presence of elevated nitrate. A 2009–10 synoptic-field study used samples from 77 stream-sampling sites to confirm that these biases were present in the field samples and evaluated the precision and bias of TN methods. The precision of TN-C and TN-K depended on the precision and relative amounts of the TN-component species used in their respective TN computations. Particulate nitrogen had an average variability (as determined by the relative standard deviation) of 13

  20. Minimizing total costs of forest roads with computer-aided design ...

    Indian Academy of Sciences (India)

    imum total road costs, while conforming to design specifications, environmental ..... quality, and enhancing fish and wildlife habitat, an appropriate design ..... Soil, Water and Timber Management: Forest Engineering Solutions in Response to.

  1. Job tasks, computer use, and the decreasing part-time pay penalty for women in the UK

    OpenAIRE

    Elsayed, A.E.A.; de Grip, A.; Fouarge, D.

    2014-01-01

    Using data from the UK Skills Surveys, we show that the part-time pay penalty for female workers within low- and medium-skilled occupations decreased significantly over the period 1997-2006. The convergence in computer use between part-time and full-time workers within these occupations explains a large share of the decrease in the part-time pay penalty. However, the lower part-time pay penalty is also related to lower wage returns to reading and writing which are performed more intensively b...

  2. Development of a real-time monitoring system and integration of different computer system in LHD experiments using IP multicast

    International Nuclear Information System (INIS)

    Emoto, Masahiko; Nakamura, Yukio; Teramachi, Yasuaki; Okumura, Haruhiko; Yamaguchi, Satarou

    2002-01-01

    There are several different computer systems in LHD (Large Helical Device) experiment, and therefore the coalition of these computers is a key to perform the experiment. Real-time monitoring system is also important because the long discharge is needed in the LHD experiment. In order to achieve these two requirements, the technique of IP multicast is adopted. The authors have developed three new systems, the first one is the real-time monitoring system, the next one is the delivery system of the shot number and the last one is the real-time notification system of the plasma data registration. The first system can deliver the real-time monitoring data to the LHD experimental LAN through the firewall of the LHD control LAN in NIFS. The other two systems are used to realize high coalition of the different computers in the LHD plasma experiment. We can conclude that IP multicast is very useful both in the LHD experiment and a future large plasma experiment from various experiences. (author)

  3. A cascadic monotonic time-discretized algorithm for finite-level quantum control computation

    Science.gov (United States)

    Ditz, P.; Borzi`, A.

    2008-03-01

    A computer package (CNMS) is presented aimed at the solution of finite-level quantum optimal control problems. This package is based on a recently developed computational strategy known as monotonic schemes. Quantum optimal control problems arise in particular in quantum optics where the optimization of a control representing laser pulses is required. The purpose of the external control field is to channel the system's wavefunction between given states in its most efficient way. Physically motivated constraints, such as limited laser resources, are accommodated through appropriately chosen cost functionals. Program summaryProgram title: CNMS Catalogue identifier: ADEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 770 No. of bytes in distributed program, including test data, etc.: 7098 Distribution format: tar.gz Programming language: MATLAB 6 Computer: AMD Athlon 64 × 2 Dual, 2:21 GHz, 1:5 GB RAM Operating system: Microsoft Windows XP Word size: 32 Classification: 4.9 Nature of problem: Quantum control Solution method: Iterative Running time: 60-600 sec

  4. Screen time viewing behaviors and isometric trunk muscle strength in youth.

    Science.gov (United States)

    Grøntved, Anders; Ried-Larsen, Mathias; Froberg, Karsten; Wedderkopp, Niels; Brage, Søren; Kristensen, Peter Lund; Andersen, Lars Bo; Møller, Niels Christian

    2013-10-01

    The objective of this study was to examine the association of screen time viewing behavior with isometric trunk muscle strength in youth. A cross-sectional study was carried out including 606 adolescents (14-16 yr old) participating in the Danish European Youth Heart Study, a population-based study with assessments conducted in either 1997/1998 or 2003/2004. Maximal voluntary contractions during isometric back extension and abdominal flexion were determined using a strain gauge dynamometer, and cardiorespiratory fitness (CRF) was obtained using a maximal cycle ergometer test. TV viewing time, computer use, and other lifestyle behaviors were obtained by self-report. Analyses of association of screen use behaviors with isometric trunk muscle strength were carried out using multivariable adjusted linear regression. The mean (SD) isometric strength was 0.87 (0.16) N·kg-1. TV viewing, computer use, and total screen time use were inversely associated with isometric trunk muscle strength in analyses adjusted for lifestyle and sociodemographic factors. After further adjustment for CRF and waist circumference, associations remained significant for computer use and total screen time, but TV viewing was only marginally associated with muscle strength after these additional adjustments (-0.05 SD (95% confidence interval, -0.11 to 0.005) difference in strength per 1 h·d-1 difference in TV viewing time, P = 0.08). Each 1 h·d-1 difference in total screen time use was associated with -0.09 SD (95% confidence interval, -0.14 to -0.04) lower isometric trunk muscle strength in the fully adjusted model (P = 0.001). There were no indications that the association of screen time use with isometric trunk muscle strength was attenuated among highly fit individuals (P = 0.91 for CRF by screen time interaction). Screen time use was inversely associated with isometric trunk muscle strength independent of CRF and other confounding factors.

  5. An Energy Efficient Neuromorphic Computing System Using Real Time Sensing Method

    DEFF Research Database (Denmark)

    Farkhani, Hooman; Tohidi, Mohammad; Farkhani, Sadaf

    2017-01-01

    In spintronic-based neuromorphic computing systems (NCS), the switching of magnetic moment in a magnetic tunnel junction (MTJ) is used to mimic neuron firing. However, the stochastic switching behavior of the MTJ and process variations effect leads to extra stimulation time. This leads to extra...... energy consumption and delay of such NCSs. In this paper, a new real-time sensing (RTS) circuit is proposed to track the MTJ state and terminate stimulation phase immediately after MTJ switching. This leads to significant degradation in energy consumption and delay of NCS. The simulation results using...... a 65-nm CMOS technology and a 40-nm MTJ technology confirm that the energy consumption of a RTS-based NCS is improved by 50% in comparison with a typical NCS. Moreover, utilizing RTS circuit improves the overall speed of an NCS by 2.75x....

  6. TIMED: a computer program for calculating cumulated activity of a radionuclide in the organs of the human body at a given time, t, after deposition

    International Nuclear Information System (INIS)

    Watson, S.B.; Snyder, W.S.; Ford, M.R.

    1976-12-01

    TIMED is a computer program designed to calculate cumulated radioactivity in the various source organs at various times after radionuclide deposition. TIMED embodies a system of differential equations which describes activity transfer in the lungs, gastrointestinal tract, and other organs of the body. This system accounts for delay of transfer of activity between compartments of the body and radioactive daughters

  7. Total testosterone levels are often more than three times elevated in patients with androgen-secreting tumours

    DEFF Research Database (Denmark)

    Glintborg, Dorte; Lambaa Altinok, Magda; Petersen, Kresten Rubeck

    2015-01-01

    surgery. Terminal hair growth on lip and chin gradually increases after menopause, which complicates distinction from normal physiological variation. Precise testosterone assays have just recently become available in the daily clinic. We present three women diagnosed with testosterone-producing tumours...... when total testosterone levels are above three times the upper reference limit....

  8. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  9. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  10. Clinical responses after total body irradiation by over permissible dose of γ-rays in one time

    International Nuclear Information System (INIS)

    Jiang Benrong; Wang Guilin; Liu Huilan; Tang Xingsheng; Ai Huisheng

    1990-01-01

    The clinical responses of patients after total body over permissilbe dose γ-ray irradiation were observed and analysed. The results showed: when the dose was above 5 cGy, there was some immunological depression, but no significant change in hematopoietic functions. 5 cases showed some transient changes of ECG, perhaps due to vagotonia caused by psychological imbalance, One case vomitted 3-4 times after 28 cGy irradiation, this suggested that a few times of vomitting had no significance in the estimation of the irradiated dose and the whole clinical manifestations must be concretely analysed

  11. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  12. Numerical analysis of boosting scheme for scalable NMR quantum computation

    International Nuclear Information System (INIS)

    SaiToh, Akira; Kitagawa, Masahiro

    2005-01-01

    Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis

  13. Chemistry, physics and time: the computer modelling of glassmaking.

    Science.gov (United States)

    Martlew, David

    2003-01-01

    A decade or so ago the remains of an early flat glass furnace were discovered in St Helens. Continuous glass production only became feasible after the Siemens Brothers demonstrated their continuous tank furnace at Dresden in 1870. One manufacturer of flat glass enthusiastically adopted the new technology and secretly explored many variations on this theme during the next fifteen years. Study of the surviving furnace remains using today's computer simulation techniques showed how, in 1887, that technology was adapted to the special demands of window glass making. Heterogeneous chemical reactions at high temperatures are required to convert the mixture of granular raw materials into the homogeneous glass needed for windows. Kinetics (and therefore the economics) of glassmaking is dominated by heat transfer and chemical diffusion as refractory grains are converted to highly viscous molten glass. Removal of gas bubbles in a sufficiently short period of time is vital for profitability, but the glassmaker must achieve this in a reaction vessel which is itself being dissolved by the molten glass. Design and operational studies of today's continuous tank furnaces need to take account of these factors, and good use is made of computer simulation techniques to shed light on the way furnaces behave and how improvements may be made. This paper seeks to show how those same techniques can be used to understand how the early Siemens continuous tank furnaces were designed and operated, and how the Victorian entrepreneurs succeeded in managing the thorny problems of what was, in effect, a vulnerable high temperature continuous chemical reactor.

  14. Time-driven activity based costing of total knee replacement surgery at a London teaching hospital.

    Science.gov (United States)

    Chen, Alvin; Sabharwal, Sanjeeve; Akhtar, Kashif; Makaram, Navnit; Gupte, Chinmay M

    2015-12-01

    The aim of this study was to conduct a time-driven activity based costing (TDABC) analysis of the clinical pathway for total knee replacement (TKR) and to determine where the major cost drivers lay. The in-patient pathway was prospectively mapped utilising a TDABC model, following 20 TKRs. The mean age for these patients was 73.4 years. All patients were ASA grade I or II and their mean BMI was 30.4. The 14 varus knees had a mean deformity of 5.32° and the six valgus knee had a mean deformity of 10.83°. Timings were prospectively collected as each patient was followed through the TKR pathway. Pre-operative costs including pre-assessment and joint school were £ 163. Total staff costs for admission and the operating theatre were £ 658. Consumables cost for the operating theatre were £ 1862. The average length of stay was 5.25 days at a total cost of £ 910. Trust overheads contributed £ 1651. The overall institutional cost of a 'noncomplex' TKR in patients without substantial medical co-morbidities was estimated to be £ 5422, representing a profit of £ 1065 based on a best practice tariff of £ 6487. The major cost drivers in the TKR pathway were determined to be theatre consumables, corporate overheads, overall ward cost and operating theatre staffing costs. Appropriate discounting of implant costs, reduction in length of stay by adopting an enhanced recovery programme and control of corporate overheads through the use of elective orthopaedic treatment centres are proposed approaches for reducing the overall cost of treatment. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Influence of different maceration time and temperatures on total phenols, colour and sensory properties of Cabernet Sauvignon wines.

    Science.gov (United States)

    Şener, Hasan; Yildirim, Hatice Kalkan

    2013-12-01

    Maceration and fermentation time and temperatures are important factors affecting wine quality. In this study different maceration times (3 and 6 days) and temperatures (15  and 25 ) during production of red wine (Vitis vinifera L. Cabernet Sauvignon) were investigated. In all wines standard wine chemical parameters and some specific parameters as total phenols, tartaric esters, total flavonols and colour parameters (CD, CI, T, dA%, %Y, %R, %B, CIELAB values) were determined. Sensory evaluation was performed by descriptive sensory analysis. The results demonstrated not only the importance of skin contact time and temperature during maceration but also the effects of transition temperatures (different maceration and fermentation temperatures) on wine quality as a whole. The results of sensory descriptive analyses revealed that the temperature significantly affected the aroma and flavour attributes of wines. The highest scores for 'cassis', 'clove', 'fresh fruity' and 'rose' characters were obtained in wines produced at low temperature (15 ) of maceration (6 days) and fermentation.

  16. Person-related determinants of TV viewing and computer time in a cohort of young Dutch adults: Who sits the most?

    Science.gov (United States)

    Uijtdewilligen, L; Singh, A S; Chinapaw, M J M; Twisk, J W R; van Mechelen, W

    2015-10-01

    We aimed to assess the associations of person-related factors with leisure time television (TV) viewing and computer time among young adults. We analyzed self-reported TV viewing (h/week) and leisure computer time (h/week) from 475 Dutch young adults (47% male) who had participated in the Amsterdam Growth and Health Longitudinal Study at the age of 32 and 36 years. Sociodemographic factors (i.e., marital and employment status), physical factors (i.e., skin folds, aerobic fitness, neuromotor fitness, back problems), psychological factors (i.e., problem- and emotion-focused coping, personality), lifestyle (i.e., alcohol consumption, smoking, energy intake, physical activity), and self-rated health (i.e., general health status, mild health complaints) were assessed. Univariable and multivariable generalized estimating equations were performed. Male gender, higher sum of skin folds, lower values of aerobic fitness, higher rigidity, higher self-sufficiency/recalcitrance, and smoking were positively associated with TV time. Male gender, higher sum of skin folds, higher scores on self-esteem, low energy intake, and a not so good general health status were significantly associated with higher computer time. Determinants of TV viewing and computer time were not identical, suggesting that both behaviors (a) have different at-risk populations and (b) should be targeted differently. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Allocation of Internal Medicine Resident Time in a Swiss Hospital: A Time and Motion Study of Day and Evening Shifts.

    Science.gov (United States)

    Wenger, Nathalie; Méan, Marie; Castioni, Julien; Marques-Vidal, Pedro; Waeber, Gérard; Garnier, Antoine

    2017-04-18

    Little current evidence documents how internal medicine residents spend their time at work, particularly with regard to the proportions of time spent in direct patient care versus using computers. To describe how residents allocate their time during day and evening hospital shifts. Time and motion study. Internal medicine residency at a university hospital in Switzerland, May to July 2015. 36 internal medicine residents with an average of 29 months of postgraduate training. Trained observers recorded the residents' activities using a tablet-based application. Twenty-two activities were categorized as directly related to patients, indirectly related to patients, communication, academic, nonmedical tasks, and transition. In addition, the presence of a patient or colleague and use of a computer or telephone during each activity was recorded. Residents were observed for a total of 696.7 hours. Day shifts lasted 11.6 hours (1.6 hours more than scheduled). During these shifts, activities indirectly related to patients accounted for 52.4% of the time, and activities directly related to patients accounted for 28.0%. Residents spent an average of 1.7 hours with patients, 5.2 hours using computers, and 13 minutes doing both. Time spent using a computer was scattered throughout the day, with the heaviest use after 6:00 p.m. The study involved a small sample from 1 institution. At this Swiss teaching hospital, internal medicine residents spent more time at work than scheduled. Activities indirectly related to patients predominated, and about half the workday was spent using a computer. Information Technology Department and Department of Internal Medicine of Lausanne University Hospital.

  18. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  19. Real-Time Evaluation of Breast Self-Examination Using Computer Vision

    Directory of Open Access Journals (Sweden)

    Eman Mohammadi

    2014-01-01

    Full Text Available Breast cancer is the most common cancer among women worldwide and breast self-examination (BSE is considered as the most cost-effective approach for early breast cancer detection. The general objective of this paper is to design and develop a computer vision algorithm to evaluate the BSE performance in real-time. The first stage of the algorithm presents a method for detecting and tracking the nipples in frames while a woman performs BSE; the second stage presents a method for localizing the breast region and blocks of pixels related to palpation of the breast, and the third stage focuses on detecting the palpated blocks in the breast region. The palpated blocks are highlighted at the time of BSE performance. In a correct BSE performance, all blocks must be palpated, checked, and highlighted, respectively. If any abnormality, such as masses, is detected, then this must be reported to a doctor to confirm the presence of this abnormality and proceed to perform other confirmatory tests. The experimental results have shown that the BSE evaluation algorithm presented in this paper provides robust performance.

  20. Real-time evaluation of breast self-examination using computer vision.

    Science.gov (United States)

    Mohammadi, Eman; Dadios, Elmer P; Gan Lim, Laurence A; Cabatuan, Melvin K; Naguib, Raouf N G; Avila, Jose Maria C; Oikonomou, Andreas

    2014-01-01

    Breast cancer is the most common cancer among women worldwide and breast self-examination (BSE) is considered as the most cost-effective approach for early breast cancer detection. The general objective of this paper is to design and develop a computer vision algorithm to evaluate the BSE performance in real-time. The first stage of the algorithm presents a method for detecting and tracking the nipples in frames while a woman performs BSE; the second stage presents a method for localizing the breast region and blocks of pixels related to palpation of the breast, and the third stage focuses on detecting the palpated blocks in the breast region. The palpated blocks are highlighted at the time of BSE performance. In a correct BSE performance, all blocks must be palpated, checked, and highlighted, respectively. If any abnormality, such as masses, is detected, then this must be reported to a doctor to confirm the presence of this abnormality and proceed to perform other confirmatory tests. The experimental results have shown that the BSE evaluation algorithm presented in this paper provides robust performance.

  1. Self-motion perception: assessment by real-time computer-generated animations

    Science.gov (United States)

    Parker, D. E.; Phillips, J. O.

    2001-01-01

    We report a new procedure for assessing complex self-motion perception. In three experiments, subjects manipulated a 6 degree-of-freedom magnetic-field tracker which controlled the motion of a virtual avatar so that its motion corresponded to the subjects' perceived self-motion. The real-time animation created by this procedure was stored using a virtual video recorder for subsequent analysis. Combined real and illusory self-motion and vestibulo-ocular reflex eye movements were evoked by cross-coupled angular accelerations produced by roll and pitch head movements during passive yaw rotation in a chair. Contrary to previous reports, illusory self-motion did not correspond to expectations based on semicircular canal stimulation. Illusory pitch head-motion directions were as predicted for only 37% of trials; whereas, slow-phase eye movements were in the predicted direction for 98% of the trials. The real-time computer-generated animations procedure permits use of naive, untrained subjects who lack a vocabulary for reporting motion perception and is applicable to basic self-motion perception studies, evaluation of motion simulators, assessment of balance disorders and so on.

  2. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    Science.gov (United States)

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  3. Two-Agent Single-Machine Scheduling of Jobs with Time-Dependent Processing Times and Ready Times

    Directory of Open Access Journals (Sweden)

    Jan-Yee Kung

    2013-01-01

    Full Text Available Scheduling involving jobs with time-dependent processing times has recently attracted much research attention. However, multiagent scheduling with simultaneous considerations of jobs with time-dependent processing times and ready times is relatively unexplored. Inspired by this observation, we study a two-agent single-machine scheduling problem in which the jobs have both time-dependent processing times and ready times. We consider the model in which the actual processing time of a job of the first agent is a decreasing function of its scheduled position while the actual processing time of a job of the second agent is an increasing function of its scheduled position. In addition, each job has a different ready time. The objective is to minimize the total completion time of the jobs of the first agent with the restriction that no tardy job is allowed for the second agent. We propose a branch-and-bound and several genetic algorithms to obtain optimal and near-optimal solutions for the problem, respectively. We also conduct extensive computational results to test the proposed algorithms and examine the impacts of different problem parameters on their performance.

  4. Biofeedback effectiveness to reduce upper limb muscle activity during computer work is muscle specific and time pressure dependent

    DEFF Research Database (Denmark)

    Vedsted, Pernille; Søgaard, Karen; Blangsted, Anne Katrine

    2011-01-01

    trapezius (TRA) can reduce bilateral TRA activity but not extensor digitorum communis (EDC) activity; (2) biofeedback from EDC can reduce activity in EDC but not in TRA; (3) biofeedback is more effective in no time constraint than in the time constraint working condition. Eleven healthy women performed......Continuous electromyographic (EMG) activity level is considered a risk factor in developing muscle disorders. EMG biofeedback is known to be useful in reducing EMG activity in working muscles during computer work. The purpose was to test the following hypotheses: (1) unilateral biofeedback from...... computer work during two different working conditions (time constraint/no time constraint) while receiving biofeedback. Biofeedback was given from right TRA or EDC through two modes (visual/auditory) by the use of EMG or mechanomyography as biofeedback source. During control sessions (no biofeedback), EMG...

  5. Usefulness of measurement of circulation time using MgSO4 : correlation with time-density curve using electron beam computed tomography

    International Nuclear Information System (INIS)

    Kim, Byung Ki; Lee, Hui Joong; Lee, Jong Min; Kim, Yong Joo; Kang, Duck Sik

    1999-01-01

    To determine the usefulness of MgSO 4 for measuring the systemic circulation time. Systemic circulation time, defined as elapsed time from the injection of MgSO 4 solution to the point of pharyngeal burning sensation, was measured in 63 volunteers. MgSO 4 was injected into a superficial vein of an upper extremity. Using dynamic electron beam computed tomography at the level of the abdominal aorta and celiac axis, a time-intensity curve was plotted, and for these two locations, maximal enhancement time was compared. For 60 of the 63 subjects, both systemic circulation time and maximal enhancement time were determined. Average systemic circulation time was 17.4 (SD:3.6) secs. and average maximal enhancement times at the level of the abdominal aorta and celiac axis were 17.5 (SD:3.0) secs. and 18.5 (SD:3.2) secs., respectively. Correlation coefficients between systemic circulation time and maximal enhancement time for the abdominal aorta and celiac axis were 0.73 (p 4 injection and maximal enhancement time for the abdominal aorta showed significant correlation. Thus, to determine the appropriate scanning time in contrast-enhanced radiological studies, MgSO 4 can be used instead of a test bolus study

  6. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  7. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  8. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  9. Brake response time is significantly impaired after total knee arthroplasty: investigation of performing an emergency stop while driving a car.

    Science.gov (United States)

    Jordan, Maurice; Hofmann, Ulf-Krister; Rondak, Ina; Götze, Marco; Kluba, Torsten; Ipach, Ingmar

    2015-09-01

    The objective of this study was to investigate whether total knee arthroplasty (TKA) impairs the ability to perform an emergency stop. An automatic transmission brake simulator was developed to evaluate total brake response time. A prospective repeated-measures design was used. Forty patients (20 left/20 right) were measured 8 days and 6, 12, and 52 wks after surgery. Eight days postoperative total brake response time increased significantly by 30% in right TKA and insignificantly by 2% in left TKA. Brake force significantly decreased by 35% in right TKA and by 25% in left TKA during this period. Baseline values were reached at week 12 in right TKA; the impairment of outcome measures, however, was no longer significant at week 6 compared with preoperative values. Total brake response time and brake force in left TKA fell below baseline values at weeks 6 and 12. Brake force in left TKA was the only outcome measure significantly impaired 8 days postoperatively. This study highlights that categorical statements cannot be provided. This study's findings on automatic transmission driving suggest that right TKA patients may resume driving 6 wks postoperatively. Fitness to drive in left TKA is not fully recovered 8 days postoperatively. If testing is not available, patients should refrain from driving until they return from rehabilitation.

  10. Visual Fatigue Induced by Viewing a Tablet Computer with a High-resolution Display.

    Science.gov (United States)

    Kim, Dong Ju; Lim, Chi Yeon; Gu, Namyi; Park, Choul Yong

    2017-10-01

    In the present study, the visual discomfort induced by smart mobile devices was assessed in normal and healthy adults. Fifty-nine volunteers (age, 38.16 ± 10.23 years; male : female = 19 : 40) were exposed to tablet computer screen stimuli (iPad Air, Apple Inc.) for 1 hour. Participants watched a movie or played a computer game on the tablet computer. Visual fatigue and discomfort were assessed using an asthenopia questionnaire, tear film break-up time, and total ocular wavefront aberration before and after viewing smart mobile devices. Based on the questionnaire, viewing smart mobile devices for 1 hour significantly increased mean total asthenopia score from 19.59 ± 8.58 to 22.68 ± 9.39 (p < 0.001). Specifically, the scores for five items (tired eyes, sore/aching eyes, irritated eyes, watery eyes, and hot/burning eye) were significantly increased by viewing smart mobile devices. Tear film break-up time significantly decreased from 5.09 ± 1.52 seconds to 4.63 ± 1.34 seconds (p = 0.003). However, total ocular wavefront aberration was unchanged. Visual fatigue and discomfort were significantly induced by viewing smart mobile devices, even though the devices were equipped with state-of-the-art display technology. © 2017 The Korean Ophthalmological Society

  11. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  12. A computationally efficient electricity price forecasting model for real time energy markets

    International Nuclear Information System (INIS)

    Feijoo, Felipe; Silva, Walter; Das, Tapas K.

    2016-01-01

    Highlights: • A fast hybrid forecast model for electricity prices. • Accurate forecast model that combines K-means and machine learning techniques. • Low computational effort by elimination of feature selection techniques. • New benchmark results by using market data for year 2012 and 2015. - Abstract: Increased significance of demand response and proliferation of distributed energy resources will continue to demand faster and more accurate models for forecasting locational marginal prices. This paper presents such a model (named K-SVR). While yielding prediction accuracy comparable with the best known models in the literature, K-SVR requires a significantly reduced computational time. The computational reduction is attained by eliminating the use of a feature selection process, which is commonly used by the existing models in the literature. K-SVR is a hybrid model that combines clustering algorithms, support vector machine, and support vector regression. K-SVR is tested using Pennsylvania–New Jersey–Maryland market data from the periods 2005–6, 2011–12, and 2014–15. Market data from 2006 has been used to measure performance of many of the existing models. Authors chose these models to compare performance and demonstrate strengths of K-SVR. Results obtained from K-SVR using the market data from 2012 and 2015 are new, and will serve as benchmark for future models.

  13. Validating the Accuracy of Reaction Time Assessment on Computer-Based Tablet Devices.

    Science.gov (United States)

    Schatz, Philip; Ybarra, Vincent; Leitner, Donald

    2015-08-01

    Computer-based assessment has evolved to tablet-based devices. Despite the availability of tablets and "apps," there is limited research validating their use. We documented timing delays between stimulus presentation and (simulated) touch response on iOS devices (3rd- and 4th-generation Apple iPads) and Android devices (Kindle Fire, Google Nexus, Samsung Galaxy) at response intervals of 100, 250, 500, and 1,000 milliseconds (ms). Results showed significantly greater timing error on Google Nexus and Samsung tablets (81-97 ms), than Kindle Fire and Apple iPads (27-33 ms). Within Apple devices, iOS 7 obtained significantly lower timing error than iOS 6. Simple reaction time (RT) trials (250 ms) on tablet devices represent 12% to 40% error (30-100 ms), depending on the device, which decreases considerably for choice RT trials (3-5% error at 1,000 ms). Results raise implications for using the same device for serial clinical assessment of RT using tablets, as well as the need for calibration of software and hardware. © The Author(s) 2015.

  14. Computer control system of TRISTAN

    International Nuclear Information System (INIS)

    Kurokawa, Shin-ichi; Shinomoto, Manabu; Kurihara, Michio; Sakai, Hiroshi.

    1984-01-01

    For the operation of a large accelerator, it is necessary to connect an enormous quantity of electro-magnets, power sources, vacuum equipment, high frequency accelerator and so on and to control them harmoniously. For the purpose, a number of computers are adopted, and connected with a network, in this way, a large computer system for laboratory automation which integrates and controls the whole system is constructed. As a distributed system of large scale, the functions such as electro-magnet control, file processing and operation control are assigned to respective computers, and the total control is made feasible by network connection, at the same time, as the interface with controlled equipment, the CAMAC (computer-aided measurement and control) is adopted to ensure the flexibility and the possibility of expansion of the system. Moreover, the language ''NODAL'' having network support function was developed so as to easily make software without considering the composition of more complex distributed system. The accelerator in the TRISTAN project is composed of an electron linear accelerator, an accumulation ring of 6 GeV and a main ring of 30 GeV. Two ring type accelerators must be synchronously operated as one body, and are controlled with one computer system. The hardware and software are outlined. (Kako, I.)

  15. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems

    Science.gov (United States)

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D.

    2016-01-01

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718

  16. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  17. Totally Optimal Decision Trees for Monotone Boolean Functions with at Most Five Variables

    KAUST Repository

    Chikalov, Igor

    2013-01-01

    In this paper, we present the empirical results for relationships between time (depth) and space (number of nodes) complexity of decision trees computing monotone Boolean functions, with at most five variables. We use Dagger (a tool for optimization of decision trees and decision rules) to conduct experiments. We show that, for each monotone Boolean function with at most five variables, there exists a totally optimal decision tree which is optimal with respect to both depth and number of nodes.

  18. Computational materials design

    International Nuclear Information System (INIS)

    Snyder, R.L.

    1999-01-01

    Full text: Trial and error experimentation is an extremely expensive route to the development of new materials. The coming age of reduced defense funding will dramatically alter the way in which advanced materials have developed. In the absence of large funding we must concentrate on reducing the time and expense that the R and D of a new material consumes. This may be accomplished through the development of computational materials science. Materials are selected today by comparing the technical requirements to the materials databases. When existing materials cannot meet the requirements we explore new systems to develop a new material using experimental databases like the PDF. After proof of concept, the scaling of the new material to manufacture requires evaluating millions of parameter combinations to optimize the performance of the new device. Historically this process takes 10 to 20 years and requires hundreds of millions of dollars. The development of a focused set of computational tools to predict the final properties of new materials will permit the exploration of new materials systems with only a limited amount of materials characterization. However, to bound computational extrapolations, the experimental formulations and characterization will need to be tightly coupled to the computational tasks. The required experimental data must be obtained by dynamic, in-situ, very rapid characterization. Finally, to evaluate the optimization matrix required to manufacture the new material, very rapid in situ analysis techniques will be essential to intelligently monitor and optimize the formation of a desired microstructure. Techniques and examples for the rapid real-time application of XRPD and optical microscopy will be shown. Recent developments in the cross linking of the world's structural and diffraction databases will be presented as the basis for the future Total Pattern Analysis by XRPD. Copyright (1999) Australian X-ray Analytical Association Inc

  19. Minimally invasive computer-navigated total hip arthroplasty, following the concept of femur first and combined anteversion: design of a blinded randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Woerner Michael

    2011-08-01

    Full Text Available Abstract Background Impingement can be a serious complication after total hip arthroplasty (THA, and is one of the major causes of postoperative pain, dislocation, aseptic loosening, and implant breakage. Minimally invasive THA and computer-navigated surgery were introduced several years ago. We have developed a novel, computer-assisted operation method for THA following the concept of "femur first"/"combined anteversion", which incorporates various aspects of performing a functional optimization of the cup position, and comprehensively addresses range of motion (ROM as well as cup containment and alignment parameters. Hence, the purpose of this study is to assess whether the artificial joint's ROM can be improved by this computer-assisted operation method. Second, the clinical and radiological outcome will be evaluated. Methods/Design A registered patient- and observer-blinded randomized controlled trial will be conducted. Patients between the ages of 50 and 75 admitted for primary unilateral THA will be included. Patients will be randomly allocated to either receive minimally invasive computer-navigated "femur first" THA or the conventional minimally invasive THA procedure. Self-reported functional status and health-related quality of life (questionnaires will be assessed both preoperatively and postoperatively. Perioperative complications will be registered. Radiographic evaluation will take place up to 6 weeks postoperatively with a computed tomography (CT scan. Component position will be evaluated by an independent external institute on a 3D reconstruction of the femur/pelvis using image-processing software. Postoperative ROM will be calculated by an algorithm which automatically determines bony and prosthetic impingements. Discussion In the past, computer navigation has improved the accuracy of component positioning. So far, there are only few objective data quantifying the risks and benefits of computer navigated THA. Therefore, this

  20. Computational derivation of quantum relativist electromagnetic systems with forward-backward space-time shifts

    International Nuclear Information System (INIS)

    Dubois, Daniel M.

    2000-01-01

    This paper is a continuation of our preceding paper dealing with computational derivation of the Klein-Gordon quantum relativist equation and the Schroedinger quantum equation with forward and backward space-time shifts. The first part introduces forward and backward derivatives for discrete and continuous systems. Generalized complex discrete and continuous derivatives are deduced. The second part deduces the Klein-Gordon equation from the space-time complex continuous derivatives. These derivatives take into account forward-backward space-time shifts related to an internal phase velocity u. The internal group velocity v is related to the speed of light u.v=c 2 and to the external group and phase velocities u.v=v g .v p . Without time shift, the Schroedinger equation is deduced, with a supplementary term, which could represent a reference potential. The third part deduces the Quantum Relativist Klein-Gordon equation for a particle in an electromagnetic field

  1. Computer games and fine motor skills.

    Science.gov (United States)

    Borecki, Lukasz; Tolstych, Katarzyna; Pokorski, Mieczyslaw

    2013-01-01

    The study seeks to determine the influence of computer games on fine motor skills in young adults, an area of incomplete understanding and verification. We hypothesized that computer gaming could have a positive influence on basic motor skills, such as precision, aiming, speed, dexterity, or tremor. We examined 30 habitual game users (F/M - 3/27; age range 20-25 years) of the highly interactive game Counter Strike, in which players impersonate soldiers on a battlefield, and 30 age- and gender-matched subjects who declared never to play games. Selected tests from the Vienna Test System were used to assess fine motor skills and tremor. The results demonstrate that the game users scored appreciably better than the control subjects in all tests employed. In particular, the players did significantly better in the precision of arm-hand movements, as expressed by a lower time of errors, 1.6 ± 0.6 vs. 2.8 ± 0.6 s, a lower error rate, 13.6 ± 0.3 vs. 20.4 ± 2.2, and a shorter total time of performing a task, 14.6 ± 2.9 vs. 32.1 ± 4.5 s in non-players, respectively; p computer games on psychomotor functioning. We submit that playing computer games may be a useful training tool to increase fine motor skills and movement coordination.

  2. Concentration and flux of total and dissolved phosphorus, total nitrogen, chloride, and total suspended solids for monitored tributaries of Lake Champlain, 1990-2012

    Science.gov (United States)

    Medalie, Laura

    2014-01-01

    Annual and daily concentrations and fluxes of total and dissolved phosphorus, total nitrogen, chloride, and total suspended solids were estimated for 18 monitored tributaries to Lake Champlain by using the Weighted Regressions on Time, Discharge, and Seasons regression model. Estimates were made for 21 or 23 years, depending on data availability, for the purpose of providing timely and accessible summary reports as stipulated in the 2010 update to the Lake Champlain “Opportunities for Action” management plan. Estimates of concentration and flux were provided for each tributary based on (1) observed daily discharges and (2) a flow-normalizing procedure, which removed the random fluctuations of climate-related variability. The flux bias statistic, an indicator of the ability of the Weighted Regressions on Time, Discharge, and Season regression models to provide accurate representations of flux, showed acceptable bias (less than ±10 percent) for 68 out of 72 models for total and dissolved phosphorus, total nitrogen, and chloride. Six out of 18 models for total suspended solids had moderate bias (between 10 and 30 percent), an expected result given the frequently nonlinear relation between total suspended solids and discharge. One model for total suspended solids with a very high bias was influenced by a single extreme value; however, removal of that value, although reducing the bias substantially, had little effect on annual fluxes.

  3. Neither pre-operative education or a minimally invasive procedure have any influence on the recovery time after total hip replacement.

    Science.gov (United States)

    Biau, David Jean; Porcher, Raphael; Roren, Alexandra; Babinet, Antoine; Rosencher, Nadia; Chevret, Sylvie; Poiraudeau, Serge; Anract, Philippe

    2015-08-01

    The purpose of this study was to evaluate pre-operative education versus no education and mini-invasive surgery versus standard surgery to reach complete independence. We conducted a four-arm randomized controlled trial of 209 patients. The primary outcome criterion was the time to reach complete functional independence. Secondary outcomes included the operative time, the estimated total blood loss, the pain level, the dose of morphine, and the time to discharge. There was no significant effect of either education (HR: 1.1; P = 0.77) or mini-invasive surgery (HR: 1.0; 95 %; P = 0.96) on the time to reach complete independence. The mini-invasive surgery group significantly reduced the total estimated blood loss (P = 0.0035) and decreased the dose of morphine necessary for titration in the recovery (P = 0.035). Neither pre-operative education nor mini-invasive surgery reduces the time to reach complete functional independence. Mini-invasive surgery significantly reduces blood loss and the need for morphine consumption.

  4. Timing organization of a real-time multicore processor

    DEFF Research Database (Denmark)

    Schoeberl, Martin; Sparsø, Jens

    2017-01-01

    Real-time systems need a time-predictable computing platform. Computation, communication, and access to shared resources needs to be time-predictable. We use time division multiplexing to statically schedule all computation and communication resources, such as access to main memory or message...... passing over a network-on-chip. We use time-driven communication over an asynchronous network-on-chip to enable time division multiplexing even in a globally asynchronous, locally synchronous multicore architecture. Using time division multiplexing at all levels of the architecture yields in a time...

  5. A note on neighborhood total domination in graphs

    Indian Academy of Sciences (India)

    [1] Arumugam S and Sivagnanam C, Neighborhood total domination in graphs, Opuscula. Mathematica 31 (2011) 519–531. [2] Chellali M and Haynes T W, A note on the total domination number of a tree, J. Combin. Math. Combin. Comput. 58 (2006) 189–193. [3] Haynes T W, Hedetniemi S T and Slater P J, Fundamentals ...

  6. Computer usage and its relationship with adolescent lifestyle in Hong Kong.

    Science.gov (United States)

    Ho, S M; Lee, T M

    2001-10-01

    To determine the patterns of computer usage among adolescents in Hong Kong and to examine whether computer usage is associated with less physical activity and social support among adolescents. A total of 2110 secondary school students (52% boys and 48% girls) in Hong Kong completed a set of questionnaires to measure their computer usage and lifestyle. Mean age of the respondents was 14.16 years (SD = 1.81 years). Computer usage was taped by asking the students to indicate how much time (in minutes) they spent on the computer each day for doing homework assignments; playing computer games; "surfing" the Internet; and communicating with others. The students also provided information on their social-physical lifestyle. Student's t-tests and analysis of variance were used to examine group differences. Pearson product moment correlations were used to explore relationships between computer usage and lifestyle. Boys who use computers for doing homework, "surfing" the Internet, and communicating with others engage in more social-physical activities than others. Boys who use computers to play games tend to be more social-behaviorally inactive. For girls, patterns of computer usage are not related to lifestyle. Computer users tended to engage in social-physical activities more frequently and had higher social support than nonusers. But among computer users, the amount of time spent daily on the computer was not associated with lifestyle. Instead, patterns of computer usage are more related to lifestyle and the relationship is moderated by gender.

  7. Kajian dan Implementasi Real Time Operating System pada Single Board Computer Berbasis Arm

    Directory of Open Access Journals (Sweden)

    Wiedjaja A

    2014-06-01

    Full Text Available Operating System is an important software in computer system. For personal and office use the operating system is sufficient. However, to critical mission applications such as nuclear power plants and braking system on the car (auto braking system which need a high level of reliability, it requires operating system which operates in real time. The study aims to assess the implementation of the Linux-based operating system on a Single Board Computer (SBC ARM-based, namely Pandaboard ES with the Dual-core ARM Cortex-A9, TI OMAP 4460 type. Research was conducted by the method of implementation of the General Purpose OS Ubuntu 12:04 OMAP4-armhf-RTOS and Linux 3.4.0-rt17 + on PandaBoard ES. Then research compared the latency value of each OS on no-load and with full-load condition. The results obtained show the maximum latency value of RTOS on full load condition is at 45 uS, much smaller than the maximum value of GPOS at full-load at 17.712 uS. The lower value of latency demontrates that the RTOS has ability to run the process in a certain period of time much better than the GPOS.

  8. Observed and simulated time evolution of HCl, ClONO2, and HF total column abundances

    Directory of Open Access Journals (Sweden)

    B.-M. Sinnhuber

    2012-04-01

    Full Text Available Time series of total column abundances of hydrogen chloride (HCl, chlorine nitrate (ClONO2, and hydrogen fluoride (HF were determined from ground-based Fourier transform infrared (FTIR spectra recorded at 17 sites belonging to the Network for the Detection of Atmospheric Composition Change (NDACC and located between 80.05° N and 77.82° S. By providing such a near-global overview on ground-based measurements of the two major stratospheric chlorine reservoir species, HCl and ClONO2, the present study is able to confirm the decrease of the atmospheric inorganic chlorine abundance during the last few years. This decrease is expected following the 1987 Montreal Protocol and its amendments and adjustments, where restrictions and a subsequent phase-out of the prominent anthropogenic chlorine source gases (solvents, chlorofluorocarbons were agreed upon to enable a stabilisation and recovery of the stratospheric ozone layer. The atmospheric fluorine content is expected to be influenced by the Montreal Protocol, too, because most of the banned anthropogenic gases also represent important fluorine sources. But many of the substitutes to the banned gases also contain fluorine so that the HF total column abundance is expected to have continued to increase during the last few years. The measurements are compared with calculations from five different models: the two-dimensional Bremen model, the two chemistry-transport models KASIMA and SLIMCAT, and the two chemistry-climate models EMAC and SOCOL. Thereby, the ability of the models to reproduce the absolute total column amounts, the seasonal cycles, and the temporal evolution found in the FTIR measurements is investigated and inter-compared. This is especially interesting because the models have different architectures. The overall agreement between the measurements and models for the total column abundances and the seasonal cycles is good. Linear trends of HCl, ClONO2, and HF are calculated from both

  9. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  10. A time and imaging cost analysis of low-risk ED observation patients: a conservative 64-section computed tomography coronary angiography "triple rule-out" compared to nuclear stress test strategy.

    Science.gov (United States)

    Takakuwa, Kevin M; Halpern, Ethan J; Shofer, Frances S

    2011-02-01

    The study aimed to examine time and imaging costs of 2 different imaging strategies for low-risk emergency department (ED) observation patients with acute chest pain or symptoms suggestive of acute coronary syndrome. We compared a "triple rule-out" (TRO) 64-section multidetector computed tomography protocol with nuclear stress testing. This was a prospective observational cohort study of consecutive ED patients who were enrolled in our chest pain observation protocol during a 16-month period. Our standard observation protocol included a minimum of 2 sets of cardiac enzymes at least 6 hours apart followed by a nuclear stress test. Once a week, observation patients were offered a TRO (to evaluate for coronary artery disease, thoracic dissection, and pulmonary embolus) multidetector computed tomography with the option of further stress testing for those patients found to have evidence of coronary artery disease. We analyzed 832 consecutive observation patients including 214 patients who underwent the TRO protocol. Mean total length of stay was 16.1 hours for TRO patients, 16.3 hours for TRO plus other imaging test, 22.6 hours for nuclear stress testing, 23.3 hours for nuclear stress testing plus other imaging tests, and 23.7 hours for nuclear stress testing plus TRO (P < .0001 for TRO and TRO + other test compared to stress test ± other test). Mean imaging times were 3.6, 4.4, 5.9, 7.5, and 6.6 hours, respectively (P < .05 for TRO and TRO + other test compared to stress test ± other test). Mean imaging costs were $1307 for TRO patients vs $945 for nuclear stress testing. Triple rule-out reduced total length of stay and imaging time but incurred higher imaging costs. A per-hospital analysis would be needed to determine if patient time savings justify the higher imaging costs. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Feasibility study of helical tomotherapy for total body or total marrow irradiation

    International Nuclear Information System (INIS)

    Hui, Susanta K.; Kapatoes, Jeff; Fowler, Jack; Henderson, Douglas; Olivera, Gustavo; Manon, Rafael R.; Gerbi, Bruce; Mackie, T. R.; Welsh, James S.

    2005-01-01

    Total body radiation (TBI) has been used for many years as a preconditioning agent before bone marrow transplantation. Many side effects still plague its use. We investigated the planning and delivery of total body irradiation (TBI) and selective total marrow irradiation (TMI) and a reduced radiation dose to sensitive structures using image-guided helical tomotherapy. To assess the feasibility of using helical tomotherapy (A) we studied variations in pitch, field width, and modulation factor on total body and total marrow helical tomotherapy treatments. We varied these parameters to provide a uniform dose along with a treatment times similar to conventional TBI (15-30 min). (B) We also investigated limited (head, chest, and pelvis) megavoltage CT (MVCT) scanning for the dimensional pretreatment setup verification rather than total body MVCT scanning to shorten the overall treatment time per treatment fraction. (C) We placed thermoluminescent detectors (TLDs) inside a Rando phantom to measure the dose at seven anatomical sites, including the lungs. A simulated TBI treatment showed homogeneous dose coverage (±10%) to the whole body. Doses to the sensitive organs were reduced by 35%-70% of the target dose. TLD measurements on Rando showed an accurate dose delivery (±7%) to the target and critical organs. In the TMI study, the dose was delivered conformally to the bone marrow only. The TBI and TMI treatment delivery time was reduced (by 50%) by increasing the field width from 2.5 to 5.0 cm in the inferior-superior direction. A limited MVCT reduced the target localization time 60% compared to whole body MVCT. MVCT image-guided helical tomotherapy offers a novel method to deliver a precise, homogeneous radiation dose to the whole body target while reducing the dose significantly to all critical organs. A judicious selection of pitch, modulation factor, and field size is required to produce a homogeneous dose distribution along with an acceptable treatment time. In

  12. Finite difference time domain calculation of three-dimensional phononic band structures using a postprocessing method based on the filter diagonalization

    International Nuclear Information System (INIS)

    Su Xiaoxing; Ma Tianxue; Wang Yuesheng

    2011-01-01

    If the band structure of a three-dimensional (3D) phononic crystal (PNC) is calculated by using the finite difference time domain (FDTD) method combined with the fast Fourier transform (FFT)-based postprocessing method, good results can only be ensured by a sufficiently large number of FDTD iterations. On a common computer platform, the total computation time will be very long. To overcome this difficulty, an excellent harmonic inversion algorithm called the filter diagonalization method (FDM) can be used in the postprocessing to reduce the number of FDTD iterations. However, the low efficiency of the FDM, which occurs when a relatively long time series is given, does not necessarily ensure an effective reduction of the total computation time. In this paper, a postprocessing method based on the FDM is proposed. The main procedure of the method is designed considering the aim to make the time spent on the method itself far less than the corresponding time spent on the FDTD iterations. To this end, the FDTD time series is preprocessed to be shortened significantly before the FDM frequency extraction. The preprocessing procedure is performed with the filter and decimation operations, which are widely used in narrow-band signal processing. Numerical results for a typical 3D solid PNC system show that the proposed postprocessing method can be used to effectively reduce the total computation time of the FDTD calculation of 3D phononic band structures.

  13. Finite difference time domain calculation of three-dimensional phononic band structures using a postprocessing method based on the filter diagonalization

    Energy Technology Data Exchange (ETDEWEB)

    Su Xiaoxing [School of Electronic and Information Engineering, Beijing Jiaotong University, Beijing 100044 (China); Ma Tianxue; Wang Yuesheng, E-mail: xxsu@bjtu.edu.cn [Institute of Engineering Mechanics, Beijing Jiaotong University, Beijing 100044 (China)

    2011-10-15

    If the band structure of a three-dimensional (3D) phononic crystal (PNC) is calculated by using the finite difference time domain (FDTD) method combined with the fast Fourier transform (FFT)-based postprocessing method, good results can only be ensured by a sufficiently large number of FDTD iterations. On a common computer platform, the total computation time will be very long. To overcome this difficulty, an excellent harmonic inversion algorithm called the filter diagonalization method (FDM) can be used in the postprocessing to reduce the number of FDTD iterations. However, the low efficiency of the FDM, which occurs when a relatively long time series is given, does not necessarily ensure an effective reduction of the total computation time. In this paper, a postprocessing method based on the FDM is proposed. The main procedure of the method is designed considering the aim to make the time spent on the method itself far less than the corresponding time spent on the FDTD iterations. To this end, the FDTD time series is preprocessed to be shortened significantly before the FDM frequency extraction. The preprocessing procedure is performed with the filter and decimation operations, which are widely used in narrow-band signal processing. Numerical results for a typical 3D solid PNC system show that the proposed postprocessing method can be used to effectively reduce the total computation time of the FDTD calculation of 3D phononic band structures.

  14. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  15. Evaluation of computed tomography coronary angiography in patients with a high heart rate using 16-slice spiral computed tomography with 0.37-s gantry rotation time

    International Nuclear Information System (INIS)

    Zhang, Shi-Zheng; Hu, Xiu-Hua; Zhang, Qiao-Wei; Huang, Wen-Xin

    2005-01-01

    The aim of our study is to evaluate computed tomography (CT) coronary angiography in patients with a high heart rate using 16-slice spiral CT with 0.37-s gantry rotation time. We compare the image quality of patients whose heart rates were over 70 beats per minute (bpm) with that of patients whose heart rates were 70 bpm or less. Sixty patients with various heart rates underwent retrospectively ECG-gated multislice spiral CT (MSCT) coronary angiography. Two experienced observers who were blind to the heart rates of the patients evaluated all the MSCT coronary angiographic images and calculated the assessable segments. A total of 620 out of 891 (69.6%) segments were satisfactorily visualized. On average, 10.3 coronary artery segments per patient could be evaluated. In 36 patients whose heart rates were below 70 bpm [mean 62.2 bpm±5.32 (standard deviation, SD)], the number of assessable segments was 10.72±2.02 (SD). In the other 24 patients whose heart rates were above 70 bpm [mean 78.6 bpm±8.24 (SD)], the corresponding number was 9.75±1.74 (SD). No statistically significant difference was found in these two subgroups' t test, P>0.05. The new generation of 16-slice spiral CT with 0.37-s rotation time can satisfactorily evaluate the coronary arteries of patients with high heart rates (above 70 bpm, up to 102 bpm). (orig.)

  16. Dosimetric evaluation of total marrow irradiation using 2 different planning systems

    International Nuclear Information System (INIS)

    Nalichowski, Adrian; Eagle, Don G.; Burmeister, Jay

    2016-01-01

    This study compared 2 different treatment planning systems (TPSs) for quality and efficiency of total marrow irradiation (TMI) plans. The TPSs used in this study were VOxel-Less Optimization (VoLO) (Accuray Inc, Sunnyvale, CA) using helical dose delivery on a Tomotherapy Hi-Art treatment unit and Eclipse (Varian Medical Systems Inc, Palo Alto, CA) using volumetric modulated arc therapy (VMAT) dose delivery on a Varian iX treatment unit. A total dose of 1200 cGy was prescribed to cover 95% of the planning target volume (PTV). The plans were optimized and calculated based on a single CT data and structure set using the Alderson Rando phantom (The Phantom Laboratory, Salem, NY) and physician contoured target and organ at risk (OAR) volumes. The OARs were lungs, heart, liver, kidneys, brain, and small bowel. The plans were evaluated based on plan quality, time to optimize the plan and calculate the dose, and beam on time. The resulting mean and maximum doses to the PTV were 1268 and 1465 cGy for VoLO and 1284 and 1541 cGy for Eclipse, respectively. For 5 of 6 OAR structures the VoLO system achieved lower mean and D10 doses ranging from 22% to 52% and 3% to 44%, respectively. Total computational time including only optimization and dose calculation were 0.9 hours for VoLO and 3.8 hours for Eclipse. These times do not include user-dependent target delineation and field setup. Both planning systems are capable of creating high-quality plans for total marrow irradiation. The VoLO planning system was able to achieve more uniform dose distribution throughout the target volume and steeper dose fall off, resulting in superior OAR sparing. VoLO's graphics processing unit (GPU)–based optimization and dose calculation algorithm also allowed much faster creation of TMI plans.

  17. Dosimetric evaluation of total marrow irradiation using 2 different planning systems

    Energy Technology Data Exchange (ETDEWEB)

    Nalichowski, Adrian, E-mail: nalichoa@karmanos.org [Karmanos Cancer Center, Detroit, MI (United States); Eagle, Don G. [Wayne State University School of Medicine, Detroit, MI (United States); Burmeister, Jay [Karmanos Cancer Center, Detroit, MI (United States); Wayne State University School of Medicine, Detroit, MI (United States)

    2016-10-01

    This study compared 2 different treatment planning systems (TPSs) for quality and efficiency of total marrow irradiation (TMI) plans. The TPSs used in this study were VOxel-Less Optimization (VoLO) (Accuray Inc, Sunnyvale, CA) using helical dose delivery on a Tomotherapy Hi-Art treatment unit and Eclipse (Varian Medical Systems Inc, Palo Alto, CA) using volumetric modulated arc therapy (VMAT) dose delivery on a Varian iX treatment unit. A total dose of 1200 cGy was prescribed to cover 95% of the planning target volume (PTV). The plans were optimized and calculated based on a single CT data and structure set using the Alderson Rando phantom (The Phantom Laboratory, Salem, NY) and physician contoured target and organ at risk (OAR) volumes. The OARs were lungs, heart, liver, kidneys, brain, and small bowel. The plans were evaluated based on plan quality, time to optimize the plan and calculate the dose, and beam on time. The resulting mean and maximum doses to the PTV were 1268 and 1465 cGy for VoLO and 1284 and 1541 cGy for Eclipse, respectively. For 5 of 6 OAR structures the VoLO system achieved lower mean and D10 doses ranging from 22% to 52% and 3% to 44%, respectively. Total computational time including only optimization and dose calculation were 0.9 hours for VoLO and 3.8 hours for Eclipse. These times do not include user-dependent target delineation and field setup. Both planning systems are capable of creating high-quality plans for total marrow irradiation. The VoLO planning system was able to achieve more uniform dose distribution throughout the target volume and steeper dose fall off, resulting in superior OAR sparing. VoLO's graphics processing unit (GPU)–based optimization and dose calculation algorithm also allowed much faster creation of TMI plans.

  18. Effects of Computer Course on Computer Self-Efficacy, Computer Attitudes and Achievements of Young Individuals in Siirt, Turkey

    Science.gov (United States)

    Çelik, Halil Coskun

    2015-01-01

    The purpose of this study is to investigate the effects of computer courses on young individuals' computer self-efficacy, attitudes and achievement. The study group of this research included 60 unemployed young individuals (18-25 ages) in total; 30 in the experimental group and 30 in the control group. An experimental research model with pretest…

  19. Cloud Compute for Global Climate Station Summaries

    Science.gov (United States)

    Baldwin, R.; May, B.; Cogbill, P.

    2017-12-01

    Global Climate Station Summaries are simple indicators of observational normals which include climatic data summarizations and frequency distributions. These typically are statistical analyses of station data over 5-, 10-, 20-, 30-year or longer time periods. The summaries are computed from the global surface hourly dataset. This dataset totaling over 500 gigabytes is comprised of 40 different types of weather observations with 20,000 stations worldwide. NCEI and the U.S. Navy developed these value added products in the form of hourly summaries from many of these observations. Enabling this compute functionality in the cloud is the focus of the project. An overview of approach and challenges associated with application transition to the cloud will be presented.

  20. Total body height estimation using sacrum height in Anatolian Caucasians: multidetector computed tomography-based virtual anthropometry

    International Nuclear Information System (INIS)

    Karakas, Hakki Muammer; Celbis, Osman; Harma, Ahmet; Alicioglu, Banu

    2011-01-01

    Estimation of total body height is a major step when a subject has to be identified from his/her skeletal structures. In the presence of decomposed skeletons and missing bones, estimation is usually based on regression equation for intact long bones. If these bones are fragmented or missing, alternative structures must be used. In this study, the value of sacrum height (SH) in total body height (TBH) estimation was investigated in a contemporary population of adult Anatolian Caucasians. Sixty-six men (41.6 ± 14.9 years) and 43 women (41.1 ± 14.2 years) were scanned with 64-row multidetector computed tomography (MDCT) to obtain high-resolution anthropometric data. SH of midsagittal sections was electronically measured. The technique and methodology were validated on a standard skeletal model. Sacrum height was 111.2 ± 12.6 mm (77-138 mm) in men and 104.7 ± 8.2 (89-125 mm) in women. The difference between the two sexes regarding SH was significant (p < 0.0001). SH did not significantly correlate with age in men, whereas the correlation was significant in women (p < 0.03). The correlation between SH and the stature was significant in men (r = 0.427, p < 0.0001) and was insignificant in women. For men the regression equation was [Stature = (0.306 x SH)+137.9] (r = 0.54, SEE = 56.9, p < 0.0001). Sacrum height is not susceptible to sex, or to age in men. In the presence of incomplete male skeletons, SH helps to determine the stature. This study is also one of the initial applications of MDCT in virtual anthropometric research. (orig.)

  1. Total body height estimation using sacrum height in Anatolian Caucasians: multidetector computed tomography-based virtual anthropometry

    Energy Technology Data Exchange (ETDEWEB)

    Karakas, Hakki Muammer [Inonu University Medical Faculty, Turgut Ozal Medical Center, Department of Radiology, Malatya (Turkey); Celbis, Osman [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Forensic Medicine, Malatya (Turkey); Harma, Ahmet [Inonu University Medical Faculty Turgut Ozal Medical Center, Department of Orthopaedics and Traumatology, Malatya (Turkey); Alicioglu, Banu [Trakya University Medical Faculty, Department of Radiology, Edirne (Turkey); Trakya University Health Sciences Institute, Department of Anatomy, Edirne (Turkey)

    2011-05-15

    Estimation of total body height is a major step when a subject has to be identified from his/her skeletal structures. In the presence of decomposed skeletons and missing bones, estimation is usually based on regression equation for intact long bones. If these bones are fragmented or missing, alternative structures must be used. In this study, the value of sacrum height (SH) in total body height (TBH) estimation was investigated in a contemporary population of adult Anatolian Caucasians. Sixty-six men (41.6 {+-} 14.9 years) and 43 women (41.1 {+-} 14.2 years) were scanned with 64-row multidetector computed tomography (MDCT) to obtain high-resolution anthropometric data. SH of midsagittal sections was electronically measured. The technique and methodology were validated on a standard skeletal model. Sacrum height was 111.2 {+-} 12.6 mm (77-138 mm) in men and 104.7 {+-} 8.2 (89-125 mm) in women. The difference between the two sexes regarding SH was significant (p < 0.0001). SH did not significantly correlate with age in men, whereas the correlation was significant in women (p < 0.03). The correlation between SH and the stature was significant in men (r = 0.427, p < 0.0001) and was insignificant in women. For men the regression equation was [Stature = (0.306 x SH)+137.9] (r = 0.54, SEE = 56.9, p < 0.0001). Sacrum height is not susceptible to sex, or to age in men. In the presence of incomplete male skeletons, SH helps to determine the stature. This study is also one of the initial applications of MDCT in virtual anthropometric research. (orig.)

  2. Geographic Location of a Computer Node Examining a Time-to-Location Algorithm and Multiple Autonomous System Networks

    National Research Council Canada - National Science Library

    Sorgaard, Duane

    2004-01-01

    .... A time-to-location algorithm can successfully resolve a geographic location of a computer node using only latency information from known sites and mathematically calculating the Euclidean distance...

  3. Computer-based anthropometrical system for total body irradiation.

    Science.gov (United States)

    Sánchez-Nieto, B; Sánchez-Doblado, F; Terrón, J A; Arráns, R; Errazquin, L

    1997-05-01

    For total body irradiation (TBI) dose calculation requirements, anatomical information about the whole body is needed. Despite the fact that video image grabbing techniques are used by some treatment planning systems for standard radiotherapy, there are no such systems designed to generate anatomical parameters for TBI planning. The paper describes an anthropometrical computerised system based on video image grabbing which was purpose-built to provide anatomical data for a PC-based TBI planning system. Using software, the system controls the acquisition and digitalisation of the images (external images of the patient in treatment position) and the measurement procedure itself (on the external images or the digital CT information). An ASCII file, readable by the TBI planning system, is generated to store the required parameters of the dose calculation points, i.e. depth, backscatter tissue thickness, thickness of inhomogeneity, off-axis distance (OAD) and source to skin distance (SSD).

  4. Development of a computer program for drop time and impact velocity of the rod cluster control assembly

    International Nuclear Information System (INIS)

    Choi, K.-S.; Yim, J.-S.; Kim, I.-K.; Kim, K.-T.

    1993-01-01

    In PWR the rod cluster control assembly (RCCA) for shutdown is released upon the action of the control drive mechanism and falls down through the guide thimble by its weight. Drop time and impact velocity of the RCCA are two key parameters with respect to reactivity insertion time and the mechanical integrity of fuel assembly. Therefore, the precise control of the drop time and impact velocity is prerequisite to modifying the existing design features of the RCCA and guide thimble or newly designing them. During its falling down into the core, the RCCA is retarded by various forces acting on it such as flow resistance and friction caused by the RCCA movement, buoyancy mechanical friction caused by contacting inner surface of the guide thimble, etc. However, complicated coupling of the various forces makes it difficult to derive an analytical dynamic equation for the drop time and impact velocity. This paper deals with the development of a computer program containing an analytical dynamic equation applicable to the Korean Fuel Assembly (KOFA) loaded in the Korean nuclear power plants. The computer program is benchmarked with an available single control rod drop tests. Since the predicted values are in good agreements with the test results, the computer program developed in this paper can be employed to modify the existing design features of the RCCA and guide thimble and to develop their new design features for advanced nuclear reactors. (author)

  5. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  6. The Effectiveness of a Web-Based Computer-Tailored Intervention on Workplace Sitting: A Randomized Controlled Trial.

    Science.gov (United States)

    De Cocker, Katrien; De Bourdeaudhuij, Ilse; Cardon, Greet; Vandelanotte, Corneel

    2016-05-31

    Effective interventions to influence workplace sitting are needed, as office-based workers demonstrate high levels of continued sitting, and sitting too much is associated with adverse health effects. Therefore, we developed a theory-driven, Web-based, interactive, computer-tailored intervention aimed at reducing and interrupting sitting at work. The objective of our study was to investigate the effects of this intervention on objectively measured sitting time, standing time, and breaks from sitting, as well as self-reported context-specific sitting among Flemish employees in a field-based approach. Employees (n=213) participated in a 3-group randomized controlled trial that assessed outcomes at baseline, 1-month follow-up, and 3-month follow-up through self-reports. A subsample (n=122) were willing to wear an activity monitor (activPAL) from Monday to Friday. The tailored group received an automated Web-based, computer-tailored intervention including personalized feedback and tips on how to reduce or interrupt workplace sitting. The generic group received an automated Web-based generic advice with tips. The control group was a wait-list control condition, initially receiving no intervention. Intervention effects were tested with repeated-measures multivariate analysis of variance. The tailored intervention was successful in decreasing self-reported total workday sitting (time × group: Pleisure time sitting (time × group: P=.03), and in increasing objectively measured breaks at work (time × group: P=.07); this was not the case in the other conditions. The changes in self-reported total nonworkday sitting, sitting during transport, television viewing, and personal computer use, objectively measured total sitting time, and sitting and standing time at work did not differ between conditions. Our results point out the significance of computer tailoring for sedentary behavior and its potential use in public health promotion, as the effects of the tailored condition

  7. Using Just-in-Time Information to Support Scientific Discovery Learning in a Computer-Based Simulation

    Science.gov (United States)

    Hulshof, Casper D.; de Jong, Ton

    2006-01-01

    Students encounter many obstacles during scientific discovery learning with computer-based simulations. It is hypothesized that an effective type of support, that does not interfere with the scientific discovery learning process, should be delivered on a "just-in-time" base. This study explores the effect of facilitating access to…

  8. Copyright and Computer Generated Materials – Is it Time to Reboot the Discussion About Authorship?

    Directory of Open Access Journals (Sweden)

    Anne Fitzgerald

    2013-12-01

    Full Text Available Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd.  The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship.The Copyright Act 1968 (Cth does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998, the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.

  9. A constructive heuristic for time-dependent multi-depot vehicle routing problem with time-windows and heterogeneous fleet

    Directory of Open Access Journals (Sweden)

    Behrouz Afshar-Nadjafi

    2017-01-01

    Full Text Available In this paper, we consider the time-dependent multi-depot vehicle routing problem. The objective is to minimize the total heterogeneous fleet cost assuming that the travel time between locations depends on the departure time. Also, hard time window constraints for the customers and limitation on maximum number of the vehicles in depots must be satisfied. The problem is formulated as a mixed integer programming model. A constructive heuristic procedure is proposed for the problem. Also, the efficiency of the proposed algorithm is evaluated on 180 test problems. The obtained computational results indicate that the procedure is capable to obtain a satisfying solution.

  10. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    Science.gov (United States)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  11. A Robust Computational Technique for Model Order Reduction of Two-Time-Scale Discrete Systems via Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Othman M. K. Alsmadi

    2015-01-01

    Full Text Available A robust computational technique for model order reduction (MOR of multi-time-scale discrete systems (single input single output (SISO and multi-input multioutput (MIMO is presented in this paper. This work is motivated by the singular perturbation of multi-time-scale systems where some specific dynamics may not have significant influence on the overall system behavior. The new approach is proposed using genetic algorithms (GA with the advantage of obtaining a reduced order model, maintaining the exact dominant dynamics in the reduced order, and minimizing the steady state error. The reduction process is performed by obtaining an upper triangular transformed matrix of the system state matrix defined in state space representation along with the elements of B, C, and D matrices. The GA computational procedure is based on maximizing the fitness function corresponding to the response deviation between the full and reduced order models. The proposed computational intelligence MOR method is compared to recently published work on MOR techniques where simulation results show the potential and advantages of the new approach.

  12. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  13. Time diary and questionnaire assessment of factors associated with academic and personal success among university undergraduates.

    Science.gov (United States)

    George, Darren; Dixon, Sinikka; Stansal, Emory; Gelb, Shannon Lund; Pheri, Tabitha

    2008-01-01

    A sample of 231 students attending a private liberal arts university in central Alberta, Canada, completed a 5-day time diary and a 71-item questionnaire assessing the influence of personal, cognitive, and attitudinal factors on success. The authors used 3 success measures: cumulative grade point average (GPA), Personal Success--each participant's rating of congruence between stated goals and progress toward those goals--and Total Success--a measure that weighted GPA and Personal Success equally. The greatest predictors of GPA were time-management skills, intelligence, time spent studying, computer ownership, less time spent in passive leisure, and a healthy diet. Predictors of Personal Success scores were clearly defined goals, overall health, personal spirituality, and time-management skills. Predictors of Total Success scores were clearly defined goals, time-management skills, less time spent in passive leisure, healthy diet, waking up early, computer ownership, and less time spent sleeping. Results suggest alternatives to traditional predictors of academic success.

  14. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  15. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    Science.gov (United States)

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  16. Real-time management (RTM) by cloud computing system dynamics (CCSD) for risk analysis of Fukushima nuclear power plant (NPP) accident

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Hyo Sung [Yonsei Univ., Wonju Gangwon-do (Korea, Republic of). Dept. of Radiation Convergence Engineering; Woo, Tae Ho [Yonsei Univ., Wonju Gangwon-do (Korea, Republic of). Dept. of Radiation Convergence Engineering; The Cyber Univ. of Korea, Seoul (Korea, Republic of). Dept. of Mechanical and Control Engineering

    2017-03-15

    The earthquake and tsunami induced accident of nuclear power plant (NPP) in Fukushima disaster is investigated by the real-time management (RTM) method. This non-linear logic of the safety management is applied to enhance the methodological confidence in the NPP reliability. The case study of the earthquake is modeled for the fast reaction characteristics of the RTM. The system dynamics (SD) modeling simulations and cloud computing are applied for the RTM method where the real time simulation has the fast and effective communication for the accident remediation and prevention. Current tablet computing system can improve the safety standard of the NPP. Finally, the procedure of the cloud computing system dynamics (CCSD) modeling is constructed.

  17. Real-time management (RTM) by cloud computing system dynamics (CCSD) for risk analysis of Fukushima nuclear power plant (NPP) accident

    International Nuclear Information System (INIS)

    Cho, Hyo Sung; Woo, Tae Ho; The Cyber Univ. of Korea, Seoul

    2017-01-01

    The earthquake and tsunami induced accident of nuclear power plant (NPP) in Fukushima disaster is investigated by the real-time management (RTM) method. This non-linear logic of the safety management is applied to enhance the methodological confidence in the NPP reliability. The case study of the earthquake is modeled for the fast reaction characteristics of the RTM. The system dynamics (SD) modeling simulations and cloud computing are applied for the RTM method where the real time simulation has the fast and effective communication for the accident remediation and prevention. Current tablet computing system can improve the safety standard of the NPP. Finally, the procedure of the cloud computing system dynamics (CCSD) modeling is constructed.

  18. Integer batch scheduling problems for a single-machine with simultaneous effect of learning and forgetting to minimize total actual flow time

    Directory of Open Access Journals (Sweden)

    Rinto Yusriski

    2015-09-01

    Full Text Available This research discusses an integer batch scheduling problems for a single-machine with position-dependent batch processing time due to the simultaneous effect of learning and forgetting. The decision variables are the number of batches, batch sizes, and the sequence of the resulting batches. The objective is to minimize total actual flow time, defined as total interval time between the arrival times of parts in all respective batches and their common due date. There are two proposed algorithms to solve the problems. The first is developed by using the Integer Composition method, and it produces an optimal solution. Since the problems can be solved by the first algorithm in a worst-case time complexity O(n2n-1, this research proposes the second algorithm. It is a heuristic algorithm based on the Lagrange Relaxation method. Numerical experiments show that the heuristic algorithm gives outstanding results.

  19. ES-5052 storage devices on magnetic disks for the BESM-4 computer

    International Nuclear Information System (INIS)

    Bezrukova, N.B.; Vinogradov, A.F.; Eliseev, G.N.; Ivanchenko, Z.M.; Pervushov, V.I.; Samojlov, V.N.; Stuk, G.P.; Shchelev, S.A.; Chulkov, N.I.

    1975-01-01

    The basic principles of connection between ES-5052 magnetic disk storage devices and a BESM-4 computer are set forth. The interchange of instructions and guidance information between computer and magnetic disk storage is accomplished through a disk controlling unit. The time taken to find the required cylinder (to pass from track to track) is 20 ms, the access time is 60 to 95 ms. The data transfer rate (512 45-digit machine words) is 20 ms. The total disk storage capacity is equivalent to 250 BESM-4 memory units. The instructions for BESM-4 access to the ES-5052 magnetic disk storage are described

  20. Accelerated time-of-flight (TOF) PET image reconstruction using TOF bin subsetization and TOF weighting matrix pre-computation

    International Nuclear Information System (INIS)

    Mehranian, Abolfazl; Kotasidis, Fotis; Zaidi, Habib

    2016-01-01

    Time-of-flight (TOF) positron emission tomography (PET) technology has recently regained popularity in clinical PET studies for improving image quality and lesion detectability. Using TOF information, the spatial location of annihilation events is confined to a number of image voxels along each line of response, thereby the cross-dependencies of image voxels are reduced, which in turns results in improved signal-to-noise ratio and convergence rate. In this work, we propose a novel approach to further improve the convergence of the expectation maximization (EM)-based TOF PET image reconstruction algorithm through subsetization of emission data over TOF bins as well as azimuthal bins. Given the prevalence of TOF PET, we elaborated the practical and efficient implementation of TOF PET image reconstruction through the pre-computation of TOF weighting coefficients while exploiting the same in-plane and axial symmetries used in pre-computation of geometric system matrix. In the proposed subsetization approach, TOF PET data were partitioned into a number of interleaved TOF subsets, with the aim of reducing the spatial coupling of TOF bins and therefore to improve the convergence of the standard maximum likelihood expectation maximization (MLEM) and ordered subsets EM (OSEM) algorithms. The comparison of on-the-fly and pre-computed TOF projections showed that the pre-computation of the TOF weighting coefficients can considerably reduce the computation time of TOF PET image reconstruction. The convergence rate and bias-variance performance of the proposed TOF subsetization scheme were evaluated using simulated, experimental phantom and clinical studies. Simulations demonstrated that as the number of TOF subsets is increased, the convergence rate of MLEM and OSEM algorithms is improved. It was also found that for the same computation time, the proposed subsetization gives rise to further convergence. The bias-variance analysis of the experimental NEMA phantom and a clinical